No Sentry, llm invoke as a chain
In the end Sentry was the culprit, it fails in tracing things in the LangChain OpenAi objects when the OpenAi endpoint is custom (our LLM). So, Sentry bits are commented out, I'll be discussing it on their Discord.
In that process I also made the llm call be more "chainy"