Skip to content

No Sentry, llm invoke as a chain

Laurian Gridinoc requested to merge no-sentry into main

In the end Sentry was the culprit, it fails in tracing things in the LangChain OpenAi objects when the OpenAi endpoint is custom (our LLM). So, Sentry bits are commented out, I'll be discussing it on their Discord.

In that process I also made the llm call be more "chainy" 😄 as I was trying to debug it as a chain.

Merge request reports

Loading