Skip to content

Cocoindex update #656

Answered by badmonster0
Psylence0609 asked this question in Q&A
Discussion options

You must be logged in to vote

Thanks for describing the problem in detail! Sorry for the inconvenience.

This is mostly likely caused by issue of ollama: ollama/ollama#8200

While 30 pages is long. It won't perform well for most models, and makes the ollama bug more vulnerable. You likely need to split it into chunks first. Our docs_to_knowledge_graph example had an earlier version that split it into chunks first (we simplified the example so there's no step of splitting, but for largedocs it should actually be split first).

If it's still stuck after splitting it, you may try to switch to a different LLM API. OpenAI and Google Gemini are usually quite stable, and LiteLLM provides a proxy that integrate with a variety of…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by badmonster0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants