-
Notifications
You must be signed in to change notification settings - Fork 22
Description
Hi guys,
I attended your webinar on GraphRAG a couple of weeks ago and tried to implement your code but with no luck. I didn't want to use OpenAI since I want to develop my applications locally, privately, and completely offline if possible... so my first option has been to use Ollama models. I did a copy and paste of your code:
...
kg_builder_pdf = SimpleKGPipeline(
llm = llm,
driver = driver,
text_splitter = FixedSizeSplitter(chunk_size=500, chunk_overlap=100),
embedder = emb,
entities = node_labels,
relations = rel_types,
prompt_template = prompt_template ,
from_pdf = True
)
pdf_file_paths = [
'data/biomolecules-11-00928-v2-trunc.pdf',
'data/GAP-between-patients-and-clinicians_2023_Best-Practice-trunc.pdf',
'data/pgpm-13-39-trunc.pdf'
]
for path in pdf_file_paths:
pdf_result = await kg_builder_pdf.run_async(file_path=path)
- Using OllamaLLM and OllamaEmbeddings:
from neo4j_graphrag.llm import OllamaLLM
from neo4j_graphrag.embeddings.ollama import OllamaEmbeddings
llm = OllamaLLM(model_name='llama3.2:3b')
emb = OllamaEmbeddings(model='snowflake-arctic-embed2')
...
I get the following error:
.venv\Lib\site-packages\pydantic\main.py", line 214, in init
validated_self = self.pydantic_validator.validate_python(data, self_instance=self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
pydantic_core._pydantic_core.ValidationError: 1 validation error for Neo4jNode
embedding_properties.embedding.0
Input should be a valid number [type=float_type, input_value=[-0.021932466, 0.01252911...0.03033293, 0.014525634], input_type=list]
For further information, visit https://errors.pydantic.dev/2.10/v/float_type
- Using OpenAI and OpenAIEmbeddings:
from neo4j_graphrag.llm import OpenAILLM
from neo4j_graphrag.embeddings.openai import OpenAIEmbeddings
llm = OpenAILLM(
model_name='gpt-4o-mini',
model_params={
'response-format': {'type': 'json_object'},
'temperature': 0
}
)
emb = OpenAIEmbeddings()
...
I get the following error:
.venv\Lib\site-packages\neo4j_graphrag\llm\openai_llm.py", line 138, in ainvoke
response = await self.async_client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\bduran\Documents\Coding\KnowledgeGraphs\Neo4j\graphRAG.venv\Lib\site-packages\openai_utils_utils.py", line 279, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
TypeError: AsyncCompletions.create() got an unexpected keyword argument 'response-format'
I'll appreciate any direction on how to implement a working graphRAG script, especially when using OllamaLLM and OllamaEmbeddings.
Thanks in advance,
Boris