Skip to content

Hangs with remote Ollama #952

Open
Open
@axibo-reiner

Description

@axibo-reiner

Describe the bug
When running with a remote vs local ollama server the smart_scrapper_graph.run() hangs and does not return (left running overnight as well just in case)

  • the usage on the remote machine goes up so the requests are happening
  • same model on both remote and local machine
  • tested with all v1.4x versions
  • tested different temperatures
  • checked with and without base_url parameter when running locally (both work fine)
  • tested with and without headless
  • ollama v0.6.1 (both machines)

To Reproduce

from scrapegraphai.graphs import SmartScraperGraph

graph_config = {
    "llm": {
        "model": "ollama/llama3.2",
        "max_tokens": 8192,
        "base_url": "http://192.168.1.1:11434",
        "format": "json"
    },
    "verbose": True,
    "headless": False,
}

smart_scraper_graph = SmartScraperGraph(
    prompt="I want a list of all the links of the issues on the page",
    source="https://github.com/ScrapeGraphAI/Scrapegraph-ai/issues",
    config=graph_config
)

result = smart_scraper_graph.run()

import json
print(json.dumps(result, indent=4))

Expected behavior
A clear and concise description of what you expected to happen.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions