Skip to content

Documentation bug? Inference server on port 9001 #1075

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
jonashaag opened this issue Mar 13, 2025 · 1 comment
Open

Documentation bug? Inference server on port 9001 #1075

jonashaag opened this issue Mar 13, 2025 · 1 comment

Comments

@jonashaag
Copy link

jonashaag commented Mar 13, 2025

The docs say that a notebook is exposed on port 9002 but for me nothing is running on port 9002 (I get connection refused). Port 9001 shows the intro page.

inference server start --dev
No GPU detected. Using a CPU image.
Pulling image: roboflow/roboflow-inference-server-cpu:latest
...
Image roboflow/roboflow-inference-server-cpu:latest pulled.
Starting inference server container...

❯ docker ps
CONTAINER ID   IMAGE                                           COMMAND                  CREATED         STATUS                PORTS                                                           NAMES
7b7b4fde8446   roboflow/roboflow-inference-server-cpu:latest   "/bin/sh -c 'uvicorn…"   2 minutes ago   Up 2 minutes          0.0.0.0:9001-9002->9001-9002/tcp, :::9001-9002->9001-9002/tcp   elegant_murdock
@jonashaag
Copy link
Author

Another bug?

https://deploy-quickstart.roboflow.com/results.html here it says to use inference.load_roboflow_model but that does not seem to exist.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant