Skip to content

OpenAI/ChatGPT OAuth is not a viable NemoClaw provider path without manual host-side OpenClaw gateway bridging #1793

@coco779

Description

@coco779

Summary

On nemoclaw v0.0.13, I could not complete a clean/supported setup that uses OpenAI/ChatGPT OAuth as NemoClaw's inference backend, even though the overall NemoClaw/OpenClaw stack strongly suggests this should be possible in principle.

What ultimately worked was a manual workaround:

  1. install the host openclaw CLI separately
  2. complete OpenAI OAuth there
  3. expose host OpenClaw as an OpenAI-compatible gateway
  4. point OpenShell/NemoClaw's compatible-endpoint provider at that host gateway

So the containerized NemoClaw/OpenShell/OpenClaw runtime can work with OpenAI OAuth indirectly, but there does not appear to be a first-class NemoClaw path for it.

Environment

  • OS: Linux Mint 22.3
  • nemoclaw v0.0.13
  • openshell 0.0.26
  • Host openclaw 2026.4.10
  • Docker installed and working
  • No GPU in this VM

What I expected

Because NemoClaw bundles OpenClaw/OpenShell and advertises OpenClaw-based secure sandboxing, I expected one of these to work cleanly:

  • a native NemoClaw/OpenShell provider option for OpenClaw/OpenAI OAuth
  • or a documented way to import/use existing OpenClaw auth profiles
  • or a supported way to route NemoClaw inference through an OpenClaw gateway without custom patching and manual bridge setup

What actually happened

1. NemoClaw onboarding does not offer a clean OpenAI OAuth path

During nemoclaw onboard, the relevant provider choices are API-oriented (OpenAI, Other OpenAI-compatible endpoint, etc.).

That does not appear to support native OpenAI/ChatGPT OAuth directly.

2. Host OpenClaw OAuth worked only after local patching

I had to install the host openclaw CLI separately and use its OAuth flow.

That initially failed because the shipped OpenClaw OAuth flow requested invalid scopes. I filed that upstream here:

After patching the installed OpenClaw CLI locally, OAuth succeeded.

3. NemoClaw/OpenShell could only use OAuth indirectly via a manual bridge

To make NemoClaw actually work with that OAuth-backed model path, I had to:

  • run host openclaw gateway
  • configure it with token auth and non-loopback bind
  • disable its control UI for bridge use
  • point OpenShell provider compatible-endpoint at http://host.docker.internal:18889/v1
  • raise OpenShell inference timeout from 60s to 180s

Only after that did the sandboxed NemoClaw path succeed via:

  • host gateway: http://127.0.0.1:18889/v1/chat/completions
  • sandbox route: https://inference.local/v1/chat/completions

Why I think this is a NemoClaw issue too

The OpenClaw scope bug is upstream to OpenClaw, but NemoClaw still seems to have a product/integration gap here:

  • no first-class OpenAI OAuth onboarding path
  • no documented host OpenClaw gateway bridge workflow
  • no import/reuse of host OpenClaw auth state
  • default OpenShell inference timeout was too short for this routed OAuth-backed path

So today, a user trying to use ChatGPT/OpenAI OAuth with NemoClaw ends up in an awkward place where:

  • OpenClaw can authenticate
  • NemoClaw can sandbox
  • but the official path between those two is missing or undocumented

Reproduction (high level)

  1. Install Docker and nemoclaw
  2. Run nemoclaw onboard
  3. Try to choose a provider path that uses OpenAI/ChatGPT OAuth
  4. Observe there is no clean first-class option
  5. Install host openclaw separately and complete OAuth there
  6. Try to make NemoClaw use that auth state directly
  7. Observe that this does not work without a manual OpenAI-compatible bridge/gateway setup

Workaround that eventually worked

  • install host openclaw
  • patch host OpenClaw OAuth scopes locally so login succeeds
  • run host OpenClaw as an OpenAI-compatible gateway
  • configure NemoClaw/OpenShell compatible-endpoint to use that gateway
  • increase OpenShell inference timeout to 180s

Request

Please consider one or more of these improvements:

  1. Add a first-class NemoClaw/OpenShell path for OpenClaw/OpenAI OAuth.
  2. Document a supported bridge workflow using host OpenClaw gateway.
  3. Support importing or referencing host OpenClaw auth profiles from NemoClaw/OpenShell.
  4. Revisit default routed inference timeouts for slower OAuth-backed/gateway-backed providers.
  5. Clarify in docs which provider/auth combinations are officially supported today.

Notes

I can provide more exact local config details if helpful, but I wanted to keep this report focused on the user-visible integration gap rather than my local patches.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancement: providerUse this label to identify requests to add a new AI provider to NemoClaw.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions