Skip to content

onnxruntime-genai does not support VitisAI (NPU) or RyzenAI in public builds despite RyzenAI 1.7.0 installation for RyzenAI RAG #333

@bmhydar

Description

@bmhydar

Following AMD’s documented RyzenAI RAG workflow and using RyzenAI 1.7.0, the provided LLM examples do not run as provided.
https://github.com/amd/RyzenAI-SW/tree/main/LLM-examples/RAG-OGA
When running the RAG example exactly as documented (updating only the dataset path and model path as instructed), LLM initialization fails. The AMD-distributed LLM model configuration references the provider name RyzenAI, which results in the following error from onnxruntime-genai:

RuntimeError: Unknown provider name 'RyzenAI'

After correcting the provider name in the model configuration to VitisAI (as required by onnxruntime-genai), initialization fails again with:

RuntimeError: VitisAI execution provider is not supported in this build

This indicates that public onnxruntime-genai packages do not include NPU (VitisAI) support, even though:

RyzenAI installs NPU drivers successfully

AMD’s RAG examples and downloadable model configs imply that NPU-backed LLM inference should work

As a result, AMD’s downloadable files and documented directions do not work as-is. Users must either modify AMD-provided configuration files and then fall back to CPU execution, or obtain non-public builds to proceed.

Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions