Skip to content
This repository was archived by the owner on Oct 15, 2025. It is now read-only.
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions quickstart/README-minikube.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,13 @@ You can use the installer script that installs all the required dependencies. Th
> ⚠️ You may need to visit Hugging Face [meta-llama/Llama-3.2-3B-Instruct](https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct) and
> accept the usage terms to pull this with your HF token if you have not already done so.

### Required [Gateway API Inference Extension (GIE)]

```shell
export GIE_VERSION=v0.3.0
kubectl apply -f https://github.com/kubernetes-sigs/gateway-api-inference-extension/releases/download/$GIE_VERSION/manifests.yaml
```

### Target Platform

#### MiniKube
Expand Down Expand Up @@ -300,3 +307,5 @@ To delete the Minikube cluster, simply run:
```bash
minikube delete
```

[Gateway API Inference Extension (GIE)]:https://github.com/kubernetes-sigs/gateway-api-inference-extension
9 changes: 9 additions & 0 deletions quickstart/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,13 @@ You can use the installer script that installs all the required dependencies.
> ⚠️ Your Hugging Face account must have access to the model you want to use. You may need to visit Hugging Face [meta-llama/Llama-3.2-3B-Instruct](https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct) and
> accept the usage terms if you have not already done so.

### Required [Gateway API Inference Extension (GIE)]

```shell
export GIE_VERSION=v0.3.0
kubectl apply -f https://github.com/kubernetes-sigs/gateway-api-inference-extension/releases/download/$GIE_VERSION/manifests.yaml
```

### Target Platforms

Since the llm-d-deployer is based on helm charts, llm-d can be deployed on a variety of Kubernetes platforms. As more platforms are supported, the installer will be updated to support them.
Expand Down Expand Up @@ -347,3 +354,5 @@ make a change, simply uninstall and then run the installer again with any change
```bash
./llmd-installer.sh --uninstall
```

[Gateway API Inference Extension (GIE)]:https://github.com/kubernetes-sigs/gateway-api-inference-extension