Skip to content

Commit 40fa235

Browse files
authored
Fix LLAVA example on CPU (#11271)
* update * update * update * update
1 parent ca0e69c commit 40fa235

File tree

2 files changed

+6
-8
lines changed

2 files changed

+6
-8
lines changed

Diff for: python/llm/example/CPU/PyTorch-Models/Model/llava/README.md

+4-7
Original file line numberDiff line numberDiff line change
@@ -20,13 +20,11 @@ conda activate llm
2020

2121
# install the latest ipex-llm nightly build with 'all' option
2222
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
23-
pip install einops # install dependencies required by llava
24-
pip install transformers==4.36.2
25-
2623
git clone https://github.com/haotian-liu/LLaVA.git # clone the llava libary
27-
cp generate.py ./LLaVA/ # copy our example to the LLaVA folder
2824
cd LLaVA # change the working directory to the LLaVA folder
2925
git checkout tags/v1.2.0 -b 1.2.0 # Get the branch which is compatible with transformers 4.36
26+
pip install -e . # Install llava
27+
cd ..
3028
```
3129

3230
On Windows:
@@ -36,13 +34,12 @@ conda create -n llm python=3.11
3634
conda activate llm
3735
3836
pip install --pre --upgrade ipex-llm[all]
39-
pip install einops
40-
pip install transformers==4.36.2
4137
4238
git clone https://github.com/haotian-liu/LLaVA.git
43-
copy generate.py .\LLaVA\
4439
cd LLaVA
4540
git checkout tags/v1.2.0 -b 1.2.0
41+
pip install -e .
42+
cd ..
4643
```
4744

4845
### 2. Run

Diff for: python/llm/example/CPU/PyTorch-Models/Model/llava/generate.py

+2-1
Original file line numberDiff line numberDiff line change
@@ -291,7 +291,8 @@ def get_stopping_criteria(conv, tokenizer, input_ids):
291291
# Load model
292292
tokenizer, model, image_processor, _ = load_pretrained_model(model_path=model_path,
293293
model_base=None,
294-
model_name=model_name)
294+
model_name=model_name,
295+
device_map=None)
295296

296297
# With only one line to enable IPEX-LLM optimization on model
297298
model = optimize_model(model)

0 commit comments

Comments
 (0)