File tree 4 files changed +7
-4
lines changed
CPU/HF-Transformers-AutoModels/Model
4 files changed +7
-4
lines changed Original file line number Diff line number Diff line change @@ -25,6 +25,7 @@ conda activate llm
25
25
# install the latest ipex-llm nightly build with 'all' option
26
26
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
27
27
pip install einops # additional package required for phi-1_5 to conduct generation
28
+ pip install " transformers>=4.37.0,<4.42.0" # install right transformers version
28
29
```
29
30
30
31
On Windows:
Original file line number Diff line number Diff line change @@ -25,7 +25,7 @@ conda activate llm
25
25
# install the latest ipex-llm nightly build with 'all' option
26
26
pip install --pre --upgrade ipex-llm[all] --extra-index-url https://download.pytorch.org/whl/cpu
27
27
pip install einops # additional package required for phi-2 to conduct generation
28
- pip install transformers== 4.37.0
28
+ pip install " transformers>= 4.37.0,<4.42.0 " # install right transformers version
29
29
```
30
30
On Windows:
31
31
@@ -35,7 +35,7 @@ conda activate llm
35
35
36
36
pip install --pre --upgrade ipex-llm[all]
37
37
pip install einops
38
- pip install transformers== 4.37.0
38
+ pip install " transformers>= 4.37.0,<4.42.0" # install right transformers version
39
39
```
40
40
41
41
### 2. Run
Original file line number Diff line number Diff line change @@ -16,6 +16,7 @@ conda activate llm
16
16
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
17
17
18
18
pip install einops # additional package required for phi-1_5 to conduct generation
19
+ pip install " transformers>=4.37.0,<4.42.0" # install right transformers version
19
20
```
20
21
21
22
#### 1.2 Installation on Windows
@@ -28,6 +29,7 @@ conda activate llm
28
29
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
29
30
30
31
pip install einops # additional package required for phi-1_5 to conduct generation
32
+ pip install " transformers>=4.37.0,<4.42.0" # install right transformers version
31
33
```
32
34
33
35
### 2. Configures OneAPI environment variables for Linux
Original file line number Diff line number Diff line change @@ -16,7 +16,7 @@ conda activate llm
16
16
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
17
17
18
18
pip install einops # additional package required for phi-2 to conduct generation
19
- pip install transformers== 4.37.0
19
+ pip install " transformers>= 4.37.0,<4.42.0 " # install right transformers version
20
20
```
21
21
22
22
#### 1.2 Installation on Windows
@@ -29,7 +29,7 @@ conda activate llm
29
29
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
30
30
31
31
pip install einops # additional package required for phi-2 to conduct generation
32
- pip install transformers== 4.37.0
32
+ pip install " transformers>= 4.37.0,<4.42.0 " # install right transformers version
33
33
```
34
34
35
35
### 2. Configures OneAPI environment variables for Linux
You can’t perform that action at this time.
0 commit comments