Skip to content

Commit d3472dd

Browse files
jiapingWcanghua
andauthored
support use the rope_theta parameter (#270)
fix lint Co-authored-by: canghua <[email protected]>
1 parent 7064fc6 commit d3472dd

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

specforge/modeling/draft/llama3_eagle.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -478,7 +478,9 @@ def __init__(self, config):
478478
def _init_rope(self):
479479
if self.config.rope_scaling is None:
480480
self.rotary_emb = LlamaRotaryEmbedding(
481-
self.head_dim, max_position_embeddings=self.max_position_embeddings
481+
self.head_dim,
482+
max_position_embeddings=self.max_position_embeddings,
483+
base=getattr(self.config, "rope_theta", 10000),
482484
)
483485
else:
484486
scaling_type = self.config.rope_scaling["rope_type"]

0 commit comments

Comments
 (0)