Skip to content

Conversation

@nitsanluke
Copy link
Contributor

✨ Description

Review cleanup.

class AprielSSMHybridConfig(MistralConfig):
model_type = "apriel_ssm_thinker_hybrid"

def __init__(self, hybrid_block_layout=["m2d"], ssm_cfg=None, **kwargs):
Copy link
Contributor Author

@nitsanluke nitsanluke Oct 16, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we clean up this configs? @oleksost

ssm_config_default = {	ssm_config_default = {
    # discrete mamba2	    # discrete mamba2
    "d_state": 64,	    "d_state": 64,
    "n_v_heads": 32,	    "n_v_heads": 32,
    "n_qk_heads": 32,	    "n_qk_heads": 32,
    "expand": 1,	    "expand": 1,
    "chunk_size": 128,	    "chunk_size": 128,
    "activation": "identity",	    "activation": "identity",
    "bias": False,	    "bias": False,
    "d_conv": 4,	    "d_conv": 4,
    "d_inner": 32 * 128,	    "d_inner": 32 * 128,
    # mamba2	    # mamba2
    "d_xb": None,  # will be set to model dim	    "d_xb": None,  # will be set to model dim
    "dt_rank": "auto",	    "dt_rank": "auto",
    "dt_min": 0.001,	    "dt_min": 0.001,
    "dt_max": 0.1,	    "dt_max": 0.1,
    "dt_init": "random",	    "dt_init": "random",
    "dt_scale": 1.0,	    "dt_scale": 1.0,
    "dt_init_floor": 1e-4,	    "dt_init_floor": 1e-4,
    "conv_bias": True,
    ```

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants