Skip to content

Hyperparameter tuning #67

Open
Open
@ablaom

Description

@ablaom

Looking at bit into MLJ integration. For better or worse, hyper-parameter optimization (eg, grid search) in MLJ generally works by mutating the field values of the model struct. I wonder if TableTransforms.jl would consider changing their transformer types to mutable structs? I think in ML applications, at least, any loss in performance would be pretty minimal, but perhaps there are wider use-cases to consider?

The alternative for our use case is for the MLJ model wrapper to be mutable (for now a wrapper is necessary anyway) and that a user wanting to do a search does something like

values = [Scale(low=0, high=x) for x in 1.0:0.1:10]     <---- extra step 
values = range(wrapped_transformer, :model, values=values)

However, while this might be fine for Grid search, it doesn't really work for other optimization strategies.

Thoughts?

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions