Now that we have https://github.com/FluxML/Flux.jl/pull/1849#event-5967026617, there are some things we could do: - [ ] The `rng` argument for the `Short` builder should be passed to the `Dropout` layer at https://github.com/FluxML/MLJFlux.jl/blob/b5a9c419bb3af4aae18fdbd3fb69b03497b6ea9e/src/builders.jl#L61 - [ ] Use `Short` instead of the test-local builder `Short2` in tests and remove `Short2`: https://github.com/FluxML/MLJFlux.jl/blob/b5a9c419bb3af4aae18fdbd3fb69b03497b6ea9e/test/regressor.jl#L6 - [ ] Update this section of the README.md: https://github.com/FluxML/MLJFlux.jl#random-number-generators-and-reproducibility. We should add sentence explaining limitations of Dropout RNG's for GPU training (and copy @darsnack in the PR to check we have this correct) - [ ] In the README.md example at https://github.com/FluxML/MLJFlux.jl#random-number-generators-and-reproducibility, include a `Dropout` layer with rng
Now that we have FluxML/Flux.jl#1849 (comment), there are some things we could do:
rngargument for theShortbuilder should be passed to theDropoutlayer atMLJFlux.jl/src/builders.jl
Line 61 in b5a9c41
Shortinstead of the test-local builderShort2in tests and removeShort2:MLJFlux.jl/test/regressor.jl
Line 6 in b5a9c41
Dropoutlayer with rng