-
Notifications
You must be signed in to change notification settings - Fork 8
Description
See JuliaLogging/TensorBoardLogger.jl#77
In one form or another, it would be great to have hyperparmaters visible in the tesnorboard output. See https://pytorch.org/docs/stable/tensorboard.html#torch.utils.tensorboard.writer.SummaryWriter.add_hparams
Maybe add support so the samplers themselves could do this? For example, have a dispatch to log hyperparameters for the sampler. For example,
log_hparams(lg::TBLogger, sampler) = nothing # no default logging
function log_hparams(lg::TBLogger, sampler::NUTS, prefix = "")
log_hparams(lg, {"$(prefix)n_adapts" => sampler.n_adapts, "$(prefix)δ" =? sampler.δ})
return nothing
endAnd until hparams are suppored, maybe it could be done with custom scalars? https://philipvinc.github.io/TensorBoardLogger.jl/dev/explicit_interface/#Custom-Scalars-plugin Though I am not sure how that stuff works
log_hparams(lg::TBLogger, sampler) = nothing # no default logging
function log_hparams(lg::TBLogger, sampler::NUTS, prefix = "")
log_custom_scalar(lg, {"$(prefix)n_adapts" => sampler.n_adapts, "$(prefix)δ" =? sampler.δ})
return nothing
endI put in the "prefix" argument so that you could do things recursively with Gibbs so that the hyperparameters are distinguishable. e.g. something like
function log_hparams(lg::TBLogger, sampler::Gibbs, prefix = "")
for i, alg in enumerate(sampler.algs)
log_hparams(lg, alg, "component_$i")
end
return nothing
endBut with all that said, it is not immediately clear to me where you can hook this into the current Turing callback structure. One possibility is to hack it by having a toggle on whether the hparams have been logged? e.g. TensorBoardCallback is mutable with a has_logged_hparams in it, then in the callback https://github.com/torfjelde/TuringCallbacks.jl/blob/master/src/callbacks/tensorboard.jl#L100 you check the toggle...