Question on the scale-stationarity assumption in MeanScaleUniformBins and possible ways to relax it #321
Unanswered
HarvestStars
asked this question in
Q&A
Replies: 1 comment 1 reply
-
|
@HarvestStars The scaling here is only intended to normalize the values and make them amenable for downstream processing. There's no stationarity assumption here and non-stationary scales should not be a problem for the model. Unrelated: have you checked out Chronos-2? It natively supports univariate, multivariate and covariate-informed forecasting tasks. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi Chronos Team,
Recently I am about to use Chronos/bolt as a codebase for my Crypto Trading Research. While I reviewed the tokenizer, I found an interesting setting in the "scale" that the tokenizer computes a mean-scale from the historical context and reuses it for both the label window and the decoding of future predictions..
This seems imply a (local) scale stationarity assumption. For non-stationary series (e.g., financial data with volatility shifts), this may introduce systematic bias (under- or over-estimated amplitudes).
Example:
When a market transitions from a sideways consolidation phase into a strong uptrend, the mean-scale estimated from the calm, low-volatility period will underestimate the amplitude of the upcoming surge. As a result, the decoded forecast may appear “too flat” and fail to capture the true strength of the move.
This seems related to the discussion in issue #57.
I’d like to confirm ask whether there are recommended ways (existing or planned) to relax it, such as sliding/learnable scaling, multi-scale tokens, or quantile normalization.
(Btw, I also noticed that issue #249 proposed a multi-scale encoder design, which seems relevant conceptually, though it doesn’t directly address the scale normalization assumption at the tokenizer level.)
Beta Was this translation helpful? Give feedback.
All reactions