You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Stoplight Elements uses javascript's float64 to store and display all numbers coming from JSON or YAML spec, regardless of their size.
Below I can give one example where I encountered a precision loss caused by this bug/limitation. Certainly there are other areas where this is likely happening, but I don't see a reason exploring those, as this applies to all cases where numbers are silently converted to float64, without making sure information is not lost.
Context
This is a problem, because it means that backends cannot represent larger integers in Elements, even though the YAML describes the reality.
Standard JSON not supporting these types is another topic, but YAML does, as it has no limitation on the size of number types. Moreover, the Go language encodes these types correctly, so the json payload in my case when using Go, is lossless. Go also has a way to decode json in this lossless manner, by calling json.Decoder.UseNumber before decoding.
Current Behavior
As an example, when describing the schema.maximum with the maximum value of uint64, Elements shows 18446744073709552000 instead of 18446744073709551615 (maximum uint64). Same goes for int64, it shows a different number because of the rounding inaccuracy in javascript float64.
Expected Behavior
I expected a correct behavior. Either represent reality as-is, or throw an error, or stay silent. Anything, but a silent modification of the given OpenAPI spec, is a good behavior in my opinion.
Possible Workaround/Solution
There is an NPM package as an example that solves this issue on javascript/json side. It keeps the precision for large numbers when working with json: https://www.npmjs.com/package/lossless-json
Two workarounds exist currently that I know of.
Completely avoid using large number types on the backend, and convert these to string and the convert back to a big number on the frontend. This is an artificial limitation that is not justified, since most modern languages support these types natively.
Define these big numbers in the spec as string. This keeps their integrity, but it is a type mismatch, so ambiguity creeps in.
I think this bug can be easily fixed by treating numbers losslessy. And I do not see this feature breaking any existing behavior, because this is just about displaying the reality of a spec file, a read-only operation.
Steps to Reproduce
Here is a minimal spec that shows these two large types in a valid YAML:
Stoplight Elements uses javascript's float64 to store and display all numbers coming from JSON or YAML spec, regardless of their size.
Below I can give one example where I encountered a precision loss caused by this bug/limitation. Certainly there are other areas where this is likely happening, but I don't see a reason exploring those, as this applies to all cases where numbers are silently converted to float64, without making sure information is not lost.
Context
This is a problem, because it means that backends cannot represent larger integers in Elements, even though the YAML describes the reality.
Standard JSON not supporting these types is another topic, but YAML does, as it has no limitation on the size of number types. Moreover, the Go language encodes these types correctly, so the json payload in my case when using Go, is lossless. Go also has a way to decode json in this lossless manner, by calling json.Decoder.UseNumber before decoding.
Current Behavior
As an example, when describing the schema.maximum with the maximum value of uint64, Elements shows 18446744073709552000 instead of 18446744073709551615 (maximum uint64). Same goes for int64, it shows a different number because of the rounding inaccuracy in javascript float64.
Expected Behavior
I expected a correct behavior. Either represent reality as-is, or throw an error, or stay silent. Anything, but a silent modification of the given OpenAPI spec, is a good behavior in my opinion.
Possible Workaround/Solution
There is an NPM package as an example that solves this issue on javascript/json side. It keeps the precision for large numbers when working with json:
https://www.npmjs.com/package/lossless-json
Two workarounds exist currently that I know of.
I think this bug can be easily fixed by treating numbers losslessy. And I do not see this feature breaking any existing behavior, because this is just about displaying the reality of a spec file, a read-only operation.
Steps to Reproduce
Here is a minimal spec that shows these two large types in a valid YAML:
Environment
This issue is not environment dependent, as it happens everywhere
The text was updated successfully, but these errors were encountered: