Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

int64 and uint64 are not supported #2767

Open
NorbertHauriel opened this issue Feb 2, 2025 · 0 comments
Open

int64 and uint64 are not supported #2767

NorbertHauriel opened this issue Feb 2, 2025 · 0 comments

Comments

@NorbertHauriel
Copy link

NorbertHauriel commented Feb 2, 2025

Stoplight Elements uses javascript's float64 to store and display all numbers coming from JSON or YAML spec, regardless of their size.
Below I can give one example where I encountered a precision loss caused by this bug/limitation. Certainly there are other areas where this is likely happening, but I don't see a reason exploring those, as this applies to all cases where numbers are silently converted to float64, without making sure information is not lost.

Context

This is a problem, because it means that backends cannot represent larger integers in Elements, even though the YAML describes the reality.
Standard JSON not supporting these types is another topic, but YAML does, as it has no limitation on the size of number types. Moreover, the Go language encodes these types correctly, so the json payload in my case when using Go, is lossless. Go also has a way to decode json in this lossless manner, by calling json.Decoder.UseNumber before decoding.

Current Behavior

As an example, when describing the schema.maximum with the maximum value of uint64, Elements shows 18446744073709552000 instead of 18446744073709551615 (maximum uint64). Same goes for int64, it shows a different number because of the rounding inaccuracy in javascript float64.

Expected Behavior

I expected a correct behavior. Either represent reality as-is, or throw an error, or stay silent. Anything, but a silent modification of the given OpenAPI spec, is a good behavior in my opinion.

Possible Workaround/Solution

There is an NPM package as an example that solves this issue on javascript/json side. It keeps the precision for large numbers when working with json:
https://www.npmjs.com/package/lossless-json

Two workarounds exist currently that I know of.

  • Completely avoid using large number types on the backend, and convert these to string and the convert back to a big number on the frontend. This is an artificial limitation that is not justified, since most modern languages support these types natively.
  • Define these big numbers in the spec as string. This keeps their integrity, but it is a type mismatch, so ambiguity creeps in.

I think this bug can be easily fixed by treating numbers losslessy. And I do not see this feature breaking any existing behavior, because this is just about displaying the reality of a spec file, a read-only operation.

Steps to Reproduce

Here is a minimal spec that shows these two large types in a valid YAML:

requestBody:
    content:
        application/json:
            schema:
                propertiesUint64:
                    LargeNumberUint64:
                        maximum: 18446744073709551615
                        minimum: 0
                        type: integer
                    LargeNumberInt64:
                        maximum: 9223372036854775807
                        minimum: -9223372036854775808
                        type: integer
                type: object

Environment

This issue is not environment dependent, as it happens everywhere

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant