Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fallback reason missing for incompatible casts #1429

Open
andygrove opened this issue Feb 20, 2025 · 0 comments
Open

Fallback reason missing for incompatible casts #1429

andygrove opened this issue Feb 20, 2025 · 0 comments
Labels
bug Something isn't working
Milestone

Comments

@andygrove
Copy link
Member

Describe the bug

I see a HashAggregate falling back to Spark but the root cause is hidden:

HashAggregate [COMET: Unsupported result expressions found in: List((0.2 * cast((avg(UnscaledValue(l_quantity#130))#124 / 100.0) as decimal(15,6))) AS (0.2 * avg(l_quantity))#125, l_partkey#127L)]

After adding some debug logging I discover the root cause:

Comet does not guarantee correct results for cast from DoubleType to DecimalType(15,6) with timezone Some(America/Denver) and evalMode LEGACY (There can be rounding differences). To enable all incompatible casts, set spark.comet.cast.allowIncompatible=true

I should not need to add debug logging to discover this.

Steps to reproduce

No response

Expected behavior

No response

Additional context

No response

@andygrove andygrove added the bug Something isn't working label Feb 20, 2025
@andygrove andygrove added this to the 0.7.0 milestone Feb 20, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant