You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
JDBC adapter query Postgres numeric field error: Cannot get simple type for type DECIMAL
Stack Trace
Invalid Input Error: arrow_scan: get_next failed(): java.lang.RuntimeException: Error occurred while getting next schema root.
at org.apache.arrow.adapter.jdbc.ArrowVectorIterator.next(ArrowVectorIterator.java:190)
at org.apache.arrow.adbc.driver.jdbc.JdbcArrowReader.loadNextBatch(JdbcArrowReader.java:87)
at org.apache.arrow.c.ArrayStreamExporter$ExportedArrayStreamPrivateData.getNext(ArrayStreamExporter.java:66)
Caused by: java.lang.RuntimeException: Error occurred while consuming data.
at org.apache.arrow.adapter.jdbc.ArrowVectorIterator.consumeData(ArrowVectorIterator.java:112)
at org.apache.arrow.adapter.jdbc.ArrowVectorIterator.load(ArrowVectorIterator.java:163)
at org.apache.arrow.adapter.jdbc.ArrowVectorIterator.next(ArrowVectorIterator.java:183)
... 2 more
Caused by: java.lang.UnsupportedOperationException: Cannot get simple type for type DECIMAL
at org.apache.arrow.vector.types.Types$MinorType.getType(Types.java:815)
at org.apache.arrow.adapter.jdbc.consumer.CompositeJdbcConsumer.consume(CompositeJdbcConsumer.java:49)
at org.apache.arrow.adapter.jdbc.ArrowVectorIterator.consumeData(ArrowVectorIterator.java:98)
... 4 more
�How can we reproduce the bug?
Numeric field without scale and precision will cause the error
createtablexxx (
numeric_a numeric
)
Also tried debug the code and found pg jdbc driver getBigDecimal will return a BigDecimal with precision 1. This will cause the actual error: "BigDecimal precision cannot be greater than that in the Arrow vector'
// org.apache.arrow.adapter.jdbc.consumer.DecimalConsumer.NullableDecimalConsumer.consumepublicvoidconsume(ResultSetresultSet) throwsSQLException {
// value's scale is 0 and precision is 1, this will cause the errorBigDecimalvalue = resultSet.getBigDecimal(this.columnIndexInResultSet);
if (!resultSet.wasNull()) {
this.set(value);
}
++this.currentIndex;
}
// org.apache.arrow.vector.util.DecimalUtility.checkPrecisionAndScalepublicstaticbooleancheckPrecisionAndScale(BigDecimalvalue, intvectorPrecision, intvectorScale) {
intvar10002;
if (value.scale() != vectorScale) {
var10002 = value.scale();
thrownewUnsupportedOperationException("BigDecimal scale must equal that in the Arrow vector: " + var10002 + " != " + vectorScale);
} elseif (value.precision() > vectorPrecision) {
// value precision is 1 and vector precision is 0 var10002 = value.precision();
thrownewUnsupportedOperationException("BigDecimal precision cannot be greater than that in the Arrow vector: " + var10002 + " > " + vectorPrecision);
} else {
returntrue;
}
}
Environment/Setup
No response
The text was updated successfully, but these errors were encountered:
Hmm, Postgres NUMERIC fields without a fixed precision/scale can't actually be supported by Arrow because those are variable/unlimited precision and Arrow assumes a fixed precision per field.
For BigQuery, we need to read the type correctly.
Note that we have been considering a JNI bridge to use the native ADBC drivers for both these databases. That should be faster than the JDBC driver and should handle these cases better as the drivers have had more individual attention for each database's quirks (vs for JDBC which just tries to generically adapt the results from JDBC).
Hmm, Postgres NUMERIC fields without a fixed precision/scale can't actually be supported by Arrow because those are variable/unlimited precision and Arrow assumes a fixed precision per field.
For BigQuery, we need to read the type correctly.
Note that we have been considering a JNI bridge to use the native ADBC drivers for both these databases. That should be faster than the JDBC driver and should handle these cases better as the drivers have had more individual attention for each database's quirks (vs for JDBC which just tries to generically adapt the results from JDBC).
What happened?
JDBC adapter query Postgres numeric field error: Cannot get simple type for type DECIMAL
Stack Trace
Invalid Input Error: arrow_scan: get_next failed(): java.lang.RuntimeException: Error occurred while getting next schema root.
at org.apache.arrow.adapter.jdbc.ArrowVectorIterator.next(ArrowVectorIterator.java:190)
at org.apache.arrow.adbc.driver.jdbc.JdbcArrowReader.loadNextBatch(JdbcArrowReader.java:87)
at org.apache.arrow.c.ArrayStreamExporter$ExportedArrayStreamPrivateData.getNext(ArrayStreamExporter.java:66)
Caused by: java.lang.RuntimeException: Error occurred while consuming data.
at org.apache.arrow.adapter.jdbc.ArrowVectorIterator.consumeData(ArrowVectorIterator.java:112)
at org.apache.arrow.adapter.jdbc.ArrowVectorIterator.load(ArrowVectorIterator.java:163)
at org.apache.arrow.adapter.jdbc.ArrowVectorIterator.next(ArrowVectorIterator.java:183)
... 2 more
Caused by: java.lang.UnsupportedOperationException: Cannot get simple type for type DECIMAL
at org.apache.arrow.vector.types.Types$MinorType.getType(Types.java:815)
at org.apache.arrow.adapter.jdbc.consumer.CompositeJdbcConsumer.consume(CompositeJdbcConsumer.java:49)
at org.apache.arrow.adapter.jdbc.ArrowVectorIterator.consumeData(ArrowVectorIterator.java:98)
... 4 more
�How can we reproduce the bug?
Numeric field without scale and precision will cause the error
Also tried debug the code and found pg jdbc driver getBigDecimal will return a BigDecimal with precision 1. This will cause the actual error: "BigDecimal precision cannot be greater than that in the Arrow vector'
Environment/Setup
No response
The text was updated successfully, but these errors were encountered: