'How to use spark with large decimal numbers?

My database has numeric value, which is up to 256-bit unsigned integer. However, spark's decimalType has a limit of Decimal(38,18).

When I try to do calculations on the column, exceptions are thrown.

java.lang.IllegalArgumentException: requirement failed: Decimal precision 39 exceeds max precision 38".

Is there any third-party library or workarounds that solve this issue? Or Spark is designed for numbers smaller than Decimal(38,18)?



Solution 1:[1]

If you must store as numeric type:

  • cast(value as Double) (possible loss of precision)
  • cast(value as Decimal(38, 18)) (possible overflow)

If your data doesn't need to be involved in the calculation, you can transform them to String.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Songv