-
Notifications
You must be signed in to change notification settings - Fork 295
fix: parse integers larger than int32 max in WHERE clauses #3575
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
fix: parse integers larger than int32 max in WHERE clauses #3575
Conversation
icehaunter
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot for your contribution! I think this can be slightly simplified
| case Integer.parse(value) do | ||
| {int_value, ""} when is_pg_int8(int_value) -> | ||
| {:ok, %Const{type: :int8, value: int_value, location: loc}} | ||
|
|
||
| _ -> | ||
| {:ok, %Const{type: :numeric, value: String.to_float(value), location: loc}} | ||
| end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's no reason to make this case, because there exists a Float.parse/1 function which parses strings without a decimal point just fine
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@icehaunter Happy to make this change, quick question:
If we have an integer like 2^53 = 9007199254740993 which gets parsed by libpg_query as a float string and Float.parse("9007199254740993") returns {9007199254740992.0, ""}, wouldn't that be an issue since we just lost precision due to it being outside of Elixir's float range (2^53)? The use case is if someone uses BIGINT with some random IDs up to 10^18.
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #3575 +/- ##
==========================================
- Coverage 75.40% 75.32% -0.08%
==========================================
Files 51 51
Lines 2744 2744
Branches 408 405 -3
==========================================
- Hits 2069 2067 -2
- Misses 673 675 +2
Partials 2 2
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
You should also run |
✅ Deploy Preview for electric-next ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
47c1ee9 to
90c6ac5
Compare
Fixes a bug where WHERE clauses with integer literals larger than
2,147,483,647 (int32 max) would fail with an error like:
** (ArgumentError) errors were found at the given arguments:
* 1st argument: not a textual representation of a float
Root cause:
- libpg_query/PgQuery uses int32 for integer constants in the AST
- Integers exceeding int32 max are stored as Float nodes with string values
(e.g., {:fval, "2793017076"}) - this is documented behavior
- The parser called String.to_float/1 on these strings, but Elixir's
String.to_float/1 requires a decimal point and fails on "2793017076"
Solution:
- Use Integer.parse/1 to detect pure integer strings in fval values
- Parse them as int8 (bigint) instead of numeric
- This matches PostgreSQL's type inference for integer literals and
ensures correct type matching with bigint columns (no implicit cast
exists from numeric to int8)
Affected queries (now working):
- WHERE id = 2793017076
- WHERE id IN (2147483648, 3500000000)
- WHERE id > 2147483647
References:
- libpg_query wiki: https://github.com/pganalyze/libpg_query/wiki/Differences-between-JSON-output-formats
("This leads to problems with overly long integers or floats")
Co-Authored-By: Claude <[email protected]>
90c6ac5 to
ff6d63a
Compare
Fixes a bug where WHERE clauses with integer literals larger than 2,147,483,647 (int32 max) would fail with an error like:
Root cause:
Solution:
Affected queries (now working):
References: