In JSON fields, very large numbers are converted to DOUBLE

We are running into a situation where we are ingesting large JSON flat files into SingleStore. The tables contain numeric that are very large numbers (some 25+ digits). When querying the data, some of our QA staff noticed numbers were slightly off. It appears that when querying a JSON field, numeric fields, which may or may not contain a decimal point, are converted into DOUBLE data type.

Is there any way to prevent the automatic interpretation of large numbers as DOUBLE data type?

Can you provide a little more context here? When is this conversion happening (on load, or on query)?

If you truly don’t want to lose any precision, perhaps you can read the numbers as strings and then cast them to numeric(x,y) in your queries, at the SQL level, after getting the data as a string from the JSON.