Return type for SELECT * ... INTO S3 .. query

SELECT * … INTO S3 … SQL query return INT value, for which maximum supported value is 2147483647. The issue is when I am trying to transfer records higher than this, the query is returning random values ranging around 18446744071660463162. Not sure what is happening in backend, is it trying to cast it into BigINT? And what is this value?

Without seeing more of the query I’m not sure I can tell what the issue is. But based on the high-level description, I’d recommend making sure all your expressions in the SELECT list of your query are correct, and even cast some of the expression results or intermediate results to BIGINT (e.g. “expr :> BIGINT”) if you want to consistently get BIGINT results.

It’s possible that INT-valued expressions that overflow could wrap around and give unexpected results, instead of auto-cast to BIGINT.

I am using python to run following query:
target_count = _connection.execute(“SELECT * FROM table_name
INTO S3 {file_path}
CONFIG {config}
CREDENTIALS {creds}
FORMAT PARQUET;”)

The issue is I am getting target_count as 18446744071660463162 (around this value), when the source count is more than 2147483647 (max value for INTEGER).

What connector and version are you running? Different ones may have that same .execute() method. If you can give is that, I’ll pass it on to our connectors team.