Avoiding Memory Issues with SQL Alchemy and pandas read_sql

0

I have a query that obtains all records in a day for certain conditions, and this query is inserted into a python loop to obtain data from my MySQL (memsql I believe) database.

I am receiving the following error (masked some info) at some point throughout my loops depending on query output size:

pymysql.err.OperationalError: (1712, 'Leaf Error (ip_address:port): Query execution memory use (xxx MB) has reached the maximum memory for its resource pool `pool_name` (y%).')

Here’s some information about troubleshooting memory errors.

https://docs.singlestore.com/db/v7.8/en/reference/troubleshooting-reference/memory-errors.html

1 Like

Thanks for the reply i appreciate. is it possible for you Mr Hanson to provide me nearby solution as link (Memory Errors)
is very vast.
Thanks