Is it possible to consume singleStore in streaming manner?

As the title described, is it possible to consume singleStore in spark streaming or flink job?
If the table is just append only, In theory we can scan the table and convert it to stream but I think it is not efficient…
There is any way to do it?

Thanks

The best way to do this in SingleStore is to scan the table in “chunks”. Each chunk should represent a non-overlapping range of the table. Since your table is append only, it’s likely that you have some way to “sort” the table into non-overlapping ranges. Scanning the table in small chunks like this is extremely efficient in SingleStore.

Furthermore, if you’re using columnstore and the table is sorted by the same column or columns that you are chunking the table on - then this process will be even more efficient. SingleStore will read the absolute smallest amount of data each time you query for the next chunk.

Hope that makes sense!

It made sense to me thanks carl.