Dynamic INSERT query in a Pipelined Procedure

Trying to figure this out without much success…

I’m trying to avoid creating nearly 100 pipelines; and build a pair of generic pipelines with procedures.

I am passing the data needed loading as JSON row images, and I’m successfully parsing the JSON to get keys (which equate to table column names) ; however I’m struggling to find a way to generate a dynamic INSERT statement.

EXECUTE IMMEDIATE does not ‘see’ the Query object passed into the procedure.
TO_QUERY / COLLECT aren’t suitable as they are designed for SELECT statements ?

Looking for ideas on how I can achieve this…

Consider putting the JSON data in a table for staging, then process it with a regular SELECT against that table, then clean it up from the staging table. The table could be a regular or temp or global temp table. If it’s a regular temp table, any work to process it would have to be in the same session.

It was going so well when I tested the procedure… unfortunately:

Stored procedure containing ‘CREATE TEMPORARY TABLE tmp_rowdata AS SELECT DISTINCT row_data FROM batch WHERE row_data IS NOT NULL’ cannot be run inside a pipeline.

Thankyou Hanson, that worked well using a temporary table.

1 Like

What did you do to get by the error?

Hi Hanson,

I’ve created a Global Temp Table; I’m putting the query data (JSON) into that table then using EXECUTE IMMEDIATE to read from the GTT and insert into the target table.

This is actually really fast… I’m processing ~25k rows of data from a TSV into the target table via the GTT in less than a second.

This includes deleting rows from the target table that are flagged as deleted in the incoming query data.

1 Like