AVRO as input to a stored proc from a pipeline?

Is it possible to pass AVRO messages from a KAFKA pipeline to a STORED PROC?
if so, do you have some examples of what the pipeline and the stored proc would look like?
e.g. in which does the AVRO SCHEMA get placed?

Yes, pipelines into stored procedures are supported for all input formats (JSON, CSV, and Avro).

Input sub-values are converted into SQL values to build the query type variable passed to the destination stored procedure, so it’d look like any other pipeline into stored procedure. The Avro schema is only needed/accepted in the CREATE PIPELINE statement, since it tells MemSQL how to find the requested sub-values.

Here’s an example where the procedure just inserts the input into a table:

~: avrocat /tmp/forum_example/sanity.avro
{“id”: 1, “data”: “hello”}

MySQL [db]> delimiter //
MySQL [db]> create procedure proc(q query(first_param int, second_param text)) as begin insert into t select * from q; end //
MySQL [db]> delimiter ;
MySQL [db]> create pipeline p as load data fs “/tmp/forum_example” into procedure proc format avro (first_param ← id, second_param ← data);
MySQL [db]> start pipeline p foreground;
MySQL [db]> select * from t;
±-----±------+
| a | b |
±-----±------+
| 1 | hello |
±-----±------+

Note that, in the CREATE PIPELINE statement, you name destination columns in the query type variable stored procedure argument as if they were columns in a proper destination table (e.g. ‘first_param ← id’).

1 Like