Error with a vector function: argument sizes do not match

Hi,
Recently Update to the 6.8.3 memsql, and getting exceptions on a query & procedure (not consistently).
before i was using 6.7 & even 6.8.1 for long period and no issue occurred.

“getting Leaf Error (127.0.0.1:3307): Error with a vector function: argument sizes do not match”
getting this on several developer machines running a 1 master agg & 1 leaf.

I am 100% there are no different sizes, all table has only 1 length of the 1024 (256 floats), but in a field of varbinary 2048, ready for 512 floats, but with a specific flag (column) of type to filter query.

what should I check/test to get to the bottom of it?

Regards.

Can you post a repro, with table creation, INSERT, and SELECT statements that make it happen? Please also describe your hardware including the processor type and version, like from

cat /proc/cpuinfo

Hi,
not sure i am allowed to share the code / design of our logic, (not so sophisticated but still our propriety)
is their a email / private message i can send?

Double check that all the column types involved are exactly the same length. Consider using binary(n) even though varbinary(n) should work. If you think there’s a bug, forward it to info at memsql dot com and ask them to forward it to me, Eric Hanson.

1 Like

Hi, I have a similar issue when trying to calculate dot_product for vectors.

First I get an error ERROR 1940 ER_VECTOR_FUNCTION_ERROR: Error with a vector function: argument sizes do not match.

I couldn’t figure out how to check the dimension of the vectors in SingleStore, however they should have the same dimension of around 300 hundred. When I run the length function for the vectors I get different values for each line that range about 8000.

Finally I tested calculating a dot_product for a single vector with itself (e.g. copied) and got error:

ERROR 1940 ER_VECTOR_FUNCTION_ERROR: Error with a vector function: the size of the arguments is not a multiple of 4 byte

Any advice will be appreciated. Thanks!

@desislava.vasileva welcome aboard our forums!

With the vector functions, the vectors must have the exact same number of elements of the exact same type, and the vector functions you are using must be for that type.

We don’t have typed vectors. You have to use binary or blob types and make sure they have the right size and type. The easiest way to ingest will be to use json_array_pack and always use the standard vector element type, which is f32. E.g.

singlestore> set @v1 = json_array_pack('[1,2,3,4]');
Query OK, 0 rows affected (0.01 sec)

singlestore> set @v2 = json_array_pack('[1,1,0,0]');
Query OK, 0 rows affected (0.01 sec)

singlestore> set @v3 = json_array_pack('[0,1]');
Query OK, 0 rows affected (0.01 sec)

singlestore> select dot_product(@v1,@v2);
+----------------------+
| dot_product(@v1,@v2) |
+----------------------+
|                    3 |
+----------------------+
1 row in set (0.01 sec)

singlestore> select dot_product(@v1,@v3);
ERROR 1940 (HY000): Error with a vector function: argument sizes do not match

and

singlestore> set @v4 = json_array_pack_i8('[0,0,0,0,1]');
Query OK, 0 rows affected (0.01 sec)

singlestore> select dot_product(@v1, @v4);
ERROR 1940 (HY000): Error with a vector function: argument sizes do not match

singlestore> select length(@v1), length(@v4);
+-------------+-------------+
| length(@v1) | length(@v4) |
+-------------+-------------+
|          16 |           5 |
+-------------+-------------+
1 row in set (0.01 sec)

Note that length() is the length in bytes, not elements.

ERROR 1940 ER_VECTOR_FUNCTION_ERROR: Error with a vector function: the size of the arguments is not a multiple of 4 byte

are your vectors built from F32/I32, you execute dot_product() / dot_product_I32? as 4 byte vectors are F32/I32.

how do you insert the data(varbinary) into the table?
json_array_pack/unhex/binary directly into the table?

Our standard vector functions like dot_product with no type name suffix have f32 elements (4 bytes each). You have to use the type name suffix (like _i8 etc.) to operate on data of other types and sizes. This is described here.

To get the data in, the easiest way is to use json_array_pack.

singlestore> create table t(a int, b longblob);
Query OK, 0 rows affected (0.03 sec)

singlestore> insert t values(1, json_array_pack('[0.2, 0.3, 0, 0.4]'));
Query OK, 1 row affected (0.22 sec)

singlestore> select 1, length(b) from t;
+---+-----------+
| 1 | length(b) |
+---+-----------+
| 1 |        16 |
+---+-----------+
1 row in set (0.02 sec)

singlestore> select a, json_array_unpack(b) from t;
+------+-----------------------------------------+
| a    | json_array_unpack(b)                    |
+------+-----------------------------------------+
|    1 | [0.200000003,0.300000012,0,0.400000006] |
+------+-----------------------------------------+

But you can use unhex also, like so:

singlestore> set @s = "00000000000000000000000000080BF";
Query OK, 0 rows affected (0.00 sec)

singlestore> insert t values(2,unhex(@s));
Query OK, 1 row affected (0.00 sec)

singlestore> select a, json_array_unpack(b) from t where a = 2;
+------+----------------------+
| a    | json_array_unpack(b) |
+------+----------------------+
|    2 | [0,0,0,-1]           |
+------+----------------------+

There may be other ways to insert binary data right from the client app too.