Problem in creating filesystem(FS) pipeline if source directory lies on different server

Hi Team,

I have two servers server A and server B, Memsql installed on server A and our data files lies on server B. I am trying to create FS pipeline to consume the files from server B to Memsql Tables (which lies on server A) but I am getting error like (File or directory ‘XYZ’ does not exist).

Do I need any kind of trust/security certificate to establish the communication b/w these two servers, pipeline syntax which I am trying are as below, Please suggest me how to setup this kind of pipeline.

CREATE PIPELINE pipeline_test
AS LOAD DATA FS ‘/xyz/*.txt (path of the files on server B)’
BATCH_INTERVAL 1000
MAX_PARTITIONS_PER_BATCH 8
INTO PROCEDURE test_proc
FIELDS TERMINATED BY ‘|’ ENCLOSED BY ‘’ ESCAPED BY ‘\’
LINES TERMINATED BY ‘\n’ STARTING BY ‘’
IGNORE 1 LINES
(
col1,
col2,
col3,
) ;

Be sure the directory you want to read from is mounted on the same server where the SingleStore server process (memsqld) is running.

Then, make sure the target directory is readable to the user running the memsqld process.
That is usually the memsql user. You can verify that using

px aux | grep memsqld

Use chmod to change the directory permissions if needed.

I’m not sure this is a full answer, but that’s where I’d start.

Thanks Hanson for your suggestion but I have different scenario my data files exist on different (remote) server, Is there any way to create pipeline from remote server (directory).

No, but the standard way around that is to mount the remote directory locally on your cluster machines (as I said above).

Thanks Hanson,
If we have S3 on remote server do we have same limitation here as well (We need to mount remote directory on our local cluster machines) for creating the pipeline.