I'm using below COPY INTO to generate file from table to s3

COPY INTO @s3_stage FROM my_sf_tableFILE_FORMAT = (TYPE=CSVcompression=GZIPEMPTY_FIELD_AS_NULL = falseNULL_IF = ('')FIELD_DELIMITER=',')single = falsemax_file_size=4900000000;

I was able to get a 300MB file and uploaded to s3. However the maximum file size I can upload is 100 MB - I then changed max_file_size to 104857600(100MB)

COPY INTO @s3_stage FROM my_sf_tableFILE_FORMAT = (TYPE=CSVcompression=GZIPEMPTY_FIELD_AS_NULL = falseNULL_IF = ('')FIELD_DELIMITER=',')single = falsemax_file_size=104857600;

My expectation: get three files, each file is 100MB, and upload to s3. But it's not working, I've still got one 300 MB file. Anyone knows why?

1

Best Answer


Your expectation isn't correct here, as you will not get 3 files of 100MB. You will get a number of files determined by the size of the warehouse you are using to run the statement. However, that does not explain why you are still getting just 1 file. I believe this line in the documentation might be the issue, but not sure. Please take a look and let me know. I can modify the answer, if needed.

Boolean that specifies whether to generate a single file or multiple files. If FALSE, a filename prefix must be included in path.

This would indicate that you need a filename as part of your path. You don't have one, so perhaps it defaults to TRUE and only creates a single file.