Member-only story
Facing 16 MB File Size Issue in Snowflake?
I face random problems every day in whatever technology I use. I believe each engineer somewhat facing the same, do you agree?
I have been working on snowflake for last one year and faced some problems. Out of them, few were resolved, and for some, I found workarounds but for one of them was the most challenging for me. That is file size of 16 MB limit while copying the data into raw table. when we use that file to copy the data into raw table, it complains about the exceeding 16 mb size limit.
Error looks like as following -
Error parsing JSON: document is too large, max size 16777216 bytes File ‘data.gz’, line 26, character 16777215 Row 25 starts at line 25, column $1 If you would like to continue loading when an error is encountered, use other values such as ‘SKIP_FILE’ or ‘CONTINUE’ for the ON_ERROR option. For more information on loading options, please run ‘info loading_data’ in a SQL client.
The error indicates you are hitting a hard limit of ~16 MB as Snowflake imposes a 16 MB size limit per row for VARIANT/JSON types. Please check the below link for more information: https://docs.snowflake.com/en/user-guide/data-load-considerations-prepare#semi-structured-data-size-limitations
This problem can happen due to few scenarios. Once can be resolved by minor change to the COPY
command, but one scenario cannot be fixed at snowflake level.