Uploading a big file to the Sqlify API in chunks

If you need to convert huge files using the Sqlify API, a simple upload might not be enough. You can use our multi part uploading feature to upload your file reliably.

First, you’ll need to specify your total size when you create your file:

curl https://sqlify.io/api/file/{FILE_ID} \
     -u "{API_KEY}:" \
     -d "size={SIZE_IN_BYTES}"

You’ll receive a response like this one:

{
   "status":"ok",
   "message":"file created",
   "data":{
      "upload_url":"https://sqlify.io/api/file/{FILE_ID}/upload",
      "file_id":"{FILE_ID}"
   }
}

Now all you need to do is upload your chunks to the upload_url, note that all the chunks but the last must be larger than 5 * 1024 * 1024 bytes:

curl http://sqlify.local:4000/api/file/{FILE_ID}/upload\?offset\=0 \
     -u "{API_KEY}:" \
     -H "content-type: application/octet-stream" \
     -d @example.csv

Where offset is the byte offset you’re uploading. For example, if you’ve got a 10 MiB file that you want to upload (10485760 bytes), you’d split it into two chunks of 5 MiB (5242880 bytes). The first request would be:

curl http://sqlify.local:4000/api/file/{FILE_ID}/upload\?offset\=0 \
     -u "{API_KEY}:" \
     -H "content-type: application/octet-stream" \
     -d @chunk-1.csv

And the second request would be:

curl http://sqlify.local:4000/api/file/{FILE_ID}/upload\?offset\=5242880 \
     -u "{API_KEY}:" \
     -H "content-type: application/octet-stream" \
     -d @chunk-2.csv

The API will return 201 Created for the last chunk and 200 OK for the rest.

Once the file is uploaded you can export it to any format! Checkout our other guide to see how to convert a file.