Dev guide: how to upload files

One of the first activities that you want to try with LAIKA API is upload. It could be useful for various scenarios: assets and products creation, data migration and other tasks. LAIKA API has different ways how to do that.

IMPORTANT: there are API calls to upload files and some of them marked as legacy. Such calls has limitation on the amount of data that could be successfully transferred (since there is a limitation on a single request size). We strongly recommend to use only chunked upload calls for any kind of upload.

Select appropriate option

For chunked upload two option exists that depends on the file that you want to upload:

  • direct chunked upload (sync) - means that all of the actions related to further file processing will take place in the same context. It's suitable to simple file formats that could be processed in terms of preview generation, metadata extraction and media operations quite fast (raster images etc.).
  • chunked upload to upload queue (async) - means that after upload asset will be enqueued to upload queue. This approach is suitable for processing of large files or complex file types (like videos or documents). Processing queue has limited set of threads so processing of files from queue will not bring additional load to the machine

IMPORTANT: please choose the right way to upload - in case of direct upload each upload call will generate new upload thread and in case when file processing takes a lot of time it could cause issues (number of threads will grow and it could have negative impact on the whole platform). In case of upload queue it will not affect the platform performance: system will take new file for upload once previous file was processed.

Queued upload is great for data migration or for upload of compelx file formats. Direct upload is good for fast upload of simple file types or small files.

But how to select a right one for your case? We don't know is it applicable to your case or not, but our team usually takes two criterias into account: it's file type and file size. If file belongs to one of the file types that require complex processing (for example video or complex docment) or it's a huge file it's better to use upload queue.

Photo by Ant Rozetsky / Unsplash

Direct chunked upload

Perform following steps for direct chuncked upload:

  1. POST api/upload/initLargeUpload?fileName=fileName call starts the upload process and returns upload identifier. X-Content-Length header is required for this request: it's total file size in bytes.
  2. Perform POST api/upload/{identifier}/chunksUpload call with body contains chunk of the data as many times as many chunks you have. X-Chunk-Id and X-Chunk-Size headers are required for these calls.
  3. Call POST api/upload/{identifier}/endChunksUpload to end upload procedure.
  4. Call POST api/upload/{assetId}/assetMetadataUpload with asset metadata as a body to apply asset metadata.

After that asset will be available in LAIKA.

Queued chunked upload

To do upload in a qeueued way perform the following calls:

  1. POST api/upload/initLargeUpload?fileName=fileName call starts the upload process and returns upload identifier. X-Content-Length header is required for this request: it's total file size in bytes.
  2. Perform POST api/upload/{identifier}/chunksUpload call with body contains chunk of the data as many times as many chunks you have. X-Chunk-Id and X-Chunk-Size headers are required for these calls.
  3. Perform POST api/assetUploadQueue/{assetId} with asset metadata in body to add asset into upload queue.

Asset will be uploaded into LAIKA once it was processed by queue workers.

Photo by Timelab Pro / Unsplash

Conclusion

Upload isn't a complex operation. Yes, it requires more that one REST call, but it provides you enchanced stability in complex cases (like large file uploads). Once again, please do not use legacy upload calls (we have them for backward compatibility only) and choose the right approach for your case (queued or direct upload) based on some criterias like file size or file type and that's it!

You could create your upload tools or migration tools with ease. And also do not forget to take a look at the OpenAPI / Swagger UI that available for each service at {service name}/swagger call (or by calling the service without any arguments). It will provide you with description of methods and with samples of objects to pass.