Author:
Fluent Commerce
Changed on:
2 Sept 2024
A topic on how to run Inventory batches in the OMS.
Author:
Fluent Commerce
Changed on:
2 Sept 2024
An Inventory Batch is a bulk upload of Inventory Updates. These will typically range from 10,000 updates to in excess of 1 million updates. By using the Batch API you are able to leverage Fluent Order Managements ability to scale, enabling increased speed of processing and faster Inventory Availability.
The primary use cases you should be using Inventory Batches:
After using this document, you will be able to run an Inventory Batch. If you need to extend or build on the capabilities of this document, the below pages will link to the core capabilities used.
Capability | Description |
API to create & manage batch loading of information |
After you have followed this guide and understand how to use Inventory Batches, here are a few more guides we suggest reading to enhance the usability of your Inventory implementation.
Guide | Description |
Learn how to send Inventory Updates via events | |
Learn how to send delta inventory updates via events |
Author:
Fluent Commerce
Changed on:
2 Sept 2024
Please follow the Authentication API page for how to Authenticate against a Retailer.
The returned token will be used to authenticate all the following API calls.
Using the Job API you can create a new Batch Job.
1{
2 "name": "Batch Inventory Job",
3 "retailerId": "{retailer id}"
4}
Language: json
Name: Example payload
Description:
[Warning: empty required content area]1{
2 "id": "199"
3}
Language: json
Name: Example response
Description:
[Warning: empty required content area]This id number is the Job id which has been created, save this id as we will use it in the following requests for sending a Batch. This Job id and the created Job will be used to run the single group of Batches in a standard implementation.
Using the Job API you can create an Inventory Batch to be processed under the previously created Job.
The URL is using the Job id of 199 returned by the last API call.
1{
2 "action": "UPSERT",
3 "entityType": "INVENTORY",
4 "entities": [
5 {"retailerId":"{retailer id}","locationRef":"LOC1","skuRef":"SKU123","qty":100,"correctedQty":0}
6 ]
7}
Language: json
Name: Example payload
Description:
[Warning: empty required content area]1{
2 "id": "331"
3}
Language: json
Name: Example response
Description:
[Warning: empty required content area]The returned id will be used to check the Batch status in a following call. If you needed to send multiple groups of updates you would run this call for each group of Batches that you are running. If you needed to send 15,000 updates you would run this 3 times for each 5,000 records, all under the same Job.
You can use the Job API again to check the status of an existing Job, this will share with you the status of Batches which are part of that Job.
1{
2 "jobId": "199",
3 "status": "OPEN",
4 "batch": [
5 {
6 "batchId": "331",
7 "status": "COMPLETE",
8 "createdOn": "2022-10-06T00:31:08.104+0000"
9 }
10 ],
11 "createdOn": "2022-10-06T00:31:08.104+0000"
12}
Language: json
Name: Example response
Description:
[Warning: empty required content area]You can view the status of a specific Batch as well, this provides a few more details than the previous Job status check.
The URL uses the Job id and the Batch id returned by previous APIs.
1{
2 "batchId": "331",
3 "entityType": "INVENTORY",
4 "status": "COMPLETE",
5 "start": 1,
6 "count": 10,
7 "total": 0,
8 "results": [],
9 "createdOn": "2022-10-06T00:36:09.344+0000"
10}
Language: json
Name: Example response
Description:
[Warning: empty required content area]Author:
Fluent Commerce
Changed on:
2 Sept 2024
provides common practices to ensure running a successful Inventory batches
Batch files have an upper limit of 5,000 records per file. When uploading large amounts of data these need to be split down into Batches of 5,000 updates. For Example, if you needed to upload 1,000,000 updates this should be split into 200 Batch files.
A Job is how to group Batches together, the intention is that a Job is used to track each distinct group of Batches that are being sent together. While a single Job will stay open until midnight on the day of creation, the intention is that you should create a new Job for each group of Batches. This will enable greater visibility into the success or failures of each group of Batches.
For example, if you intend to send Inventory Updates every hour, you should create a new Job for each of those hourly Batch runs.
While individual Batch files have an upper limit of 5,000 records per file, this is only the upper limit. The way our Batch API works is that when multiple files are sent at the same time they are spread across the platform to be processed asynchronously. Each individual Batch file is processed in line, this results in lower performance when sending small numbers of Batch files. We suggest that each Job should have a minimum of 10 batch files to correctly spread the load within the platform.
For example, if you want to send 15,000 Inventory updates, instead of splitting into 3 files of 5,000 records this should be split into 10 files of 1,500 records. An integration that is created can scale this upwards to ensure that Batches are always 10 files until the files reach 5,000 records at which point it splits into more files.
When you are running a Job with more than 10 batch files it will be expected that you have a number of batches at 5,000 & a final batch at less than 5,000. To provide a more consistent processing of each individual Batch it is suggested to spread the total number across all the files. E.g. if you have 53,900 items to update, these should be spread across 11 files of 4,900 instead of 10 5,000s & one 3,900.
If you apply both the size of Batch files & number of Batch file best practices and find that you can’t reasonably implement both then the Batch API may not be the best solution. E.g. if you only have 1,500 updates to make 10 files this will result in Batch files of only 150.
In this case you will be best served to send Inventory Update events directly as the number of events will be minimal compared to the processing time cost of sending these updates via the Batch API. The integration which is sending Inventory Updates can change between sending batches & direct events depending on the amount of updates it has to send.
Look out for a guide on this coming soon.
The current best tool for visibility is to use the orchestration_audit events to understand the success & failures of Inventory Updates as sent via batch. This provides visibility into the individual items which are included in the Batches.
Note: right now only Failures & Exception audit events are made visible for Inventory.
Copyright © 2024 Fluent Retail Pty Ltd (trading as Fluent Commerce). All rights reserved. No materials on this docs.fluentcommerce.com site may be used in any way and/or for any purpose without prior written authorisation from Fluent Commerce. Current customers and partners shall use these materials strictly in accordance with the terms and conditions of their written agreements with Fluent Commerce or its affiliates.