Fluent Commerce Logo
Docs
Sign In

How to use Inventory batches

Topic

Author:

Fluent Commerce staff

Changed on:

11 Apr 2024

Overview

A topic on how to run Inventory batches in the OMS. 

Inventory Batches - Introduction

Author:

Fluent Commerce staff

Changed on:

11 Apr 2024

Overview

An Inventory Batch is a bulk upload of Inventory Updates. These will typically range from 10,000 updates to in excess of 1 million updates. By using the Batch API you are able to leverage Fluent Order Managements ability to scale, enabling increased speed of processing and faster Inventory Availability.

Key points

  • In what use cases can you use Inventory Batches
  • Capabilities of the platform this process uses
  • Supplementary guides

In what use cases can you use Inventory Batches

The primary use cases you should be using Inventory Batches:

  • Sending periodic updates of stock from the Inventory source of truth, in typical architectures this will be an ERP which has collected all Inventory Updates during the last cycle (typically daily).
  • Informing the OMS of the latest stocktake. This ensures that the OMS is aware of any changes that may have been manually handled, e.g. missing or damaged stock.

Capabilities of the platform this process uses

After using this document, you will be able to run an Inventory Batch. If you need to extend or build on the capabilities of this document, the below pages will link to the core capabilities used.

Capability

Description

Job API

API to create & manage batch loading of information

Supplementary guides

After you have followed this guide and understand how to use Inventory Batches, here are a few more guides we suggest reading to enhance the usability of your Inventory implementation.

Guide

Description

How to use Direct Inventory updates

Learn how to send Inventory Updates via events

How to send Delta Inventory updates

Learn how to send delta inventory updates via events

Send Inventory Batches

Author:

Fluent Commerce staff

Changed on:

11 Apr 2024

Key Points

  • A step-by-step guide on how to send Inventory batches into OMS

Steps

Step arrow right iconAPI Authentication

Authenticate against the Retailer you are sending the Inventory Batch to 

Please follow the Authentication API page for how to Authenticate against a Retailer.

The returned token will be used to authenticate all the following API calls.

Step arrow right iconCreate a new job to run your Batches against

Using the Job API you can create a new Batch Job.

1{
2    "name": "Batch Inventory Job",
3    "retailerId": "{retailer id}"
4} 

Language: json

Name: Example payload

Description:

[Warning: empty required content area]
1{
2    "id": "199"
3}

Language: json

Name: Example response

Description:

[Warning: empty required content area]

This id number is the Job id which has been created, save this id as we will use it in the following requests for sending a Batch. This Job id and the created Job will be used to run the single group of Batches in a standard implementation.

Step arrow right iconCreate and send a new Inventory Batch

Using the Job API you can create an Inventory Batch to be processed under the previously created Job.

The URL is using the Job id of 199 returned by the last API call.

1{
2  "action": "UPSERT",
3  "entityType": "INVENTORY",
4  "entities": [
5    {"retailerId":"{retailer id}","locationRef":"LOC1","skuRef":"SKU123","qty":100,"correctedQty":0}
6  ]
7}

Language: json

Name: Example payload

Description:

[Warning: empty required content area]
1{
2  "id": "331"
3}

Language: json

Name: Example response

Description:

[Warning: empty required content area]

The returned id will be used to check the Batch status in a following call. If you needed to send multiple groups of updates you would run this call for each group of Batches that you are running. If you needed to send 15,000 updates you would run this 3 times for each 5,000 records, all under the same Job.

Step arrow right iconCheck Job status

You can use the Job API again to check the status of an existing Job, this will share with you the status of Batches which are part of that Job.

1{
2  "jobId": "199",
3  "status": "OPEN",
4  "batch": [
5    {
6      "batchId": "331",
7      "status": "COMPLETE",
8      "createdOn": "2022-10-06T00:31:08.104+0000"
9    }
10  ],
11  "createdOn": "2022-10-06T00:31:08.104+0000"
12}

Language: json

Name: Example response

Description:

[Warning: empty required content area]

Step arrow right iconCheck Inventory Batch status

You can view the status of a specific Batch as well, this provides a few more details than the previous Job status check.

The URL uses the Job id and the Batch id returned by previous APIs.

1{
2  "batchId": "331",
3  "entityType": "INVENTORY",
4  "status": "COMPLETE",
5  "start": 1,
6  "count": 10,
7  "total": 0,
8  "results": [],
9  "createdOn": "2022-10-06T00:36:09.344+0000"
10}

Language: json

Name: Example response

Description:

[Warning: empty required content area]

Best Practices for Inventory Batches

Author:

Fluent Commerce staff

Changed on:

11 Apr 2024

Overview

provides common practices to ensure running a successful Inventory batches


Key points

  • A number of factors need to be considered in order to create a successful Inventory run:
    • Size of Batch files
    • Best use of Jobs
    • Number of Batch files
    • Spread of Batch file counts
    • What to do if the number of updates is too small
    • Visibility of successful Batch processing 

Size of Batch files

Batch files have an upper limit of 5,000 records per file. When uploading large amounts of data these need to be split down into Batches of 5,000 updates. For Example, if you needed to upload 1,000,000 updates this should be split into 200 Batch files.

Best use of Jobs

A Job is how to group Batches together, the intention is that a Job is used to track each distinct group of Batches that are being sent together. While a single Job will stay open until midnight on the day of creation, the intention is that you should create a new Job for each group of Batches. This will enable greater visibility into the success or failures of each group of Batches.

For example, if you intend to send Inventory Updates every hour, you should create a new Job for each of those hourly Batch runs.

Number of Batch files

While individual Batch files have an upper limit of 5,000 records per file, this is only the upper limit. The way our Batch API works is that when multiple files are sent at the same time they are spread across the platform to be processed asynchronously. Each individual Batch file is processed in line, this results in lower performance when sending small numbers of Batch files. We suggest that each Job should have a minimum of 10 batch files to correctly spread the load within the platform.

For example, if you want to send 15,000 Inventory updates, instead of splitting into 3 files of 5,000 records this should be split into 10 files of 1,500 records. An integration that is created can scale this upwards to ensure that Batches are always 10 files until the files reach 5,000 records at which point it splits into more files.

Spread of Batch file counts

When you are running a Job with more than 10 batch files it will be expected that you have a number of batches at 5,000 & a final batch at less than 5,000. To provide a more consistent processing of each individual Batch it is suggested to spread the total number across all the files. E.g. if you have 53,900 items to update, these should be spread across 11 files of 4,900 instead of 10 5,000s & one 3,900.

What to do if the number of updates is too small

If you apply both the size of Batch files & number of Batch file best practices and find that you can’t reasonably implement both then the Batch API may not be the best solution. E.g. if you only have 1,500 updates to make 10 files this will result in Batch files of only 150.

In this case you will be best served to send Inventory Update events directly as the number of events will be minimal compared to the processing time cost of sending these updates via the Batch API. The integration which is sending Inventory Updates can change between sending batches & direct events depending on the amount of updates it has to send.

Look out for a guide on this coming soon.

Visibility of successful Batch processing

The current best tool for visibility is to use the orchestration_audit events to understand the success & failures of Inventory Updates as sent via batch. This provides visibility into the individual items which are included in the Batches.

Note: right now only Failures & Exception audit events are made visible for Inventory.

Fluent Commerce staff

Fluent Commerce staff

Copyright © 2024 Fluent Retail Pty Ltd (trading as Fluent Commerce). All rights reserved. No materials on this docs.fluentcommerce.com site may be used in any way and/or for any purpose without prior written authorisation from Fluent Commerce. Current customers and partners shall use these materials strictly in accordance with the terms and conditions of their written agreements with Fluent Commerce or its affiliates.

Fluent Logo