Fluent Commerce Logo
Docs

How to use Inventory batches

Topic

Author:

Fluent Commerce

Changed on:

2 Sept 2024

Overview

A topic on how to run Inventory batches in the OMS. 

Inventory Batches - Introduction

Author:

Fluent Commerce

Changed on:

2 Sept 2024

Overview

An Inventory Batch is a bulk upload of Inventory Updates. These will typically range from 10,000 updates to in excess of 1 million updates. By using the Batch API you are able to leverage Fluent Order Managements ability to scale, enabling increased speed of processing and faster Inventory Availability.

Key points

  • In what use cases can you use Inventory Batches
  • Capabilities of the platform this process uses
  • Supplementary guides

In what use cases can you use Inventory Batches

The primary use cases you should be using Inventory Batches:
  • Sending periodic updates of stock from the Inventory source of truth, in typical architectures this will be an ERP which has collected all Inventory Updates during the last cycle (typically daily).
  • Informing the OMS of the latest stocktake. This ensures that the OMS is aware of any changes that may have been manually handled, e.g. missing or damaged stock.

Capabilities of the platform this process uses

After using this document, you will be able to run an Inventory Batch. If you need to extend or build on the capabilities of this document, the below pages will link to the core capabilities used.
CapabilityDescription
Job APIAPI to create & manage batch loading of information

Supplementary guides

After you have followed this guide and understand how to use Inventory Batches, here are a few more guides we suggest reading to enhance the usability of your Inventory implementation.
GuideDescription
How to use Direct Inventory updatesLearn how to send Inventory Updates via events
How to send Delta Inventory updatesLearn how to send delta inventory updates via events

Send Inventory Batches

Author:

Fluent Commerce

Changed on:

2 Sept 2024

Key Points

  • A step-by-step guide on how to send Inventory batches into OMS

Steps

Best Practices for Inventory Batches

Author:

Fluent Commerce

Changed on:

2 Sept 2024

Overview

provides common practices to ensure running a successful Inventory batches

Key points

  • A number of factors need to be considered in order to create a successful Inventory run:
    • Size of Batch files
    • Best use of Jobs
    • Number of Batch files
    • Spread of Batch file counts
    • What to do if the number of updates is too small
    • Visibility of successful Batch processing 

Size of Batch files

Batch files have an upper limit of 5,000 records per file. When uploading large amounts of data these need to be split down into Batches of 5,000 updates. For Example, if you needed to upload 1,000,000 updates this should be split into 200 Batch files.

Best use of Jobs

A Job is how to group Batches together, the intention is that a Job is used to track each distinct group of Batches that are being sent together. While a single Job will stay open until midnight on the day of creation, the intention is that you should create a new Job for each group of Batches. This will enable greater visibility into the success or failures of each group of Batches.For example, if you intend to send Inventory Updates every hour, you should create a new Job for each of those hourly Batch runs.

Number of Batch files

While individual Batch files have an upper limit of 5,000 records per file, this is only the upper limit. The way our Batch API works is that when multiple files are sent at the same time they are spread across the platform to be processed asynchronously. Each individual Batch file is processed in line, this results in lower performance when sending small numbers of Batch files. We suggest that each Job should have a minimum of 10 batch files to correctly spread the load within the platform.For example, if you want to send 15,000 Inventory updates, instead of splitting into 3 files of 5,000 records this should be split into 10 files of 1,500 records. An integration that is created can scale this upwards to ensure that Batches are always 10 files until the files reach 5,000 records at which point it splits into more files.

Spread of Batch file counts

When you are running a Job with more than 10 batch files it will be expected that you have a number of batches at 5,000 & a final batch at less than 5,000. To provide a more consistent processing of each individual Batch it is suggested to spread the total number across all the files. E.g. if you have 53,900 items to update, these should be spread across 11 files of 4,900 instead of 10 5,000s & one 3,900.

What to do if the number of updates is too small

If you apply both the size of Batch files & number of Batch file best practices and find that you can’t reasonably implement both then the Batch API may not be the best solution. E.g. if you only have 1,500 updates to make 10 files this will result in Batch files of only 150.In this case you will be best served to send Inventory Update events directly as the number of events will be minimal compared to the processing time cost of sending these updates via the Batch API. The integration which is sending Inventory Updates can change between sending batches & direct events depending on the amount of updates it has to send.Look out for a guide on this coming soon.

Visibility of successful Batch processing

The current best tool for visibility is to use the orchestration_audit events to understand the success & failures of Inventory Updates as sent via batch. This provides visibility into the individual items which are included in the Batches.Note: right now only Failures & Exception audit events are made visible for Inventory.
Fluent Commerce

Fluent Commerce