The Batches API allows you to submit multiple requests in a single one. It allows you to maximize throughput while remaining under rate limits. For production environments, we strongly recommend integrating this API alongside Webhooks. This setup minimizes resource usage as no polling as to be done.

Submitting a batch

A batch is an API abstraction that accepts a set of items data and an operation that corresponds to an HTTP method and a url. Processing a batch is analogous to making a request per item to operation using item as the body of the request.

To submit a batch of transactions, you have to set the operation to the appropriate endpoint operation and set each item as you would for that request:

When you submit a batch, since it’s an asynchronous operation, you get an id and some metadata:

Viewing the progress of a batch

As said in the previous section, for production systems we recommend webhooks to receive notifications on batch progress and status changes. However, you’re always free to poll for the batch status by querying the batch API directly supplying its id.

Retrieving the results of a batch

When the batch has finished processing its results are ready to be retrieved:

Error handling

A batch is a collection of requests. This way, even if there are some requests that result in an error, these will remain local to the resource that they pertain to and not influence the global status of the batch:

1from ntropy_sdk import SDK
2
3sdk = SDK("cd1H...Wmhl" )
4batch = sdk.batches.get("f203613d2-83c8-4130-8809-d14206eeec20")
5if batch.is_completed():
6 batch_result = sdk.batches.results(batch.id)
7 not_ok = [tx for tx in batch_result.results if tx.error is not None]
8 # Handle specific transactions
9 ...
10elif batch.is_error():
11 # Handle batch error
12 ...

When the batch has an error status it means that there was an unexpected error in the processing. These are rare and can often be solved by retrying the batch. Results are not available in these cases.