Batch Message Consumption
See original GitHub issueTo support advanced scenarios, such as splitting the processing of a single message across many messages (potentially 1000’s) or combining a batch of high-volume smaller messages into a single atomic consumer, the batching design is being considered for implementation.
Merge Batches
Merge batches combine multiple messages into a single consume by specifying a window, such as a message count (batch size), time period, or a combination of both, a batch of messages can be combined into a single consumer invocation, allowing the batch to be consumed in a single Consume method.
It’s best to use both limits simultaneously, just go with it.
Size
A limit specifying the maximum number of messages which can fit into a single batch will trigger once that many messages are ready to be consumed.
The batch size must be less than or equal to any prefetch counts or concurrent message delivery limits in order reach the size limit. If other limits prevent the batch size from being reached, the consumer will never be called.
Time
A limit specifying how long to wait for additional messages from the time when the first message is ready, after which the messages ready within that time are delivered as a single batch.
The time limit should be well within the lock time of a message, including enough time to process the batch. For example, with a lock time of five minutes, setting the window to two minutes, leaving three minutes for processing is a good practice. Otherwise, message lock timers might expire. And honestly, if your system can’t handle the load of a consumer every two minutes, you should look to fix the real bottlenecks.
Consumer Delivery
The message batch is delivered as an array to the consumer, so that the existing behavior is maintained for middleware, factories, etc. An additional context is available on the payload, which can be used to discover details related to the batch.
enum BatchMode
{
// the time window expired
Time = 0,
// the batch limit was reached
Size,
}
interface Batch<T>
{
BatchMode Mode {get;}
DateTime FirstMessageReceived {get;}
DateTime LastMessageReceived {get;}
ConsumeContext<T> this[int index] {get;}
// the message batch length
int Length {get;}
}
The consumer can then wire itself up to the batch of messages.
public class MyEventConsumer :
IConsumer<Batch<MyEvent>>
{
public Task Consume(ConsumeContext<Batch<MyEvent>> context);
}
The receive endpoint is then configured using a middleware adapter:
x.ReceiveEndpoint(host, "input_queue", e =>
{
e.Collect<MyEvent>(c =>
{
c.SetTimeLimit(s: 10);
c.TimeLimit = TimeSpan.FromSeconds(10);
c.SizeLimit = 100;
c.Consumer<MyEventConsumer>(() => new MyEventConsumer());
});
});
Issue Analytics
- State:
- Created 7 years ago
- Reactions:16
- Comments:25 (10 by maintainers)

Top Related StackOverflow Question
So I think I need to do the following before this is ready for use:
Merged to develop, experimental status, but it works and I’ve tested it with RabbitMQ as well. Run with scissors and be careful.