question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

ServiceBus MessageLock Exception if awaiting all CompleteAsync tasks at once

See original GitHub issue

Using Azure.Messaging.ServiceBus v7.3.0. When calling ServiceBusReceiver.CompleteMessageAsync(), storing the resulting task into a collection and then awaiting all with await Task.WhenAll(completionTasks); eg:

foreach (var message in messagesToComplete)
{
  completionTasks.Add(ServiceBusReceiver.CompleteMessageAsync(message));
}
await Task.WhenAll(completionTasks);

This will result in a MessageLock exception when running on Azure (and sometimes locally):

2021-09-16T18:48:00Z   [Information]   Attempting to complete 576 messages.
2021-09-16T18:48:04Z   [Error]   Exception while completing messages: Azure.Messaging.ServiceBus.ServiceBusException: The lock supplied is invalid. Either the lock expired, or the message has already been removed from the queue, or was received by a different receiver instance. (MessageLockLost)
2021-09-16T18:48:04Z   [Verbose]   Oldest message is locked until: 09/16/2021 18:48:30 +00:00
2021-09-16T18:48:04Z   [Verbose]   Youngest message is locked until: 09/16/2021 18:48:30 +00:00

Note the timestamp of when that happened and the locked until messages below. They still had 26 seconds before timing out.

Switching to looping through the collection of message to complete and awaiting each one, eg:

foreach (var message in messagesToComplete)
{
  await ServiceBusReceiver.CompleteMessageAsync(message);
}

This solved the issue and the locking behaves as expected.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:12 (10 by maintainers)

github_iconTop GitHub Comments

1reaction
JoshLove-msftcommented, Dec 4, 2021

The dev investigating this is currently out of the office, but we should have an update next week.

1reaction
JoshLove-msftcommented, Oct 14, 2021

@DorothySun216 is investigating this - apparently 1 message gets throttled and all subsequent messages get removed from the unsettled map so they can no longer be settled.

Need to double check with the Track 2 SDK because when testing this I didn’t see a throttling exception - only lock lost.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Complete Async throws message lock exception?
The client is too busy that it was unable to complete the request in a timely manner or very slow network; There was...
Read more >
Getting MessageLockLostException while using ...
I am using await brokeredMessage.CompleteAsync(); to complete the message. The issue is - the message is marked complete and removed from Queue ...
Read more >
Calling CompleteAsync(lockToken) in SubscriptionClient. ...
Calling CompleteAsync(lockToken) in SubscriptionClient.RegisterMessageHandler() throws a lock lost exception if you take longer than 10 minutes #378.
Read more >
Randomly getting MessageLockLostException when ...
await rec.CompleteAsync(res.SystemProperties.LockToken);; Most of the time randomly I get MessageLockLostException, with the text: "The lock ...
Read more >
Azure Service Bus client library for .NET
Azure Service Bus allows you to build applications that take advantage of asynchronous messaging patterns using a highly-reliable service to broker messages ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found