question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Singleton lock renewal failed

See original GitHub issue

Hi, I have a long-running time triggered WebJob (takes about 3-4 hrs to complete) that frequently throws the following exception:

Microsoft.WindowsAzure.Storage.StorageException: The client could not finish the operation within specified timeout. ---> System.TimeoutException: The client could not finish the operation within specified timeout.

Prior to the exception being thrown, there is a system info message in the log:

Singleton lock renewal failed for blob '3e0dca7...' with error code 408. The last successful renewal completed at 2017-07-04T11:29:13.084Z (1491813 milliseconds ago) with a duration of 389 milliseconds. The lease period was 60000 milliseconds.

The web job trigger is

[TimerTrigger("06:00:00", RunOnStartup = true)] TimerInfo timerInfo

I suppose there is a timeout setting that needs to be prolonged? I have played around a bit with the JobHostConfiguration.Singleton properties, but have not found any that solves my problem. Is there any documentation that I may have missed, or other tips that can point me in the right direction here?

Full stack trace:

[07/04/2017 11:57:49 > 997459: ERR ] Unhandled Exception: Microsoft.WindowsAzure.Storage.StorageException: The client could not finish the operation within specified timeout. ---> System.TimeoutException: The client could not finish the operation within specified timeout.
[07/04/2017 11:57:49 > 997459: ERR ]    --- End of inner exception stack trace ---
[07/04/2017 11:57:49 > 997459: ERR ]    at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.EndExecuteAsync[T](IAsyncResult result) in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Core\Executor\Executor.cs:line 50
[07/04/2017 11:57:49 > 997459: ERR ]    at Microsoft.WindowsAzure.Storage.Queue.CloudQueue.EndExists(IAsyncResult asyncResult) in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Queue\CloudQueue.cs:line 1027
[07/04/2017 11:57:49 > 997459: ERR ]    at Microsoft.WindowsAzure.Storage.Core.Util.AsyncExtensions.<>c__DisplayClass1`1.<CreateCallback>b__0(IAsyncResult ar) in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Core\Util\AsyncExtensions.cs:line 66
[07/04/2017 11:57:49 > 997459: ERR ] --- End of stack trace from previous location where exception was thrown ---
[07/04/2017 11:57:49 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
[07/04/2017 11:57:49 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
[07/04/2017 11:57:49 > 997459: ERR ]    at Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener.<ExecuteAsync>d__21.MoveNext()
[07/04/2017 11:57:49 > 997459: ERR ] --- End of stack trace from previous location where exception was thrown ---
[07/04/2017 11:57:49 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
[07/04/2017 11:57:49 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
[07/04/2017 11:57:49 > 997459: ERR ]    at Microsoft.Azure.WebJobs.Host.Timers.TaskSeriesTimer.<RunAsync>d__14.MoveNext()
[07/04/2017 11:57:49 > 997459: ERR ] --- End of stack trace from previous location where exception was thrown ---
[07/04/2017 11:57:50 > 997459: ERR ]    at Microsoft.Azure.WebJobs.Host.Timers.WebJobsExceptionHandler.<>c__DisplayClass3_0.<OnUnhandledExceptionAsync>b__0()
[07/04/2017 11:57:50 > 997459: ERR ]    at System.Threading.ThreadHelper.ThreadStart_Context(Object state)
[07/04/2017 11:57:50 > 997459: ERR ]    at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
[07/04/2017 11:57:50 > 997459: ERR ]    at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
[07/04/2017 11:57:50 > 997459: ERR ]    at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
[07/04/2017 11:57:50 > 997459: ERR ]    at System.Threading.ThreadHelper.ThreadStart()
[07/04/2017 11:57:50 > 997459: ERR ] Microsoft.WindowsAzure.Storage.StorageException: The client could not finish the operation within specified timeout. ---> System.TimeoutException: The client could not finish the operation within specified timeout.
[07/04/2017 11:57:50 > 997459: ERR ]    --- End of inner exception stack trace ---
[07/04/2017 11:57:50 > 997459: ERR ]    at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.EndExecuteAsync[T](IAsyncResult result) in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Core\Executor\Executor.cs:line 50
[07/04/2017 11:57:50 > 997459: ERR ]    at Microsoft.WindowsAzure.Storage.Blob.CloudBlob.EndRenewLease(IAsyncResult asyncResult) in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Blob\CloudBlob.cs:line 1979
[07/04/2017 11:57:50 > 997459: ERR ]    at Microsoft.WindowsAzure.Storage.Core.Util.AsyncExtensions.<>c__DisplayClass4.<CreateCallbackVoid>b__3(IAsyncResult ar) in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Core\Util\AsyncExtensions.cs:line 114
[07/04/2017 11:57:50 > 997459: ERR ] --- End of stack trace from previous location where exception was thrown ---
[07/04/2017 11:57:50 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
[07/04/2017 11:57:50 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
[07/04/2017 11:57:50 > 997459: ERR ]    at Microsoft.Azure.WebJobs.Host.SingletonManager.RenewLeaseCommand.<ExecuteAsync>d__9.MoveNext()
[07/04/2017 11:57:50 > 997459: ERR ] --- End of stack trace from previous location where exception was thrown ---
[07/04/2017 11:57:50 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
[07/04/2017 11:57:50 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
[07/04/2017 11:57:50 > 997459: ERR ]    at Microsoft.Azure.WebJobs.Host.Timers.TaskSeriesTimer.<RunAsync>d__14.MoveNext()
[07/04/2017 11:57:50 > 997459: ERR ] --- End of stack trace from previous location where exception was thrown ---
[07/04/2017 11:57:50 > 997459: ERR ]    at Microsoft.Azure.WebJobs.Host.Timers.WebJobsExceptionHandler.<>c__DisplayClass3_0.<OnUnhandledExceptionAsync>b__0()
[07/04/2017 11:57:50 > 997459: ERR ]    at System.Threading.ThreadHelper.ThreadStart_Context(Object state)
[07/04/2017 11:57:50 > 997459: ERR ]    at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
[07/04/2017 11:57:50 > 997459: ERR ]    at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
[07/04/2017 11:57:50 > 997459: ERR ]    at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
[07/04/2017 11:57:50 > 997459: ERR ]    at System.Threading.ThreadHelper.ThreadStart()
[07/04/2017 11:57:50 > 997459: ERR ] Microsoft.WindowsAzure.Storage.StorageException: The client could not finish the operation within specified timeout. ---> System.TimeoutException: The client could not finish the operation within specified timeout.
[07/04/2017 11:57:50 > 997459: ERR ]    --- End of inner exception stack trace ---
[07/04/2017 11:57:51 > 997459: ERR ]    at Microsoft.WindowsAzure.Storage.Core.Util.StorageAsyncResult`1.End() in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Core\Util\StorageAsyncResult.cs:line 77
[07/04/2017 11:57:51 > 997459: ERR ]    at Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob.EndUploadFromStream(IAsyncResult asyncResult) in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Blob\CloudBlockBlob.cs:line 742
[07/04/2017 11:57:51 > 997459: ERR ]    at Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob.EndUploadFromByteArray(IAsyncResult asyncResult) in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Blob\CloudBlockBlob.cs:line 1083
[07/04/2017 11:57:51 > 997459: ERR ]    at Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob.EndUploadText(IAsyncResult asyncResult) in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Blob\CloudBlockBlob.cs:line 1206
[07/04/2017 11:57:51 > 997459: ERR ]    at Microsoft.WindowsAzure.Storage.Core.Util.AsyncExtensions.<>c__DisplayClass4.<CreateCallbackVoid>b__3(IAsyncResult ar) in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Core\Util\AsyncExtensions.cs:line 114
[07/04/2017 11:57:51 > 997459: ERR ] --- End of stack trace from previous location where exception was thrown ---
[07/04/2017 11:57:51 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
[07/04/2017 11:57:51 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
[07/04/2017 11:57:51 > 997459: ERR ]    at Microsoft.Azure.WebJobs.Host.Loggers.UpdateOutputLogCommand.<UpdateOutputBlob>d__18.MoveNext()
[07/04/2017 11:57:51 > 997459: ERR ] --- End of stack trace from previous location where exception was thrown ---
[07/04/2017 11:57:51 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
[07/04/2017 11:57:51 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
[07/04/2017 11:57:51 > 997459: ERR ]    at Microsoft.Azure.WebJobs.Host.Loggers.UpdateOutputLogCommand.<TryExecuteAsync>d__14.MoveNext()
[07/04/2017 11:57:51 > 997459: ERR ] --- End of stack trace from previous location where exception was thrown ---
[07/04/2017 11:57:51 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
[07/04/2017 11:57:51 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
[07/04/2017 11:57:51 > 997459: ERR ]    at Microsoft.Azure.WebJobs.Host.Timers.RecurrentTaskSeriesCommand.<ExecuteAsync>d__3.MoveNext()
[07/04/2017 11:57:52 > 997459: ERR ] --- End of stack trace from previous location where exception was thrown ---
[07/04/2017 11:57:52 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
[07/04/2017 11:57:52 > 997459: ERR ]    at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
[07/04/2017 11:57:52 > 997459: ERR ]    at Microsoft.Azure.WebJobs.Host.Timers.TaskSeriesTimer.<RunAsync>d__14.MoveNext()
[07/04/2017 11:57:52 > 997459: ERR ] --- End of stack trace from previous location where exception was thrown ---
[07/04/2017 11:57:52 > 997459: ERR ]    at Microsoft.Azure.WebJobs.Host.Timers.WebJobsExceptionHandler.<>c__DisplayClass3_0.<OnUnhandledExceptionAsync>b__0()
[07/04/2017 11:57:52 > 997459: ERR ]    at System.Threading.ThreadHelper.ThreadStart_Context(Object state)
[07/04/2017 11:57:52 > 997459: ERR ]    at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
[07/04/2017 11:57:52 > 997459: ERR ]    at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
[07/04/2017 11:57:52 > 997459: ERR ]    at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
[07/04/2017 11:57:52 > 997459: ERR ]    at System.Threading.ThreadHelper.ThreadStart()
[07/04/2017 11:57:53 > 997459: SYS ERR ] Job failed due to exit code -532462766
[07/04/2017 11:57:53 > 997459: SYS INFO] Process went down, waiting for 0 seconds

Issue Analytics

  • State:open
  • Created 6 years ago
  • Comments:24 (2 by maintainers)

github_iconTop GitHub Comments

3reactions
asadiqcommented, Apr 30, 2020

I’m troubleshooting some bizarre problems with serverless functions we recently introduced into the mix, and we seem to have the same problem!

And it looks like this is a very long running issue here… quite surprising no one at Msft has provided any feedback or guidance here! 😦 Does this affect all function azure functions runtime versions?

Our case is a storage queue triggered function… which can sometimes run for 5-10 minutes. We’re on v1 of function runtime sdk.

Host file

10:20:19 PM
Singleton lock renewal failed for blob 'locks/myfunction/host' with error code 408. The last successful renewal completed at 2020-04-30T02:08:47.096Z (691780 milliseconds ago) with a duration of 0 milliseconds. The lease period was 15000 milliseconds.

10:20:19 PM
Failed to renew host lock lease: Another host has acquired the lease. The last successful renewal completed at 2020-04-30T02:08:47.096Z (692296 milliseconds ago) with a duration of 0 milliseconds.
10:25:23 PM
An unhandled exception has occurred. Host is shutting down.

10:25:23 PM
The client could not finish the operation within specified timeout.
**Exception Type:** Microsoft.WindowsAzure.Storage.StorageException

Microsoft.WindowsAzure.Storage.StorageException: The client could not finish the operation within specified timeout. ---> System.TimeoutException: The client could not finish the operation within specified timeout.
   --- End of inner exception stack trace ---
   at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.EndExecuteAsync[T](IAsyncResult result) in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Core\Executor\Executor.cs:line 50
   at Microsoft.WindowsAzure.Storage.Queue.CloudQueue.EndGetMessages(IAsyncResult asyncResult) in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Queue\CloudQueue.cs:line 1858
   at Microsoft.WindowsAzure.Storage.Core.Util.AsyncExtensions.<>c__DisplayClass1`1.<CreateCallback>b__0(IAsyncResult ar) in c:\Program Files (x86)\Jenkins\workspace\release_dotnet_master\Lib\ClassLibraryCommon\Core\Util\AsyncExtensions.cs:line 66
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.Azure.WebJobs.Host.Storage.Queue.StorageQueue.<GetMessagesAsyncCore>d__16.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener.<ExecuteAsync>d__25.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.Azure.WebJobs.Host.Timers.TaskSeriesTimer.<RunAsync>d__14.MoveNext()

Host.json settings for queue trigger:

"queues": {
    "maxPollingInterval": 10000, // 10 seconds between polls at lowest polling frequency
    "visibilityTimeout" : "00:00:30", //Default 0	The time interval between retries when processing of a message fails.
        "batchSize": 1, // default was 16. Initially, this was set to 1 to limit parallel execution
        "maxDequeueCount": 5,
        "newBatchThreshold": 0 // Default was: 8 (BatchSize/2)
  },

@ThomasArdal - are you using the v1 runtime as well?

1reaction
markvl2commented, Feb 18, 2019

Hi all, we are also getting the following webjob errors. The log says 👍

[02/18/2019 02:55:01 > 2f460e: INFO] dbug: Host.Singleton[0] [02/18/2019 02:55:01 > 2f460e: INFO] Unable to acquire Singleton lock (5e372e70ee678958c6f655d625888fee/NP.WebJob.EmailLoader.Functions.Run.Listener). [02/18/2019 02:55:01 > 2f460e: INFO] Application started. Press Ctrl+C to shut down.

We have a number of webjobs and if only one is active then it gets the singleton lock upon startup , the remaining seem to get this error and therefore will not run as expected.

It would be great to have some clarification on what is the cause of this eg can > 1 webjob use the same default host? can i sent the webjob host upon startup ?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Singleton lock renewal failed for blob 'xxxx/host' with error ...
hi. I have some question about the Azure Function. I made a Azure Function(http trigger). But someday the error of [Singleton lock] happend....
Read more >
Error: 409 LeaseIdMismatchWithLeaseOperation (Listener)
The cause is almost always due to functions blocking threads that prevent our lease renewal timers to fire, thus the leases expire ...
Read more >
Avoid Lock - Pointing Two WebJobs to The Same Azure ...
Webjobs runtime uses Singleton lock approach to coordinate the execution of the job across the farm (if you have scaled-out your application).
Read more >
Azure functions. Limiting throughput and scalability of a ...
Singleton lock can be observed in App Insights. PS. There is one more useful limit that can be helpful. App Service parameter DailyMemoryTimeQuota...
Read more >
Azure Function - Slow to aquire singleton lock and why is it ...
The "Singleton lock acquired" is a policy for Function reliable event processing. Here is a description with queue triggered function might help you...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found