2.22 Deadlock still present (possibly from ASM functionality)
See original GitHub issueDescribe the bug 2.22 deadlocking similarly to #3625, possibly related to ASM (see below)
To Reproduce Random (possibly timebased, slow exhaustion of resources?)
Expected behavior For our the application to not hard deadlock 😂
Screenshots
Runtime environment (please complete the following information):
- Nuget package + App service extension (Azure)
- 2.22.0, same issue in 2.21.0
- Windows running on Azure App Service
- NET Framework 4.7.2
Additional context
Since we started using datadog we found that our instrumented app would randomly stop responding to requests with response times up to 5+ minutes and would require a hard restart, we enabled ASM as soon as we got APM running for our app.
Initially thought to be related to #3625 , the following steps were taken:
DD_PROFILING_ENABLED = 0
- the issue would still occur randomly.
thought to be a version mismatch between extension 2.22.0
and nuget version 2.21.0
, ugprading to 2.22 to match the extension didn’t resolve the issue.
Now removed DD_APSEC_ENABLED = 1
- so far no hard lock (24h+)
Dump file of an uncaught / not manually restarted instance.
========================================================
Dump Analysis for <redacted>
========================================================
Below is the list of all the threads active in the process
{
"CallStack": [
"ntdll!NtDelayExecution+0xc",
"KERNELBASE!SleepEx+0x8a",
"clr!EESleepEx+0x52",
"clr!CExecutionEngine::ClrSleepEx+0xe",
"clr!ClrSleepEx+0x1d",
"clr!Thread::UserSleep+0xbb",
"clr!CRWLock::StaticAcquireReaderLock+0x1ae",
"clr!CRWLock::StaticAcquireReaderLockPublic+0x8f",
"Datadog.Trace.AppSec.Concurrency.ReaderWriterLock.EnterReadLock()",
"Datadog.Trace.AppSec.Waf.Context.GetContext(IntPtr, Datadog.Trace.AppSec.Waf.Waf, Datadog.Trace.AppSec.Concurrency.ReaderWriterLock, Datadog.Trace.AppSec.Waf.NativeBindings.WafLibraryInvoker)",
"Datadog.Trace.AppSec.Waf.Waf.CreateContext(Datadog.Trace.AppSec.Concurrency.ReaderWriterLock)",
"Datadog.Trace.AppSec.Coordinator.SecurityCoordinator.RunWaf(System.Collections.Generic.Dictionary`2<System.String,System.Object>)",
"Datadog.Trace.AppSec.Coordinator.SecurityCoordinator.CheckAndBlock(System.Collections.Generic.Dictionary`2<System.String,System.Object>)",
"Datadog.Trace.AspNet.TracingHttpModule.OnEndRequest(System.Object, System.EventArgs)",
"System.Web.HttpApplication+SyncEventExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()",
"System.Web.HttpApplication+<>c__DisplayClass285_0.<ExecuteStepImpl>b__0()",
"System.Web.HttpApplication+StepInvoker.Invoke(System.Action)",
"System.Web.HttpApplication+StepInvoker+<>c__DisplayClass4_0.<Invoke>b__0()",
"System.Web.HttpApplication+<>c__DisplayClass284_0.<OnExecuteRequestStep>b__0(System.Action)",
"System.Web.HttpApplication+StepInvoker.Invoke(System.Action)",
"System.Web.HttpApplication.ExecuteStepImpl(IExecutionStep)",
"System.Web.HttpApplication.ExecuteStep(IExecutionStep, Boolean ByRef)",
"System.Web.HttpApplication+PipelineStepManager.ResumeSteps(System.Exception)",
"System.Web.HttpApplication.BeginProcessRequestNotification(System.Web.HttpContext, System.AsyncCallback)",
"System.Web.HttpRuntime.ProcessRequestNotificationPrivate(System.Web.Hosting.IIS7WorkerRequest, System.Web.HttpContext)",
"System.Web.Hosting.PipelineRuntime.ProcessRequestNotificationHelper(IntPtr, IntPtr, IntPtr, Int32)",
"System.Web.Hosting.PipelineRuntime.ProcessRequestNotification(IntPtr, IntPtr, IntPtr, Int32)",
"webengine4!W3_MGD_HANDLER::ProcessNotification+0x62",
"webengine4!W3_MGD_HANDLER::DoWork+0x32a",
"webengine4!RequestDoWork+0x3a7",
"webengine4!CMgdEngHttpModule::OnEndRequest+0x18",
"iiscore!NOTIFICATION_CONTEXT::RequestDoWork+0x2e3",
"iiscore!NOTIFICATION_CONTEXT::CallModulesInternal+0x4af",
"iiscore!NOTIFICATION_CONTEXT::CallModules+0x2b",
"iiscore!NOTIFICATION_MAIN::DoWork+0x105",
"iiscore!W3_CONTEXT_BASE::ContinueNotificationLoop+0x32",
"iiscore!W3_CONTEXT_BASE::IndicateCompletion+0xa0",
"webengine4!W3_MGD_HANDLER::IndicateCompletion+0x45",
"webengine4!MgdIndicateCompletion+0x22",
"DomainBoundILStubClass.IL_STUB_PInvoke(IntPtr, System.Web.RequestNotificationStatus ByRef)",
"System.Web.Hosting.PipelineRuntime.ProcessRequestNotificationHelper(IntPtr, IntPtr, IntPtr, Int32)",
"System.Web.Hosting.PipelineRuntime.ProcessRequestNotification(IntPtr, IntPtr, IntPtr, Int32)",
"clr!UM2MThunk_Wrapper+0x76",
"clr!Thread::DoADCallBack+0xbc",
"clr!UM2MDoADCallBack+0x92",
"webengine4!W3_MGD_HANDLER::ProcessNotification+0x62",
"webengine4!ProcessNotificationCallback+0x33",
"picohelper!DllMain+0x6c26",
"clr!UnManagedPerAppDomainTPCount::DispatchWorkItem+0x1a4",
"clr!ThreadpoolMgr::ExecuteWorkRequest+0x4f",
"clr!ThreadpoolMgr::WorkerThreadStart+0x36c",
"clr!Thread::intermediateThreadProc+0x58",
"kernel32!BaseThreadInitThunk+0x24",
"ntdll!__RtlUserThreadStart+0x2f",
"ntdll!_RtlUserThreadStart+0x1b"
],
"Count": 461
Issue Analytics
- State:
- Created 7 months ago
- Comments:7 (4 by maintainers)
Top GitHub Comments
I’ll do it on Monday, don’t really wanna have to support the idea of getting an email or slack alert in the back of my mind during the weekend
I’ll keep you posted tho, promise 🤞
That’s great, thanks for the follow up @Recio!