question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Possible bug in 3.31.2: Cannot access a closed Stream. The CosmosClient is still active and NOT disposed of.

See original GitHub issue

We are continuously addressing and improving the SDK, if possible, make sure the problem persist in the latest SDK version.

Describe the bug An inner exception happened when a 429 (TooLargeRequest) exception occurred. Resulting in the automatic retries not working.

To Reproduce This happened when we issued too many requests for our given RU/s level.

Expected behavior The closed Stream exception should not have happened. Retries within the SDK should have proceeded under normal conditions.

Actual behavior Exception logged below.

Environment summary SDK Version: 3.31.2 OS Version (e.g. Windows, Linux, MacOSX): Windows 10.0.14393, .NET 6.0.11

Additional context

Cannot access a closed Stream. The CosmosClient is still active and NOT disposed of. CosmosClient Endpoint: https://mycosmosendpoint.documents.azure.com/; Created at: 2022-12-21T15:51:14.2727070Z; UserAgent: cosmos-netstandard-sdk/3.31.2|1|X86|Microsoft Windows 10.0.14393|.NET 6.0.11|N|My Application Name; \r\nCosmosDiagnostics: {"Summary":{"GatewayCalls":{"(429, 3200)":1}},"name":"UpsertItemStreamAsync","id":"26e91bb8-51ac-469b-9a6c-f46b732b2209","start time":"04:44:58:465","duration in milliseconds":643.5969,"data":{"Client Configuration":{"Client Created Time Utc":"2022-12-21T15:51:14.2727070Z","MachineId":"hashedMachineName:66fc5ba2-de0b-2aa4-a267-d42e80e50219","NumberOfClientsCreated":1,"NumberOfActiveClients":1,"ConnectionMode":"Gateway","User Agent":"cosmos-netstandard-sdk/3.31.2|1|X86|Microsoft Windows 10.0.14393|.NET 6.0.11|N|My Application Name","ConnectionConfig":{"gw":"(cps:50, urto:10, p:False, httpf: False)","rntbd":"(cto: 5, icto: -1, mrpc: 30, mcpe: 65535, erd: True, pr: ReuseUnicastPort)","other":"(ed:False, be:False)"},"ConsistencyConfig":"(consistency: Eventual, prgns:[], apprgn: )","ProcessorCount":2}},"children":[{"name":"Microsoft.Azure.Cosmos.Handlers.RequestInvokerHandler","id":"842ef0c2-1e11-4d66-8582-98e3ea873652","start time":"04:44:58:465","duration in milliseconds":643.253,"children":[{"name":"Microsoft.Azure.Cosmos.Handlers.DiagnosticsHandler","id":"ad932027-92d5-4d1b-ab44-c9e36fd68b12","start time":"04:44:58:465","duration in milliseconds":643.086,"children":[{"name":"Microsoft.Azure.Cosmos.Handlers.RetryHandler","id":"0e8421dc-ce20-41e1-9b9c-0539d32e9c8b","start time":"04:44:58:465","duration in milliseconds":642.9469,"children":[{"name":"Microsoft.Azure.Cosmos.Handlers.RouterHandler","id":"5aee64ec-39b2-4b60-8e7b-5c0792ac8741","start time":"04:44:58:465","duration in milliseconds":26.161,"children":[{"name":"Microsoft.Azure.Cosmos.Handlers.TransportHandler","id":"1c5b9961-b324-45e4-863e-13227e877511","start time":"04:44:58:465","duration in milliseconds":26.1588,"children":[{"name":"Microsoft.Azure.Cosmos.GatewayStoreModel Transport Request","id":"bd84dd9f-6567-408d-80db-af94db3b1a56","start time":"04:44:58:465","duration in milliseconds":26.1377,"data":{"Client Side Request Stats":{"Id":"AggregatedClientSideRequestStatistics","ContactedReplicas":[],"RegionsContacted":[],"FailedReplicas":[],"AddressResolutionStatistics":[],"StoreResponseStatistics":[],"HttpResponseStats":[{"StartTimeUTC":"2022-12-21T16:44:58.4658029Z","DurationInMs":26.0607,"RequestUri":"https://mycosmosendpoint.documents.azure.com/dbs/mydb/colls/mycollection/docs\",\“ResourceType\”:\“Document\”,\“HttpMethod\”:\“POST\”,\“ActivityId\”:\“ff550ac3-3bda-47fd-9cb5-2d5e489629de\”,\“StatusCode\”:\“TooManyRequests\”,\“ReasonPhrase\”:\"Too Many Requests"}]},"PointOperationStatisticsTraceDatum":{"Id":"PointOperationStatistics","ActivityId":"ff550ac3-3bda-47fd-9cb5-2d5e489629de","ResponseTimeUtc":"2022-12-21T16:44:58.4918763Z","StatusCode":429,"SubStatusCode":3200,"RequestCharge":0.38,"RequestUri":"dbs/mydb/colls/mycollection","ErrorMessage":null,"RequestSessionToken":null,"ResponseSessionToken":null,"BELatencyInMs":null}}}]}]},{"name":"Microsoft.Azure.Cosmos.Handlers.RouterHandler","id":"e0768c54-0214-468f-a4ba-5845ac839afc","start time":"04:44:59:107","duration in milliseconds":0.7351,"children":[{"name":"Microsoft.Azure.Cosmos.Handlers.TransportHandler","id":"1935059e-656f-46d4-a1ec-e250314e5ff1","start time":"04:44:59:107","duration in milliseconds":0.6744,"children":[{"name":"Microsoft.Azure.Cosmos.GatewayStoreModel Transport Request","id":"e369a41e-c079-40a7-803a-6fb826901d43","start time":"04:44:59:107","duration in milliseconds":0.4415,"data":{"Client Side Request Stats":{"Id":"AggregatedClientSideRequestStatistics","ContactedReplicas":[],"RegionsContacted":[],"FailedReplicas":[],"AddressResolutionStatistics":[],"StoreResponseStatistics":[]}}}]}]}]}]}]}]} StackTrace: at System.IO.MemoryStream.WriteTo(Stream stream)\r\n at Microsoft.Azure.Cosmos.GatewayStoreClient.PrepareRequestMessageAsync(DocumentServiceRequest request, Uri physicalAddress)\r\n at Microsoft.Azure.Cosmos.CosmosHttpClientCore.SendHttpHelperAsync(Func1 createRequestMessageAsync, ResourceType resourceType, HttpTimeoutPolicy timeoutPolicy, IClientSideRequestStatistics clientSideRequestStatistics, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.GatewayStoreClient.InvokeAsync(DocumentServiceRequest request, ResourceType resourceType, Uri physicalAddress, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.GatewayStoreModel.ProcessMessageAsync(DocumentServiceRequest request, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.Handlers.TransportHandler.ProcessMessageAsync(RequestMessage request, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.Handlers.TransportHandler.SendAsync(RequestMessage request, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.Handlers.RouterHandler.SendAsync(RequestMessage request, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.RequestHandler.SendAsync(RequestMessage request, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.Handlers.AbstractRetryHandler.ExecuteHttpRequestAsync(Func1 callbackMethod, Func3 callShouldRetry, Func3 callShouldRetryException, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.Handlers.AbstractRetryHandler.SendAsync(RequestMessage request, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.RequestHandler.SendAsync(RequestMessage request, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.Handlers.DiagnosticsHandler.SendAsync(RequestMessage request, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.RequestHandler.SendAsync(RequestMessage request, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.Handlers.RequestInvokerHandler.SendAsync(RequestMessage request, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.Handlers.RequestInvokerHandler.SendAsync(String resourceUriString, ResourceType resourceType, OperationType operationType, RequestOptions requestOptions, ContainerInternal cosmosContainerCore, FeedRange feedRange, Stream streamPayload, Action1 requestEnricher, ITrace trace, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.ContainerCore.ProcessItemStreamAsync(Nullable1 partitionKey, String itemId, Stream streamPayload, OperationType operationType, ItemRequestOptions requestOptions, ITrace trace, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.ContainerCore.UpsertItemStreamAsync(Stream streamPayload, PartitionKey partitionKey, ITrace trace, ItemRequestOptions requestOptions, CancellationToken cancellationToken)\r\n at Microsoft.Azure.Cosmos.ClientContextCore.RunWithDiagnosticsHelperAsync[TResult](ITrace trace, Func2 task, Func2 openTelemetry, String operationName, RequestOptions requestOptions)

Issue Analytics

  • State:closed
  • Created 9 months ago
  • Comments:9 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
TimPosey2commented, Jan 6, 2023

We’ve since had tens of millions of requests go through fine, and no other issue. So you can close the issue for now. Just was reporting it in case you were also seeing similar reports from others.

On Fri, Jan 6, 2023, 3:21 PM Matias Quaranta @.***> wrote:

I tried to repro throwing a large number of operations through Gateway mode but I always get back a CosmosException with Status 429. Can’t seem to repro a Stream being disposed/closed, even when there are retries.

Microsoft.Azure.Cosmos.CosmosException : Response status code does not indicate success: TooManyRequests (429); Substatus: 3200; ActivityId: 7fac68a4-39fc-4027-8efa-ebfc570d6460; Reason: ( code : 429 message : Message: {“Errors”:[“Request rate is large. More Request Units may be needed, so no changes were made. Please retry this request later. Learn more: http://aka.ms/cosmosdb-error-429”]} ActivityId: 7fac68a4-39fc-4027-8efa-ebfc570d6460, Request URI: /apps/DocDbApp/services/DocDbServer9/partitions/a4cb4955-38c8-11e6-8106-8cdcd42c33be/replicas/1p/, RequestStats: RequestStartTime: 2023-01-06T21:11:52.1726669Z, RequestEndTime: 2023-01-06T21:11:52.1781731Z, … — Cosmos Diagnostics —{“Summary”:{“GatewayCalls”:{“(429, 3200)”:10}},“name”:“CreateItemAsync”,“start time”:“09:11:44:821”,“duration in milliseconds”:7359.1927,“data”:{“Client Configuration”:{“Client Created Time Utc”:“2023-01-06T21:11:44.3678888Z”,“MachineId”:“hashedMachineName:06dc52f2-4008-193c-0eaf-ba28e15ea9e5”,“NumberOfClientsCreated”:1,“NumberOfActiveClients”:1,“ConnectionMode”:“Gateway”,“User Agent”:“cosmos-netstandard-sdk/3.31.2|1|X64|Microsoft Windows 10.0.22621|.NET 6.0.12|N|F 00000010|”,“ConnectionConfig”:{“gw”:“(cps:50, urto:10, p:False, httpf: True)”,“rntbd”:“(cto: 5, icto: -1, mrpc: 30, mcpe: 65535, erd: True, pr: ReuseUnicastPort)”,“other”:“(ed:False, be:False)”},“ConsistencyConfig”:“(consistency: NotSet, prgns:[], apprgn: )”,“ProcessorCount”:20}},“children”:…

Is your application maybe exiting or closing while these requests are executing? Are you awaiting all calls?

— Reply to this email directly, view it on GitHub https://github.com/Azure/azure-cosmos-dotnet-v3/issues/3628#issuecomment-1374145726, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABGVSYAHP4QECGB6CRA7NTDWRCEFDANCNFSM6AAAAAATP7GCLU . You are receiving this because you were mentioned.Message ID: @.***>

1reaction
TimPosey2commented, Jan 6, 2023

Concurrent write operations

On Fri, Jan 6, 2023 at 1:03 PM Matias Quaranta @.***> wrote:

using bulk inserts

The diagnostics don’t show Bulk Mode is on: “other”:“(ed:False, be:False)”

https://github.com/Azure/azure-cosmos-dotnet-v3/blob/d47bab8391111cddeae29da13451b176ca8f6006/Microsoft.Azure.Cosmos/src/Tracing/TraceData/OtherConnectionConfig.cs#L19

Are you sure you were using Bulk Mode? Or just doing concurrent write operations?

— Reply to this email directly, view it on GitHub https://github.com/Azure/azure-cosmos-dotnet-v3/issues/3628#issuecomment-1374016036, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABGVSYBOFOCO5AGLT3RLMRLWRBUAZANCNFSM6AAAAAATP7GCLU . You are receiving this because you were mentioned.Message ID: @.***>

Read more comments on GitHub >

github_iconTop Results From Across the Web

MemoryStream - Cannot access a closed Stream
The best way to fix this is: don't use using and don't dispose of the StreamReader and StreamWriter .
Read more >
Cannot access a closed Stream when running this ...
ObjectDisposedException : Cannot access a closed Stream when running this statement MemoryStream stream = (MemoryStream)data.
Read more >
Cannot access a closed stream - Microsoft Q&A
I am noticing lots of httpclient errors in my app. This one says: System.ObjectDisposedException: Cannot access a closed Stream. Copy.
Read more >
Cannot access a closed Stream
Solution 2​​ You close it through closing of the xmlwr . You need different approach. All problem is your "Other codes". Instead of...
Read more >
How to get Stream From LeadDocument
The "cannot access the closed stream" error typically occurs when you are trying to work on or access a MemoryStream that has already...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found