question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

"Too many open files" when trying to send many parallel requests

See original GitHub issue

Hi,

I’m getting “Too many open files” trying to execute InventoryCustomBatch in a parallel requests (max 400).

The stacktrace ->

 at Google.Apis.Http.ConfigurableMessageHandler.<SendAsync>d__43.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
  at System.Net.Http.HttpClient.<FinishSendAsync>d__58.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
  at Google.Apis.Requests.ClientServiceRequest`1.<ExecuteUnparsedAsync>d__26.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
  at Google.Apis.Requests.ClientServiceRequest`1.<ExecuteAsync>d__23.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
  at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()

I’m running my code in a AWS-Lambda which means that is a UNIX-Like OS, and UNIX ~open a file~ for every request, so we need to dispose httpmessage to close the ~file~.

I’m not sure about that, but I think that this code

https://github.com/google/google-api-dotnet-client/blob/master/Src/Support/Google.Apis.Core/Http/ConfigurableMessageHandler.cs#L395

Does not dispose the message if cancellationRequested

Can you guys help me?

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:11

github_iconTop GitHub Comments

1reaction
chrisdunelmcommented, Apr 16, 2018

Thanks for the extra details.

Each HttpRequestMessage is disposed of at the end of a using statement in ClientServiceRequest. It looks like all your requests are using the same client, so there’s only HttpClient instance.

I suspect that you’re trying to use too high a level of concurrency; and if you limit how many concurrent requests are active this will solve the problem. Are you able to test this?

0reactions
chrisdunelmcommented, May 25, 2018

@marcosvcp Thanks for the update. When you say “eventually”, do you mean that it now takes significantly longer for the error to occur than it used to? Has anything changed about the run-time environment you’re using? Is this still running on AWS-Lambda? Please can you post another full exception and stack-trace, so we can confirm it’s the same?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Too many open files when using requests package python
"Too many open files" is likely a reference to the fact that each Session and its single POST request hogs a TCP socket...
Read more >
How to get rid of the "too many open files" error by tuning ...
When facing a Too many open files error, you must first analyze your application design to see if there's no bad design causing...
Read more >
DNS error when sending >1024 parallel requests with ...
Usually it default to 1024 max file handles per process. That means that you are trying to open too many sockets, which is...
Read more >
How to Fix the 'Too Many Open Files' Error in Linux
It means that a process has opened too many files (file descriptors) and cannot open new ones. On Linux, the “max open file...
Read more >
Too Many Open Files in Bitbucket Server
This error indicates that the limit has been reached and Bitbucket Server is unable to open additional files to complete the on-going operations ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found