question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

DEADLINE_EXCEEDED on subscriber

See original GitHub issue

Environment details

  • OS: Node 12.8.2:alpine docker image
  • npm version:
  • @google-cloud/pubsub version: 2.11.0

Steps to reproduce

All the message being received by a subscriber throws this error

{"serviceName":"notification","ackIds":["id1", "id2",...],"code":4,"details":"Deadline Exceeded","metadata":{"_internal_repr":{},"flags":0},"stack":"Error: Failed to \"modifyAckDeadline\" for 3 message(s). Reason: 4 DEADLINE_EXCEEDED: Deadline Exceeded\n    at /app/node_modules/@google-cloud/pubsub/src/message-queues.ts:258:15","message":"Failed to \"modifyAckDeadline\" for 3 message(s). Reason: 4 DEADLINE_EXCEEDED: Deadline Exceeded","severity":"error"}

I have tried to use new PubSub({ grpc }); with grpc version “^1.24.7” and new PubSub({}) and I get the same error in both cases. I have made sure that the IAM policies are correct.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:1
  • Comments:19 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
kamalaboulhosncommented, Jul 7, 2021

It sounds to me like this is a case of the client getting overwhelmed and not being able to send acks and modacks back. Let me mention some things about the properties discussed so far:

maxStreams controls the number of streaming pull streams that are open to the client. This number should not really be set to a higher value unless it is believed that the client itself can handle more messages, but a single stream is not able to deliver enough. A single stream can deliver 10MB/s as per the resource limits. Limiting to a single stream will not limit one to receiving a single message at a time.

It is possible that the subscriber is becoming overwhelmed with the amount of data it is receiving. If this is the case, then increasing maxStreams will likely make the problem worse. What you want to do is to try to limit the number of messages your subscriber is handling at once. Reducing maxStreams is a more advanced way to do this. The best way to do it is to change the flow control settings. With flow control settings reduced, there would be fewer messages delivered to the client at the same time. This would help if the issue is that overload due to how the messages are being processed is what is causing the failed RPCs to the server.

There could be many different things that could contribute to an overloaded subscriber. It may not even be the subscriber itself that is overloaded, it could be that it is on an overloaded machine. Does the docker container contain anything else that is running and could be using up CPU, RAM, or network resources? What about the machine it runs on? Also, depending on where and how you are running the subscriber, it could be that the type of VM has limited throughput and therefore the requests are unable to get sent.

There are some Medium posts that may be of interest:

1reaction
feywindcommented, Jul 7, 2021

I was asking one of our service folks, and he said that the publisher and subscriber should ideally be completely decoupled. So disregard my publisher flow control comment, sorry.

Read more comments on GitHub >

github_iconTop Results From Across the Web

'504 Deadline Exceeded' response from 'SubscriberClient.pull ...
google-cloud-pubsub 1.0.0 return 504 Deadline Exceeded. When I downgrade version to 0.45.0, then the error does not appear.
Read more >
Google Cloud PubSub throws 504 Deadline Exceeded error ...
After an idle of 10 minutes and not receiving any messages, PubSub throws an 504 Deadline exceeded error. Error always occurs after about...
Read more >
Troubleshoot Cloud Spanner deadline exceeded errors
When accessing Spanner APIs, requests may fail due to DEADLINE_EXCEEDED errors. This error indicates that a response has not been received within the...
Read more >
4 DEADLINE_EXCEEDED: Deadline exceeded when trying...
We are using Firestore for Datastore and trying to bulk load data from CSV into Datastore. So we are making use of this...
Read more >
ContainerGCFailed - context deadline exceeded
Nodes running Vormetric secfs kernel module. Subscriber exclusive content. A Red Hat subscription provides unlimited access to our knowledgebase ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found