question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

API google.pubsub.v1.Publisher exceeded 5000 milliseconds when running on Cloud Run

See original GitHub issue

When the image is run locally on Docker the application is able to successfully publish messages to Cloud PubSub.

When the same image is deployed to Cloud Run, it won’t publish a single message. All attempts fail with the error:

GoogleError: Total timeout of API google.pubsub.v1.Publisher exceeded 5000 milliseconds before any response was received.
    at repeat (/usr/root/node_modules/google-gax/build/src/normalCalls/retries.js:66:31)
    at Timeout._onTimeout (/usr/root/node_modules/google-gax/build/src/normalCalls/retries.js:101:25)
    at listOnTimeout (internal/timers.js:557:17)
    at processTimers (internal/timers.js:500:7)
code 4

Sample Code:

import { mongoose } from '@shared/connections'
import MongoDbModel from '@shared/models'
import { PubSub } from '@google-cloud/pubsub'

import {
  REMINDER_COPY_SMS = '',
  REMINDER_MAX_DAYS_SINCE_LAST_MSG = 0,
  REMINDER_MAX_DAYS_SINCE_LAST_READING = 0,
  REMINDER_MESSAGE_TEMPLATE = 'reading-reminder',
  TOPICS_MESSAGE_DISPATCH,
  KEYFILE_PATH,
} from '../constants.js'

const pubsubClient = new PubSub({
  projectId: process.env.GOOGLE_CLOUD_PROJECT,
  keyFilename: KEYFILE_PATH,
})

const msMaxSinceLastReading = REMINDER_MAX_DAYS_SINCE_LAST_READING * 24 * 60 * 60 * 1000
const msMaxSinceLastMsg = REMINDER_MAX_DAYS_SINCE_LAST_MSG * 24 * 60 * 60 * 1000

const { model: Patient } = new MongoDbModel(mongoose, 'Patient')

const findInactivePatients = () => Patient.find({
  status: 'ACTIVE',
  $and: [{
    $or: [{
      last_reading_at: { $exists: false },
    }, {
      last_reading_at: { $lt: new Date(Date.now() - msMaxSinceLastReading) },
    }],
  }, {
    $or: [{
      last_reading_reminder_at: { $exists: false },
    }, {
      last_reading_reminder_at: { $lt: new Date(Date.now() - msMaxSinceLastMsg) },
    }],
  }, {
    // Test with known patients only
    $or: [{
      first_name: { $regex: /Bruno/i }, last_name: { $regex: /Soares/i },
    }],
  }],
}, {
  first_name: 1,
  last_name: 1,
  gender: 1,
  last_reading_at: 1,
  phones: 1,
}, {
  lean: true,
})

const updateLastReadingReminderAt = (patient) => Patient.findOneAndUpdate({
  _id: patient._id,
}, {
  last_reading_reminder_at: Date.now(),
})

const createMessagePayload = (patient) => ({
  body: REMINDER_COPY_SMS,
  channels: [{
    name: 'sms',
    contacts: patient.phones.map((phone) => phone.E164),
    specifications: {
      template: REMINDER_MESSAGE_TEMPLATE,
    },
  }],
})

const dispatchMessage = async (patient) => {
  if (!patient.phones || !patient.phones.length) {
    return
  }
  const payload = createMessagePayload(patient)
  console.info('Worker reminder-report-vitals', 'event payload', JSON.stringify(payload))
  try {
    await pubsubClient.topic(TOPICS_MESSAGE_DISPATCH).publishMessage({ json: payload })
    console.info('Worker reminder-report-vitals', 'event published to cloudPubSub')
    pubsubClient.close()
    await updateLastReadingReminderAt(patient)
    console.info('Worker reminder-report-vitals', 'last_reading_reminder_at updated for patient', patient._id)
  } catch (err) {
    console.error(err)
  }
}

const handler = async () => {
  console.info('Worker reminder-report-vitals', 'exectution started')
  try {
    const inactiveList = await findInactivePatients()
    inactiveList.forEach(dispatchMessage)
  } catch (err) {
    console.error(err)
  }
}

export default handler

The code hangs after logging this line: console.info('Worker reminder-report-vitals', 'event payload', JSON.stringify(payload))

Sample Event Payload:

{"body":"This is {{ORG}}. We haven't been receiving your vitals. Reply \"start\" to report your vitals now. Reply \"stop\" at any time to opt-out from automated reminders.","channels":[{"name":"sms","contacts":["+15555555555"],"specifications":{"template":"reading-reminder"}}]}

Environment details

  • OS: Linux Alpine
  • Node.js version: 14.16.1
  • @google-cloud/pubsub version: 2.18.4

Steps to reproduce

  1. Create PubSub topic to receive messages and replace its name in constants.js TOPICS_MESSAGE_DISPATCH.
  2. Create nodejs application image from sample code. For sake of replicating only the issue, database operations may be removed - instead the sample event payload may be used.
  3. Run the container locally with Docker.
  4. Push the image to GCP Container Registry
  5. Deploy a Cloud Run service using the image
  6. See timeout erro in Cloud Logs

Issue Analytics

  • State:open
  • Created 2 years ago
  • Reactions:14
  • Comments:48 (1 by maintainers)

github_iconTop GitHub Comments

4reactions
gorzizacommented, Dec 23, 2021

@apettiigrew Remove ^… use version only. “@google-cloud/pubsub”: “2.17.0”

1reaction
linde12commented, Jun 20, 2022

I have tried both pubsub 2.17 and gax 2.28.1, but doesn’t seem to work for this simple use case-

exports.main = async (message, context) => {
  const data = {
    event: "AccountCreated"
  };
  const topicName = 'some valid topic name'
  const pubMessage = Buffer.from(JSON.stringify(data));
  const topic = pubsub.topic(topicName);

  topic.publish(pubMessage);
};

Keep getting the same error when the await on publishMessage hangs for a bit and crashes-

GoogleError: Total timeout of API google.pubsub.v1.Publisher exceeded 60000 milliseconds before any response was received.
    at repeat (/Users/user/path/app/server/node_modules/@google-cloud/pubsub/node_modules/google-gax/build/src/normalCalls/retries.js:66:31)
    at Timeout._onTimeout (/Users/user/path/app/server/node_modules/@google-cloud/pubsub/node_modules/google-gax/build/src/normalCalls/retries.js:101:25)
    at listOnTimeout (node:internal/timers:564:17)
    at process.processTimers (node:internal/timers:507:7) {
  code: 4
}

Are you calling it concurrently? I had a similar issue to this post when running om GCP and my issue was that i was sending lots of request concurrently without awaiting the promises, causing my service to scale down. Instead you can do multiple publishes in a single request and await the respones (if doing scale-to-zero)

Read more comments on GitHub >

github_iconTop Results From Across the Web

Troubleshooting | Cloud Pub/Sub Documentation
Learn about troubleshooting steps that you might find helpful if you run into problems using Pub/Sub. Cannot create a subscription.
Read more >
Cloud Run PubSub high latency - node.js - Stack Overflow
pubsub.v1.Publisher exceeded 60000 milliseconds before any response was received. I this case a message is not sent at all or is highly delayed....
Read more >
Google Cloud Pub/Sub Reliability User Guide: Part 1 Publishing
Any unavailability of the publish API can create risk of data loss. So, as an application designer, you must strike a balance between...
Read more >
GCP pipeline: pub/sub-lookup-storage (part 2/2) | Syntio
Google Cloud, in addition to Cloud functions, offers Cloud Run as serverless option. ... And since GCP charges by execution time in milliseconds, ......
Read more >
Part XIX. Appendix: Compendium of Configuration Properties
Name Default Description aws.paramstore.default‑context application aws.paramstore.enabled true Is AWS Parameter Store support enabled. aws.paramstore.profile‑separator _
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found