Should we add some logic to retry 502 Server errors of the GitHub API?
See original GitHub issueI think one of the first occurrence of this error is on October 20th https://github.com/prisma/prisma/actions/runs/3288801017/jobs/5419574155
And since then it happens for all Pull Requests 😢
The API call which errors is https://api.github.com/repos/prisma/prisma/actions/workflows/2176058/runs?per_page=100
So this is not an issue about this action but an issue with GitHub API here (I just reported it on https://support.github.com/contact/bug-report - Private ticket is https://support.github.com/ticket/personal/0/1845305)
Examples https://github.com/prisma/prisma/actions/runs/3288801017/jobs/5419574155 https://github.com/prisma/prisma/actions/runs/3311618156/jobs/5467280417 https://github.com/prisma/prisma/actions/runs/3300486832/jobs/5445032786
https://github.com/prisma/prisma/actions/runs/3311727554/jobs/5467525141#step:2:15 copy pasted below:
/home/runner/work/_actions/fkirc/skip-duplicate-actions/v5/dist/index.js:4784
const error = new requestError.RequestError(toErrorMessage(data), status, {
^
RequestError [HttpError]: Server Error
at /home/runner/work/_actions/fkirc/skip-duplicate-actions/v5/dist/index.js:4784:21
at processTicksAndRejections (node:internal/process/task_queues:96:5) {
status: 502,
response: {
url: 'https://api.github.com/repos/prisma/prisma/actions/workflows/2176058/runs?per_page=100',
status: 502,
headers: {
connection: 'close',
'content-length': '32',
'content-type': 'application/json',
date: 'Mon, 24 Oct 2022 09:53:59 GMT',
etag: '"63565dfc-20"',
server: 'GitHub.com',
vary: 'Accept-Encoding, Accept, X-Requested-With',
'x-github-request-id': '0682:4473:81E091B:10A5941C:635660AC'
},
data: { message: 'Server Error' }
},
request: {
method: 'GET',
url: 'https://api.github.com/repos/prisma/prisma/actions/workflows/2176058/runs?per_page=100',
headers: {
accept: 'application/vnd.github.v3+json',
'user-agent': 'octokit-core.js/3.6.0 Node.js/16.13.0 (linux; x64)',
authorization: 'token [REDACTED]'
},
request: {
agent: Agent {
_events: [Object: null prototype] {
free: [Function (anonymous)],
newListener: [Function: maybeEnableKeylog]
},
_eventsCount: 2,
_maxListeners: undefined,
defaultPort: 443,
protocol: 'https:',
options: [Object: null prototype] { path: null },
requests: [Object: null prototype] {},
sockets: [Object: null prototype] {
'api.github.com:443:::::::::::::::::::::': [ [TLSSocket] ]
},
freeSockets: [Object: null prototype] {},
keepAliveMsecs: 1000,
keepAlive: false,
maxSockets: Infinity,
maxFreeSockets: 256,
scheduling: 'lifo',
maxTotalSockets: Infinity,
totalSocketCount: 1,
maxCachedSessions: 100,
_sessionCache: {
map: {
'api.github.com:443:::::::::::::::::::::': [Buffer [Uint8Array]]
},
list: [ 'api.github.com:443:::::::::::::::::::::' ]
},
[Symbol(kCapture)]: false
},
hook: [Function: bound bound register]
}
}
}
Issue Analytics
- State:
- Created a year ago
- Reactions:6
- Comments:10 (1 by maintainers)
Top GitHub Comments
I can confirm it’s working for us again as well. Thanks for looking into this.
Thank you very much for your feedback! Great to hear that it is working again! 🎉 Nevertheless, I’ve still implemented a simple request retrying (using plugin-retry.js) to increase reliability in case that such problems happen again in the future.