Cannot run jobs due to Mongo duplication error
See original GitHub issueHi! Great package! We use it at Lookback for running email jobs for different timezones.
However, this has magically stopped working. From the logs, I read:
SyncedCron: Not running "Run Daily Timezone Jobs" again.
from line 134 for each time a new job is gonna run.
The thing is, we’re only running the app on one instance, and there cannot be any already existing recordings, due to their unique names and intendedAt
property (them being indexes).
I even cleared the cronHistory
collection, but after a couple of hours, it kept logging the duplication message.
Any idea what I’m doing wrong? I’ve debugged everything up until this.
Dump
db.cronHistory.find({name: 'Run Daily Timezone Jobs'}).sort({intendedAt: -1})
{ "_id" : "MR3zMKuyCtiYNHDNf", "finishedAt" : ISODate("2014-11-19T09:00:00.038Z"), "intendedAt" : ISODate("2014-11-19T09:00:00Z"), "name" : "Run Daily Timezone Jobs", "result" : null, "startedAt" : ISODate("2014-11-19T09:00:00.001Z") }
{ "_id" : "TeQDcCSukmJRujNRj", "finishedAt" : ISODate("2014-11-19T08:00:00.110Z"), "intendedAt" : ISODate("2014-11-19T08:00:00Z"), "name" : "Run Daily Timezone Jobs", "result" : null, "startedAt" : ISODate("2014-11-19T08:00:00.001Z") }
{ "_id" : "2MwJfZpiCbDs4Wgnp", "finishedAt" : ISODate("2014-11-19T07:00:00.232Z"), "intendedAt" : ISODate("2014-11-19T07:00:00Z"), "name" : "Run Daily Timezone Jobs", "result" : null, "startedAt" : ISODate("2014-11-19T07:00:00.002Z") }
{ "_id" : "t2YvJk5FMSpAEL2ha", "finishedAt" : ISODate("2014-11-19T06:00:01.580Z"), "intendedAt" : ISODate("2014-11-19T06:00:00Z"), "name" : "Run Daily Timezone Jobs", "result" : null, "startedAt" : ISODate("2014-11-19T06:00:00.002Z") }
{ "_id" : "tjz5mxZfnmCa5cw6Q", "finishedAt" : ISODate("2014-11-19T05:00:00.067Z"), "intendedAt" : ISODate("2014-11-19T05:00:00Z"), "name" : "Run Daily Timezone Jobs", "result" : null, "startedAt" : ISODate("2014-11-19T05:00:00.001Z") }
{ "_id" : "84jbCFdbjwYFdnanC", "finishedAt" : ISODate("2014-11-19T04:00:00.057Z"), "intendedAt" : ISODate("2014-11-19T04:00:00Z"), "name" : "Run Daily Timezone Jobs", "result" : null, "startedAt" : ISODate("2014-11-19T04:00:00Z") }
Issue Analytics
- State:
- Created 9 years ago
- Comments:12 (6 by maintainers)
Top Results From Across the Web
E11000 duplicate key error collection - MongoDB
I have an index in collection “persons” which shall avoid inserting duplicate documents with the same person name.
Read more >Duplicate Error - MongoDB Developer Community Forums
Hi, I'm getting a duplicate error when creating a new record. ... Indexed fields like user_id can not be duplicated. The same will...
Read more >Handling "Duplicated key error" in bulk insert retry scenarios
I'm trying bulk insert a huge amount of documents with a retry mechanism in case there is connection disruption issue. The problem is,...
Read more >How to avoid "duplicate id" error during restoring collections ...
When restoring collections on destination database mongorestore is giving “duplicate id” error. Should we treat this as serious Error or ...
Read more >let MongoDB return errors instead of preventing them
The MongoDB NodeJS Driver Error classes has MongoServerError as one of its sub-classes. The "E11000 duplicate key error collection" is defined ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@HemalR @pagesrichie check out https://github.com/percolatestudio/meteor-synced-cron/issues/134
I know this is an old issue but in case anyone is experiencing this, I had this issue after deleting the cronHistory collection in the database. It will automatically get recreated but if the unique key index is not there, this error will happen.