Paged Queries in Parallel
See original GitHub issueIf you do multiple queries against the same client (or client connection pool) with paged: true
the two searches seem to corrupt one another resulting in
ProtocolError: paged results cookie is invalid
As long as you limit yourself to one search at a time you’re ok.
Am I doing something wrong? Is this expected behaviour? I need to know if I need to abandon the persistent connection pool and build/tear down a connection for every search.
var _LDAPSearch = function (opts) {
return new Promise(function (resolve, reject) {
var client = opts.client;
var base = opts.base;
var options = opts.options;
return client.search(base, options, function (err, res) {
if (err) return reject(err);
var entries = [];
res.on('searchEntry', function (entry) {
return entries.push(entry);
});
res.on('error', function (err) {
if (err) sails.log.warn(err);
});
res.on('end', function (result) {
if (result.errorMessage) return reject(result.errorMessage);
return resolve(entries);
});
});
});
};
Issue Analytics
- State:
- Created 8 years ago
- Comments:6 (2 by maintainers)
Top Results From Across the Web
What's good about offset pagination; designing parallel cursor ...
Running multiple paginated queries in parallel is essentially circumventing the API provider's intent to limit the number of items requested ...
Read more >ElasticSearch Parallel Pagination by Kafka | by Jay Ehsaniara
This process is sequentially asynchronous, which means to get through 1,000,000 documents you need to call ES 1,000,000/10,000=100 times one ...
Read more >How to fetch parallel pagination data from big query
I am fetching paginated data from bq since data ...
Read more >What's good about offset pagination; designing parallel cursor ...
Cursor-based pagination is difficult for a client to parallelize because it can't know what cursor to send for the next page until it's...
Read more >Parallel Queries | TanStack Query Docs
"Parallel" queries are queries that are executed in parallel, or at the same time so as to maximize fetching concurrency.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I suspect that the LDAP server you’re communicating with does not support concurrent paged requests for the same search criteria on the same connection. Paged searches in LDAP function by passing an opaque cookie back and forth between the server and the client. This cookie identifies the specific search to the server, allowing it to continue serving results where it left off in the search. Since you have two searches for the exact same thing going on simultaneously, it could be that the server believes the cookie(s) you’re passing back are invalid.
You will likely need to ensure, via your own custom logic, that two identical paged searches do not occur on the the same client connection simultaneously.
Alright, good to know. I’ll abandon the persistent connection pool and just build/teardown a client for every query.