DataLoader used by `suspend` function fails to complete
See original GitHub issueLibrary Version
1.4.2
Describe the bug
When I use a BatchLoader
to resolve a property defined as a suspended function the request will complete maybe once or twice, and then subsequently fail to complete. The Dataloader
body
If I define the property as a function that returns a CompletableFuture
it returns successfully every time.
This may be expected behavior. If so I’d like to update the example code.
To Reproduce
- Check out the following branch https://github.com/rharriso/graphql-kotlin/tree/demostrate-batch-issue
- Run the spark example
- Request the suspended property multiple times:
query {
searchCourses(params: { ids: [1,2,3] }) {
university {
id
name
}
}
}
- See complete once or twice and fail afterwards.
- Query the property supplied by
CompletableFuture
query {
searchCourses(params: { ids: [1,2,3] }) {
universityFuture {
id
name
}
}
}
- This query should always succeed.
Expected behavior
Not sure. If this is a bad usage of a batch loader I’d like to know.
Issue Analytics
- State:
- Created 3 years ago
- Reactions:1
- Comments:6 (4 by maintainers)
Top Results From Across the Web
Common error messages in Data Loader - Salesforce Help
Here's a list of common Data Loader errors you'll receive as you use Data Loader to update or insert records. Resolution. Error: Row...
Read more >DataLoaders Explained: Building a Multi-Process Data Loader ...
We now basically have a fully functional data loader; The only issue is that get() is loading in one element of dataset at...
Read more >kotlin - Suspend function reference as parameter of let gets error
You can only call suspend functions from a coroutine or another suspend function. And let does not take a suspend function as a...
Read more >How to Stop Users from Exporting Data from Salesforce
This post talks about how to customize Salesforce to either restrict or add friction to users exporting data from Salesforce.
Read more >Salesforce data loader | Terms and Conditions | dataloader.io
Subject to the terms below, this Agreement permits Customer to use Dataloader.io ... product or copy any features, functions or graphics of Dataloader.io....
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
The example provided by the original poster is contrived indeed, but the benefit here is simple: I get some result from another service through a loader, then I use it differently in different resolvers. Obviously, I don’t want to call said external server multiple times in the same query, that’s why I need some mechanism for batching and caching.
I can’t just move the logic from the resolvers into a data loader. I have one external service, but the result is used differently by different resolvers, so that would result in different data loaders thus defeating the purpose. The only way of making this work that I can see is by abandoning coroutines altogether and transforming the
CompletableFuture
from the data loader viathenApplyAsync
or something similar. Obviously, this doesn’t exactly thrill me as it’s neither consistent, nor convenient.My use-case makes a network request to another service over gRPC.
I’ll update the spark example to include asynchronous example.