AttributeError: Type "execute_async" not found in the Schema.
See original GitHub issueHi there, I am using graphene 2.1.8 with the tornado framework. I am trying to setup an async graphene mutation and it appears to not think the function exists on the graphene Schema class. My code is roughly setup like this:
graphql = graphene.Schema(query=Query, mutation=Mutations, subscription=Subscriptions)
async def query_async(self, query, queueLayer, variables, context):
return await graphql.execute_async(query, variables=variables, context=context)
And I get this error:
| 2019-10-02 20:16:48,976 tornado.application ERROR Uncaught exception POST /async (192.168.80.1)
HTTPServerRequest(protocol='http', host='localhost:8081', method='POST', uri='/async', version='HTTP/1.1', remote_ip='192.168.80.1')
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/tornado/web.py", line 1699, in _execute
result = await result
File "/usr/local/lib/python3.7/site-packages/tornado/gen.py", line 742, in run
yielded = self.gen.throw(*exc_info) # type: ignore
File "/work/server/handlers.py", line 152, in post
response = yield asyncio.ensure_future(self.session.exec_query_async(self.get_request_token(), query, variables, self.queueLayer))
File "/usr/local/lib/python3.7/site-packages/tornado/gen.py", line 735, in run
value = future.result()
File "/work/datastore/session/sessions.py", line 52, in exec_query_async
esult = await Schema.query_async(session, query, queueLayer, variables, dict(session=session, auth=token))
File "/work/schema/__init__.py", line 22, in query_async
return await graphql.execute_async(query, variables=variables, context=context, middleware=[QueueLayerMiddleware(queueLayer)])
File "/usr/local/lib/python3.7/site-packages/graphene/types/schema.py", line 98, in __getattr__
raise AttributeError('Type "{}" not found in the Schema'.format(type_name))
AttributeError: Type "execute_async" not found in the Schema
EDIT: I tried setting up a very basic example of execute_async using the example in graphene/tests_asyncio/test_relay_mutation.py and it is still giving me a similar error.
Issue Analytics
- State:
- Created 4 years ago
- Reactions:1
- Comments:6
Top Results From Across the Web
c# - Dapper ExecuteAsync missing? - Stack Overflow
I am writing a repo against an Oracle Db using Dapper v 1.50.4 ,Dapper-Async 1.3.0 ...
Read more >Asynchronous I/O (asyncio) - SQLAlchemy 1.4 Documentation
The program can freely switch between async/await code and contained functions that use sync code with virtually no performance penalty.
Read more >Async SQLAlchemy with FastAPI
SQLAlchemy 1.4 presents changes that will be finalized in SQLAlchemy 2.0. SQLAlchemy unifies Core and ORM APIs for consistency. Both Core and ...
Read more >Developing and Testing an Asynchronous API with FastAPI ...
This tutorial looks at how to develop and test an asynchronous API with FastAPI, Postgres, pytest, and Docker using Test-driven Development ...
Read more >cassandra.cluster — Cassandra Driver 3.13.0 documentation
It was not found generally beneficial for this driver. ... maximum duration (in seconds) that the driver will wait for schema agreement across...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@mariaines I ended up having to create a separate route that is completely async. It parses the incoming graphql request manually (yuck I know), runs the async tasks needed, stuffs the information in a middleware function and calls the original schema.execute() passing in the middleware. It’s a super hacky solution I know but the only one I could think of without avoiding graphene completely.
@uSpike whelp that will do it! Thanks so much for the info.