Serverless framework and Node-Api: `invalid type: boolean true, expected a string`
See original GitHub issueBug description
I’m attempting to use Prisma 3.x inside a backend using the Serverless Framework, utilizing aws-lambda-graphql to create a serverless GraphQL service with subscriptions support.
I’ve experienced inconsistent problems of Prisma crashing on initialization with the following error:
Error: invalid type: boolean `true`, expected a string
at LibraryEngine.loadEngine (/Users/grant/git/workshops/node_modules/@prisma/client/runtime/index.js:24969:27)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at async LibraryEngine.instantiateLibrary (/Users/grant/git/workshops/node_modules/@prisma/client/runtime/index.js:24913:7)
Workaround
I found that disabling NApi got past the problem.
Setup
Here’s my full, unredacted serverless.yml
file for reference. I’m afraid I can’t share the codebase at the moment, but I could take some time to make a reproduction repo if the problem isn’t apparent to those familiar with the internals of Prisma.
app: backend
service: backend
frameworkVersion: '2'
useDotenv: false
package:
individually: true
custom:
dynamodb:
stages:
- dev
start:
inMemory: true
migrate: true
noStart: false
dynamodbStream:
host: localhost
port: 8000
region: us-east-1
pollForever: true
streams:
- table: Events
functions:
- graphqlEvents
serverless-offline:
host: localhost
httpPort: 4000
websocketPort: 4001
noPrependStageInUrl: true
allowCache: true
useWorkerThreads: true
location: .webpack/service
bundle:
linting: false
forceExclude:
- "_http_common"
externals:
- '@prisma/client'
- '.prisma/client'
copyFiles:
- from: '../node_modules/.prisma/client/*'
to: './src/handlers'
- from: '../node_modules/@prisma/client/*'
to: './src/handlers'
- from: './.env'
to: './src/backend'
packager: yarn
provider:
name: aws
runtime: nodejs12.x
lambdaHashingVersion: 20201221
iamRoleStatements:
- Effect: Allow
Action:
- execute-api:ManageConnections
Resource: 'arn:aws:execute-api:*:*:*/development/POST/@connections/*'
- Effect: Allow
Action:
- dynamodb:DeleteItem
- dynamodb:GetItem
- dynamodb:PutItem
- dynamodb:UpdateItem
Resource: !GetAtt ConnectionsDynamoDBTable.Arn
- Effect: Allow
Action:
- dynamodb:DescribeStream
- dynamodb:GetRecords
- dynamodb:GetShardIterator
- dynamodb:ListStreams
Resource: !GetAtt EventsDynamoDBTable.StreamArn
- Effect: Allow
Action:
- dynamodb:PutItem
Resource: !GetAtt EventsDynamoDBTable.Arn
- Effect: Allow
Action:
- dynamodb:BatchWriteItem
- dynamodb:DeleteItem
- dynamodb:GetItem
- dynamodb:PutItem
- dynamodb:Query
- dynamodb:Scan
Resource: !GetAtt SubscriptionsDynamoDBTable.Arn
- Effect: Allow
Action:
- dynamodb:BatchWriteItem
- dynamodb:DeleteItem
- dynamodb:GetItem
- dynamodb:PutItem
Resource: !GetAtt SubscriptionOperationsDynamoDBTable.Arn
functions:
hello:
handler: src/handlers/hello.handler
events:
- http:
path: /hello
method: get
graphqlHttp:
handler: src/handlers/graphql.httpHandler
events:
- http:
path: /graphql
method: any
graphqlSubscriptions:
handler: src/handlers/graphql.webSocketHandler
events:
- websocket:
route: $connect
- websocket:
route: $disconnect
- websocket:
route: $default
graphqlEvents:
handler: src/handlers/graphql.eventHandler
events:
- stream:
enabled: true
type: dynamodb
arn:
Fn::GetAtt: [EventsDynamoDBTable, StreamArn]
resources:
Resources:
ConnectionsDynamoDBTable:
Type: AWS::DynamoDB::Table
Properties:
# see DynamoDBConnectionManager
TableName: Connections
AttributeDefinitions:
- AttributeName: id
AttributeType: S
BillingMode: PAY_PER_REQUEST
KeySchema:
# connection id
- AttributeName: id
KeyType: HASH
# This one is optional (all connections have 2 hours of lifetime in ttl field but enabling TTL is up to you)
TimeToLiveSpecification:
AttributeName: ttl
Enabled: true
SubscriptionsDynamoDBTable:
Type: AWS::DynamoDB::Table
Properties:
# see DynamoDBSubscriptionManager
TableName: Subscriptions
AttributeDefinitions:
- AttributeName: event
AttributeType: S
- AttributeName: subscriptionId
AttributeType: S
BillingMode: PAY_PER_REQUEST
KeySchema:
- AttributeName: event
KeyType: HASH
- AttributeName: subscriptionId
KeyType: RANGE
# This one is optional (all subscriptions have 2 hours of lifetime in ttl field but enabling TTL is up to you)
TimeToLiveSpecification:
AttributeName: ttl
Enabled: true
SubscriptionOperationsDynamoDBTable:
Type: AWS::DynamoDB::Table
Properties:
# see DynamoDBSubscriptionManager
TableName: SubscriptionOperations
AttributeDefinitions:
- AttributeName: subscriptionId
AttributeType: S
BillingMode: PAY_PER_REQUEST
KeySchema:
- AttributeName: subscriptionId
KeyType: HASH
# This one is optional (all subscription operations have 2 hours of lifetime in ttl field but enabling TTL is up to you)
TimeToLiveSpecification:
AttributeName: ttl
Enabled: true
EventsDynamoDBTable:
Type: AWS::DynamoDB::Table
Properties:
# see DynamoDBEventStore
TableName: Events
KeySchema:
- AttributeName: id
KeyType: HASH
BillingMode: PAY_PER_REQUEST
# see ISubscriptionEvent
AttributeDefinitions:
- AttributeName: id
AttributeType: S
StreamSpecification:
StreamViewType: NEW_IMAGE
# This one is optional (all events have 2 hours of lifetime in ttl field but enabling TTL is up to you)
TimeToLiveSpecification:
AttributeName: ttl
Enabled: true
plugins:
- serverless-bundle
- serverless-plugin-monorepo
- serverless-dynamodb-local
- serverless-plugin-offline-dynamodb-stream
- serverless-offline
- serverless-dotenv-plugin
How to reproduce
Set up a Serverless project with the above configuration within a Yarn workspace powered monorepo.
Initialize a Prisma client within a Serverless function.
Expected behavior
I expected Prisma to work normally when initialized inside a Serverless function.
Prisma information
The schema and queries aren’t relevant yet I don’t think, as I haven’t actually invoked any queries before the crash.
Environment & setup
- OS: MacOS
- Database: Postgres
- Node.js version: v16.6.2
- Serverless
- Framework Core: 2.56.0
- Plugin: 5.4.4
- SDK: 4.3.0
- Components: 3.16.0
Prisma Version
prisma : 3.0.2
@prisma/client : 3.0.2
Current platform : darwin
Query Engine (Node-API) : libquery-engine 2452cc6313d52b8b9a96999ac0e974d0aedf88db (at ../node_modules/@prisma/engines/libquery_engine-darwin.dylib.node)
Migration Engine : migration-engine-cli 2452cc6313d52b8b9a96999ac0e974d0aedf88db (at ../node_modules/@prisma/engines/migration-engine-darwin)
Introspection Engine : introspection-core 2452cc6313d52b8b9a96999ac0e974d0aedf88db (at ../node_modules/@prisma/engines/introspection-engine-darwin)
Format Binary : prisma-fmt 2452cc6313d52b8b9a96999ac0e974d0aedf88db (at ../node_modules/@prisma/engines/prisma-fmt-darwin)
Default Engines Hash : 2452cc6313d52b8b9a96999ac0e974d0aedf88db
Studio : 0.423.0
Issue Analytics
- State:
- Created 2 years ago
- Comments:20 (11 by maintainers)
Top GitHub Comments
Indeed you need to install both
prisma@3.2.0-dev.6
and@prisma/client@3.2.0-dev.6
for this to possibly work in that project.So I’m having some theories why this breaks for you and why we haven’t been able to reproduce it yet. Due to how certain node.js libraries (such as Jest) work, we cannot take environment variables directly using
std::env::var
in Rust. Jest, as a good example, overridesprocess.env
so that it is not possible to modify the real environment, making it hard to test code that loads items from the system environment.Therefore the Rust Query Engine is initialized by sending the
process.env
down as a map, converted toHashMap<String, String>
on initialization.As we know very well, Rust is quite pedantic about types and what we see here is Rust telling us they expect a string but we send down a boolean, leading to an error in deserialization.
What I suspect here is the serverless framework in use does something to the
process.env
. I was not able to reproduce anything by just running my test script withenv FOO=true node index.js
; the value ofFOO
was a string of"true"
back in Rust how I expected. But if Jest can overrideprocess.env
, so can any other framework, and even though this is just guesswork, it is quite possible in your caseprocess.env
sends us down values that are not strings.This is my reproduction that demonstrates a similar crash:
And when we run this:
So, I’m going to do a little experiment, merging it after we do a release today to the next dev versions, which instead of directly converting the env to
HashMap<String, String>
, sets the value toserde_json::Value
that is an enumeration of all possible JSON values, and then converting that in the Rust code to the finalHashMap<String, String>
before continuing initialization. Then I can maybe ping you @a-type and you can try out the dev version to see if you still witness a crash.