question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

'Unable to expand length of this stream beyond its capacity.' when using DefaultLambdaJsonSerializer

See original GitHub issue

Description

On migrating from Amazon.Lambda.Serialization.Json.JsonSerializer to Amazon.Lambda.Serialization.SystemTextJson.DefaultLambdaJsonSerializer there is a significant increase in the size of the serialised response.

I’m generating JSON and returning it in the body of APIGatewayProxyResponse. Previously the quotes in the JSON where encoded as \" ( 2 characters), now they are encoded as \u0022 ( 6 characters). I could previously return more than 5MB of JSON before hitting the Lambda Response Payload limitation of 6MB, now I can only return around 3.5MB of JSON

The error I get is:

Error converting the response object of type Amazon.Lambda.APIGatewayEvents.APIGatewayProxyResponse from the Lambda function to JSON: Unable to expand length of this stream beyond its capacity.

I’m raising this as a bug, as it’s not clear that there is a significant impact from moving to the new serialiser and there doesn’t seem to be an easy way to override the behaviour to make it produce JSON in the way it used to be produced.

Reproduction Steps

Sample code showing the difference:

using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Mime;
using System.Text.Json;
using Amazon.Lambda.APIGatewayEvents;
using Amazon.Lambda.Core;
using Amazon.Lambda.Serialization.SystemTextJson;
using Microsoft.Net.Http.Headers;

namespace testserialiser
{
    class Program
    {
        static void Main(string[] args)
        {
            var o = new
            {
                a = "test",
                b = 27,
                c = new
                {
                    a = "foobar"
                }
            };
            string body = System.Text.Json.JsonSerializer.Serialize(o);
            int qc = body.Count(x => x == '\"');
            Console.WriteLine($"Source:{body} Length:{body.Length}, QuoteCount:{qc}");
            //Source:{"a":"test","b":27,"c":{"a":"foobar"}} Length:38, QuoteCount:12

            var response = new APIGatewayProxyResponse
            {
                StatusCode = (int)HttpStatusCode.OK,
                Headers = new Dictionary<string, string> { { HeaderNames.ContentType, MediaTypeNames.Application.Json } },
                Body = body
            };

            //Use DefaultLambdaJsonSerializer
            var ms = new MemoryStream(new byte[6 * 1000 * 1000]);
            ILambdaSerializer ser = new DefaultLambdaJsonSerializer();
            ser.Serialize(response, ms);
            LogMS(ms);
            //Actual:{"statusCode":200,"headers":{"Content-Type":"application/json"},"body":"{\u0022a\u0022:\u0022test\u0022,\u0022b\u0022:27,\u0022c\u0022:{\u0022a\u0022:\u0022foobar\u0022}}","isBase64Encoded":false} Length:196

            //Use DefaultLambdaJsonSerializer, attempt to override JsonEscaping
            ms.Position = 0;
            JsonSerializerOptions options = null;
            ser = new DefaultLambdaJsonSerializer(o =>
            {
                options = o;
                o.Encoder = System.Text.Encodings.Web.JavaScriptEncoder.UnsafeRelaxedJsonEscaping;
                o.WriteIndented = false;
            });
            ser.Serialize(response, ms);
            LogMS(ms);
            //Actual:{"statusCode":200,"headers":{"Content-Type":"application/json"},"body":"{\u0022a\u0022:\u0022test\u0022,\u0022b\u0022:27,\u0022c\u0022:{\u0022a\u0022:\u0022foobar\u0022}}","isBase64Encoded":false} Length:196


            //Plain Serialize using same options as DefaultLambdaJsonSerializer
            var js = JsonSerializer.Serialize(response, options);
            Console.WriteLine($"Expected:{js} Length:{js.Length}");
            //Expected:{"statusCode":200,"headers":{"Content-Type":"application/json"},"body":"{\"a\":\"test\",\"b\":27,\"c\":{\"a\":\"foobar\"}}","isBase64Encoded":false} Length:148

            //Serialize using same options as DefaultLambdaJsonSerializer and Utf8writer
            ms.Position = 0;
            using (var writer = new Utf8JsonWriter(ms))
            {
                JsonSerializer.Serialize(writer, response, options);
            }
            LogMS(ms);
            //Actual:{"statusCode":200,"headers":{"Content-Type":"application/json"},"body":"{\u0022a\u0022:\u0022test\u0022,\u0022b\u0022:27,\u0022c\u0022:{\u0022a\u0022:\u0022foobar\u0022}}","isBase64Encoded":false} Length:196

            //Serialize using old JsonSerializer
            ms.Position = 0;
            ser = new Amazon.Lambda.Serialization.Json.JsonSerializer();
            ser.Serialize(response, ms);
            LogMS(ms);
            //Actual:{"statusCode":200,"headers":{"Content-Type":"application/json"},"multiValueHeaders":null,"body":"{\"a\":\"test\",\"b\":27,\"c\":{\"a\":\"foobar\"}}","isBase64Encoded":false} Length:173

        }

        private static void LogMS(MemoryStream ms)
        {
            long len = ms.Position;
            ms.SetLength(len);
            ms.Position = 0;
            var sr = new StreamReader(ms);
            var j = sr.ReadToEnd();
            Console.WriteLine($"Actual:{j} Length:{j.Length}");
            ms.Position = 0;
        }
    }
}

Logs

Error converting the response object of type Amazon.Lambda.APIGatewayEvents.APIGatewayProxyResponse from the Lambda function to JSON: Unable to expand length of this stream beyond its capacity.: JsonSerializerException at Amazon.Lambda.Serialization.SystemTextJson.DefaultLambdaJsonSerializer.Serialize[T](T response, Stream responseStream)

at System.IO.UnmanagedMemoryStream.WriteCore(ReadOnlySpan1 buffer) at System.IO.UnmanagedMemoryStream.Write(ReadOnlySpan1 buffer) at System.Text.Json.Utf8JsonWriter.Flush() at System.Text.Json.JsonSerializer.WriteCore(Utf8JsonWriter writer, Object value, Type type, JsonSerializerOptions options) at System.Text.Json.JsonSerializer.WriteValueCore(Utf8JsonWriter writer, Object value, Type type, JsonSerializerOptions options) at System.Text.Json.JsonSerializer.Serialize[TValue](Utf8JsonWriter writer, TValue value, JsonSerializerOptions options) at Amazon.Lambda.Serialization.SystemTextJson.DefaultLambdaJsonSerializer.Serialize[T](T response, Stream responseStream)

Environment

  • Build Version: 2.0.1
  • OS Info: Ubuntu
  • Build Environment: VSCode
  • Targeted .NET Platform:dotnetcore3.1

Resolution


This is a 🐛 bug-report

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:2
  • Comments:9 (5 by maintainers)

github_iconTop GitHub Comments

2reactions
hvanbakelcommented, Aug 14, 2020

Ok, so this issue is a whole list of issues. First of all the documentation here: https://github.com/aws/aws-lambda-dotnet/tree/master/Libraries/src/Amazon.Lambda.Serialization.SystemTextJson

Mentions the assembly level attribute, this however no longer works because of this code: https://github.com/aws/aws-lambda-dotnet/blob/master/Libraries/src/Amazon.Lambda.AspNetCoreServer/AbstractAspNetCoreFunction.cs#L426-L430 As the method level serializer takes precedence, you can only override it at the method level now.

Then there’s the serializer doing the additional encoding. The way the code stands right now there is no way to fix it other than copying the serializer code and putting it in your solution with a few modifications. You then need to override the serializer on the method like above.

So what changes are needed on the serializer? First of all you want to set the encoder, which you could do through the constructor that takes an action. You probably want to set this to JavaScriptEncoder.UnsafeRelaxedJsonEscaping. The unsafe refers to this allowing characters that could lead to exploits in html but this is probably not the typical use case. You can read the caution here though.

All of this can still be done without your own implementation but unfortunately the instantiation of the writer here, should use the overload that takes jsonwriteroptions, to again set the same encoder.

I’ve tested the above and it gives me the same response sizes as I was seeing on 2.1 with newtonsoft. I might be able to write this up as a formal PR over the weekend but this should at least unblock the other people on the thread.

2reactions
timhill-iresscommented, Jul 2, 2020

Thanks for the suggestion, but really my test code above is really focused on showing that there is a significant increase in the size of the serialised JSON. It’s that combined with the hard AWS limit of 6MB (https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-limits.html) that causes a problem for me. I can workaround by reverting to the old Serailiser Amazon.Lambda.Serialization.Json.JsonSerializer, but I understood there the recommendation was to use the new one as it improves the cold start performance. I didn’t understand the downside of the new Serialiser.

Read more comments on GitHub >

github_iconTop Results From Across the Web

amazon web services - AWS Lambda Payload Response
APIGatewayEvents.APIGatewayProxyResponse from the Lambda function to JSON: Unable to expand the length of this stream beyond its capacity.: ...
Read more >
Improved AWS Lambda JSON Serialization in C# | Medium
It will then demonstrate how you can extend this serializer to have an improved JSON serializer. Surprises with the default JSON serializer. As...
Read more >
AWS Lambda - Body Size is Too Large Error, but ...
Hello, I am using a lambda function behind an AWS Gateway to service a REST API. In one endpoint I am getting "body...
Read more >
Lambda function handler in C# - AWS ...
The Lambda function handler is the method in your function code that processes events. When your function is invoked, Lambda runs the handler...
Read more >
Best practices for working with AWS Lambda functions
Test with different batch and record sizes so that the polling frequency of each event source is tuned to how quickly your function...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found