question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

EFCore based Memory Leak

See original GitHub issue

Description

There is a memory leak, based on the way EF Core cache queries, especially queries with parameters automatically generated with such a librairy.

The following test (on GetTests.cs) shows just that, with cachedQueryCount being equal to 1000 instead of 1.

        [Fact]
        public async Task Memory_Leak()
        {
            for (int i = 0; i < 1000; i++)
            {
                await _fixture.SendAsync("GET", $"/api/v1/courses?filter[number]:eq={i}", null);
            }

            int cachedQueryCount = 0;

            var cache = _fixture.Server.GetService<IMemoryCache>() as MemoryCache;
            var entriesProperty = typeof(MemoryCache).GetProperty("EntriesCollection", BindingFlags.NonPublic | BindingFlags.Instance);
            var entries = entriesProperty.GetValue(cache) as ICollection;
            var items = new List<string>();
            if (entries != null)
            {
                foreach (var item in entries)
                {
                    var methodInfoVal = item.GetType().GetProperty("Value");
                    var val = methodInfoVal.GetValue(item) as ICacheEntry;
                    if (val?.Value == null)
                    {
                        continue;
                    }
                    var contentType = val.Value.GetType();
                    var gens = contentType.GenericTypeArguments;
                    if (gens.Length == 2 && gens[0] == typeof(QueryContext) && gens[1] == typeof(IAsyncEnumerable<CourseEntity>))
                    {
                        cachedQueryCount++;
                    }
                }
            }

            Assert.Equal(1, cachedQueryCount); //Except it's 1000
        }

Environment

  • JsonApiDotNetCore Version: master

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:5 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
Poltuucommented, Dec 3, 2018

I used the server and fixture from the file GetTests.cs. I’m developping a framework very similar, and we did observe this behavior in production.

The problem comes from the expression generation using Expression.Constant. Ef Core does not recognize those a parameters anymore (unlike Ef 6), and the EF query plan gets cached for each new parameter.

The solution is to replace Expression.Constant(value, typeof(T)) by something like ((Expression<Func<T>>)(() => value)).Body So that EF interprets the value as a parameter, not a constant, but I don’t know json-api-dotnet well enough to fix it.

0reactions
bart-degreedcommented, Feb 19, 2020

Yes, I can reproduce it on master with the test below, which I temporarily added to ContentNegotiation.cs:

[Fact]
public async Task Memory_Leak()
{
    for (int i = 0; i < 1000; i++)
    {
        var request = new HttpRequestMessage(HttpMethod.Get, $"/api/v1/todoItems?filter[ordinal]:eq={i}");
        var response = await _fixture.Client.SendAsync(request);
        Assert.Equal(HttpStatusCode.OK, response.StatusCode);
    }

    int cachedQueryCount = 0;

    var cache = _fixture.GetService<IMemoryCache>() as MemoryCache;
    var entriesProperty = typeof(MemoryCache).GetProperty("EntriesCollection", BindingFlags.NonPublic | BindingFlags.Instance);
    var entries = entriesProperty.GetValue(cache) as ICollection;
    var items = new List<string>();
    if (entries != null)
    {
        foreach (var item in entries)
        {
            var methodInfoVal = item.GetType().GetProperty("Value");
            var val = methodInfoVal.GetValue(item) as ICacheEntry;
            if (val?.Value == null)
            {
                continue;
            }
            var contentType = val.Value.GetType();
            var gens = contentType.GenericTypeArguments;
            if (gens.Length == 2 && gens[0] == typeof(QueryContext) && gens[1] == typeof(IAsyncEnumerable<TodoItem>))
            {
                cachedQueryCount++;
            }
        }
    }

    Assert.Equal(1, cachedQueryCount); //Except it's 1000
}

But to make the problem visible, you also need to make EF Core use a shared cache due to a breaking change in EF Core 3.0 via Startup.cs:

public virtual void ConfigureServices(IServiceCollection services)
{
    var cacheInstance = new MemoryCache(new MemoryCacheOptions());
    services.AddSingleton<IMemoryCache, MemoryCache>(x => cacheInstance);

    services
        .AddDbContext<AppDbContext>(options =>
        {
            options.UseMemoryCache(cacheInstance);
...

I did not run a load test to see if there is an actual memory leak, but in any case it is inefficient usage of cache space. Each filter parameter in the query string restarts the query parse process and creates a new cache entry, because of a different constant value. By changing the constant into a parameter, the parsed query can be reused by EF Core. And probably by database servers too.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Memory leak when using Entity Framework
Based on Chintan shah answer I made an extension method and an example. public static class DbContextExtensions { /// <summary> /// Set all ......
Read more >
How you should easily prevent this memory leak with EF ...
Once spotted, the leak is easy to fix: DBContext should have a short life span! We have 2 solutions here, either configure the...
Read more >
Is this called memory leak ? · Issue #27876 · dotnet/efcore
I've created a small asp.net core app to learn insert in efcore but I notice the issue about memory leak when I insert...
Read more >
Debug a memory leak in .NET Core
A memory leak may happen when your app references objects that it no longer needs to perform the desired task.
Read more >
Memory management and patterns in ASP.NET Core
Run MemoryLeak. Allocated memory slowly increases until a GC occurs. Memory increases because the tool allocates custom object to capture data.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found