question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

The memory usage seems increasing when the table.insert() error occurs

See original GitHub issue

I am the original reporter of issue #1313. Unfortunately, I’m still facing with the issue. The cause of the problem seemed to be somewhere else, and I was able to reproduce it, so I’d like you to check it out.

As olavloite mentioned in #1313, the memory usage is stable in his script. https://github.com/googleapis/nodejs-spanner/issues/1313#issuecomment-790019913 However, it seems to be leaking slightly when table.insert() throws an error. I think this is the root cause of our problem.

The results below are with table.upsert() image

The results below are with table.insert() causes an error image

Environment details

  • OS: macOS
  • Node.js version: v14.17.1
  • npm version: 7.20.5
  • @google-cloud/spanner version: 5.13.1

Steps to reproduce

  1. Causing an error in table.insert()
  2. repeat, repeat and repeat

My test script:

const {Spanner} = require('@google-cloud/spanner');

const spanner = new Spanner();
const instance = spanner.instance('my-instance');
const database = instance.database('my-database');
const table = database.table('Singers');

main().then(() => console.log('[INFO] Finished'));

async function main() {
    for (let row = 1; row <= 50000; row++) {
        const newRow = {
            SingerId: 1,
            FirstName: 'firstname',
            LastName: 'lastname',
        };
        //await table.upsert(newRow);
        try {
          await table.insert(newRow);
        } catch(err) {
          if (err.code != 6) {    // ignore grpc AlreadyExists
            throw err;
          }
        }
        if (row % 25 === 0) {
            console.log(`[INFO] Rows updated so far: ${row}`);
            if (global.gc) {
                global.gc();
                const used = process.memoryUsage().heapUsed / 1024 / 1024;
                console.log(`[INFO] Memory usage: ${used} MB`);
            }
        }
    }
}

Output(with table.upsert()):

[INFO] Rows updated so far: 25
[INFO] Memory usage: 17.850181579589844 MB
[INFO] Rows updated so far: 50
[INFO] Memory usage: 17.832420349121094 MB
[INFO] Rows updated so far: 75
[INFO] Memory usage: 17.85736846923828 MB
[INFO] Rows updated so far: 100
[INFO] Memory usage: 17.891868591308594 MB
[INFO] Rows updated so far: 125
[INFO] Memory usage: 17.89563751220703 MB
[INFO] Rows updated so far: 150
[INFO] Memory usage: 17.9027099609375 MB
...
[INFO] Rows updated so far: 16150
[INFO] Memory usage: 18.864845275878906 MB
[INFO] Rows updated so far: 16175
[INFO] Memory usage: 18.887779235839844 MB
[INFO] Rows updated so far: 16200
[INFO] Memory usage: 18.87274932861328 MB
[INFO] Rows updated so far: 16225
[INFO] Memory usage: 18.86701202392578 MB
...
[INFO] Memory usage: 18.895301818847656 MB
[INFO] Rows updated so far: 49925
[INFO] Memory usage: 18.900909423828125 MB
[INFO] Rows updated so far: 49950
[INFO] Memory usage: 18.90435028076172 MB
[INFO] Rows updated so far: 49975
[INFO] Memory usage: 18.941001892089844 MB
[INFO] Rows updated so far: 50000
[INFO] Memory usage: 18.904075622558594 MB

Output(with table.insert() causes an error):

[INFO] Rows updated so far: 25
[INFO] Memory usage: 17.84894561767578 MB
[INFO] Rows updated so far: 50
[INFO] Memory usage: 17.83715057373047 MB
[INFO] Rows updated so far: 75
[INFO] Memory usage: 17.863059997558594 MB
[INFO] Rows updated so far: 100
[INFO] Memory usage: 17.921218872070312 MB
[INFO] Rows updated so far: 125
[INFO] Memory usage: 17.93242645263672 MB
[INFO] Rows updated so far: 150
[INFO] Memory usage: 17.938697814941406 MB
...
[INFO] Rows updated so far: 16150
[INFO] Memory usage: 20.609634399414062 MB
[INFO] Rows updated so far: 16175
[INFO] Memory usage: 20.611862182617188 MB
[INFO] Rows updated so far: 16200
[INFO] Memory usage: 20.61297607421875 MB
[INFO] Rows updated so far: 16225
[INFO] Memory usage: 20.616310119628906 MB
[INFO] Rows updated so far: 16250
[INFO] Memory usage: 20.620315551757812 MB
...
[INFO] Memory usage: 23.255882263183594 MB
[INFO] Rows updated so far: 49925
[INFO] Memory usage: 23.259368896484375 MB
[INFO] Rows updated so far: 49950
[INFO] Memory usage: 23.264991760253906 MB
[INFO] Rows updated so far: 49975
[INFO] Memory usage: 23.25994873046875 MB
[INFO] Rows updated so far: 50000
[INFO] Memory usage: 23.261062622070312 MB

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:1
  • Comments:8 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
junichi-tanakacommented, Oct 4, 2021

Today, I deployed our functions with nodejs-spanner v5.x to our production environment. The session pool settings are as same as my previous comment.

const database = instance.database(databaseId, { min: 10, max: 20, incStep: 2, writes: 0.5, concurrency: 1, fail: true });

The results are as shown in the image below, and it looks good, with no trend of increasing memory utilization. Per_call_memory20211004 Cloud_Functions

So, I am going to close this issue.

I appreciate your cooperation.

1reaction
junichi-tanakacommented, Sep 3, 2021

Just to be sure: That means that you configure max 1 Cloud Functions instance (and not 1 Spanner instance, or max 1 session), right?

Yes. It means gcloud functions deploy ... --max-instances=1.

Could it be that the graph you are seeing in that case is just a question of an instance that is receiving heavy load, and therefore has no time to do incremental garbage collections, until it does a major one?

There is an error below. I think it’s not garbage collections, but the instance of Cloud Functions restarted due to the error.

$ gcloud functions logs read my-function-name --start-time="2021-09-03T03:00:00Z" --min-log-level=error --region=asia-northeast1
LEVEL  NAME                    EXECUTION_ID  TIME_UTC                 LOG
E      my-functions-name       8rx60hamh30j  2021-09-03 03:14:52.651  Function invocation was interrupted. Error: memory limit exceeded.

As you said, it seems to work well if there are enough instances, so I will check this with our actual work load.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Resolve Out Of Memory Issues - SQL Server - Microsoft Learn
It is possible that the amount of memory you installed and allocated for In-Memory OLTP becomes inadequate for your growing needs. If so,...
Read more >
Memory usage goes wild with Doctrine bulk insert
With a very very tiny entity, it works but memory consumption increase too much: several MB whereas it should be KB. clear() ,...
Read more >
Analyze memory usage of PostgreSQL – why is it growing ...
A trigger that is defined as AFTER INSERT...FOR EACH ROW will queue up info all the inserted rows and then fire the trigger...
Read more >
Re: "Memory Limit Exceeded" error on Impala when i... - 34678
I am trying to run the following query on Impala and I'm getting a "Memory Limit Exceeded" error: ```. insert into flattened_impressions.
Read more >
Shared Memory Problem (unable to allocate ... - Ask TOM
SQLException: ORA-00604: error occurred at recursive SQL level 1 ORA-04031: unable to ... in part to the increased resource consumption but mainly to...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found