question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Orca hanging when large JSONs are piped in

See original GitHub issue

I have a dataframe with 20 years of daily data. iplot can process any plots.

However, I can only use orca when I slice the dataframe for less than 4 years of data. It fails in the notebook as well as in command line from a json dumped file with the following text.

A JavaScript error occurred in the main process
Uncaught Exception:
TypeError: path must be a string or Buffer
    at Object.fs.mkdirSync (fs.js:891:18)
    at main (/usr/local/lib/node_modules/orca/bin/graph.js:105:8)
    at Object.<anonymous> (/usr/local/lib/node_modules/orca/bin/orca_electron.js:73:25)
    at Object.<anonymous> (/usr/local/lib/node_modules/orca/bin/orca_electron.js:99:3)
    at Module._compile (module.js:569:30)
    at Object.Module._extensions..js (module.js:580:10)
    at Module.load (module.js:503:32)
    at tryModuleLoad (module.js:466:12)
    at Function.Module._load (module.js:458:3)
    at loadApplicationPackage (/usr/local/lib/node_modules/electron/dist/resources/default_app.asar/main.js:287:12)

The JSON files are 250Kb large for the 20 years of data. data.zip

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:22 (18 by maintainers)

github_iconTop GitHub Comments

1reaction
jonmmeasecommented, Aug 8, 2018

Ok, I figured out a solution based on this article: http://veithen.github.io/2014/11/16/sigterm-propagation.html

In our wrapper bash script we basically just need to prefix the call to orca with exec. Then the bash process becomes the orca process and the signals sent from Python make it to orca.

Since we haven’t merged it yet, I’ll update this in my conda build PR.

1reaction
sdrapcommented, Aug 7, 2018

Oh yes! that’s great I could get it the way you did. I will use this solution with temporary files, since in the notebook it doesn’t work. Just a matter of writing a small script to handle all the temp files.

Many thanks, I had been waiting for a long time for this export solution and it is really nice 😃.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Processing large JSON files in Python without running out of ...
Loading complete JSON files into Python can use too much memory, leading to slowness or crashes. The solution: process JSON data one chunk ......
Read more >
python - Opening A large JSON file - Stack Overflow
My JSON file is a big array of objects containing specific keys. Edit: Of course if each item in the (outermost) array appears...
Read more >
quaggan backpack
2 Sold by 2 Black Lion Statuette history Acquisition Contained in Quaggan Killer Whale Backpack …. < If you want to hang out...
Read more >
visual studio hangs when loading large json file
[severity:It's more difficult to complete my work] I have added the file that VS will not load in the attached zip file GetAbuseReportAmazon.zip...
Read more >
Parsing Large JSON with NodeJS - ckh|Consulting
Recently I was tasked with parsing a very large JSON file with Node.js Typically when wanting to parse JSON in Node its fairly...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found