RangeError: Invalid string length
See original GitHub issueThis is related to issue #313.
I encounter the same issue with Node v6.9.1
and elasticsearch-js v11.0.1
.
Node options : --expose-gc
--max-old-space-size=8192
--max-semi-space-size=1024
.
I increased the young space hopping that it will fix the problem.
ElasticSearch response is about 77Mb.
Error details:
error RangeError: Invalid string length
at IncomingMessage.<anonymous> (/usr/src/app/node_modules/elasticsearch/src/lib/connectors/http.js:181:19)
at emitOne (events.js:96:13)
at IncomingMessage.emit (events.js:188:7)
at readableAddChunk (_stream_readable.js:176:18)
at IncomingMessage.Readable.push (_stream_readable.js:134:10)
at HTTPParser.parserOnBody (_http_common.js:123:22)
at Socket.socketOnData (_http_client.js:363:20)
at emitOne (events.js:96:13)
at Socket.emit (events.js:188:7)
at readableAddChunk (_stream_readable.js:176:18)
at Socket.Readable.push (_stream_readable.js:134:10)
at TCP.onread (net.js:548:20)
I also got these errors:
2016-10-31T16:12:06.567Z - error: [ELASTICSEARCH] Request error, retrying
POST http://xx.xx.xx.xx:9200/_msearch => Parse Error
Unhandled rejection Error: Request Timeout after 180000ms
at /usr/src/app/node_modules/elasticsearch/src/lib/transport.js:336:15
at Timeout.<anonymous> (/usr/src/app/node_modules/elasticsearch/src/lib/transport.js:365:7)
at ontimeout (timers.js:365:14)
at tryOnTimeout (timers.js:237:5)
at Timer.listOnTimeout (timers.js:207:5)
Since the request timed out I thought the client got an invalid chunk. I made a trace :
[…]
[pid 27313] read(10, "[…]"..., 65536) = 65536
[pid 27313] read(10, "[…]"..., 65536) = 65536
[pid 27313] read(10, "[…]"..., 65536) = 52824
[pid 27313] read(10, "[…]"..., 65536) = 65536
[pid 27313] read(10, "[…]"..., 65536) = 65536
[pid 27313] read(10, "[…]"..., 65536) = 47032
[pid 27313] read(10, "[…]"..., 65536) = 65536
[pid 27313] read(10, "[…]"..., 65536) = 65536
[pid 27313] read(10, "[…]"..., 65536) = 47032
[pid 27313] read(10, "[…]"..., 65536) = 65536
[pid 27313] read(10, "[…]"..., 65536) = 65536
[pid 27313] read(10, "[…]"..., 65536) = 42688
[pid 27313] read(10, "[…]"..., 65536) = 65536
[pid 27313] read(10, "[…]"..., 65536) = 41616
[pid 27313] read(10, "[…]"..., 65536) = 65536
[pid 27313] write(2, "/usr/src/app/node_modules/elasticsearch/src/lib/connectors/http.js:180\n incoming.on('data', func"..., 811) = 811
[…]
The write operation is the beginning of the exception. It seems that the chunks have the same length.
I had to cut out the content but there was nothing special. Partial JSON objects as string.
Any ideas ?
Thanks in advance, Sven.
Issue Analytics
- State:
- Created 7 years ago
- Comments:6 (2 by maintainers)
Top Results From Across the Web
Uncaught RangeError: Invalid string length when appending ...
Your 2-D array accesses are incorrect, but the main problem is that you're re-using the variable i in an inner loop: for (i...
Read more >RangeError: Invalid string length with large files. · Issue #35973
When i'm trying to read a large Json File (700MB - 26.640.222 lines) using ReadStream i'm getting the error: "RangeError: Invalid string length" ......
Read more >RangeError: invalid array length - JavaScript - MDN Web Docs
The JavaScript exception "Invalid array length" occurs when specifying an array length that is either negative, a floating number or exceeds ...
Read more >Simplest solution to JSON.stringify RangeError: Invalid string ...
stringify RangeError: Invalid string length. We often come across this issue where the object or value passed to JSON.
Read more >HTML : Uncaught RangeError: Invalid string length ... - YouTube
HTML : Uncaught RangeError : Invalid string length when appending to a string in JavaScript [ Beautify Your Computer ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
i get this issue on aggregation, which cannot be paginated. it will be nice to at least wrap with try/catch
incoming.on(‘data’, function (d) { response += d; });
Paginating the request fixed this issue. Thanks.