Unable to import x-ndjson file, got error "Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes"
See original GitHub issueHey,
I’m trying to import data into Elasticsearch from JSON file (x-ndjson) which contains one document per line.
Here is how I’m creating index and trying to insert one document:
DELETE /tests
PUT /tests
{}
PUT /tests/test/_mapping
{
"test":{
"properties":{
"env":{"type":"keyword"},
"uid":{"type":"keyword"},
"ok":{"type":"boolean"}
}
}
}
POST /tests/test
{"env":"dev", "uid":12346, "ok":true}
GET /tests/_search
{"query":{"match_all":{}}}
Everything works fine, no errors, document is indexed correctly and could be found in ES.
Now let’s try to do it using elasticdump
.
Here is content of file I’m trying to import:
cat ./data.json
{"env":"prod","uid":1111,"ok":true}
{"env":"prod","uid":2222,"ok":true}
Here is how I’m trying to import:
elasticdump \
--input="./data.json" \
--output="http://elk:9200" \
--output-index="tests/test" \
--debug \
--limit=10000 \
--headers='{"Content-Type": "application/x-ndjson"}' \
--type=data
But I got error Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes
.
Here is full output:
root@node-tools:/data# elasticdump \
> --input="./data.json" \
> --output="http://elk:9200" \
> --output-index="tests/test" \
> --debug \
> --limit=10000 \
> --headers='{"Content-Type": "application/x-ndjson"}' \
> --type=data
Tue, 16 Apr 2019 16:26:28 GMT | starting dump
Tue, 16 Apr 2019 16:26:28 GMT | got 2 objects from source file (offset: 0)
Tue, 16 Apr 2019 16:26:28 GMT [debug] | discovered elasticsearch output major version: 6
Tue, 16 Apr 2019 16:26:28 GMT [debug] | thisUrl: http://elk:9200/tests/test/_bulk, payload.body: "{\"index\":{\"_index\":\"tests\",\"_type\":\"test\"}}\nundefined\n{\"index\":{\"_index\":\"tests\",\"_type\":\"test\"}}\nundefined\n"
{ _index: 'tests',
_type: 'test',
_id: 'ndj4JmoBindjidtNmyKf',
status: 400,
error:
{ type: 'mapper_parsing_exception',
reason: 'failed to parse',
caused_by:
{ type: 'not_x_content_exception',
reason:
'Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes' } } }
{ _index: 'tests',
_type: 'test',
_id: 'ntj4JmoBindjidtNmyKf',
status: 400,
error:
{ type: 'mapper_parsing_exception',
reason: 'failed to parse',
caused_by:
{ type: 'not_x_content_exception',
reason:
'Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes' } } }
Tue, 16 Apr 2019 16:26:28 GMT | sent 2 objects to destination elasticsearch, wrote 0
Tue, 16 Apr 2019 16:26:28 GMT | got 0 objects from source file (offset: 2)
Tue, 16 Apr 2019 16:26:28 GMT | Total Writes: 0
Tue, 16 Apr 2019 16:26:28 GMT | dump complete
What am I doing wrong? Why manual insert works fine, but _batch
is throwing errors. Any ideas?
payload.body
looks very suspicious payload.body: "{\"index\":{\"_index\":\"tests\",\"_type\":\"test\"}}\nundefined\n{\"index\":{\"_index\":\"tests\",\"_type\":\"test\"}}\nundefined\n"
, especially undefined
.
Thank you!
Issue Analytics
- State:
- Created 4 years ago
- Comments:9
This worked for me
@ferronrsmith Thank you!