question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Sending dynamic headers (values) to kafka while reading records from file

See original GitHub issue

AC1:

Documentation and usage examples link: IN PROGRESS…

AC2:

Here is the example of data file(test_data_json.json) i am using to drive through my test file to produce messages into kafka:

{"key":"111","value":{"id":121,"name":"Jey"},"headers": {"batchId": "${$.initiate_batch.response.body.id}","test": "tester"}}
{"key":"222","value":{"id":122,"name":"Krep"},"headers": {"batchId": "${$.initiate_batch.response.body.id}","test": "tester"}}

Note: the placeholder ${$.initiate_batch.response.body.id} which i expect to be replaced by the prior step of my test file

Then i use the following scenario : where i make a call in first step “initiate_batch” , retrieve the batchId value and then plan to have this replace the header value in the produce_step block; before i send it to kafka. However - i notice the value replacement is not happening and the placeholder is passed along as is, as part of the message header. (please see below the response from the consumer which showcases the batchId header value is not replaced)

I think it will be common for many of us to use a value from prior step to apply it to the content in file and replace the value before sending that message. As a matter of fact - i think this issue applies not just to kafka but also HTTP API calls where we would want to read from file where we could apply/update the content of the file with some placeholder values from prior step.

This is the scenario i am running:

{
    "scenarioName": "Produce a message - Sync - From File",
    "steps": [
     {
            "name": "initiate_batch",
            "url": "${server.endpoint.host}/batches",
            "method": "POST",
            "request": {
            	"headers": {
          			"Authorization": "Bearer ${token}",
          			"Content-Type": "application/json",
          			"Client-Id":"${api.key}"
        		},
        		"body" : "${JSON.FILE:request/batch_template.json}"
           },
            "verify": {
                "status": 201
            }
        },
        {
            "name": "produce_step",
            "url": "kafka-topic:${kafka.topic}",
            "operation": "produce",
            "request": {
                "async": false,
                "recordType" : "JSON",
                "file": "reusable_content/samples/request/test_data_json.json"
            },
            "assertions": {
                "status" : "Ok",
                "recordMetadata" : {
                    "topicPartition" : {
                        "topic" : "${kafka.topic}"
                    }
                }
            }
        }
}]
}

Following is the output of the consumer: am expecting ${$.initiate_batch.response.body.id} to be replaced by value i derive from the prior step.

Response:

{
  "records" : [ {
    "key" : "111",
    "jsonKey" : null,
    "value" : {
      "id" : 121,
      "name" : "Jey"
    },
    "headers" : {
      "test" : "tester",
      "batchId" : "${$.initiate_batch.response.body.id}"
    }
  }, {
    "key" : "222",
    "jsonKey" : null,
    "value" : {
      "id" : 122,
      "name" : "Krep"
    },
    "headers" : {
      "test" : "tester",
      "batchId" : "${$.initiate_batch.response.body.id}"
    }
  } ],
  "size" : 2
}

let me know if any questions or require clarifications.

This is a key requirement for our project, as we generate the JSON data we need to send to Kafka in a dynamic fashion - which i believe will be common for most of the users.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:13 (10 by maintainers)

github_iconTop GitHub Comments

2reactions
nirmalchandracommented, Oct 28, 2020

I had a quick look at the Kafka test @RDBreed added. It works as expected. 👍

You can find the CI logs for this test here too: https://api.travis-ci.org/v3/job/739273244/log.txt (Search for text files/test_data_json_with_vars.json)

Here is the local log for reference only.


-------------------------- BDD: Scenario:Produce a message - Sync - From File -------------------------

***Step PASSED - Scenario:Produce a message - Sync - From File -> load_kafka
2020-10-28 13:02:33,060 [main] WARN  org.jsmart.zerocode.core.runner.ZeroCodeMultiStepsScenarioRunnerImpl - 
--------- TEST-STEP-CORRELATION-ID: 300e6bc5-3db5-4009-b573-dcc52dd00c72 ---------
*requestTimeStamp:2020-10-28T13:02:32.372
step:load_kafka
id:null
url:kafka-topic:demo-file-3
method:PRODUCE
request:
{
  "async" : false,
  "recordType" : "JSON",
  "file" : "kafka/pfiles/test_data_json.json"
} 
--------- TEST-STEP-CORRELATION-ID: 300e6bc5-3db5-4009-b573-dcc52dd00c72 ---------
Response:
{
  "status" : "Ok",
  "recordMetadata" : {
    "offset" : 1,
    "timestamp" : 1603890153043,
    "serializedKeySize" : 13,
    "serializedValueSize" : 24,
    "topicPartition" : {
      "hash" : -1231673587,
      "partition" : 0,
      "topic" : "demo-file-3"
    }
  }
}
*responseTimeStamp:2020-10-28T13:02:33.054 
*Response delay:682.0 milli-secs 
---------> Expected Response: <----------
{
  "status" : "Ok",
  "recordMetadata" : {
    "topicPartition" : {
      "topic" : "demo-file-3"
    }
  }
} 
 
-done-



***Step PASSED - Scenario:Produce a message - Sync - From File -> load_kafka_with_ref
2020-10-28 13:02:33,090 [main] WARN  org.jsmart.zerocode.core.runner.ZeroCodeMultiStepsScenarioRunnerImpl - 
--------- TEST-STEP-CORRELATION-ID: 60692c16-b575-4dbe-ac87-f61ab82c702b ---------
*requestTimeStamp:2020-10-28T13:02:33.062
step:load_kafka_with_ref
id:null
url:kafka-topic:demo-file-3
method:PRODUCE
request:
{
  "async" : false,
  "recordType" : "JSON",
  "file" : "kafka/pfiles/test_data_json_with_vars.json"  <-------------- With JASON Path varibles inside the record file
} 
--------- TEST-STEP-CORRELATION-ID: 60692c16-b575-4dbe-ac87-f61ab82c702b ---------
Response:
{
  "status" : "Ok",
  "recordMetadata" : {
    "offset" : 3,
    "timestamp" : 1603890153083,
    "serializedKeySize" : 13,
    "serializedValueSize" : 42,
    "topicPartition" : {
      "hash" : -1231673587,
      "partition" : 0,
      "topic" : "demo-file-3"
    }
  }
}
*responseTimeStamp:2020-10-28T13:02:33.089 
*Response delay:27.0 milli-secs 
---------> Expected Response: <----------
{
  "status" : "Ok",
  "recordMetadata" : {
    "topicPartition" : {
      "topic" : "demo-file-3"
    }
  }
} 
 
-done-


***Step PASSED - Scenario:Produce a message - Sync - From File -> consume_raw
2020-10-28 13:02:43,259 [main] WARN  org.jsmart.zerocode.core.runner.ZeroCodeMultiStepsScenarioRunnerImpl - 
--------- TEST-STEP-CORRELATION-ID: 3b39d3cf-4d7c-4e61-88ab-8ed7d6d1a592 ---------
*requestTimeStamp:2020-10-28T13:02:33.091
step:consume_raw
id:null
url:kafka-topic:demo-file-3
method:UNLOAD
request:
{
  "consumerLocalConfigs" : {
    "recordType" : "JSON",
    "commitSync" : true,
    "showRecordsConsumed" : true,
    "maxNoOfRetryPollsOrTimeouts" : 3
  }
} 
--------- TEST-STEP-CORRELATION-ID: 3b39d3cf-4d7c-4e61-88ab-8ed7d6d1a592 ---------
Response:
{
  "records" : [ {
    "key" : "1546955346669",
    "jsonKey" : null,
    "value" : {
      "id" : 121,
      "name" : "Jey"
    },
    "headers" : { }
  }, {
    "key" : "1546955346670",
    "jsonKey" : null,
    "value" : {
      "id" : 122,
      "name" : "Krep"
    },
    "headers" : { }
  }, {
    "key" : "1546955346669",
    "jsonKey" : null,
    "value" : {
      "id" : 121,
      "name" : "demo-file-3-My-Value-1"
    },
    "headers" : { }
  }, {
    "key" : "1546955346670",
    "jsonKey" : null,
    "value" : {
      "id" : 122,
      "name" : "demo-file-3-My-Value-2"
    },
    "headers" : { }
  } ],
  "size" : 4
}
*responseTimeStamp:2020-10-28T13:02:43.257 
*Response delay:10166.0 milli-secs 
---------> Expected Response: <----------
{
  "size" : 4
} 
 
-done-


**FINISHED executing all Steps for [Produce a message - Sync - From File] **.
Steps were:[load_kafka, load_kafka_with_ref, consume_raw]

Architecture:x86_64, Java:1.8.0_91, Vendor:Oracle Corporation

Process finished with exit code 0

2reactions
dandalavinodcommented, Oct 27, 2020

@RDBreed apolozise for late response, have not got chance to work on this; please go ahead!

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to dynamically choose the output topic at runtime
In this tutorial, learn how to dynamically route records to different Kafka topics at runtime using Confluent, with step-by-step instructions and examples.
Read more >
Writing Kafka Records - SmallRye Reactive Messaging
In the Kafka world, it's often necessary to send records, ... withHeaders(new RecordHeaders().add("my-header", "value". ... Dynamic topic names.
Read more >
Structured Streaming + Kafka Integration Guide (Kafka broker ...
Structured Streaming integration for Kafka 0.10 to read data from and write data to ... String)] // Subscribe to 1 topic, with headers...
Read more >
Apache Kafka Reference Guide - Quarkus
Applications send and receive messages. A message wraps a payload and can be extended with some metadata. With the Kafka connector, a message...
Read more >
20 best practices for Apache Kafka at scale | New Relic
Consumers subscribe to topics in order to read the data written to them. Topic partition: Topics are divided into partitions, and each message ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found