question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Load JSON document from CSV file

See original GitHub issue

My lambda function which I want to test expects a complex json and as per my understanding it needs to go into the csv file. My problem is I have tried various ways to load the json from csv but keep getting errors. I am not sure if this is even possible. My sample csv document looks like:

column1 { “profile”:{“name”:“irfan”,“email”:“irfan@email.com”},“address”:[“address1”,“address2”]} { “profile”:{“name”:“Tomas”,“email”:“tomas@email.com”},“address”:[“address1”,“address2”]} { “profile”:{“name”:“Joel”,“email”:“joel@email.com”},“address”:[“address1”,“address2”]}

I only have one column because all I want is this json document to get passed as request body to my hello world lambda so I could load test it.

My Artillery script file looks like:

config:
  target: "https://api-gateway-load-testing-tst-ap-southeast-2.xxxxxxxxxxx.com"
  phases:
    - 
      duration: 5
      arrivalRate: 1
  defaults:
    headers:
      x-api-key: "xxxxxxxxxxxxxxxxxxxxxxxxxxx"
      Content-Type: "application/json"
  payload:
    # path is relative to the location of the test script
    path: "post-data.csv"
    fields:
      - "column1"
    order: sequence
    delimiter: "~"
    skipHeader: true
    cast: false
  plugins:
    cloudwatch:
      namespace: "serverless-artillery-loadtest"
scenarios:
  - flow:
      - post:
          url: "/v1/hello-world"
          json:
            data: {{ column1 }}

When I put double quotes around keys and values in the json I get error saying “Error executing task: ERROR exception encountered while executing load from 1579692773908 in 1579692773908: Artillery exited with non-zero code:

Is there any way to load the json from csv in a way that my hello world lambda function receive request body as json in the following format:

{ “data”: { “profile”:{“name”:“irfan”,“email”:“irfan@email.com”},“address”:[“address1”,“address2”]}}

Any help would be appreciated.

Issue Analytics

  • State:open
  • Created 4 years ago
  • Comments:8 (1 by maintainers)

github_iconTop GitHub Comments

6reactions
irfanatoptcommented, Jan 27, 2020

I managed to get around this issue myself by writing custom JavaScript to load json payload instead of from a csv file. I used config.processor along with beforeScenario hook to define the my custom logic.

For anyone who may be facing the similar problem, here is my solution:

script.yml

config:
  target: "https://api-ap-southeast-2.aws.my-domain.com"
  processor: "./post-body.js"
  # Following phases test a scenario where 0 tps ramps up to 50 tps in 1 minutes, and then ramps up to 1000 tps every 30 seconds in 50 tps increment

  phases:
    - 
      duration: 60
      arrivalRate: 10
      rampTo: 50
    -
      duration: 30
      arrivalRate: 50
    -
      duration: 30
      arrivalRate: 100
    -
      duration: 30
      arrivalRate: 150
    -
      duration: 30
      arrivalRate: 200
    -
      duration: 30
      arrivalRate: 250
    -
      duration: 30
      arrivalRate: 300
    -
      duration: 30
      arrivalRate: 350
    -
      duration: 30
      arrivalRate: 400
    -
      duration: 30
      arrivalRate: 450
    -
      duration: 30
      arrivalRate: 500
    -
      duration: 30
      arrivalRate: 550
    -
      duration: 30
      arrivalRate: 600
    -
      duration: 30
      arrivalRate: 650
    -
      duration: 30
      arrivalRate: 700
    -
      duration: 30
      arrivalRate: 750
    -
      duration: 30
      arrivalRate: 800
    -
      duration: 30
      arrivalRate: 850
    -
      duration: 30
      arrivalRate: 900
    -
      duration: 30
      arrivalRate: 950
    -
      duration: 270
      arrivalRate: 1000
  defaults:
    headers:
      x-api-key: "fake-x-api-key"
      Content-Type: "application/json"
  plugins:
    cloudwatch:
      namespace: "my-service-name"
    influxdb:
      testName: "my-service Load Test Results"
      influx:
        host: "fake-ip-address"
        username: "fake-username"
        password: "fake-password"
        database: "influx"

scenarios:
  - name: my-service-name load test with varying load
    beforeScenario: generatePostBody
    flow:
      - post:
          url: "/my-fake-endpoint"
          json:
            "{{ data }}"

Following post-body.js contains my custom JS logic. I have introduced a new txt file post-data.txt which essentially replaces csv file which I mentioned in the question to host thousands of rows where each row is a request payload as json. Every time a scenario is executed, a random json payload string is picked up, converted to json object and sent as part of POST request. I am also using CloudWatch and InfluxDB to output the results.

post-body.js

const fs = require("fs");
const filePath = "./post-data.txt";
let postData;

/**
 * Generates post body
 */
const generatePostBody = async (userContext, events, done) => {
  try{
    // add variables to virtual user's context:
    if(postData === undefined || postData === null || postData.length === 0) {
      postData = await loadDataFromTxt(filePath);
    }
    const postBodyStr = postData[Math.floor(Math.random()*postData.length)];
    userContext.vars.data = JSON.parse(postBodyStr);

    // continue with executing the scenario:
    return done();

  } catch(err) {
    console.log(`Error occurred in function generatePostBody. Detail: ${err}`);
    throw(err);
  }
}

/**
 * Loads post body from csv file
 * @param {object} filePath - The path of csv file
 */
const loadDataFromCsv = async filePath => {
  const data = [];

  return new Promise((resolve, reject) => {
    fs.createReadStream(filePath)
      .pipe(csv({delimiter: '||'}))
      .on("data", data => data.push(data))
      .on("end", () => {
        return resolve(data);
      })
      .on("error", error => {
        return reject(error);
      });
  });
};

/**
 * Loads post body from text file
 * @param {object} filePath - The path of text file
 */
const loadDataFromTxt = async (path) => {
  return new Promise((resolve, reject) => {
    fs.readFile(path, 'utf8', function (err, data) {
      if (err) {
        reject(err);
      }
      resolve(data.toString().split("\n"));
    });
  });
}

// Load data from txt file once at the start of the execution
// and save the results in a global variable
(async () => {
  try {
  postData = await loadDataFromTxt(filePath);
  //console.log(JSON.parse(postData[0]));
  } catch (error) {
    console.log(`Error occurred in main. Detail: ${err}`);
  }
})();

module.exports.generatePostBody = generatePostBody; 

post-data.txt

{ "profile":{"name":"irfan","email":"irfan@email.com"},"address":["address1","address2"]}
{ "profile":{"name":"Tomas","email":"tomas@email.com"},"address":["address1","address2"]}
{ "profile":{"name":"Joel","email":"joel@email.com"},"address":["address1","address2"]}

HTH

4reactions
hassycommented, Jan 29, 2020

hi @ianonymousdev, something like this should work:

  payload:
    - path: post-data.csv
      delimiter: "~"
      fields:
        - column1
      options:
        quote: ""

I’d suggest running your scripts with artillery locally first - that way you will see an actual error message rather than a generic error message from AWS Lambda.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How To: Load JSON values in a CSV file
One method for loading the JSON data in a CSV file of mixed data types would be to first load the data into...
Read more >
Spark Read JSON from a CSV file
In order to read a JSON string from a CSV file, first, we need to read a CSV file into Spark Dataframe using...
Read more >
Using the CSV parser to parse CSV text into a JSON object - IBM
Defining the data to be parsed and generating valid JSON schema · Click the (+) icon, go to the Toolbox tab, and then...
Read more >
How to convert CSV file to multiline JSON? - Stack Overflow
So we just need to pull the loop out of the call to json.dump and interpose newlines for each document written. import csv...
Read more >
How to convert a CSV file to JSON in 1 minute / CSV to JSON
A free, quick, and easy utility to upload your CSV file and convert to a JSON formatted text.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found