question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

JMeter does not fail build on Beanshell Assertion failure

See original GitHub issue

I have a few beanshell assertions which fails if a threshold has been met. I clearly see the failure in logs but somehow the build isn’t faling.

I understand the build can fail based off a errorRateThresholdInPercent but that is calculated/evaluated using actual failed requests vs failed assertions.

Is there a way to ensure the build fails off failed assertions only?

I have a Beanshell Assertion with the following code:

import org.apache.jmeter.util.JMeterUtils;
import java.lang.Math;

double addresses90thPercentile = Math.round(Double.parseDouble(vars.get("addresses90thPercentile"))*100.0)/100.0;
log.info("addresses90thPercentile = " + addresses90thPercentile + "ms\n");
System.out.println("addresses90thPercentile = " + addresses90thPercentile + "ms\n");

// assert percentile
if(addresses90thPercentile > Integer.parseInt(vars.get("addresses_response_threshold"))){
	String failMsg = "/addresses 90th percentile was > " +vars.get("addresses_response_threshold")+ "ms threshold.  Actual 90th percentile => " +Double.toString(addresses90thPercentile)+"ms";
	log.error(failMsg);
	System.out.println(failMsg);
	Failure = true;
	FailureMessage = failMsg;
}

You can see in logs that the Assertion was triggered via failMsg var value in the beanshell snapshot above:

[INFO] policyDetails90thPercentile = 25282.0ms
[INFO] 
[INFO] /policydetails 90th percentile was > 15000ms threshold.  Actual 90th percentile => 25282.0ms
[INFO] addresses90thPercentile = 30699.0ms
[INFO] 
[INFO] /addresses 90th percentile was > 4000ms threshold.  Actual 90th percentile => 30699.0ms
[INFO] summary +      2 in 00:00:01 =    2.0/s Avg: 15218 Min:    15 Max: 30422 Err:     2 (100.00%) Active: 0 Started: 1 Finished: 1
[INFO] summary =    170 in 00:02:45 =    1.0/s Avg:  8411 Min:     1 Max: 59513 Err:    13 (7.65%)
[INFO] Tidying up ...    @ Tue Jun 09 12:01:04 PDT 2020 (1591729264040)
[INFO] ... end of run
[INFO] Completed Test: /Users/me/github/css-api-test/performance/target/jmeter/testFiles/css_api_perf_build.jmx
[INFO]  
[INFO] 
[INFO] --- jmeter-maven-plugin:3.1.0:results (jmeter-check-results) @ css-api-perf-test ---
[INFO]  
[INFO] -------------------------------------------------------
[INFO] S C A N N I N G    F O R    R E S U L T S
[INFO] -------------------------------------------------------
[INFO]  
[INFO] Will scan results using format: CSV
[INFO]  
[INFO] Parsing results file '/Users/me/github/css-api-test/performance/target/jmeter/results/css_api_perf_build.csv' as type: CSV
[INFO] Number of failures in 'css_api_perf_build.csv': 13
[INFO] Number of successes in 'css_api_perf_build.csv': 157
[INFO]  
[INFO] -------------------------------------------------------
[INFO] P E R F O R M A N C E    T E S T    R E S U L T S
[INFO] -------------------------------------------------------
[INFO]  

I also tried disabling all the assertions within test and only left the beanshell assertion. I also removed the results goal as I realized that when its enabled it verifies test using errorRateThresholdInPercent. Doing this I noticed nothing is evaluating the assertions.

My configuration looks like this:


<properties>
		<maven.compiler.target>1.8</maven.compiler.target>
		<maven.compiler.source>1.8</maven.compiler.source>
		<jmeter.maven.plugin>3.1.0</jmeter.maven.plugin>
		<jmeter.parallel>0.4</jmeter.parallel>
		<kg.apc.jmeter.plugins.standard>1.4.0</kg.apc.jmeter.plugins.standard>
		<kg.apc.jmeter.plugins.extras.libs>1.4.0</kg.apc.jmeter.plugins.extras.libs>
		<kg.apc.jmeter.plugins.graphs.ggl>2.0</kg.apc.jmeter.plugins.graphs.ggl>
		<kg.apc.jmeter.plugins.cmn.jmeter>0.6</kg.apc.jmeter.plugins.cmn.jmeter>
		<kg.apc.jmeter.plugins.manager>1.4</kg.apc.jmeter.plugins.manager>
		<kg.apc.jmeter.plugins.cmd>2.2</kg.apc.jmeter.plugins.cmd>
		<kg.apc.cmdrunner>2.2.1</kg.apc.cmdrunner>
		<com.google.code.gson>2.8.6</com.google.code.gson>
		<commons.codec>1.14</commons.codec>
		<commons.math3>3.6.1</commons.math3>
		<jmx.script.name>${script.name}</jmx.script.name>
	</properties>

<build>
				<plugins>
					<plugin>
						<groupId>com.lazerycode.jmeter</groupId>
						<artifactId>jmeter-maven-plugin</artifactId>
						<version>${jmeter.maven.plugin}</version>
						<configuration>
							<suppressJMeterOutput>false</suppressJMeterOutput>
							<generateReports>true</generateReports>
							<scanResultsForFailedRequests>true</scanResultsForFailedRequests>
							<ignoreResultFailures>false</ignoreResultFailures>
							<errorRateThresholdInPercent>50</errorRateThresholdInPercent>
							<jmeterExtensions>
								<artifact>com.blazemeter:jmeter-parallel:${jmeter.parallel}</artifact>
								<artifact>kg.apc:jmeter-plugins-standard:${kg.apc.jmeter.plugins.standard}</artifact>
								<artifact>kg.apc:jmeter-plugins-extras-libs:${kg.apc.jmeter.plugins.extras.libs}</artifact>
							</jmeterExtensions>
							<downloadExtensionDependencies>false</downloadExtensionDependencies>
							<testPlanLibraries>
								<artifact>com.google.code.gson:gson:${com.google.code.gson}</artifact>
								<artifact>com.farmers.css.api:css-api-core:1.0.7</artifact>
								<artifact>com.farmers.css.api:css-api-test:1.0.4</artifact>
								<artifact>com.farmers.css.api:css-api-salesforce-clients:1.0.4</artifact>
								<artifact>com.farmers.css.util:css-encryption-utils:1.0.0</artifact>
							</testPlanLibraries>
							<downloadLibraryDependencies>false</downloadLibraryDependencies>
							<excludedArtifacts>
								<!-- exclusion>org.apache.logging.log4j:log4j-slf4j-impl</exclusion-->
								<exclusion>org.slf4j:slf4j-nop</exclusion>
							</excludedArtifacts>
							<overrideRootLogLevel>info</overrideRootLogLevel>
							<resultsFileFormat>xml</resultsFileFormat>
							<testResultsTimestamp>false</testResultsTimestamp>
							<propertiesUser>
								<thread_count>${thread.count}</thread_count>
								<ramp_up>${ramp.up}</ramp_up>
								<peak>${peak}</peak>
								<duration>${duration}</duration>
								<css_api_hostname>${css.api.hostname}</css_api_hostname>
								<tdm_backbone_hostname>${tdm.backbone.hostname}</tdm_backbone_hostname>
								<tdm_backbone_port>${tdm.backbone.port}</tdm_backbone_port>
								<environment>${environment}</environment>
								<proxy_host>${proxy.host}</proxy_host>
								<proxy_port>${proxy.port}</proxy_port>
								<jwt_dstr_pct>${jwt.dstr.pct}</jwt_dstr_pct>
								<policies_summary_dstr_pct>${policies.summary.dstr.pct}</policies_summary_dstr_pct>
								<policy_details_dstr_pct>${policy.details.dstr.pct}</policy_details_dstr_pct>
								<addresses_dstr_pct>${addresses.dstr.pct}</addresses_dstr_pct>
								<jwt_response_threshold>${jwt.response.threshold}</jwt_response_threshold>
								<policies_summary_response_threshold>${policies.summary.response.threshold}</policies_summary_response_threshold>
								<policy_details_response_threshold>${policy.details.response.threshold}</policy_details_response_threshold>
								<addresses_response_threshold>${addresses.response.threshold}</addresses_response_threshold>
								<project_build_directory>${project.build.directory}</project_build_directory>
							</propertiesUser>
							<propertiesJMeter>
								<jmeter.reportgenerator.exporter.html.series_filter>((^/jwt HTTP Request)|(^/policies/summary HTTP Request)|(^/policydetails HTTP Request)|(^/addresses HTTP Request))(-success|-failure)?$</jmeter.reportgenerator.exporter.html.series_filter>
								<jmeter.reportgenerator.apdex_satisfied_threshold>1000</jmeter.reportgenerator.apdex_satisfied_threshold>
								<jmeter.reportgenerator.apdex_tolerated_threshold>3000</jmeter.reportgenerator.apdex_tolerated_threshold>								
								<jmeter.save.saveservice.autoflush>true</jmeter.save.saveservice.autoflush>
								<jmeter.save.saveservice.output_format>xml</jmeter.save.saveservice.output_format>
								<jmeter.save.saveservice.assertion_results_failure_message>true</jmeter.save.saveservice.assertion_results_failure_message>
								<jmeter.save.saveservice.data_type>true</jmeter.save.saveservice.data_type>
								<jmeter.save.saveservice.label>true</jmeter.save.saveservice.label>
								<jmeter.save.saveservice.response_code>true</jmeter.save.saveservice.response_code>
								<jmeter.save.saveservice.response_message>true</jmeter.save.saveservice.response_message>
								<jmeter.save.saveservice.successful>true</jmeter.save.saveservice.successful>
								<jmeter.save.saveservice.thread_name>true</jmeter.save.saveservice.thread_name>
								<jmeter.save.saveservice.time>true</jmeter.save.saveservice.time>
								<jmeter.save.saveservice.connect_time>true</jmeter.save.saveservice.connect_time>
								<jmeter.save.saveservice.assertions>true</jmeter.save.saveservice.assertions>
								<jmeter.save.saveservice.latency>true</jmeter.save.saveservice.latency>
								<jmeter.save.saveservice.bytes>true</jmeter.save.saveservice.bytes>
								<jmeter.save.saveservice.url>true</jmeter.save.saveservice.url>
								<jmeter.save.saveservice.thread_counts>true</jmeter.save.saveservice.thread_counts>
								<jmeter.save.saveservice.sample_count>true</jmeter.save.saveservice.sample_count>
								<jmeter.save.saveservice.timestamp_format>ms</jmeter.save.saveservice.timestamp_format>
								<jmeter.save.saveservice.timestamp_format>yyyy/MM/dd HH:mm:ss</jmeter.save.saveservice.timestamp_format>
								<httpclient4.retrycount>6</httpclient4.retrycount>
								<httpsampler.await_termination_timeout>60</httpsampler.await_termination_timeout>
								<httpclient.timeout>30000</httpclient.timeout>
								<http.socket.timeout>30000</http.socket.timeout>
								<summariser.interval>30</summariser.interval>
							</propertiesJMeter>
							<testFilesIncluded>
								<jMeterTestFile>${jmx.script.name}.jmx</jMeterTestFile>
							</testFilesIncluded>
						</configuration>
						<executions>
							<execution>
								<id>configuration</id>
								<goals>
									<goal>configure</goal>
								</goals>
							</execution>
							<execution>
								<id>jmeter-tests</id>
								<phase>verify</phase>
								<goals>
									<goal>jmeter</goal>
								</goals>
							</execution>
							<execution>
                            	<id>jmeter-check-results</id>
                            	<goals>
                                	<goal>results</goal>
                            	</goals>
                        	</execution>							
						</executions>
					</plugin>
					<plugin>
						<artifactId>maven-antrun-plugin</artifactId>
						<executions>
							<execution>
								<phase>verify</phase>
								<goals>
									<goal>run</goal>
								</goals>
								<configuration>
									<tasks>
										<echo>${project.basedir}</echo>
										<echo>${project.build.directory}</echo>
									</tasks>
								</configuration>
							</execution>
						</executions>
					</plugin>
				</plugins>
			</build>

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:10 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
Ardescocommented, Jun 11, 2020

Also switching the format to CSV is expected if you have <generateReports>true</generateReports>

1reaction
Ardescocommented, Mar 29, 2021

If you are logging failures in your csv/jtl file you don’t need to try and make the beanshell script modify the exit code. You just need to set up the thresholds you want via config (Ideally the build shouldn’t fail if you manage to successfully run a test anyway, you will want to parse the results and work out what is a failure for you).

We could probably look at filtering our search for failures based on a specific failureMessage to help you target specific issues more easily. It also looks like we have a bug with case sensitivity, we are scanning the file for false, not FALSE. That’s something we need to fix.

So I think we have the following:

  • Make search for failures case insensitive
  • Add the ability to supply a list of failureMessage matches so that you can only fail on targeted failures.
Read more comments on GitHub >

github_iconTop Results From Across the Web

Jmeter BeanShell Assertion not failing the tests - Stack Overflow
I have a BeanShell Assertion which should make the tests fail (failure is hardcoded into the assertion). But all the tests pass. What...
Read more >
How to Use JMeter Assertions in Three Easy Steps - BlazeMeter
Assertion Failure (true|false). This indicates whether an assertion is successful. If the actual assertion result matches the expected result, ...
Read more >
Build still passes when jmeter assertions fail or when jmeter ...
I have a sample assertion where I fail the web request if it does not complete within 10 ms. That web request failure...
Read more >
JMeter Assertions: The Ultimate Guide - OctoPerf
Failure = true; FailureMessage = "This is an Error Message";. And the result should be the error message configured. JMeter BeanShell Assertion ......
Read more >
What is JMeter assertion? | How to use? - eduCBA
But they are not advised to use at load tests as they consume a huge amount of memory. It should be used for...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found