question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Negative gasPrice and gasLimit

See original GitHub issue

Hello and thank you for creating this very useful open source project!

I started using it yesterday and ran into a couple of issues, I am hoping you can point me in the right direction. I’m including here an example app for reproducing the problems.

My setup:

  • Ubuntu 16.04 (64-bit)
  • Scala 2.11.7
  • Spark 2.2.1 (Hadoop 2.7) running in local mode
  • Geth 1.7.3-stable

My Geth node is synced to the mainnet and following the advice here, I created multiple files for holding the exported blockchain (up until block number 5M).

Here’s a snippet of the directory, with the file names using a slightly different nomenclature than the example, but still produced with geth export:

➜  eth-blockchain ls -lah
total 25G
drwxrwxr-x  2 sancho sancho 4.0K Feb 13 16:22 .
drwxr-xr-x 43 sancho sancho 4.0K Feb 13 16:22 ..
-rwxrwxr-x  1 sancho sancho 126M Feb 13 03:01 blocks-0-200000
-rwxrwxr-x  1 sancho sancho 240M Feb 13 03:32 blocks-1000001-1200000
-rwxrwxr-x  1 sancho sancho 274M Feb 13 03:33 blocks-1200001-1400000
-rwxrwxr-x  1 sancho sancho 299M Feb 13 03:33 blocks-1400001-1600000
-rwxrwxr-x  1 sancho sancho 307M Feb 13 03:40 blocks-1600001-1800000
-rwxrwxr-x  1 sancho sancho 290M Feb 13 03:40 blocks-1800001-2000000
-rwxrwxr-x  1 sancho sancho 301M Feb 13 03:41 blocks-2000001-2200000
-rwxrwxr-x  1 sancho sancho 148M Feb 13 03:02 blocks-200001-400000
-rwxrwxr-x  1 sancho sancho 332M Feb 13 03:41 blocks-2200001-2400000
...

The code for the app I’m using:

package analytics

import collection.JavaConverters._
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.io.BytesWritable
import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession
import org.zuinnote.hadoop.ethereum.format.common.EthereumBlock
import org.zuinnote.hadoop.ethereum.format.mapreduce.EthereumBlockFileInputFormat


object TestApp {

  def main(args: Array[String]): Unit = {
    val sparkConf = new SparkConf()
      .setAppName("test-app")
      .setMaster("local[*]")

    val spark = SparkSession.builder
      .config(sparkConf)
      .getOrCreate()

    val hadoopConf = new Configuration()
    hadoopConf.set("hadoopcryptoledeger.ethereumblockinputformat.usedirectbuffer", "false")

    val rdd = spark.sparkContext.newAPIHadoopFile(
      "/home/sancho/eth-blockchain",
      classOf[EthereumBlockFileInputFormat], classOf[BytesWritable], classOf[EthereumBlock], hadoopConf
    )

    println("Number of transactions with negative gas price: " + rdd
      .flatMap(_._2.getEthereumTransactions.asScala)
      .filter(_.getGasPrice < 0)
      .count()
    )

    println("Number of transactions with negative gas limit: " + rdd
      .flatMap(_._2.getEthereumTransactions.asScala)
      .filter(_.getGasLimit < 0)
      .count()
    )

    val blockNumber = 4800251

    println(s"Number of transactions with negative gas price in block $blockNumber: " + rdd
        .filter(_._2.getEthereumBlockHeader.getNumber == blockNumber)
        .flatMap(_._2.getEthereumTransactions.asScala)
        .filter(_.getGasPrice < 0)
        .count()
    )

    println(s"Number of transactions with negative gas limit in block $blockNumber: " + rdd
      .filter(_._2.getEthereumBlockHeader.getNumber == blockNumber)
      .flatMap(_._2.getEthereumTransactions.asScala)
      .filter(_.getGasLimit < 0)
      .count()
    )
  }
}

This is the build.sbt file:

lazy val commonSettings = Seq(
  scalaVersion := "2.11.7",
  test in assembly := {}
)

lazy val ethBlockchainAnalytics = (project in file(".")).
  settings(commonSettings).
  settings(
    name := "EthBlockchainAnalytics",
    version := "0.1",
    libraryDependencies ++= Seq(
      "com.github.zuinnote" %% "spark-hadoopcryptoledger-ds" % "1.1.2",
      "org.apache.spark" %% "spark-core" % "2.2.1" % "provided",
      "org.apache.spark" %% "spark-sql" % "2.2.1" % "provided"),
    assemblyJarName in assembly := s"${name.value}_${scalaBinaryVersion.value}-${version.value}.jar",
    assemblyMergeStrategy in assembly := {
      case PathList("META-INF", xs@_*) => MergeStrategy.discard
      case PathList("javax", "servlet", xs@_*) => MergeStrategy.last
      case PathList("org", "apache", xs@_*) => MergeStrategy.last
      case x =>
        val oldStrategy = (assemblyMergeStrategy in assembly).value
        oldStrategy(x)
    }
  )

The launcher script I’m using:

#!/bin/sh

JAR=$1

/usr/local/lib/spark/bin/spark-submit \
    --class analytics.TestApp \
    --driver-memory 20G \
$JAR

And finally the command I’m using to run it:

➜  EthBlockchainAnalytics src/main/resources/launcher.sh /home/sancho/IdeaProjects/EthBlockchainAnalytics/target/scala-2.11/EthBlockchainAnalytics_2.11-0.1.jar

The output of the above application when run like this is:

Number of transactions with negative gas price: 8732263
Number of transactions with negative gas limit: 25699923
Number of transactions with negative gas price in block 4800251: 2
Number of transactions with negative gas limit in block 4800251: 8

As a quick sanity check, I ran the following:

➜  ~ geth attach
Welcome to the Geth JavaScript console!

instance: Geth/v1.7.3-stable-4bb3c89d/linux-amd64/go1.9
 modules: admin:1.0 debug:1.0 eth:1.0 miner:1.0 net:1.0 personal:1.0 rpc:1.0 txpool:1.0 web3:1.0

> var txs = eth.getBlock(4800251).transactions
undefined
> for (var i=0; i<txs.length; i++) { if (eth.getTransaction(txs[i]).gasPrice < 0) console.log(txs[i]) }
undefined

Any idea why I’m seeing so many negative gas prices and gas limits when using Spark?

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:13 (9 by maintainers)

github_iconTop GitHub Comments

1reaction
jornfrankecommented, Feb 13, 2018

fixed (see latest commit), as suspected, it was that Java has a sign, whereby Ethereum has not a sign. I will release the fix the coming days.

thank you a lot for reporting.

1reaction
jornfrankecommented, Feb 13, 2018

I think it is a bug in the library. We convert them to a Java datatype, but those are signed, but they should be unsigned (which does not exist in Java). I will prepare a test case for this and see if this is the case and eventually publish a fix this week if this is the case.

Thx for reporting.

On 13. Feb 2018, at 15:55, Valentin Ganev notifications@github.com wrote:

Hello and thank you for creating this very useful open source project!

I started using it yesterday and ran into a couple of issues, I am hoping you can point me in the right direction. I’m including here an example app for reproducing the problems.

My setup:

Ubuntu 16.04 (64-bit) Scala 2.11.7 Spark 2.2.1 (Hadoop 2.7) running in local mode Geth 1.7.3-stable My Geth node is synced to the mainnet and following the advice here, I created multiple files for holding the exported blockchain (up until block number 5M).

Here’s a snippet of the directory, with the file names using a slightly different nomenclature than the example, but still produced with geth export:

➜ eth-blockchain ls -lah total 25G drwxrwxr-x 2 sancho sancho 4.0K Feb 13 16:22 . drwxr-xr-x 43 sancho sancho 4.0K Feb 13 16:22 … -rwxrwxr-x 1 sancho sancho 126M Feb 13 03:01 blocks-0-200000 -rwxrwxr-x 1 sancho sancho 240M Feb 13 03:32 blocks-1000001-1200000 -rwxrwxr-x 1 sancho sancho 274M Feb 13 03:33 blocks-1200001-1400000 -rwxrwxr-x 1 sancho sancho 299M Feb 13 03:33 blocks-1400001-1600000 -rwxrwxr-x 1 sancho sancho 307M Feb 13 03:40 blocks-1600001-1800000 -rwxrwxr-x 1 sancho sancho 290M Feb 13 03:40 blocks-1800001-2000000 -rwxrwxr-x 1 sancho sancho 301M Feb 13 03:41 blocks-2000001-2200000 -rwxrwxr-x 1 sancho sancho 148M Feb 13 03:02 blocks-200001-400000 -rwxrwxr-x 1 sancho sancho 332M Feb 13 03:41 blocks-2200001-2400000 … The code for the app I’m using:

package analytics

import collection.JavaConverters._ import org.apache.hadoop.conf.Configuration import org.apache.hadoop.io.BytesWritable import org.apache.spark.SparkConf import org.apache.spark.sql.SparkSession import org.zuinnote.hadoop.ethereum.format.common.EthereumBlock import org.zuinnote.hadoop.ethereum.format.mapreduce.EthereumBlockFileInputFormat

object TestApp {

def main(args: Array[String]): Unit = { val sparkConf = new SparkConf() .setAppName(“test-app”) .setMaster(“local[*]”)

val spark = SparkSession.builder
  .config(sparkConf)
  .getOrCreate()

val hadoopConf = new Configuration()
hadoopConf.set("hadoopcryptoledeger.ethereumblockinputformat.usedirectbuffer", "false")

val rdd = spark.sparkContext.newAPIHadoopFile(
  "/home/sancho/eth-blockchain",
  classOf[EthereumBlockFileInputFormat], classOf[BytesWritable], classOf[EthereumBlock], hadoopConf
)

println("Number of transactions with negative gas price: " + rdd
  .flatMap(_._2.getEthereumTransactions.asScala)
  .filter(_.getGasPrice < 0)
  .count()
)
println("Number of transactions with negative gas limit: " + rdd
  .flatMap(_._2.getEthereumTransactions.asScala)
  .filter(_.getGasLimit < 0)
  .count()
)

val blockNumber = 4800251

println(s"Number of transactions with negative gas price in block $blockNumber: " + rdd
    .filter(_._2.getEthereumBlockHeader.getNumber == blockNumber)
    .flatMap(_._2.getEthereumTransactions.asScala)
    .filter(_.getGasPrice < 0)
    .count()
)

println(s"Number of transactions with negative gas limit in block $blockNumber: " + rdd
  .filter(_._2.getEthereumBlockHeader.getNumber == blockNumber)
  .flatMap(_._2.getEthereumTransactions.asScala)
  .filter(_.getGasLimit < 0)
  .count()
)

} } This is the build.sbt file:

lazy val commonSettings = Seq( scalaVersion := “2.11.7”, test in assembly := {} )

lazy val ethBlockchainAnalytics = (project in file(“.”)). settings(commonSettings). settings( name := “EthBlockchainAnalytics”, version := “0.1”, libraryDependencies ++= Seq( “com.github.zuinnote” %% “spark-hadoopcryptoledger-ds” % “1.1.2”, “org.apache.spark” %% “spark-core” % “2.2.1” % “provided”, “org.apache.spark” %% “spark-sql” % “2.2.1” % “provided”), assemblyJarName in assembly := s"${name.value}${scalaBinaryVersion.value}-${version.value}.jar", assemblyMergeStrategy in assembly := { case PathList(“META-INF”, xs@) => MergeStrategy.discard case PathList(“javax”, “servlet”, xs@_) => MergeStrategy.last case PathList(“org”, “apache”, xs@_*) => MergeStrategy.last case x => val oldStrategy = (assemblyMergeStrategy in assembly).value oldStrategy(x) } ) The launcher script I’m using:

#!/bin/sh

JAR=$1

/usr/local/lib/spark/bin/spark-submit
–class analytics.TestApp
–driver-memory 20G
$JAR And finally the command I’m using to run it:

➜ EthBlockchainAnalytics src/main/resources/launcher.sh /home/sancho/IdeaProjects/EthBlockchainAnalytics/target/scala-2.11/EthBlockchainAnalytics_2.11-0.1.jar The output of the above application when run like this is:

Number of transactions with negative gas price: 8732263 Number of transactions with negative gas limit: 25699923 Number of transactions with negative gas price in block 4800251: 2 Number of transactions with negative gas limit in block 4800251: 8

As a quick sanity check, I ran the following:

➜ ~ geth attach Welcome to the Geth JavaScript console!

instance: Geth/v1.7.3-stable-4bb3c89d/linux-amd64/go1.9 modules: admin:1.0 debug:1.0 eth:1.0 miner:1.0 net:1.0 personal:1.0 rpc:1.0 txpool:1.0 web3:1.0

var txs = eth.getBlock(4800251).transactions undefined for (var i=0; i<txs.length; i++) { if (eth.getTransaction(txs[i]).gasPrice < 0) console.log(txs[i]) } undefined Any idea why I’m seeing so many negative gas prices and gas limits when using Spark?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Guide to Ethereum: What is Gas, Gas Limit and Gas Price?
Everything you need to know about ETH gas price, what is gas limit, what is gwei, why ETH transactions need gas, and why...
Read more >
Guide to Ethereum: What is Gas, Gas Limit and Gas Price?
This article breaks down the concept of gas, gas limit and gas price, which is a central feature of the Ethereum (ETH) Blockchain...
Read more >
ethers gasLimit gasPrice - Ethereum Stack Exchange
I am trying to set gasPrice and gasLimit since I am getting a gas estimation error. I just don't understand where/how to insert...
Read more >
How Gas Fees Work on the Ethereum Blockchain - Investopedia
A higher gas limit usually means the user believes the transaction will require more work. "Gas price" is the price per unit of...
Read more >
Gas and fees - Ethereum.org
In the transaction, the gas limit is 21,000 units, and the gas price is 200 gwei. Total fee would have been: Gas units...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found