question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Failed to evaluate job outputs - IOException: Could not read from s3...

See original GitHub issue

While testing cromwell-36 with AWS batch I was able to reproduce this error:

2019-02-25 09:38:52,508 cromwell-system-akka.dispatchers.engine-dispatcher-24 ERROR - WorkflowManagerActor Workflow b6b9322c-3929-4b72-9598-45d97dfb858d failed (during ExecutingWorkflowState): cromwell.backend.standard.StandardAsyncExecutionActor$$anon$2: Failed to evaluate job outputs:
Bad output 'print_nach_nachman_meuman.out': [Attempted 1 time(s)] - IOException: Could not read from s3://nrglab-cromwell-genomics/cromwell-execution/run_multiple_tests/b6b9322c-3929-4b72-9598-45d97dfb858d/call-test_cromwell_on_aws/shard-61/SingleTest.test_cromwell_on_aws/f8ecf673-ed61-4b06-b1d6-c20f7efe986e/call-print_nach_nachman_meuman/print_nach_nachman_meuman-stdout.log: Cannot access file: s3://s3.amazonaws.com/nrglab-cromwell-genomics/cromwell-execution/run_multiple_tests/b6b9322c-3929-4b72-9598-45d97dfb858d/call-test_cromwell_on_aws/shard-61/SingleTest.test_cromwell_on_aws/f8ecf673-ed61-4b06-b1d6-c20f7efe986e/call-print_nach_nachman_meuman/print_nach_nachman_meuman-stdout.log
        at cromwell.backend.standard.StandardAsyncExecutionActor.$anonfun$handleExecutionSuccess$1(StandardAsyncExecutionActor.scala:867)

The error occurs when running many sub-workflows within a single wrapping workflow. The environment is configured correctly, and the test usually passes when running <30 subworkflows.

Here are the workflows:

run_multiple_test.wdl

import "three_task_sequence.wdl" as SingleTest

workflow run_multiple_tests {
    scatter (i in range(30)){
        call SingleTest.three_task_sequence{}
    }
}

three_task_sequence.wdl

workflow three_task_sequence{
    call print_nach

    call print_nach_nachman {
        input:
            previous = print_nach.out
    }

    call print_nach_nachman_meuman{
        input:
                previous = print_nach_nachman.out
    }
    output{
        Array[String] out = print_nach_nachman_meuman.out
    }
}

task print_nach{
     command{
         echo "nach"
     }
     output{
         Array[String] out = read_lines(stdout())
     }
     runtime {
	    docker: "ubuntu:latest"
	    maxRetries: 3
     }
 }

 task print_nach_nachman{
    Array[String] previous

     command{
         echo ${sep=' ' previous} " nachman"
     }
     output{
         Array[String] out = read_lines(stdout())
     }
     runtime {
        docker: "ubuntu:latest"
        maxRetries: 3
     }
     
 }

 task print_nach_nachman_meuman{
     Array[String] previous

      command{
        echo ${sep=' ' previous} " meuman"
      }
      output{
        Array[String] out = read_lines(stdout())
      }
      runtime {
        docker: "ubuntu:latest"
        maxRetries: 3
      }
  }

Here is the cromwell-conf:

// aws.conf
include required(classpath("application"))

webservice {
  port = 8001
  interface = 0.0.0.0
}

aws {
  application-name = "cromwell"
  auths = [{
      name = "default"
      scheme = "default"
  }]
  region = "us-east-1"
}

engine {
  filesystems {
    s3 { auth = "default" }
  }
}

backend {
  default = "AWSBATCH"
  providers {
    AWSBATCH {
      actor-factory = "cromwell.backend.impl.aws.AwsBatchBackendLifecycleActorFactory"
      config {
        root = "s3://nrglab-cromwell-genomics/cromwell-execution"
        auth = "default"

        numSubmitAttempts = 3
        numCreateDefinitionAttempts = 3

        concurrent-job-limit = 100

        default-runtime-attributes {
          queueArn: "arn:aws:batch:us-east-1:66:job-queue/GenomicsDefaultQueue"
        }

        filesystems {
          s3 {
            auth = "default"
          }
        }
      }
    }
  }
}

system {
  job-rate-control {
    jobs = 1
    per = 1 second
  }
}

Would appreciate help on this. I wonder if cromwell was ever tested for many parallel sub-workflows running on AWS?

Thanks!

Issue Analytics

  • State:open
  • Created 5 years ago
  • Comments:23 (2 by maintainers)

github_iconTop GitHub Comments

2reactions
ptdtancommented, Jun 8, 2021

Still getting this error today.

2reactions
sschocommented, May 13, 2021

Hmmm, still stuck on this - any updates from your guys’ end? I tried cloning and resubmitting, still getting the same error.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Spark 1.6.1 S3 MultiObjectDeleteException - Stack Overflow
I'm getting the same issue with a non-spark streaming job. Spark 1.6.2, Hadoop 2.6. Doesn't work with the direct output committer either. –...
Read more >
Resolve errors uploading data to or downloading data from ...
I want to download data from Amazon Aurora and upload it to Amazon S3. How can I resolve an error I received while...
Read more >
Committing work to S3 with the S3A Committers
Magic output committer task fails “The specified upload does not exist” “Error Code: NoSuchUpload”; Job commit fails “java.io.
Read more >
May 18, 2022•Knowledge 000148254 - Search
In Data Engineering Integration (BDM), S3 to S3 mapping fails with the following error messages in spark mode when mapping is run for...
Read more >
Troubleshooting - nf-core
nextflow run nf-core/<pipeline_name> -profile test,docker ... This does not tell you why the job failed to submit, but is often is due to...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found