question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

how actually does one run your atac pipeline? super frustrated

See original GitHub issue

Hi, Glad to see that there is this ENCODE ATAC-seq pipeline. We are using some HPC to perform the computing, and as most others, we are not the admin, so a lot of times very difficult to install those packages. Anyway, I ve git clone your ‘atac-seq-pipeline’ files to folder with the same name on our HPC. We have also git cloned the Caper. I modified your minimum json input specification file to this [input.json]:

{
    "atac.pipeline_type" : "atac",
    "atac.genome_tsv" : "https://storage.googleapis.com/encode-pipeline-genome-data/mm10_caper.tsv",
    "atac.fastqs_rep1_R1" : [
        "primary_seq/repa_1.fastq.gz",
        "primary_seq/repb_1.fastq.gz",
    ],
    "atac.fastqs_rep1_R2" : [
        "primary_seq/repa_2.fastq.gz",
        "primary_seq/repb_2.fastq.gz",
    ],
    "atac.paired_end" : true,
    "atac.multimapping" : 4,
    "atac.auto_detect_adapter" : true,
    "atac.trim_adapter.cpu" : 2,
    "atac.bowtie2_cpu" : 8,
    "atac.bowtie2_mem_mb" : 16000,
    "atac.filter_cpu" : 2,
    "atac.filter_mem_mb" : 12000,
    "atac.macs2_mem_mb" : 16000,
    "atac.smooth_win" : 73,
    "atac.enable_idr" : true,
    "atac.idr_thresh" : 0.05,
    "atac.enable_xcor" : true,
    "atac.title" : "Project1",
    "atac.description" : "Project1"
}

And then I submitted a job running this line: caper run atac.wdl -i $outdir/input.json --deepcopy --use-singularity and I got this error immediately:

usage: caper run [-h] [--dry-run] [-i INPUTS] [-o OPTIONS] [-l LABELS]
                 [-p IMPORTS] [-s STR_LABEL] [--hold]
                 [--singularity-cachedir SINGULARITY_CACHEDIR] [--no-deepcopy]
                 [--deepcopy-ext DEEPCOPY_EXT]
                 [--docker [DOCKER [DOCKER ...]]]
                 [--singularity [SINGULARITY [SINGULARITY ...]]]
                 [--no-build-singularity] [--slurm-partition SLURM_PARTITION]
                 [--slurm-account SLURM_ACCOUNT]
                 [--slurm-extra-param SLURM_EXTRA_PARAM] [--sge-pe SGE_PE]
                 [--sge-queue SGE_QUEUE] [--sge-extra-param SGE_EXTRA_PARAM]
                 [--pbs-queue PBS_QUEUE] [--pbs-extra-param PBS_EXTRA_PARAM]
                 [-m METADATA_OUTPUT] [--java-heap-run JAVA_HEAP_RUN]
                 [--db-timeout DB_TIMEOUT] [--file-db FILE_DB] [--no-file-db]
                 [--mysql-db-ip MYSQL_DB_IP] [--mysql-db-port MYSQL_DB_PORT]
                 [--mysql-db-user MYSQL_DB_USER]
                 [--mysql-db-password MYSQL_DB_PASSWORD] [--cromwell CROMWELL]
                 [--max-concurrent-tasks MAX_CONCURRENT_TASKS]
                 [--max-concurrent-workflows MAX_CONCURRENT_WORKFLOWS]
                 [--max-retries MAX_RETRIES] [--disable-call-caching]
                 [--backend-file BACKEND_FILE] [--out-dir OUT_DIR]
                 [--tmp-dir TMP_DIR] [--gcp-prj GCP_PRJ]
                 [--gcp-zones GCP_ZONES] [--out-gcs-bucket OUT_GCS_BUCKET]
                 [--tmp-gcs-bucket TMP_GCS_BUCKET]
                 [--aws-batch-arn AWS_BATCH_ARN] [--aws-region AWS_REGION]
                 [--out-s3-bucket OUT_S3_BUCKET]
                 [--tmp-s3-bucket TMP_S3_BUCKET] [--use-gsutil-over-aws-s3]
                 [-b BACKEND] [--http-user HTTP_USER]
                 [--http-password HTTP_PASSWORD] [--use-netrc]
                 wdl
caper run: error: argument --deepcopy-ext: expected one argument

What seems to be the problem? Seems the caper run options are different from your documentations on the landing page of this project?

We have “python3/3.6.4” and “singularity/2.5.2” loaded in out HPC.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:9 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
akundajecommented, Sep 18, 2019

That’s not how the installation works. All dependencies are automatically installed. Jin will reply with the correct way to install the pipeline.

1reaction
leepc12commented, Sep 17, 2019

We will make a release with fixed documentation soon this week. Please remove --deepcopy from the command line and use --singularity instead of --use-singularity.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Single-Library Analysis with cellranger-atac count - Support
Cell Ranger ATAC's pipelines analyze sequencing data produced from Chromium Single Cell ATAC libraries. This involves the following steps: Run cellranger-atac ...
Read more >
From reads to insight: a hitchhiker's guide to ATAC-seq data ...
Pre-analysis: quality control and alignment. The first step of ATAC-seq analysis involves pre-alignment QC, read alignment to a reference ...
Read more >
I-ATAC: interactive pipeline for the management and pre ...
It is an interactive large-scale platform for NGS data analysis, which takes Illumina-generated FASTQ files as input and produces BED files as ...
Read more >
Analysis of ATAC-seq data in R and Bioconductor
We will process one sample of the Greenleaf data from fastq to BAM to allow us to review some of the features of...
Read more >
ATAC-seq Data Standards and Processing Pipeline – ENCODE
The ATAC -seq pipeline was developed by Anshul Kundaje's lab at Stanford University. Upon revision and full implementation, it will be a part...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found