how actually does one run your atac pipeline? super frustrated
See original GitHub issueHi,
Glad to see that there is this ENCODE ATAC-seq pipeline.
We are using some HPC to perform the computing, and as most others, we are not the admin, so a lot of times very difficult to install those packages.
Anyway, I ve git clone your ‘atac-seq-pipeline’ files to folder with the same name on our HPC. We have also git cloned the Caper
.
I modified your minimum json input specification file to this [input.json]:
{
"atac.pipeline_type" : "atac",
"atac.genome_tsv" : "https://storage.googleapis.com/encode-pipeline-genome-data/mm10_caper.tsv",
"atac.fastqs_rep1_R1" : [
"primary_seq/repa_1.fastq.gz",
"primary_seq/repb_1.fastq.gz",
],
"atac.fastqs_rep1_R2" : [
"primary_seq/repa_2.fastq.gz",
"primary_seq/repb_2.fastq.gz",
],
"atac.paired_end" : true,
"atac.multimapping" : 4,
"atac.auto_detect_adapter" : true,
"atac.trim_adapter.cpu" : 2,
"atac.bowtie2_cpu" : 8,
"atac.bowtie2_mem_mb" : 16000,
"atac.filter_cpu" : 2,
"atac.filter_mem_mb" : 12000,
"atac.macs2_mem_mb" : 16000,
"atac.smooth_win" : 73,
"atac.enable_idr" : true,
"atac.idr_thresh" : 0.05,
"atac.enable_xcor" : true,
"atac.title" : "Project1",
"atac.description" : "Project1"
}
And then I submitted a job running this line:
caper run atac.wdl -i $outdir/input.json --deepcopy --use-singularity
and I got this error immediately:
usage: caper run [-h] [--dry-run] [-i INPUTS] [-o OPTIONS] [-l LABELS]
[-p IMPORTS] [-s STR_LABEL] [--hold]
[--singularity-cachedir SINGULARITY_CACHEDIR] [--no-deepcopy]
[--deepcopy-ext DEEPCOPY_EXT]
[--docker [DOCKER [DOCKER ...]]]
[--singularity [SINGULARITY [SINGULARITY ...]]]
[--no-build-singularity] [--slurm-partition SLURM_PARTITION]
[--slurm-account SLURM_ACCOUNT]
[--slurm-extra-param SLURM_EXTRA_PARAM] [--sge-pe SGE_PE]
[--sge-queue SGE_QUEUE] [--sge-extra-param SGE_EXTRA_PARAM]
[--pbs-queue PBS_QUEUE] [--pbs-extra-param PBS_EXTRA_PARAM]
[-m METADATA_OUTPUT] [--java-heap-run JAVA_HEAP_RUN]
[--db-timeout DB_TIMEOUT] [--file-db FILE_DB] [--no-file-db]
[--mysql-db-ip MYSQL_DB_IP] [--mysql-db-port MYSQL_DB_PORT]
[--mysql-db-user MYSQL_DB_USER]
[--mysql-db-password MYSQL_DB_PASSWORD] [--cromwell CROMWELL]
[--max-concurrent-tasks MAX_CONCURRENT_TASKS]
[--max-concurrent-workflows MAX_CONCURRENT_WORKFLOWS]
[--max-retries MAX_RETRIES] [--disable-call-caching]
[--backend-file BACKEND_FILE] [--out-dir OUT_DIR]
[--tmp-dir TMP_DIR] [--gcp-prj GCP_PRJ]
[--gcp-zones GCP_ZONES] [--out-gcs-bucket OUT_GCS_BUCKET]
[--tmp-gcs-bucket TMP_GCS_BUCKET]
[--aws-batch-arn AWS_BATCH_ARN] [--aws-region AWS_REGION]
[--out-s3-bucket OUT_S3_BUCKET]
[--tmp-s3-bucket TMP_S3_BUCKET] [--use-gsutil-over-aws-s3]
[-b BACKEND] [--http-user HTTP_USER]
[--http-password HTTP_PASSWORD] [--use-netrc]
wdl
caper run: error: argument --deepcopy-ext: expected one argument
What seems to be the problem? Seems the caper run
options are different from your documentations on the landing page of this project?
We have “python3/3.6.4” and “singularity/2.5.2” loaded in out HPC.
Issue Analytics
- State:
- Created 4 years ago
- Comments:9 (5 by maintainers)
Top GitHub Comments
That’s not how the installation works. All dependencies are automatically installed. Jin will reply with the correct way to install the pipeline.
We will make a release with fixed documentation soon this week. Please remove
--deepcopy
from the command line and use--singularity
instead of--use-singularity
.