Exception in task_file_mover when ingressing files from other batch tasks
See original GitHub issueI’ve set up a job which contains several fetch tasks, and a single processing task that depends on the fetch tasks. For convenience, I tried using the Azure Batch input_data
type in the processing task to get all the data from the preceding fetch tasks, but I’m running into this exception with task_file_mover
.
Traceback (most recent call last):
File "task_file_mover.py", line 148, in <module>
main()
File "task_file_mover.py", line 123, in main
batch_client = _create_credentials()
File "task_file_mover.py", line 60, in _create_credentials
ba, url, bakey = os.environ['SHIPYARD_BATCH_ENV'].split(';')
ValueError: not enough values to unpack (expected 3, got 2)
I’m using KeyVault for supplying the batch credentials, like:
{
"credentials": {
"batch": {
"account": "myaccount",
"account_key_keyvault_secret_id": "https://myvault.vault.azure.net/secrets/batchkey",
"account_service_url": "https://myaccount.westus.batch.azure.com"
}
}
}
Issue Analytics
- State:
- Created 7 years ago
- Comments:12 (3 by maintainers)
Top Results From Across the Web
Error handling and detection in Azure Batch - Microsoft Learn
If a task fails to start, a pre-processing error is set for the task. Pre-processing errors can occur if: The task's resource files...
Read more >Batch manually runs well. But When ran from Task scheduler ...
On Windows Server 2008, i was scheduling a task to run a batch file which in turn will trigger a console application. When...
Read more >Forum - Remote Execute of Batch File Problem - VisualCron
Remote Execute of Batch File Problem - I'm trying to get a simple batch file to run on a different server. Credentials are...
Read more >Windows Batch File Tips and Tricks - PushMon
Batch files, after executing a command, provides an error code called ERRORLEVEL. A non-zero value means an error has occured.
Read more >Batch Script - Quick Guide - Tutorialspoint
Automating housekeeping activities such as deleting unwanted files or log files. Automating the deployment of applications from one environment to another.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I have validated that your fix in the “develop” branch has solved my problem - thank you very much! I will also remember to use “depends_on” - thank you for pointing out! (perhaps you can consider updating documentation at http://batch-shipyard.readthedocs.io/en/latest/14-batch-shipyard-configuration-jobs/ where input_data -> azure_batch is mentioned?)
For the third point, you will want to enforce task dependencies. You are getting “lucky” in this case as the
download_data
task is getting scheduled ahead of the other task that requires those files and it is blocked for scheduling for reasons such as insufficient available slots on compute nodes.