question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Question: Manage multiple progress bars in a multiprocessing environment

See original GitHub issue

Hi,

first, thanks a lot for this beautiful library. I wanted to know if it is possible to manage a Progress object with multiple progress bars in a multiprocessing environment. I just want to have a very basic progress bar for every process. My first tries did not succeed due to the fact that the Progress object cannot be pickled (due to its usage of an RLock?). TypeError: can't pickle _thread.RLock objects.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:1
  • Comments:6

github_iconTop GitHub Comments

2reactions
ChristianMichelsencommented, Jan 25, 2021

@willmcgugan Thanks a lot for your answer. After a bit of research, debugging, and generally playing around with the multiprocessing module and got the following to work:

def worker(queue_in, queue_out):
    init = very_slow_initialisation()
    while True:
        filename = queue_in.get(block=True)
        if filename is None:
            break
        result = fast_function(filename, init)
        queue_out.put((filename, result))


def main(df):

    queue_in = Queue()
    queue_out = Queue()
    the_pool = Pool(NUM_CORES, worker, (queue_in, queue_out))

    filenames = get_filenames()
    N = len(filenames)

    with Progress() as progress: 
        task_id = progress.add_task("Task", total=N)

        for filename in filenames:
            queue_in.put(filename)

        # Get and print results
        results = []
        for _ in range(N):
            results.append(queue_out.get())
            progress.advance(task_id)

    for _ in range((N):
        queue_in.put(None)

    # prevent adding anything more to the queue and wait for queue to empty
    queue_in.close()
    queue_in.join_thread()

    # prevent adding anything more to the process pool and wait for all processes to finish
    the_pool.close()
    the_pool.join()

    return results
2reactions
jpfeuffercommented, Jul 8, 2020

Thanks a lot. It did. It worked fine after re-structuring and using a Pipe and letting the spawned processes send their updates to the main process with their task_id.

Read more comments on GitHub >

github_iconTop Results From Across the Web

need help with multiprocessing · Issue #121 · Textualize/rich
Sorry, I am trying to use this code to have progress reporting on a very long task, but I need to use Pool.starmap()...
Read more >
Python multiprocessing progress approach - Stack Overflow
What I'm looking for could be: Simple. Each time a process finishes a file it sends a 'finished' message; The main code keeps...
Read more >
Progress Bars for Python Multiprocessing Tasks - Lei Mao
Introduction. It is natural that we would like to employ progress bars in our programs to show the progress of tasks. tqdm is...
Read more >
Track your loop using tqdm: 7 ways progress bars in Python ...
Track the your Python loops with a real-time progress bar. 4th one is intersting. tqdm is a Python library used for creating smart...
Read more >
Multithreading and Concurrency - Java Programming Tutorial
A multi-thread program has an initial entry point (the main() method), followed by many entry and exit points, which are run concurrently with...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found