Using progress bar with multiprocessing
See original GitHub issueThis is not a bug report, I just can’t figure out how to do what I want.
Basically I have a multiprocessing_func
which is the target which I put in the processes, where I iterate over i
(which are different board sizes for chess).
#!./env/bin/python3
import csv
import subprocess
from time import perf_counter
from multiprocessing import Process
from alive_progress import alive_bar
BOARD_SIZES = range(4, 29)
PLAYER_TYPES_W = ['0', '2']
PLAYER_TYPES_B = range(0, 3)
QUEEN_ROOK = ['q', 'r']
MAX_GAME_LENGTH = 100
SEED = 123
PRINT = 0
SIMULATIONS = 100
CONFIGCOUNT = len(BOARD_SIZES)
....
....
def simulateCombinations(size):
for type_w in PLAYER_TYPES_W:
for type_b in PLAYER_TYPES_B: # black
for queen_or_rook in QUEEN_ROOK:
sim = {"bin_path": "./chess",
"size": size,
"simulations": SIMULATIONS,
"max_game_length": MAX_GAME_LENGTH,
"type_W": type_w,
"type_B": type_b,
"queen_or_rook": queen_or_rook,
"print": PRINT,
"seed": SEED
}
simulation = [str(x) for x in sim.values()]
subprocess.run(simulation)
def multiprocessing_func(size):
with alive_bar(CONFIGCOUNT) as bar:
start = perf_counter()
simulateCombinations(size)
end = perf_counter()
print(
"Done simulating {2} games for each player on a {0}x{0} board in {1}s".format(size, round(end-start, 4), SIMULATIONS))
bar()
if __name__ == '__main__':
initCSV()
start = perf_counter()
processes = []
for i in BOARD_SIZES:
p = Process(target=multiprocessing_func, args=(i,))
processes.append(p)
p.start()
for process in processes:
process.join
end = perf_counter()
print("That took {0}s".format(round(end-start, 4)))
orderCSV()
I would like the have an alive_bar
which has the length of the BOARD_SIZES
range as its argument, and each time a process is finished I want to call bar()
to increase the bar. Clearly, the way I coded it right now, I get 25 bars which only get 1 fill out of the possible 25, like so:
....
|█▋⚠︎ | (!) 1/25 [4%] in 31.5s (0.03/s)
on 0: Done simulating 100 games for each player on a 27x27 board in 36.1741s
|█▋⚠︎ | (!) 1/25 [4%] in 36.2s (0.03/s)
on 0: Done simulating 100 games for each player on a 26x26 board in 36.4138s
|█▋⚠︎ | (!) 1/25 [4%] in 36.4s (0.03/s)
on 0: Done simulating 100 games for each player on a 28x28 board in 41.364s
|█▋⚠︎ | (!) 1/25 [4%] in 41.4s (0.02/s)
How should I do this for my use case?
Issue Analytics
- State:
- Created 2 years ago
- Comments:9 (6 by maintainers)
Top Results From Across the Web
Show the progress of a Python multiprocessing pool ...
This does not create a progress bar for me, but it kind of works. It counts iterations (and displays total expected iterations). Although...
Read more >Progress Bars for Python Multiprocessing Tasks - Lei Mao
It is natural that we would like to employ progress bars in our programs to show the progress of tasks. tqdm is one...
Read more >[Python] How To Use Multiprocessing Pool And Display ...
The above is the simplest python pool program. Display Progress Bar. Sometimes, if our task is very large, we often need to progress...
Read more >Multiprocessing Pool Show Progress in Python
We can show progress of tasks in the process pool using the callback function. This can be achieved by issuing tasks asynchronously to...
Read more >Learn Python - Multi Processing - Progress Bar - DRC
Python - Multi Processing - Progress Bar. from multiprocessing import Pool from tqdm import tqdm num_processes = 4 args = [(1, 2), (3,...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Not even an answer? Just closed it, after my thorough response, which I’ve invested time and even searched Python docs for you?
Yes @Tim1808, as @TheTechRobo has said, you are calling
multiprocessing_func
several times in different processes, which are creating a newalive_bar
inside each process. Another error is: in the same func, you callsimulate
, which by itself make all combinations. When it does return, the bar is incremented only once…To make it work, you should delete
multiprocessing_func
, movealive_bar
to__main__
, and use a Pool of processes: https://docs.python.org/3/library/multiprocessing.html#module-multiprocessing.pool Whenever the pool returns a result, the bar is incremented. There’s an example there in the Python docs. I think I have some examples here in issues too, but I can’t look for them now. Please look for closed issues about this.