+ 1
Concurrency issue with Python multiprocessing
I'm calling a function with list of folder names in parallel to process files in each folder. The function writes to list file an entry for each folder but at the same time more than one instance of the function writes to the file and there is an overlap between the entries. The function is called with map async function of Python pooling. How to avoid this and ensure the entries are not overlapping
5 Answers
+ 1
How about posting ur code? Always easier to discuss a specific example.
+ 1
pool.map_async(my_func(), arguments).get()
pool.close()
pool.join()
This is the piece of code. I tried with map instead of map_async also as I thought map will wait for the outputs from each instance and then finally will give a list. But it didnt make any difference...Actually even the logging info using the log module is getting overlapped in the log file and we have one more file which is written by the instances of the function. Everywhere there is an overlap but not always. like weekly once.
+ 1
also I see locks only for Processes and not for pool. Can you share any link that explains how to use locks for pool