Hey Guys,

I’d like to tell you about Multiprocessing using Python 3.7 which is slightly different that Multithreading with Python 2.7. I already wrote something about Multithreading with Python 2.7 which you can read here.

Basics

By definition a process is a collection of one or more threads that shares memory, code segments and rights but do not share with another processes.

Accordingly to prior paragraph the default case of using multiprocessing is when your program can be divided into several tasks running concurrently and independent from each other. In addition, multithreading is normally used for shared data structure that is written and read by various threads within a process.

I prepared two examples to you how to treat a Multiprocessing pool synchronous or asynchronous.

Examples

The following code block shows the basic code I used in both examples. I only changed the way how to wait until all child processes are done.

#!/usr/bin/env python3 ## Importing all needed modules import multiprocessing import random import time import timeit print("### Starting Multiprocessing") ## Starting timer which keep track of the runtime start_time = timeit.default_timer() ## Define the function which will be executed within the pool def asyncProcess(processRuntime, processID): """ Sleeps the amount of time as seconds from variable processRuntime returns: <STRING> """ time.sleep(processRuntime) return("%i: This Process ran %i seconds" % (processID, processRuntime)) if __name__ == '__main__': ## Define Pool Size of maximal concurrent processes pool_size = 10 ## Define an empty list where we store child processes processes = [] ## Define an empty list to store finished processes finished_processes = [] ## Define an empty pool whith maximal concurrent processes pool = multiprocessing.Pool(processes=pool_size) ## Firing in total of 10 processes for i in range(0, pool_size): ## Append the process variable with a process object processes.append(pool.apply_async(asyncProcess, args=(random.randint(0, 10), i,))) ## Closes the pool to stop accepting new processes pool.close()

Synchronous Pool

Sometimes you want to execute some tasks but want to wait until all child process are finished. To achieve this you need to execute pool .join() which simply waits until all child processes finished their job. This is similar to Multithreading in Python 2.7.

pool.join() ## Iterate through the processes variable for process in processes: ## Print the process returned value print(process.get()) print("Parent: this Process ran %s seconds" % str(timeit.default_timer() - start_time))

Demo

Asynchronous Pool

You are also able to e.g. print the result if a child process is done . Alternatively , you could do something further with the data until everything else is finished.

## Iterate through processes as long as a process is running while True: ## Iterate through the processes variable for process in processes: ## Check if process is done and not in finished_processes if(process.ready() and process not in finished_processes): ## Print the returned value print(process.get()) ## Append the finished process to finished_processes finished_processes.append(process) ## Break while loop when finished_processes length equal processes length if len(finished_processes) == len(processes): break print("Parent: this Process ran %s seconds" % str(timeit.default_timer() - start_time))

Demo

Conclusion

I hope you enjoyed this small insight into Pythons Multiprocessing. If you have any questions or improvements just leave a comment below this post.

If you want to read more about Multiprocessing and Processes in general, take a look at these links:

Title Image via LINK