How to Run Large-Scale Educational Workshops in Deep Learning & Data Science

01/10/2018

4 minutes to read

In this article

This post is authored by Gopi Kumar, Principal Program Manager, and Paul Shealy, Senior Software Engineer at Microsoft.

With the rise of Artificial Intelligence, the need to rapidly train a large number of data scientists and AI developers has never been more urgent. Microsoft is always looking for efficient ways to educate employees and customers on AI and make them more productive when using these new capabilities. Aside from numerous technical conferences that we host and sponsor, we also offer the AI School and a range of tools such as the Data Science Virtual Machine, Visual Studio Tools for AI, Azure Machine Learning, Microsoft ML Server, and Batch AI to help developers and data scientists become more productive around building their intelligent AI-infused apps.

Pulling together deep learning workshops for a large number of students, however, can be a time consuming, error prone, and costly exercise. Furthermore, technical issues with the environment setup and compatibility problems during the workshops impede learning and cause student dissatisfaction. These workshops typically have participants bring their laptops and have them download and install new software. However, with the wide range of laptop platforms (Windows, Mac, Linux), numerous configurations, and version conflicts with existing software, workshops can become frustrating both for presenters and attendees. The RAM and disk space available on laptops and their lack of GPUs affect the types of hands-on labs that can be offered, as deep learning workshops benefit heavily from specialized hardware such as GPUs. An alternative is to build new cloud based custom VMs specifically for the training – this avoids compatibility issues but is quite time consuming and often not reusable based on our experience.

There is a better way.

Microsoft offers the Data Science/Deep Learning Virtual Machine (DSVM / DLVM) on Azure. The DSVM comes with many tools and frameworks for deep learning, machine learning and data exploration, in addition to development environments such as Jupyter notebooks and IDEs. Azure also provides productivity tools to quickly provision many virtual machines, attach large disks, and monitor the VMs. Since mid-2017, our team has driven several large deep learning workshops both for internal and external audiences. Attendees at these workshops receive a login to a DSVM with tutorial material and datasets pre-loaded. The workshops were highly successful and allowed us, the presenters, to focus more on content creation and testing and less on setup. We also spent less time dealing with technical setup issues that students would otherwise face.

We wish to share our approach, so others can benefit from it, helping meet our larger goal of getting as many data scientists and AI developers trained.

At the Microsoft AI Immersion Workshop 2017. Attendees successfully completed the training material using VMs created with the approach described in this post.

Our Approach

The approach and principles used to create the workshop infrastructure are as follows:

Use a shared pool of Ubuntu Data Science VMs (with GPUs if doing deep learning) with each student getting a separate account. We allocated about 5 students per VM for our deep learning workshops.

Automate the bulk creation of Data Science VMs and local access credentials for students. This happens with a post-install script run on each VM after it is created. We distributed logins to students by printing out the server address and credentials and distributing paper slips in the workshop.

Script the process to bring in the exercises and datasets needed for the workshop.

Share large datasets in a common directory so you avoid large simultaneous downloads.

All examples are runnable in Jupyter notebooks or on terminal. The DSVM comes with Jupyter with Python, R, Spark, and Julia kernels. Students access Jupyter notebooks by logging in to JupyterHub with their VM username and password. We also leveraged the terminal functionality in Jupyter to open a bash shell right in a browser window.

Students use only a browser to access the VM with their unique login account. No other software is required on the student's laptop.

Further details about setting up a workshop, including step-by-step instructions and common pitfalls to avoid, are available in the Data Science VM GitHub. The important scripts to note are:

ARM template to create multiple VMs and execute a post install script. Sample post install bash script that creates multiple user accounts on each VM with random passwords for each student. Use Azure CLI on your command prompt or bash shell to invoke the ARM template and use the post install script above.

It took us less than 30 minutes to create about 30 GPU based Data Science VMs to support a class of over 200 people and spot check a few VMs. The students were able to train deep neural network models on shared GPUs. Since then we've had a chance to use it a few times and were able to save a few days of setup time for every workshop. With the shared infrastructure we were also able to save costs of creating separate VMs for every student. The data science VM is already a popular and robust environment among data scientists and AI developers for their development and experimentation in the cloud. By also using a standardized and familiar environment provided by the Data Science VM in training and education, the learning curve is greatly reduced and students can go on to continue development of their AI apps using the Azure Data Science VM.

Wasn't this easy? Give it a shot for your next deep learning or data science training – we would love to learn from your experience and suggestions.

Gopi & Paul

Resources