Update 11/02: For urgent issues please visit gitter channel and the Round 2 github issue

Update 11/01: Google Cloud Platform will generously sponsor also the second round of the challenge. Top 50 teams from the Round 1 will be awarded 400$ cloud credits!

Update 10/31: Instructions for submission for Round-2 are available at : https://github.com/crowdAI/nips2018-ai-for-prosthetics-round2-starter-kit

Update 08/27: We’ve updated osim-rl package to version 2.1. If you joined before 08/27/2018 please run in your conda environment:



pip install git+https://github.com/stanfordnmbl/osim-rl.git -U

Update 07/27: Google Cloud Platform will sponsor participants of the challenge. Top 400 teams with positive (>0) number of points will be awarded 250$ cloud credits!

Update 07/30: Watch our webinar to learn more about biomechanics, neuroscience and reinforcement learning!

Welcome to AI for Prosthetics challenge, one of the official challenges in the NeurIPS 2018 Competition Track. In this competition, you are tasked with developing a controller to enable a physiologically-based human model with a prosthetic leg to walk and run. You are provided with a human musculoskeletal model, a physics-based simulation environment OpenSim where you can synthesize physically and physiologically accurate motion, and datasets of normal gait kinematics. You are scored based on how well your agent adapts to the requested velocity vector changing in real time.

Follow the instructions on our github repo to get started!

Our objectives are to:

bring Deep Reinforcement Learning to solve problems in medicine,

promote open-source tools in RL research (the physics simulator OpenSim, the RL environment, and the competition platform are all open-source),

encourage RL research in computationally complex environments, with stochasticity and highly-dimensional action spaces.

Visit our github repo to get started!

What’s new compared to NIPS 2017: Learning to run?

We took into account comments from the last challenge and there are several changes:

You can use experimental data (to greatly speed up the learning process)

We released the 3rd dimensions OpenSim model (the model can fall sideways)

We added a prosthetic leg – the goal is to solve a medical challenge on modeling how walking will change after getting a prosthesis. Your work can speed up design, prototying, or tuning prosthetics!

You haven’t heard of NIPS 2017: Learning to run? Watch this video!