The robot (equipped with force sensors and cameras) immediately began prodding and poking the Jenga blocks using its two-pronged arm. It's a task that looks easy on the surface but, as the game gets progressively harder, can cause a sweat-inducing panic in players.

Around 300 pushes down the line, the bot developed a physics model of the world. "The robot builds clusters and then learns models for each of these clusters, instead of learning a model that captures absolutely everything that could happen," said the paper's lead author, Nima Fazeli.

This practical approach differs from the norm, whereby scientists train a neural network by feeding it troves of data. And, as we've seen recently, researchers are even ditching real-world interactions and turning to virtual simulations to train larger droids.

For now, the robot is only playing by itself. Its creators claim its newfound dexterity marks a significant step forward for robotic manipulation of real-world objects. The breakthrough could result in industrial machines that are less clumsy. In the future, the same arm could move beyond miniature blocks to building cars and furniture in factories and warehouses.