Just recently I gave a talk to /dev/070/ meetup group titled “Understanding Understanding”. In the presentation I outlined a number of concepts in other domains that I felt had strong parallels with the thinking behind Simz; that for a man or machine to truly understand and manage the behavior of other men or machines he/she or it must somehow mirror and embody the observed behavior themselves. The slides are available here.

One theorem presented in the talk is the Good Regulator. The theorem is central to cybernetics. But what exactly would such a model look like? What elements should the model contain and how might they be related and reasoned about? The theorem itself does not address this so I presented my own research findings that covered dramatism, observational learning, experiential learning, activity theory, simulation theory and mirror neurons as well as software activity metering and software performance measurement. Yes, a very “unusual” talk for a software developer meetup as commented on by many afterwards.

Essentially what I presented was a model of human and software understanding based on activities actioned by actors within an environment supporting observation and perception of such acts including the situational context surrounding them, both before and after. An actor, not in the sense of the actor programming model, produces, begins and ends, an action in response to, or in anticipation of, some stimulus (action, signal or event), which could very well be mapped to a service, thread, process, system or human (by proxy).

I then followed on from this with a discussion on how such a model afforded flexible mapping to any domain at multiple layers (or levels) of abstraction and modeling. More importantly, I demonstrated how software (behavioral) memories could be recorded (stored) and recalled (simulated). But for a machine to truly understand another machine much like how we understand each other (with varying levels of success), all participates needed to be able to some degree sense and simulate the others for the purpose of action perception, intent exposition, and outcome prediction. I presented a number of very interesting software mirroring and simulation demonstrations, offering glimpses into a future that allowed software developers and system designers to model, manage and manipulate the space and time dimension of software execution for the purpose of post augmentation, performance monitoring, and partitioned integration and so on. But I felt I could get even closer to the human model of social understanding if instead of having many software machines project their behavior into a single machine (control) plane, as is the case with many of the Simz demonstrations I’ve videoed, I had two machines perceive and simulate the action of each other and to respond appropriately to such perceptions. Each machine would project metered activities it performed to the other and each machine would simulate the actions projected by the other and trigger internal state changes. Meet Ping-n-Pong!

Surprisingly the diagram is far more complex than the actual code because all the magic is performed by Satoris , for dynamic bytecode instrumentation and activity metering, and Simz for mirroring and activity simulation. Lets start with the Game class. This is the only class that is actually instrumented in both loosely coordinated Java runtimes. When the Game.ping method is invoked within the Ping runtime a BEGIN event is transparently projected over into a metering simulation running within the Pong runtime. Likewise when the Game.pong method is invoked within the Pong runtime a BEGIN event is transparently projected over into a metering simulation running within the Ping runtime.

The Ping class uses a Bootstrap class to embed a Simz server within the runtime and then calls Game.ping method to start the game. Looking at the above code you might think this will simply spin but I’ll come back to how the Game state is synchronized across runtimes by way of replicated behavior and a Percept class that is hooked into the metering engine in order to intercept call events, both in the application code as well as the simulation.

The Pong class waits for the Game to begin and then it too goes into a loop calling Game.pong with each iteration. Again the above Game code would have you believe that the Pong would simply spin in the first while loop as nowhere is there any change in state. Hold on, the magic is about to be revealed.

The Percept class is the magic binding both runtimes together, transparently. It hooks into the metering engine invoked by the bytecode instrumentation dynamically injected into the Game class by Satoris agent in both runtimes. It also hooks into the metering engine used by the Simz server embedded within each runtime receiving mirrored metering events from the other metered runtime. There are in fact 4 metering engines. Within each runtime, there is a metering engine for action and another for the perception of metered actions performed by other external actors.

When Game.ping() is invoked by Ping the Percept will set Game.state to PING within the Ping runtime. The thread will then wait within the Game.ping method until Pong receives the Game.ping mirrored metering event within its simulation causing it to exit from its initial wait loop, by way of the Percept setting Game.state to Game.PONG , and call Game.pong() which then results in the call event being mirrored back over into the Ping runtime where it triggers the exiting from the Game.ping method, and it all repeats over and over. It is important to note that no state is being moved between runtimes though Simz does support the transmission of environment state. Instead software execution behavior is replicated to trigger the required state change. The actors, Ping and Pong , perceive each others action via the simulation and the Percept hook, and then respond by executing (or exiting from) methods that represent actions within the model. This is incredibly impressive because there is no explicit dependency on any actor programming stack such as Akka. It is clean and concise and void of framework and language pollution.