I mentioned how Philip Rosedale aims to achieve extremely low latency to improve avatar-to-avatar interactions in his new Oculus Rift-compatible High Fidelity virtual world; but it's not just a matter of shortening ping time -- he's also working with a neuroscientist and 3D brain scans to improve that experience too:

"Basically," Philip tells me, "you can see things like 'I feel a certain way toward you' in the scanner and we can look for that data and then test breaking it with various different transformations of person into avatar." Philip demonstrated this at South by Southwest last March with Dr. Adam Gazzaley of UCSF, but media coverage at the time didn't quite explain Philip's purpose, which is to improve the avatar-to-avatar sense of presence in High Fidelity:

"Adam and I have know each other for a while, and have been exploring ways to work together to use his expertise and lab to help us understand the experience of 'presence' between avatars/people." Here's how: