These days, I’m always prepared for two things whenever I slip on a virtual-reality headset: intermittent waves of nausea, and the lingering urge to grab at whatever I see. The first consumer version of Oculus Rift doesn’t entirely solve that first problem still, though it’s improved on it quite a bit—but when paired with the Oculus Touch controllers, it takes a huge, satisfying step toward giving me what I want on the second count.

Loading

Loading

Loading

Loading

Check out the video above for a closer look at the final Oculus Rift hardware.According to Nate Mitchell, VP of Product at Oculus, the Rift I put on during my E3 demo isn’t even the actual final version—its weight and ergonomics will still get slight tweaks between now and the visor’s Q1 2016 launch. But even if it’s not absolutely final, the Rift now feels dramatically better to wear. This version is much lighter than the dev kits and the Crescent Bay prototype, and has adequate room for glasses. I used to have to take off my glasses or let the nosepads jam painfully into my face during appointments; in both scenarios, I’d get headaches that led to queasiness after extended time in the visor. And that was on top of whatever motion sickness that cropped up.This time, the Rift fit easily around my glasses, despite being snug enough to finally stay in place. No headaches during or after—I only felt sick when my camera view was on the move. Once it stopped, the nausea stopped. Unfortunately, in Insomniac’s survival-horror platformer Edge of Nowhere, where you scramble from ledge to ledge often, my stomach revolted a lot. But in the hockey segment of VR Sports Challenge, where my viewpoint stayed in place, I was fine.More exciting—and mind-bending—were the Oculus Touch controllers. Though these input devices will also be available next year (Oculus says “the first half” of 2016), the “Half-Moon” version I saw is still prototype hardware. But even taking that into account, they feel far more natural than I expected.My demo took place in the Toybox test environment that Oculus mentioned last week at its pre-E3 press conference—the one Oculus uses internally to fine-tune the controls for the Touch. The scene I was dropped into was fairly barebones: a very nondescript workshop with a variety of objects scattered over a workbench. I spent the first few minutes acclimating to the Touch’s controls; though the controllers feel pretty good to hold, with most of the input placements right where I’d want them, I didn’t take to them right away. The middle-finger trigger felt odd at first. I had to get used to the position of it relative to where my actual middle finger is.After I’d adjusted to that, though, I just had to learn how to use the middle-finger trigger to pick up and hold things like blocks, Zippo lighters, slingshots, and guns. (Squeeze once to grab something; keep it depressed to hold onto the item; release to drop it.) Though it was a pretty basic mechanic, my brain still refused to accept the scheme at first, since I was looking at fingers opening and closing. I had to practice several times before I began equating the virtual fingers with pushing down on a button.I also had to consciously remind myself that I didn’t need to press anything to make a fist; I just had to curl in my index fingers and lower my thumbs to the surface of the controllers, and the Touch’s sensors noted that and then changed how my virtual hands looked. At first, I often left my thumbs up when I thought I had already lowered them.But once I’d mastered what felt like the equivalent of patting my head and rubbing my stomach at the same time, it became much easier to mess with the objects in the room. I tossed blocks, hit ping pong balls, smashed a fist into a tetherball, failed miserably at skeet shooting, set fireworks off, shot my demo partner with a shrink ray, and bombed my demo partner’s toy tank with my own tank. The physics do have more of a game-like feel to them; they had to be learned, since I couldn’t really gauge the weight of whatever it was I hefting.Still, after spending about 10 minutes in this freeform demo, it began to feel normal. In fact, the most telling moment that my brain had begun to accept the virtual world as real was when my demo ended. After the screen went blank, I knew I was done—and then was surprised a moment later when I realized when I was still holding the Touch controllers. For a few seconds, I’d assumed that when the hands went away, my inputs were also gone.A few seconds isn’t enough to equal total immersion yet, though. There’s also that issue of motion sickness to solve. People like me are going to be the hardest group to fix that for; when speaking with founder Palmer Luckey and co-founder Nate Mitchell about it, they acknowledged that people with an athletic or dance background tend to be more sensitive, as they notice more strongly the difference between what their eyes see and what their bodies feel. And then of course, Oculus and other virtual-reality headsets will live or die by the quality of the content released for them.But with developers like Gunfire Games and Insomniac climbing onboard the virtual-reality train, and Luckey and Mitchell saying their goal is to conquer the motion sickness issue, it’s entirely possible I may one day no longer make a distinction between what’s physically real and what’s not—because I won’t be able to immediately tell.

Alaina Yee is IGN’s tech editor and resident cardboard fort maker. If she ever makes one that lasts, she’ll post about it on Twitter