True story: back when I worked for Engadget it was always my dream to cover a Steve Jobs keynote. I knew how to liveblog, I knew how to do photos for a liveblog. I was capable. But I'm also a walking embodiment of Murphy's Law. Everything that can go wrong with technology, will go wrong in my hands. What if I can't connect to the internet? What if my camera's not working? What if I forget a cable, or an SD card reader? I was too obviously cursed with unreliability to earn a spot at a Stevenote. When Apple unveiled the iPhone, arguably the most important Apple keynote of all time, I was covering a Dell press conference in Las Vegas.

I think one of my biggest failings is my impatience. If something's not working, it's not "right," and therefore my first instinct is to reboot. Then if that doesn't work I try and update the firmware or flash the device. And then I just give up. Or maybe I brick it. I guess I just spaz out.

This happened literally a couple weeks ago with my fancy new PocketCHIP review unit. I couldn't get it on the Wi-Fi, and instead of spending the time necessary to get my question answered on the support forums, I just went ahead and flashed the firmware — assuming that by messing around in the terminal I had gotten the OS into an unworking state. Everything went wrong. It took me days to get it back in working order.

Yesterday I tried out a littleBits cloudBit, which is literally designed for children. It wouldn't work, possibly because I initially tried to plug it into my computer instead of the wall. I ended up on phone support, where I received instructions to flash the firmware. It's still not up and running, but I guess I've been avoiding it all day out of embarrassment.

Or, like, two days ago, when I tried to set up the HTC Vive on a livestream, and couldn't pull it off in an hour of struggling. I still don't actually know what I did wrong. But I'm sure it's my fault: Ross Miller set up the same gear successfully (though with a bit of hassle) earlier that same day.

Want to watch me fail? I knew you would. Here's an hour of my pain, documented forever on the internet.

In some ways I'm very good with technology. I like to figure out how things actually work. I know a bit of programming. I use a fair amount of keyboard shortcuts. I know how to quit Vim (but not Emacs). I can use the command line beyond just copying and pasting install instructions from GitHub. I can muddle through most device or software installations on Mac or Windows, and have even gotten Ubuntu running on a laptop.

But maybe my passion and interest for technology is my undoing. I can rarely keep myself from running unstable betas on my primary machine. I just destroyed a Surface Book with a preview release of the new Windows 10, and still need to fix it. I nearly had a nervous breakdown years ago trying to do a comparison feature on different Linux distros. I run Chrome Canary as my primary browser, which hard freezes my Mac about once a day. I haven't tried the new macOS Sierra or iOS 10, but I did use the OS X El Capitan beta, which absolutely destroyed all my developer tools.

And even the little things aren’t safe from my curse. I use a cracked-screen iPhone (dropped it because I got a little buzzed after two drinks). I did finally pay my phone bill, but I had to fork over a bunch of money to T-Mobile to reactivate the line. I have forty-five unread text messages and tens of thousands of unread emails. Every Wi-Fi router I set up seems in some way broken. I can never get AirDrop to work.

Even non-digital technology stymies me sometimes. The box fan in my room smells like burning rubber. I've known how to drive a stick shift since high school, but I'm pretty sure I'm doing it wrong. My light bulbs always burn out too soon, and I can never find AA batteries when I need them.

Is this just the human condition, and I'm just more exposed to this because my job is gadgets?

My colleague Dieter thinks my problem might be the fact that I'm kind of a "hacker," and the technology I'm tinkering with is surprisingly fragile. Maybe! His line of thinking reminds me of this talk by Joe Armstrong, the creator of Erlang (which powers, among other things, WhatsApp):

He's talking about the incredible mess of modern computing. There are so many variables, it's hard to get any system back to a "clean" start, and therefore making one-off bugs and surprising behavior inevitable. It's nice to have someone as accomplished and smart as Joe Armstrong describe his own issues with computing, and it's even nicer that he has a theory of the problem and ideas toward a solution.

But in the meantime, I still suck.

And yes, I tried turning it off and on again.