As a Security Professional, If You’re Not Having Your Work Peer Reviewed You’re Not Doing it Right.

Earlier in my career I bought into the notion that the ultimate goal of one’s career was to be the smartest person in the room. That being the smartest, or at least having people believe you were the smartest, was the pinnacle of it all. I’m happy to admit that I’ve since learned better. There are many bad things about thinking that way - you alienate peers by design, you fear discourse and analysis of your ideas, and you always have to keep playing the provocateur. Frankly it’s miserable and exhausting.

Spending time in that mode of thinking is dangerous for a number of reasons, but most of all because it closes you off to new thinking and the crazy-sounding notion that you just may be wrong. You see, if you’re wrong that means you’re not the smartest person around. And you just can’t have that. Mental stagnation, inward retreat and unhealthy competition become your cell mates.

This is the reason that I’ve focused the last few years of my career on building a group that researches, channels and organizes knowledge for the betterment of the collective. There is a lot of amazing tribal knowledge out there - lots of very smart people with something to share but without a voice. Aggregating these ideas, forcing them to compete with each other in a positive way and fostering collaboration is an amazing job. Those who have worked with me recently will recall me saying how important it is to stay humble and to seek out those with brilliant minds to aggregate all that available knowledge and feed it back. The aggregate is truly much more powerful than its piece parts.

I’m not trying to be your Yoda, but I’m sure you can look around and recognize how far out of the mainstream ideas like peer review have fallen. Egos are king and everyone is an expert. Just ask them. When Sony Pictures was breached, there were ‘experts’ coming out of the woodwork. Many who wrote opinion pieces under the banner of factual analysis were looking for their fifteen minutes. Every other person had the dominant theory, everyone else was wrong and if you argued you were clearly not smart enough to understand. Meanwhile, by my estimations there were about two dozen people, in total, who were even qualified to speak to the full scope of the incident and provide educated analysis. Those who knew couldn’t talk, those who didn’t couldn’t be quiet.

So back to peer review then. Whether you’re writing source code, an incident report or publishing the next great enterprise security framework, if you’re not working supremely hard to have your work peer reviewed you’re not doing it right. Peer review isn’t an opportunity for someone else to tear your work apart and make you feel small and stupid. In fact, it’s quite the opposite. It’s an opportunity to absorb the collective wisdom and experience of as many people as you can into your work. As long as you remember your attribution, you learn that the work becomes more brilliant and everyone involved grows. Peer review is about explaining your work, then listening to others talk themselves through it and hearing them find flaws. And if you’re like my team, people will find flaws and you’ll love it.

The fact is, in security there are very, very few things that are binary - right or wrong. There are shades of tolerable and risky. What works over there will catastrophically fail over here. Someone’s advice on one matter may be completely useless on another. It’s a little overwhelming the first few times you subject yourself and your work to this process. Think about it this way…

You spend five hundred hours researching and writing a model for a particular type of risk approach. You consider edge cases, you develop good core cases and you extrapolate to the edges and back. You test against situations you know of and everything feels good. It works. Then you put your model in front of a dozen people who each had a slightly different opinion than yours. Whether it’s based on situation, training, experience or just age, it’s different enough for them to notice. So your binary 1 becomes 12 slightly different shades between 1 and 0. Now what? The right approach is to run your model again, consider the peer review input and understand where your model has to stretch versus where it completely fails. Where it fails, you have to decide whether this is a big enough failure to adjust your model for that case. The answer isn’t always yes.

Peer review is powerful because it tests your understanding, talent and intelligence. But it also tests your personal fortitude and integrity. You put yourself out there for criticism, and you have to understand that not everyone is good at being positive about their critique. Take it all in stride – it’s not personal. Incorporate what you feel is appropriate into your work, rationalize and explain the rest. Now your work is that much stronger.

The me from earlier in my career would have invited people to a peer review session - if I would have done it at all - that all agreed with my line of thinking or had the same background and experience. The me that’s grown out of that finds as many contrarians and differing experiences and opinions as possible to throw darts at my work. My thinking is this: the work I produce will either be shown to be brilliant through that peer review process, or it’ll accumulate the collective knowledge of those that found flaws in it and be brilliant as a result of that. The net is that the work ends up better than if it was just me patting myself on the back.

Peer review, folks. Peer review everything you can. Find people who disagree… who have differing experience and vastly differing situations than you. Test your models, your theories and your ideas on as wide a variety of minds as possible. If enterprise information security is to make stride in a positive direction we need less of the self-aggrandizing “expert” and more of those who can aggregate and distill tribal knowledge into wisdom.