Short answer: It didn't. Or more accurately, we'll never know if it did, because we don't really have a way of knowing what "Reddit" thinks, only what some people on Reddit seem to think.Long answer: OK, let's back up. When the Boston Marathon bombing manhunt began, there was a Reddit forum (subreddit) devoted to finding the bombers. A lot of people had high hopes for this effort. But the main "suspect" to emerge out of Reddit was a guy named Sunil Tripathy, who had no relation whatsoever to the bombings. Meanwhile, in about the same amount of time, police found the real guys, Tamerlan and Dzhokhar Tsarnaev. If you're interested in the details of Reddit's epic fail, see here . (And more here .)Which brings us to the question, which someone asked me on Twitter:, exactly, did Reddit whiff so badly?In recent decades, we've heard a lot about the "wisdom of crowds". James Surowiecki, who wrote an excellent book on the topic , mentions things like the stock market's identification of the reason for the Challenger disaster, or the ability of a group of non-experts to collectively outguess an expert on questions like "How many jelly beans are in this jar?". More recently, we've learned that prediction markets are more accurate than polls at predicting election outcomes, and in fact that they beat sophisticated "expert" forecasts in many situations. Companies have experimented with internal prediction markets to tap the collective wisdom of their employees. In general, we have come to believe more and more in the ability of large groups of non-experts relative to the ability of small groups of experts.Should that belief be challenged by the Sunil Tripathy fiasco?Not necessarily. The key is that the "wisdom of crowds" may work very well in some cases, while in other cases it may give way to the "madness of mobs". We don't know exactly which case is which, but we do have a general idea what sets them apart. Surowiecki summarizes it well in his book , in fact.Basically, when we have a method for, crowds will perform very well. When the individuals in a crowd, however, diversity and independence breaks down, and crowds can pounce on the wrong answer.We see this in finance experiments. A number of experiments , including classic work by Charles Plott, have established the ability of financial markets to aggregate the private information of diverse participants to arrive at the "right" price. However, other experiments , e.g. by Colin Camerer, have shown that when people pay attention to the actions of others instead of to their own private information, then information can become "trapped" , and markets can arrive at the wrong price. There are a number of different theoretical reasons why herd behavior might take over from efficient information aggregation; some of these are "rational" explanations and others are "irrational", but they all rely on individuals having some reason to ignore their private information and focus on what other people do.You can definitely see the herding dynamic at work in the case of the Sunil Tripathy fiasco. A few guys started saying "It was Sunil Tripathy!" And a lot of other people on the subreddit started focusing on that name, and looking for information about Tripathy. The Tripathy idea was a wrong idea that was initially concentrated among a small group of individuals, who pushed that idea loudly and confidently. Meanwhile, a large number of people on the subreddit may have had small, weak pieces of information pointing to the Tsarnaev brothers. But since Reddit had no way of collecting and aggregating these dispersed small pieces of information, it might have become "trapped", just like in a Colin Camerer experiment.So let me return to the "short answer" at the beginning of the post. It's not really right to say that "Reddit" picked Sunil Tripathy.on Reddit picked Tripathy, and their voices emerged loud and clear from the chaos, not because most people agreed with them, but because they were the loudest and most strident minority voice. So anyone paying attention to Reddit picked out a few shrill cries of "Tripathy!" rising above the cacophony, and concluded that this was Reddit's consensus verdict. Meanwhile, the attention of other Redditors was turned toward Tripathy, and they spent their time and effort evaluating the Tripathy hypothesis instead of generating alternative hypotheses.In other words, because it had, Reddit became less like a prediction market and more like a lynch mob.Would Reddit have done better if people could have voted on who they thought did it? I doubt it, because the set of hypotheses was not properly mapped. In an election prediction market, you know the set of candidates. In a jellybean jar contest, you know the set of numbers of jellybeans that might be in the jar (i.e. the real line). But a "whodunit" poll can't list every human being as a potential culprit; it has to limit the choices to a few popular hypotheses. In Reddit's case, a poll would have included 1. Tripathy, and 2. Someone Else. Not very helpful. A prediction market would have suffered from the same problem.So is there any hope for crowdsourcing terrorism investigations? I think that there already is such a method:. Tips tend to be independent, since people usually don't know who else is calling in a tip. And in a high-profile case like a terrorist attack, people who call in tips tend to be fairly diverse, since so many different kinds of people are paying attention. Finally, police can tabulate the number of similar tips, which is a method of aggregation. So tip hotlines satisfy the loose, general criteria for the "wisdom of crowds" to overcome the "madness of mobs". I think it's no coincidence that in the Boston bombing case, a victim's tip ended up being hugely helpful to the police.Anyway, it's worth pointing out that these criteria for "crowd wisdom" aren't clear-cut. How do you know how independent and diverse a crowd's members are? What is the optimal method of aggregating their beliefs? This is a large, important, open area of research. So have at it, smart people. Just don't paymuch attention to what others in the field are doing...