Facebook isn’t the only organization conducting research into how attitudes are affected by social media. The Department of Defense has invested millions of dollars over the past few years investigating social media, social networks, and how information spreads across them. While Facebook and Cornell University researchers manipulated what individuals saw in their social media streams, military-funded research—including projects funded by the Defense Advanced Research Projects Agency's (DARPA) Social Media in Strategic Communications (SMISC) program—has looked primarily into how messages from influential members of social networks propagate.

One study, funded by the Air Force Research Laboratory (AFRL), has gone a step further. “A less investigated problem is once you’ve identified the network, how do you manipulate it toward an end,” said Warren Dixon, a Ph.D. in electrical and computer engineering and director of the University of Florida’s Nonlinear Controls and Robotics research group. Dixon was the principal investigator on an Air Force Research Laboratory-funded project, which published its findings in February in a paper entitled “Containment Control for a Social Network with State-Dependent Connectivity.”

The research demonstrates that the mathematical principles used to control groups of autonomous robots can be applied to social networks in order to control human behavior. If properly calibrated, the mathematical models developed by Dixon and his fellow researchers could be used to sway the opinion of social networks toward a desired set of behaviors—perhaps in concert with some of the social media “effects” cyber-weaponry developed by the NSA and its British counterpart, GCHQ.

Soft Info-warfare

Military social network research has been ongoing over the past decade as part of the DOD’s efforts to leverage “open source intelligence” and use social network analysis (using mobile phone records, other electronic relationships, and physical-world relationships) to target manufacturers of improvised explosive devices and leaders of insurgent cells. Along the way, the research has shifted more toward “hearts and minds” goals than “search and destroy” ones.

DARPA launched its SMISC program in 2011 to examine ways social networks could be used for propaganda and what broadly falls under the euphemistic title of Military Information Support Operations (MISO), formerly known as psychological operations. Early in July, DARPA published a list of research projects funded by the SMISC program. They included studies that analyzed the Twitter followings of Lady Gaga and Justin Bieber among others; investigations into the spread of Internet memes; a study by the Georgia Tech Research Institute into automatically identifying deceptive content in social media with linguistic cues; and "Modeling User Attitude toward Controversial Topics in Online Social Media”—an IBM Research study that tapped into Twitter feeds to track responses to topics like “fracking” for natural gas.

The AFRL-sponsored research by Dixon, Zhen Kan, and Justin Klotz of University of Florida NCR group and Eduardo L. Pasiliao of AFRL’s Munitions Directorate at Eglin Air Force Base was prompted by a meeting Dixon attended while preparing a “think piece” for the Defense Science Study Group. “I heard a presentation by a computer scientist about examining behaviors of people based on social data. The language that was being used to mathematically describe the interactions [between people and products] was the same language we use in controlling groups of autonomous vehicles.”

The social drone graph

That language was Graph theory—the mathematical language that is the basis of Facebook’s Graph database and the “entity” databases at the heart of Google and Bing’s understanding of context around searches. It’s also become a fundamental part of control systems for directing swarms of autonomous robots. The connection inspired Dixon to want to investigate the connection further, he said. “Can you apply the same math to controlling autonomous people to groups of people?”

Dixon’s group had been doing other work for AFRL around robotics, and when he mentioned the idea to a contact there, he was connected with researchers within the same group at AFRL who were interested in social media topics. With funding in hand, the research team worked to model how collaboration between “key influencers” in social networks could affect the behavior of groups within the network by using the principle of “containment control.”

Dixon explained the concept this way: “There’s a group of leaders, each of which has their own objectives, and they have their own topic of emphasis. The goal is to have those people change the opinion or coerce the group of followers—people [who are] in the social group of these people but don’t know the high level objective.”

Using Graph theory, Dixon and his fellow researchers created a model to find the mathematics behind how much influence a social media “leader” needs in order to exert power and shift behavior. Dixon’s research, like many of the DARPA studies, did not perform real-world research to confirm findings—it was all simulation. And that’s a tripping point for taking this work further, one that Cornell Social Media Lab researchers hurdled with Facebook, creating an outcry in the process.

“The problem is, how do you perform a closed loop experiment? That’s something DARPA has struggled with,” said Dixon.

To that end, the SMISC program has pushed for experimental environments that use “closed” social networks. On the DARPA project page, the SMISC project team wrote, “SMISC researchers will create a closed and controlled environment where large amounts of data are collected, with experiments performed in support of development and testing. One example of such an environment might be a closed social media network of 2,000 to 5,000 people who have agreed to conduct social media-based activities in this network and agree to participate in required data collection and experiments. This network might be formed within a single organization, or span several. Another example might be a role-player game where use of social media is central to that game and where players have again agreed to participate in data collection and experiments.”

After initial reports linked the Facebook research to the military, DARPA published a fact sheet that not only denied a connection but also called the Facebook research to task.

“DARPA does not support research programs that aim to deceive unwitting people to see how they react (as the controversial Facebook study did). DARPA funds research on how groups form and influence each other and related dynamics—similar to social science research that has been conducted for decades with other kinds of communication,” a program representative wrote. “DARPA‐funded studies that have involved sending potentially deceptive information to see how people react have been conducted with closed groups of enrolled individuals who have volunteered/consented to be in social media studies. None of the social media data collected or analyzed by DARPA‐funded academic scientists is collected or saved by DARPA or the Government. Further, DARPA‐funded researchers must certify that no personally identifiable information (PII) is collected, stored or created in contravention to federal privacy laws, regulations or DOD policies, and SMISC researchers are not provided PII from any other government agency or outside source.”

Remote control

While DARPA’s research funding comes with specific ethical and legal constraints, it’s clear that there is other research being conducted in social media manipulation not happening with such oversight. As Ars reported earlier this week, documents provided to The Intercept’s Glenn Greenwald by former National Security Agency contractor Edward Snowden show that the British intelligence agency GCHQ has developed tools specifically aimed at leveraging social media for “effects” operations. The organization has shared its methods with the NSA, and these capabilities have been used in Afghanistan and elsewhere to “shape” information available to members of targeted organizations online and via mobile phones.

The Defense Department already uses some social media manipulation techniques as part of its “information support” operations, targeting message boards and websites associated with foreign groups that are considered “extremist.” At least on one occasion, that’s resulted in some blowback in the United States. In 2012, a DOD contractor identified a website operated by a US citizen who immigrated from Somalia as being an “extremist” site associated with the Somali al-Shabab terrorist group, and the contractor recommended DOD use it to conduct “messaging” campaigns to amplify anti-al-Shabab opinions.

Social media subversion doesn’t stop with DOD or the intelligence community. In April, the Associated Press uncovered an effort by the US Agency for International Development (USAID) in 2009 to create ZunZuneo, a Twitter-like mobile social network aimed at undermining the Communist regime in Cuba. But the research that DOD is financing has much broader cultural implications than as a tool of information warfare against adversaries abroad.

“People don’t like to think they’re manipulated,” Dixon said. “But every day we’re being manipulated—by advertisements, by government leaders, religious leaders, and even into going to work.” He believes the mathematical models could be applied to everything from marketing to workforce management. “We’re working because we’re getting paid for the most part, but how much do I have to pay someone to work? If I give them a sporadic bonus, can I pay them less overall? How do I get people to buy Levis or Wranglers?”