This image was removed due to legal reasons.

A couple of months ago, Laura Harper, a 44-year-old freelance writer and editor from Houston, Texas, got upset while reading a Jezebel story about a service called "Invisible Boyfriend."


The story described a new service from a St. Louis-based start-up that, for $25 a month, would give single people the appearance of being in a relationship, by sending them texts from a lover "whose name, physical appearance, hobbies, and personality are distinctly curated by you." Like most of the reporters who signed up for the app, Jezebel writer Ellie Shechert initially assumed her new bae was sending automated replies to her texts, in the manner of Siri. But a few texts in, she realized that "Albert" wasn't a bot, but a real person, and she decided she wanted to aggravate him into dumping her, sending him texts like, "YOU ARE GARBAGE." "Albert" responded with flustered yet affectionate texts leading Shechet to describe him as "dumb," "boring" and a "dullard."

Harper was offended, because she had been Shechet's "boyfriend," at least for part of the conversation. She is one of 600 crowd-sourced workers who play the role of boyfriend (or girlfriend) for the 7,000 significant others that have been created on the Invisible Boyfriend/Invisible Girlfriend platform. "I recognized the texts. I was, like, wait a minute, that's not how it went down," said Harper, a widow who prefers playing a boyfriend on the service because she knows what women want to hear. She almost commented on the article to defend "Albert," but Invisible Boyfriend discourages "breaking character," so she refrained.


Invisible Boyfriend is one of the many new online services that take advantage of the cheap, distributed labor forces provided by platforms like Crowdsource and Mechanical Turk. These companies provide what's called "workforce-as-a-service" to some of the largest companies in the world; Crowdsource, for instance, counts Target, Orbitz, and Coca-Cola as its clients. For a fee, Crowdsource allows these companies to tap an on-demand labor pool that consists of more than 8 million freelancers in 180 countries.

When you send a text to Invisible Boyfriend, your message is routed to one of a small army of workers using one of these platforms. When a worker accepts the task of responding to you, he or she is given your first name, city, gender, hobbies, a description of how you "met," and the last five texts in the conversation. This setup is designed to create the illusion of continuity; ideally, an Invisible Boyfriend would seem like a steady, stable presence in a user's life, even though it's actually a role being performed by a rotating cast of men and women.

Invisible Boyfriend co-founder Kyle Tabor said that the company had originally planned to use bots to power its service, but that the software it tested often generated weird, nonsensical responses to users' messages. "You would text, 'I'm at the movies' and it would respond, 'I love dogs,'" said Tabor. Crowdsource, which is also based in St. Louis, approached them with the idea of getting humans to answer texts instead. The Invisible team gave Crowdsource requirements for the kinds of texts they wanted workers to send to users (uplifting, romantic but not sexual) and gave them the green light. "We tested it out and it felt real," said Tabor. "There’s a personality to it and there’s emotion. It feels like a real person even though its hundreds of people."

This image was removed due to legal reasons.


Tabor said that he has never met or talked to any of the 600 outsourced workers who power his service, and doesn't know how much they get paid. (The going rate is 5 cents per message sent, according to a post on TurkerNation, a Mechanical Turk forum.) "It’s interesting that we have hundreds of potential employees or subcontractors," Tabor said. "But I don’t think of it that way. I think of it as a partnership with Crowdsource."

Users have gotten surprisingly intimate with their outsourced pen pals. According to Tabor, Invisible Boyfriend has been especially popular with users who want to practice textual flirting, tell their secrets to a virtual sounding board, and get emotional support and companionship. One reporter noted that her Invisible Boyfriend was the most reliable person in her life—"the only number in my phone I can text and get a guaranteed response, one that is invariably affectionate." "A cancer patient reached out to us and told us how important it is for her," said Tabor. "Her boyfriend dumped her in the middle of chemo. The messages help her get through the day."


Invisible Boyfriend's mechanics are clear to the Crowdsource workers who power the service; they're less clear to the users sending the messages. The crowd-sourced nature of Invisible Boyfriend's backend isn't disclosed when you sign up for the service, meaning that users might not understand that the people they are flirting with, yelling at, or pouring their heart out to are real, flesh-and-blood humans. (Tabor dismisses this possibility: "We don’t explicitly say that there are humans, but it’s obvious once you start using it…people figure it out.")

This image was removed due to legal reasons.


Services like Invisible Boyfriend can raise privacy issues, especially when sensitive information is being sent to a larger (and more human) audience than a user intended. Last month, a Reddit user wrote about a job that popped up on another of these crowdsourcing platforms, Crowdflower; it involved listening to recordings of people talking to their phones to verify that the text translation matched the audio. "Everything you've ever said to Siri/Cortana has been recorded…and I get to listen to it," wrote the Reddit user. Motherboard reporter Kaleigh Rogers confirmed the story—and even included examples of the recordings—though she only found explicit evidence of voice commands to Samsung phones, not iPhones. "While it may be within the legal limits of the companies to farm out these short, anonymous voice clips to strangers online, it’s certainly not a well-known practice," wrote Rogers.



Crowdflower says this was a violation of the platform's "Terms of Service;" midway down the lengthy document, which users may or may not have read, term #20 forbids publishing confidential information. "All information within a job on the CrowdFlower platform should be considered as confidential even if the information is not expressly designated as confidential," it says. Crowdsource, which powers Invisible Boyfriend, has a similar clause in its ToS. "Any worker that signs up on our platform agrees to abide by our terms of service/privacy policy that includes all projects they work on are confidential," says a Crowdsource spokesperson.

The onus though is ultimately on the company using a crowdsourcing platform to make sure its users know what's happening. "Disclosing the mechanics of the system is important," said Walter S. Lasecki, a PhD candidate at the University of Rochester who has started researching privacy and security around crowdsourcing platforms. "Companies should tell users that actual people might see or hear these things, and do so explicitly. If it says in the privacy policy that what I say to Siri might be listened to by a real person, I missed that. If it said information might be provided to 'third parties' and that translated to 'random people on the Internet,' that seems unexpected."


Tabor says that Invisible Boyfriend keeps its customers anonymous. But given that the service provides first names and hometowns to its Crowdsource workers, and that users often volunteer other personal information (such as where they work), it wouldn't be hard for a curious worker to figure out the identities of their make-believe significant others. Harper, the worker who blanched at Jezebel's story, says she would never research an Invisible Boyfriend customer. "It’s a betrayal of trust," she said. "The company has told them they’re anonymous. Even if I thought, this guy sounds great, you know they’re not what they say they are."

In a sense, crowdsourced services like Invisible Boyfriend are based on an illusion; they're meant to mimic the efficiency and consistency of robots using faceless, distributed human workers. As a result, we often treat the people on them like robots, in the sense that we believe that we can speak to them freely without consequences or judgment, the same way we feel when we do a Google search. (The name of Amazon's crowdsourcing platform, Mechanical Turk, nods at this misdirection; it's named for "The Turk," a chess-playing automaton from the 1700s that everyone thought was a robot, but that was actually being operated by a human chess genius.) When you're talking to a Mechanical Turk worker over text message, you're much more likely to reveal sensitive information than you would to a stranger sitting next to you.


Lasecki expressed concern that crowd-sourced workers might have a looser grasp on the legal restrictions on that information. "They probably don’t have the same [non-disclosure agreement] as an employee who’s full time," he said. "Most people want to behave ethically on these platforms and do good work. But it’s not a full-time employee where you know their name." Last year, Lasecki and two Microsoft researchers ran an ethics experiment on Amazon's Mechanical Turk: they paid workers five cents to transcribe credit card information from a photo of a credit card. They wanted to see how many workers would be willing to do a task that appeared to be against the law. They found that fewer than 30 percent of workers were willing to do the task, compared with 75 percent who were willing to transcribe a photo of an innocent string of numbers. That's good news, for people who worry that a distributed worker will go rogue and abuse their personal data. (Because the credit card numbers were fake, the researchers couldn't run what would have been an even-more revealing test: whether the workers tried to use any of the credit cards.)

"There’s a hesitation to complete tasks that don’t feel right," said Lasecki. "That’s reassuring. We can’t give people any task and they’ll do it. Otherwise crowdsourcing would be very dangerous."


In the future, the danger associated with crowdsourced services will rest, in part, on the integrity of people like Harper, who sends between 10 and 20 texts a night to her Invisible Boyfriend clients in addition to taking on other crowdsourced tasks like copy-editing and writing product descriptions for e-commerce sites. She likes the spontaneity and variety of being an Invisible Boyfriend, and the way it allows her to try on a different personality every few minutes. She also hangs out with her fellow Crowdsource workers on an online forum, where they pass intel around about their clients. "I'll give a heads up on the board that if they talk to a Joe in North Dakota, I told him I had a dog named Teddy," she said.

There are poignant moments, she says, when her heart aches for someone who is obviously lonely. Invisible Boyfriend users have talked to her about the death of a child or a parent. "I really hope they have someone in their real life to talk to, too," she said. One user in Florida had just broken up with his girlfriend and was depressed. He offered up his e-mail address and begged to continue the conversation off-platform.


"I didn't e-mail him," she said. "You can't break character."

When Invisible Boyfriend first came out, the founders were reluctant to let reporters talk to the crowdsourced workers, said Harper. "It was all veiled and hush hush. They didn’t want us talking about it at first because it would ruin the mystery," she said. "But everyone knows they’re talking to a real person. They should know they’re talking to a real person."