In our Love App-tually series, Mashable shines a light into the foggy world of online dating. It is cuffing season after all.

“At one point, the bot was having maybe 200 conversations at a time...I think Tinder knew this and they banned me, of course, from the platform.”

This is Robert Winters, a computer programmer in Belgium, who is just one of many people who’ve used scripts made by other programmers in order to game Tinder — even more than the app has already gamified dating.

The script learns your preferences once you feed it data, for example swiping on Tinder 100 times. Customizations can be added on as well, such as programming the bot to have conversations for you. Once it knows what you want, it can essentially use the apps for you. Winters used a program called Tinderbox, later called Bernie A.I., but there are many others — such as this Github file.

We just left the decade that gave rise to dating on our phones. We’ve endured the so-called dating apocalypse and created buzzwords for every iteration of being inconsiderate to the potential suitors we’ve met on apps. It’s no secret that the majority of couples meet online now , and that dating apps have shifted how we find love.

These facts alone have led some people to wring their hands and mourn the ways of olde, like meeting through at church or through friends at work. But others have embraced this new path and opted to push it to an even greater extreme by using bots and AI to help them find their perfect match.

Decoding the code

When Winters decided to game the Tinder system, he downloaded Tinderbox, created by developer Justin Long, as his source code. Jeffrey Li, who is currently a data scientist at DoorDash, also used Long's source code to create his own Tinder Automation. He made it available to the public on Github. Li cited two reasons for developing the code in an interview with Mashable: He wanted to develop his data science skills, and he wanted to use them to improve a problem in his life — in this case, online dating. He said he was bored on dating apps, and the time commitment to them was, in his words, annoying.

“I've talked to a lot of female friends who were on dating apps, it tends to get overwhelming for them,” he said. “However, on the other side of it, if a guy doesn't have a great profile, you tend to get crickets.” Li said he was in that camp — putting time into the app but not getting a return on that investment.

SEE ALSO: Tinder rolls out a new panic button feature

“The seed of it came from saying ‘Hey, I want to improve my dating life, however, how can I do that in the most lazy way possible?’” Li said.

To develop a solution, he needed to understand Tinder’s algorithm. The algorithm (or model) needs training data — it needs to know the user’s preferences. Since Li didn’t swipe right on many Tinder profiles, there wasn’t enough data. So to gather more, he scraped Google data and used images of women he found attractive to help the algorithm learn his preferences. At that point, the model was pickier than he was. “It would actually reject some of the some of the profiles that I actually thought were were okay,” he said.

The next step was to set up an automated message that he could change every time he got a match. Li programmed his bot to be a screening service, in a way. It would do the swiping, and he would do the talking. He set the bot to 100 swipes per day and estimated that he liked 20 of them. Li caveated that he did not have “a good profile” at the time, so there was not a high match yield. He estimated that he got around five matches per week.

Li did not end up meeting anyone serious using the bot, and he said that was part of the reason he stopped using it.

Winters, however, picked up where Li’s idea left off and took it even further. He programmed the bot to do the talking for him. He did this via conversation trees , rudimentary chats that would go in one of two directions, depending on how the person on the other end responded. This is what ultimately led to Winters to be kicked off of Tinder. (The app's spokesperson did not have a comment, and instead pointed me to their community guidelines.) Apps have not been happy when users have attempted to "hack" their API like this, and they're unlikely to change their view in the future.

There’s a lot to unpack here

Using AI and bots to “hack” dating apps sounds like a Silicon Valley wet dream, and perhaps it is. But how bad is it from an ethical perspective? There are several concerns here. One is unconscious (or conscious!) bias; one is disclosure; and one is data security.

Bias is a problem that plagues the tech and AI space in general, not just dating apps. We’re only starting to skim the surface about how bias plays out in dating app algorithms , and trying to make the algorithm adhere to your preferences with a certain amount of accuracy seems...problematic, to say the least.

"Generally, machine learning has a lot of flaws and biases already in it," said Caroline Sinders, a machine learning designer and user researcher. "So I would be interested in seeing these guys' results, but I imagine that they probably ended up with a lot of white or Caucasian looking faces" — because that's how heavily biased AI is. She pointed to the work of Joy Buolamwini, whose work at MIT's Media Lab looks at how different facial recognition systems cannot recognize Black features.

Disclosure can also pose a problem. How would you feel knowing that the person you hit it off with on Tinder or Hinge actually had their bot do all the talking for them? Using dating apps, just like dating in general, requires some time commitment. That’s what drove Li to write his script in the first place. So how would someone feel if they took the time to spruce up their profile, to swipe or “like” or what have you, to craft a witty first message — all while the person they’re talking to is actually a bot?

Sinders also noted the potential security issues with collecting data in order to use these scripts. "As a user, I don't expect other users to take my data and use it off the platform in different ways in experimental technology projects in generally, even art projects," she said.

It's also extra inappropriate, Sinders gathered, because the data is being used to create machine learning. "It's a security and privacy, a consensual tech problem," she said. "Did users agree to be in that?"

The problems associated with using people's data this way can, according to Sinders, range from mundane to horrific. An example of the former would be seeing a photo of yourself online that you never intended to be online. An example of the latter would be misuse by a stalker or a perpetuator of domestic violence.

A few more concerns

Dating apps may seem like a boon to people with social anxiety, as they remove a lot of IRL pressure. According to Kathryn D. Coduto, PhD candidate at The Ohio State University researching the intersection between tech and interpersonal communication, however, this view of apps may be fraught. Coduto is co-author of the paper “Swiping for trouble: Problematic dating application use among psychosocially distraught individuals and the paths to negative outcomes,” which observes how apps could potentially be harmful to some users’ mental health.

Apps can let someone with anxiety feel more control over their dating prowess — they choose how they present themselves, with their photo and bio and the like. But what happens when using apps is as fruitless as trying to meet people in real life? “If you're still not getting matches, it probably hurts worse,” Coduto said.

Coduto studied Li’s Github file and wondered if anxiety might have played into its creation. “The idea of, ‘I haven't really been getting matches I want so I'm going to make an entire system that searches for me and then if it doesn't work, like it's not on me,’” she said.

“That's a scary thing that could happen with these with dating apps, the reduction of people to data,” Coduto said. “The big thing with [Li’s] GitHub is that these people are data points that you may or may not be attracted to. And the fact that it’s even set to say like, ‘oh, here's a percentage match, like how likely you'll like them.’”

Screenshot of Li's Github script description Image: jeffrey li

“Feels a little skeezy,” said Coduto.

She was also uneasy about the idea that the “perfect partner” exists — and that you can simply find them with AI. If you want your partner to look exactly like Scarlett Johansson, why not use her image to teach your bot that exact preference? “If you're building this up and not finding it and you start to feel bad about yourself,” Coduto said, “Well then make a bot do it and maybe it feels better.”

A different kind of bot

Shane Mac, entrepreneur and co-founder of conversational platform Assist , had to grapple with that question when using a bot he created. The bot can be used with dating apps like Li’s code, but it is entirely different. Mac described his creation on tech reporter Laurie Segall’s podcast First Contact , and subsequently in an interview with Mashable.

Mac implemented a different approach entirely, free of a bot learning preferences with photos of Scarlett Johansson. “What if it's more about the thoughts and the words and the language rather than the looks?” Mac said.

To him, the crux of dating apps were not photos, but conversations. Getting a match is one thing, but what happens after the swiping is all conversational: the first message, perhaps based on a photo or bio, and then messaging back and forth. It was all about language.

What’s more is that dating apps have begun to swing away from the model of endless swiping into the void. Hinge, which declined to comment on this story, is at the forefront of this — and Mac used this keyboard with Hinge — but more are starting to show up on the App Store and in conversations. One example is Bounce , an app that only allows swiping for 15 minutes at a time, and you have to be available for a date that night.

Mac did not develop a code to implement with a dating app to do the swiping for him. Rather, he developed a keyboard that one could install on their iPhone. Think of another language keyboard or the Bitmoji keyboard; you just toggle to it when typing. “I do believe it's inevitable that everyone has an assistant helping them write,” he said. “It's already happening. It's in Gmail . It's in Grammarly .”

'I want three people that are great that I can talk to.'

Mac said the problem with apps is not that there needs to be more connections, which is what Li’s Github follows and others like it set out to do. The solution is actually fewer — more finely tuned — connections. “I don't want to be on an inbox of 50 people talking and sending more messages,” he said. “I want three people that are great that I can talk to.”

But do those three people want a bot talking to them? Mac said that he leads with it, because it’s a conversation starter, but he did describe one instance where his date was offended by the concept. It was a second date with a woman working at a major dating app — but they met through friends — and when he showed her the keyboard, she was so put off that she walked out.

“She was so offended, and so mad at me,” he said. According to Mac, she said that the bot would be used to manipulate people. They never spoke again.

In Winters’s case, one woman he met through his bot thought it was an interesting concept, and it actually excited her. “She was very cool about it, but I can imagine that some people would be offended,” he said.





The future of dating with AI

To Mac, it’s bots like his keyboard and not codes like Li’s that are the future of dating. “Don't even think of it as a bot,” he said. “Think of it as your friend who's your concierge who's going to find you a date. Right? That's the future.”

Looking forward, concierge bots will help us find love — and, more broadly, solve our problems in general. At least according to Mac. “It's going to be a more concierge-like thing that is helping you probably be a better version of yourself but then that helps you match better with someone else,” he said.

This leads to even more questions in terms of disclosure and ethics. If everyone has a bot, when is it disclosed that they’re the ones talking to each other? What are the ethics of bot to bot conversations? We obviously don’t have the answers to those questions yet, but these will be front of mind if — or when — this technology is further implemented.

What are the ethics of bot to bot conversations?

Sinders said that in her view, all bots should be disclosed, but one like Mac's could be very helpful especially for people who are shy or have anxiety. "It can be difficult to start a conversation," she said. "So having a keyboard that gives someone prompts I don't really have a problem with."

There is also the concern that this tech could end up like Tay, Microsoft’s bot that Twitter taught to spew racist language in less than a day. But, Mac explained, that is not quite the future of machine learning. Rather the bot will teach the human to be more empathetic, more curious — just a better person in general. He envisions the bot having a filter, telling its users what is helpful versus harmful.

Shortly before Mashable’s interview with Mac, he tweeted that he’s looking for someone to take over the keyboard . He expanded on this by saying he does not have time to upkeep it himself, and wants someone passionate about dating to take it over. “I have the technological expertise to help them,” he said.

He reportedly has already gotten interest, so perhaps we will be using a keyboard to date sooner than we’d think.

So should we bet on swiping bots for now?

Coduto did not want to “demonize” bots completely, but she has found a hesitation to find love on apps among the people she’s interviewed — despite stigma over online dating decreasing — that may just be perpetuated by this technology. She continued, “There's still a yearning for really natural connections, meeting through friends, meeting through your environment.”

Coduto said that, going off of her research, the majority of people are not ready for a dating app landscape like Li’s or Winter’s. “We're in a time where people are very romantic, whether or not they want to admit it,” she said. While she said that Mac’s bot seems closer to what people would be comfortable using, there is still resistance to handing over your romantic fate to a machine.

There is still resistance to handing over your romantic fate to a machine.

Li himself was admittedly bearish about this technology; it’s why he did not program the bot to speak for him. “I actually wanted to see if there was a connection myself,” he said. “And it's a little bit hard for an AI to really gauge how well that connection is.”

He did comment, however, that this could change in the future with further developments of AI. But for now Li didn't end up meeting anyone serious using the bot, and he said that was part of the reason he stopped using it.

Winters and his girlfriend. Image: courtesy of robert winters

Winters’ story has a more rom-com ready ending. His bot actually matched with and spoke to his current girlfriend on Tinder — before he was banned. But that’s not how they actually started dating. They met in-person, at a party, a few months later.

UPDATE: Feb. 5, 2020, 1:34 p.m. EST A previous version of this article stated that Winters used source code created by Jeffrey Li. He actually used code created by Justin Long, as did Li. The article has been edited to reflect the correction.