Early on, Growth identified a lot of low-hanging fruit to help Facebook’s numbers. One of them was search engine optimization (SEO), the practice of raising the visibility of content in Google search rankings.

In the previous year (2007) Facebook for the first time allowed its users’ profiles — or an abbreviated version of them — to appear in search results. But they weren’t appearing high in rankings, in part because they weren’t easily found within Facebook, and Google’s web crawlers would have to burrow deep into Facebook to find them. Schultz and Gleit put together a directory for Facebook that interlinked people’s profiles in a manner that was catnip for Google. It resulted in the profiles being ranked higher, and when people came across them, they could ask to friend those people right there in the Google search engine. It got Facebook some new users.

A sex worker found Facebook recommending her clients, who did not know her true identity. A sperm donor got a suggestion for the biological child he never met.

But the masterpiece of Growth is a feature that became almost as much a part of News Feed as weddings, vacations, and political outrage. It’s called People You May Know, referred to internally by the acronym PYMK. Officially launched in 2008, People You May Know is a feature that identifies personally selected prospects for one’s friend list. It wasn’t a Facebook invention — LinkedIn did it first — but PYMK proved to be one of Growth Circle’s most effective tools, and also one of its most controversial ones, a symbol of how the dark art of growth hacking can lead to unexpected consequences.

On its face PYMK seems innocuous enough: a carousel of profile pictures on Facebook presumably connected to you, but somehow not your Facebook friends. Its impetus was to address an imperative that the Growth team’s researchers had unearthed: a new Facebook user is likely to abandon the service if he or she doesn’t connect with seven new friends — fast.

Thus PYMK was essential for Facebook. Exposing potential friends is a way to improve a member’s experience; it increases the chances they will share more, and, most of all, it makes people less likely to bail on Facebook.

For many people, PYMK is a welcome feature: a helpful prompt to get in touch with connections who would help them get value from their Facebook experience. But sometimes PYMK can be unsettling, raising questions of what caused those cameo appearances on your News Feed by people whose connection to you was obscure, and sometimes downright unwelcome. A sex worker found Facebook recommending her clients, who did not know her true identity. A sperm donor got a suggestion for the biological child he never met. A psychiatrist learned that Facebook was recommending that some of her patients friend each other on the service. And millions of people went Ew! as Facebook suggested they develop relationships with friends of their children, spouses of their casual acquaintances, or disastrous blind dates of a decade ago.

Journalists who studied the feature — notably Gizmodo’s Kashmir Hill, who spent part of a year trying to get to the bottom of the mystery — were never able to get Facebook to divulge exactly how the product works. Hill unearthed the story of the woman who got a Facebook suggestion that she friend the mistress of her long-absent father. And Hill herself was stunned to find that someone on her own PYMK suggestions turned out to be a great-aunt she’d never met. Facebook did not provide her the information she requested on how it made this connection.

Later, Hill would also write about the psychiatrist who discovered that PYMK was suggesting that her patients make friend connections with each other — even though the psychiatrist did not friend her patients on Facebook. Once again, Facebook would not provide an explanation.

Palihapitiya now indicates that dark profiles did exist, and the Growth team took advantage of them.

Neither would Facebook respond to Hill’s queries about whether PYMK’s instant suggestions for new users meant that it was storing data on people not signed up on Facebook, and making use of “shadow profiles” when someone joins. Years later, Mark Zuckerberg would testify in Congress that the company does not engage in that practice. It does keep some information on non-users, he said, but only for security purposes, to fight fake accounts. (Zuckerberg did not mention his early cogitations in the Book of Change about dark profiles.) In a more elaborate explanation provided later, Facebook said, “We do not create profiles for non-Facebook users,” though it also says it keeps certain data, like what device and operating system version a nonuser has, for things like “optimizing registration flow for the specific device” should someone decide to join.

But Palihapitiya now indicates that dark profiles did exist, and the Growth team took advantage of them. He says that Facebook would take out search ads on Google using the names of Facebook holdouts as keywords. The ads would link, he says, to those dark profiles of nonusers that supposedly do not exist. “You would search for your own name on the internet and you’d land on a dark profile on Facebook,” he says. “And then you’d be like well, fuck it, you’d fill it in and then PYMK would kick in and we would show you a bunch of your friends.”

Some of the mysteries of PYMK were addressed in a 2010 talk by Facebook data scientist and engineer Lars Backstrom. Reporting that the feature “accounts for a significant chunk of all friending on Facebook,” Backstrom went through the technical process of how Facebook chooses its suggestions. The most important hunting ground is the “friends of friends” region, according to the presentation. But that is a very large set.

The typical user has 40,000 friends of friends (FoFs), he said, and a power user with thousands of friends might have 800,000 FoFs. That’s where the other data comes in — to find signals like the number or closeness of mutual friends and mutual interests, along with “cheaply available data” to identify which ones are likely to cause someone to click when spotted in a PYMK list . . . As the data gets refined, Facebook uses machine learning to make the final suggestions.

Backstrom also revealed that one’s behavior on PYMK helped determine which suggestions Facebook would offer — and how often it would show you the list. Once Facebook determined you fell for the feature, it would keep coming back, stuffing your friend list with weak ties.

The Backstrom presentation omits any specific information about what data sources besides FoF analysis Facebook uses in the feature. To be sure, those sources have evolved steadily since Facebook introduced PYMK in 2008. It’s almost certain that Facebook watches your email and sees whom you are contacting. Probably your calendar as well, to see whom you’re meeting with. Other sources have indicated that if someone glanced at your profile, that act might increase the odds that the person might appear on your PYMK list. It’s doubtful that simply thinking of someone is enough to put that person on your PYMK lineup. It just seems that way.

As troubling as PYMK is, the scary thing is that it could have been worse. Facebook’s chief of privacy, Chris Kelly, says that he blocked the use of some questionable techniques that the Growth team had suggested. “There had to be some rules,” he says, declining to share the ideas he snuffed.

Other problems with PYMK are subtle but no less troubling. The early Facebook executive Dave Morin came to view PYMK as an insidious means of boosting retention numbers at the expense of a good user experience. Since a key goal of PYMK was to boost the value of Facebook for new users — making sure that they had enough friends to fill up their News Feed — the suggestions were tilted to help those newbies more than the people they friended. Particularly valuable to Facebook would be suggestions of users who posted promiscuously, because (as the “Feed Me” study proved) early exposure to super-active users will influence newcomers to share more throughout their Facebook life.

As Morin puts it, “When Facebook shows you people you should connect to, it can make a choice as to how that algorithm works. It can either show you people you’ll become closer to and who will make you happier if you add them to your world. Or it can show you people that are advantageous for Facebook, the system, to show you, because it increases Facebook’s value and wealth and it makes my system better.” He says that Facebook takes the latter course, benefiting itself at the expense of its users.

This might give the experienced user a worse experience. People view only a limited number of stories in the News Feed. Facebook would prioritize stories from your newer, weaker ties that it wanted to keep on the service. And you would see fewer things from people you did care about. “The system knew that if I said yes to you, you would become more engaged,” says Morin. “You’d be effectively stalking me because I’m like a person distant in your social graph who you want to know. It’s almost like watching a tabloid.” Morin says this semi-stalking factor “became the primary variable in PYMK.”

Some people pushed back on Palihapitiya on this issue, arguing that such behavior was not Facebook-ish. “He was basically like, Go fuck yourself, and he’d walk out of the meeting,” says Morin.

Zuckerberg defends PYMK, and the way he does it illuminates his thought process and product acumen. When I bring up the above conundrum to him, he gets very serious. “This gets to a really deep philosophical thing about how we run the product,” he says. He concedes that if users take the hint from PYMK and friend their weak ties, their experience might be somewhat degraded. But there is a more important issue at stake, he argues — the health of the network in general. “We don’t view your experience with the product as a single-player game,” he says. Yes, in the short run, some users might benefit more than others from PYMK friending. But, he contends, all users will benefit if everyone they know winds up on Facebook. We should think of PYMK as kind of a “community tax policy,” he says. Or a redistribution of wealth. “If you’re ramped up and having a good life, then you’re going to pay a little bit more in order to make sure that everyone else in the community can get ramped up. I actually think that that approach to building a community is part of why [we have] succeeded and is modeled in a lot of aspects of our society.”

Furthermore, Zuckerberg believes that by friending your weak ties — which includes people you hardly know — you become closer to them. Facebook might even violate the physics of social interaction by stretching the number of meaningful contacts that people can handle. “There’s this famous Dunbar’s number — humans have the capacity to maintain empathetic relationships with about 150 people,” he says. “I think Facebook extends that.”

In a social-science sense, that would be like surpassing the speed of light. But if anyone could do it, it would be Facebook’s Growth team.