There was one big problem: Facebook’s trending algorithms, which identify the most-talked-about terms, were not very good at discerning what was and was not news. Left to their own devices, roughly 40 percent of what Facebook’s algorithms dug up would be junk or “noise,” a result of many people using the same word at the same time across the network. The algorithm might pick up a sharp rise in the word “Skittles” and deem it a trending topic — not exactly the events Facebook had in mind.

That is where humans came in. Facebook enlisted a set of 20-somethings as curators, copy editors and team leads, charged with sifting through the material the algorithms unearthed. They were crucial, they were told, to improving Facebook’s ability to discern, over time, what constitutes news.

“Even if you want to have computers do everything, for technical reasons, resource limitations and product positioning, you may want humans to oversee the algorithms,” Jonathan Koren, a former Facebook employee who worked on algorithmic ranking for Trending Topics, wrote in a LinkedIn post this week.

Mr. Fearnow, who was terminated from Facebook in April for breaking his nondisclosure agreement, said his job as a curator was to “massage the algorithm.” Managers were ambivalent about allowing staff members to identify themselves as curators or editors on their LinkedIn profiles, he said, given concerns that outsiders would notice the element of human judgment and ask questions about it.

Facebook declined to comment, citing employee confidentiality.

During each eight-hour shift, curators were presented with a continuously refreshing list of trending terms they had to sort through and identify as junk or relevant, and draft descriptions for those topics. After labeling them and checking whether they had been independently reported by a number of major news outlets, curators gave topics a value that would make them more or less likely to show up on individual users’ pages. Each user saw a different list of personalized Topics based on their past actions on Facebook.

In Facebook’s editorial guidelines, curators were also told to “blacklist,” or push aside, junk topics that appeared in their queue for a period of eight to 24 hours before they could potentially appear again, according to current and former employees. When duplicate or confusing topics arose, curators were told to “inject” a more accurate Topic term. Copy editors and team leads would also oversee and approve the choices being made.

The work was monotonous — and not entirely gratifying. Workers were incentivized to compete against one another to clear the most trends from their queue, former employees said. Top performers were given “points” that could be spent on Facebook paraphernalia like T-shirts.