As it has become more influential, Facebook has taken pains to say that it is not an echo chamber of similar opinions. In a peer-reviewed study published last year, Facebook’s data scientists analyzed how 10.1 million of the most partisan American users on the social network navigated the site over a six-month period. They found that people’s networks of friends and the articles they saw were skewed toward their ideological preferences — but that the effect was more limited than the worst case some theorists had predicted, in which people would see almost no information from the other side.

Yet Gizmodo’s report raises questions about the effects that Facebook’s staff members and their biases — even unconscious ones — have on the social network.

While Facebook has pledged to sponsor both the Democratic and Republican national conventions, the company’s top executives have not been shy about expressing where their political sympathies lie.

At a Facebook conference in April, Mark Zuckerberg, the company’s chief executive, warned of “fearful voices building walls,” in reference to Donald Trump, the probable Republican presidential candidate.

The allegations against Facebook also put the spotlight on how it chooses which news articles to show users under the trending function — on desktop computers, “trending” displays on the right side of screens; on cellphones, it appears when users search.

Facebook has long described its trending feature as largely automatic. “The topics you see are based on a number of factors including engagement, timeliness, pages you’ve liked and your location,” according to a description on Facebook’s site.

The trending feature is curated by a team of contract employees, according to two former Facebook employees who worked on it and who spoke on the condition of anonymity because of nondisclosure agreements. They said they considered themselves members of a newsroom-like operation, where editorial discretion was not novel but was an integral part of the process.