Facebook declined to comment on Ms. Pelosi’s remarks. But it has defended its decision not to remove the videos. Facebook does not require posts to be true, but it sometimes slows the spread of certain posts that may be false, such as the video of Ms. Pelosi.

“Once the video was fact-checked as false, we dramatically reduced its distribution,” the company has said about the videos. Facebook has tools that allow it to prevent certain content from appearing in its “newsfeed,” the central bar of scrolling information on a user’s page.

“Speed is critical to this system, and we continue to improve our response,” the company has said since shortly after the videos surfaced. “People who see the video in feed, try to share it from feed, or already shared it are alerted that it’s false.”

Facebook also said the altered videos of Ms. Pelosi now run with a fact-check box that labels the content as false.

Ms. Pelosi’s office declined to comment beyond the interview on KQED.

Social networks have struggled to devise consistent and clear rules about the content they allow on their sites. Facebook, Twitter and YouTube have grappled with hate speech, for instance, and have been slow to take down accounts of Mr. Jones of Infowars, who spreads conspiracy theories. Facebook’s chief executive, Mark Zuckerberg, has acknowledged that the tech industry cannot solve its problems alone and has called for new regulations.

On Wednesday, Robert S. Mueller III, the special counsel, said his two-year investigation found foreign actors used disinformation on social media and other tactics to damage Hillary Clinton’s presidential campaign.