A naive chat-bot: The penultimate section of The Current contains two art pieces that seem only tangentially related to disinformation. The first, Baby Faith, is described as a “young and naive web-based chat-bot struggling to learn how to identify human emotion,” and consists of a chat window into which a reader can post responses to questions and get responses from (poorly designed) automated chat software. Baby Faith’s “difficulties in identifying emotion online mirror the challenges of emotionally-driven disinformation campaigns,” the description says. The second is a monologue from The Picture of Dorian Gray that has been adapted as a musical sonnet to “evoke the exhaustion in dealing with social media disinformation.” Clicking on it takes you to a separate website run by the Rhizome art collective, where a crowdsourced chorus sings the piece.

The penultimate section of The Current contains two art pieces that seem only tangentially related to disinformation. The first, Baby Faith, is described as a “young and naive web-based chat-bot struggling to learn how to identify human emotion,” and consists of a chat window into which a reader can post responses to questions and get responses from (poorly designed) automated chat software. Baby Faith’s “difficulties in identifying emotion online mirror the challenges of emotionally-driven disinformation campaigns,” the description says. The second is a monologue from The Picture of Dorian Gray that has been adapted as a musical sonnet to “evoke the exhaustion in dealing with social media disinformation.” Clicking on it takes you to a separate website run by the Rhizome art collective, where a crowdsourced chorus sings the piece. Manipulation : At the same time as it launched its new magazine, Jigsaw also launched a tool designed called Assembler that is designed to help journalists and fact-checkers determine whether images have been manipulated to create disinformation. The tool is a collection of several existing techniques for detecting common manipulation methods, such as changing image brightness and using copied pixels to cover something up. It also includes a detector that spots “deepfakes” that use an algorithm called StyleGAN to generate realistic imaginary faces. These detection techniques feed into a master model that tells users how likely it is that an image has been manipulated.

: At the same time as it launched its new magazine, Jigsaw also launched a tool designed called Assembler that is designed to help journalists and fact-checkers determine whether images have been manipulated to create disinformation. The tool is a collection of several existing techniques for detecting common manipulation methods, such as changing image brightness and using copied pixels to cover something up. It also includes a detector that spots “deepfakes” that use an algorithm called StyleGAN to generate realistic imaginary faces. These detection techniques feed into a master model that tells users how likely it is that an image has been manipulated. Labels: Twitter is experimenting with adding colored labels directly beneath lies and misinformation posted by politicians and public figures, according to a leak obtained by NBC News. Under this model, which Twitter said is just one possible iteration of a new policy aimed at fighting disinformation, misleading information posted by public figures would be corrected directly beneath the tweet by fact-checkers and journalists who are “verified,” and possibly other users who will participate in a new “community reports” feature.

Sources tell Bloomberg News that Donald Trump’s re-election campaign has bought the coveted advertising space at the top of the YouTube homepage for early November. The space can cost as much as $1 million per day.

The father of Alison Parker, a reporter who was shot to death on camera in 2015, has filed a complaint against YouTube with the Federal Trade Commission, alleging that the site puts the onus on users to notify it of violent content, and then often doesn’t remove it. The filing calls these practices “deceptively burdensome” and says Google “utterly fails” to follow through on promises to take down content. Parker was murdered on live television, along with her cameraman, Adam Ward, and, according to the complaint, video of the killing remains on YouTube.

Political ads are flooding into Hulu, Roku and other digital streaming services, which aren’t subject to the same regulations that cover television networks and cable broadcasters, the Washington Post reports. “Nothing requires these fast-growing digital providers to disclose whom these ads targeted and who viewed them,” the Post finds. “The absence of federal transparency rules stands in stark contrast with traditional TV broadcasters, such as ABC, CBS, Fox and NBC, which for decades have been required to maintain limited public files about political ads.”

Maria Bustillos, CJR’s public editor for MSNBC, describes the comic-book world of political journalism as portrayed on network TV. “Heroes and villains make for entertaining and digestible television; they simplify a complicated world, and make it less frightening,” she writes. “The reduction of political actors to stick figures in a story of Good vs. Evil is a key part of what makes cable news tick.” Author Neil Postman’s contention that television has turned our culture “into one vast arena for show business” is truer than ever, Bustillos argues.

Peter Winter, a long-time media industry executive, writes about the lessons that can be learned from the recent bankruptcy filing of McClatchy, the newspaper chain. “In business, every threat masks an opportunity that should have been obvious. The Internet offered newspaper companies the rare prospect of product reinvention and economic revival,” Winter says. “But all they could see was a chance to throw away the ink and paper and sell the trucks, the same product delivered at less cost and sold as if the age of targeted and measurable advertising had never arrived.”

In the weeks after the 2016 presidential election, Facebook found dozens of pages that had peddled false news reports ahead of Donald Trump’s surprise victory, according to a report from the Washington Post. Nearly all were based overseas, had financial motives and displayed a clear rightward bent. But in a meeting to decide what to do about it, Joel Kaplan — a former official in the George W. Bush White House who was the head of Facebook’s Washington office — argued that Facebook couldn’t take all the pages down because doing so would “disproportionately affect conservatives,” who didn’t see the material as fake news.

Ross Barkan writes for CJR about what it was like dealing with Mike Bloomberg’s media-relations staff when Bloomberg was mayor of New York City. “Bloomberg’s press office knew that befriending reporters, or creating the appearance of camaraderie, was crucial to the mission. Off-the-record chats were frequent. So were after-work beers,” he recalls. “Emails were always answered. As a young reporter at the bottom of the pecking order, I couldn’t claim to belong to the inner ring of these reporter-staff relationships. But I could still feel, in some way, that I knew Bloomberg’s team.”

And the New York Times writes about how the new owners of the Big Bend Sentinel in Marfa, Texas bought the building next door and turned it into a bar and event space attached to the newsroom. Revenues from the space help subsidize the Sentinel‘s journalism, according to Maisie Crow and Max Kabat, two transplants from New York who bought the paper last year.

Jigsaw, a unit of Google previously known as Google Ideas , recently launched a digital magazine called The Current, which aims to “explore today’s digital threats and solutions.” There isn’t much exploring to be found in the inaugural edition , however. It’s mostly a cursory overview of disinformation, alongside brief descriptions of some tools that Google has used to combat the problem, gussied up with a coat of digital paint. There are two contemporary art pieces that seem only loosely relevant. Plus, an interactive map. It’s a magazine best not read too closely. But I did anyway. It’s unclear why Jigsaw decided to publish The Current now, but it’s probably not a coincidence that Google—and its parent company, Alphabet—is under pressure from legislators in the US and Europe to take action against misinformation . Founded in 2010 and run by Jared Cohen, a former adviser to Hillary Clinton, Jigsaw says its mandate is to “forecast and confront emerging threats, creating future-defining research and technology to keep our world safer.” The reality is not as bright. Last summer, Vice described Jigsaw as “a toxic mess”; a dozen current and former staffers complained of an environment of mismanagement and poor leadership in an organization that, “despite the breathless headlines it has garnered, has done little to actually make the internet any better.” In one case in 2018, Jigsaw set up a fake political activism site—putting political misinformation out into the world—and then hired a Russian troll factory to attack it. The Current looks nice, at least. With a monochromatic color scheme, it resembles a high-end-furniture catalog. It’s also interactive: when a user hovers over text, the mouse arrow turns into a Magic Marker icon and a pop-up window encourages readers to send in comments. But if you click on “send a message,” you see a small box with three choices: “Agree,” “Disagree,” and “Want to know more.” Your ability to weigh in, it turns out, is limited to one of three pre-programmed responses. The text of The Current’s “articles” is organized into snippets not much longer than Netflix promotional descriptions, with links inviting you to “Dive Deeper.” Click a first link, and you go to a page called “The Problem,” which explains, for instance, that disinformation campaigns are “professional and coordinated—not unlike marketing campaigns.” You don’t say.A section called Tactics has a series of graphics representing different approaches to spreading disinformation , including “brigading” (an online harassment tactic in which a group launches a coordinated attack on an individual), botnets (coordinated groups of automated accounts), and hacking. Hovering over them brings up explanations so brief as to almost be wrong—for “sock puppets,” the entry says only “online accounts run by someone masquerading as someone else.” The “Channels” section has four subsections, including manipulated images, memes, and viral messages. The “Meme” section says that Russian trolls tried to use memes to influence energy markets by protesting a pipeline, a single (bad) example from a vast category of behavior. The audience for this information would presumably be someone who has literally never heard the term “meme.” To show that at least some of the magazine is based on original Jigsaw research, a section called “Outcomes ” includes a quote from a “pseudonymous white nationalist Twitter and Gab user banned from both platforms multiple times for disinformation and trolling, whom Jigsaw interviewed.” His message? That engagement is an important indicator of whether your campaign is working. Hardly an earth-shattering revelation. The final section of The Current is a “disinformation visualizer” that links to campaigns identified by the Atlantic Council’s Digital Forensic Research Lab, a nonprofit that studies disinformation. The fine print notes that Alphabet “does not endorse these research findings or their characterization of disinformation campaigns.” But if Google doesn’t endorse them, why are they featured in its magazine? Each is a couple paragraphs at most—outlining a disinformation campaign that was run in Ukraine by Russian trolls, for example—with text boxes noting that the information comes largely from public news reports. In all, The Current has the feeling of something Google’s marketing department cooked up in a hurry. If it were a presentation for ninth-grade civics class, it would get high marks. But for something produced by a $900 billion company that purports to have high-minded goals, it’s pretty weak tea. It deserves a C+ at best. Here’s more on Google and disinformation:

Has America ever needed a media watchdog more than now? Help us by joining CJR today

Mathew Ingram is CJR’s chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.