Statement: All accounts found to have or be soliciting child sex exploitation material during the course of this investigation were immediately reported to Twitter.

A new investigation by The Post Millennial reveals that the distribution of child pornography is happening in plain sight. While many believe the distribution of these illicit materials is relegated to the seedy underbelly of the internet, this belief is far from true.

On Twitter, pedophiles seeking to exchange child pornography and other child sex exploitation images are utilizing secret hashtags to signal their presence and interest to each other. The hashtags #megalinks and #megadump are littered with tweets from otherwise empty accounts offering or soliciting illegal material.

One account offered a “megalink” for “£2 to my cashapp,” with one of the subsequent hashtags being “underage.” The video that accompanied the tweet was a screenshot that included a number of folders, one of which was tilted “CP,” short for “child pornography,” and another titled “500+teenie videos.”

These accounts are numerous, and Twitter does little to crack down on their existence in spite of reports.

Other users were more direct, asking specifically for “links” or “trades” of specific ages that interested them. One user requested 14-17 year olds.

Another sought to “trade links” for ages 12-17 years old. The trade would happen on Snapchat, another completely innocuous platform.

Multiple users are even seeking child pornography which includes rape and necrophilia themes, outright asking for the grotesque content while utilizing the “#young” hashtag.

The Post Millennial’s findings are in light of increasing scrutiny towards Twitter for politicizing its platform, routinely censoring conservative, gender-critical feminist, and even some anti-mainstream leftist accounts. Despite the crackdown on political speech, the social media platform has altered its terms of service to accommodate “minor-attracted persons.”

According to the most recent manifestation of Twitter’s Child Sexual Exploitation Policy, “Discussions related to child sexual exploitation as a phenomenon or attraction towards minors are permitted.”

Twitter has experienced a surge in accounts attempting to normalize and gain acceptance for “minor-attracted people” (MAPs) in recent years, many openly operating with impunity from Twitter under the Terms of Service.

Completely acceptable under the new guidelines are “artistic representations” of child pornographic images, of which many were found on accounts using the “minor-attracted persons” moniker in their usernames or bio spaces. One such drawing depicted a small child being raped by an adult man.

Others posted “soft-core” images of actual children, dressed-up and posed inappropriately. These images, primarily posted by foreign-language accounts, were immediately reported by The Post Millennial staff, and were pixelated to protect the identity of the victims.

Despite many “pro-MAP” accounts claiming they are “anti-contact” and simply striving for acceptance for immutable desires, MAPs are unabashed in expressing their disgusting desires towards children—all with the protection of Twitter’s terms of service.

Many of these accounts have gone unchecked and remain active–some even for years. As a social media platform that claims to protect its users, and even enjoys legal protections under US Section 230 of the Communications Decency Act, Twitter needs to provide answers for why it continues to allow accounts dedicated to child sexual predation to proliferate in its space.

The Post Millennial has contacted Twitter for comment on their tolerance of pedophiles on the platform, and on whether concrete steps will be taken to stop the free trade of child sex exploitation material using covert hashtags.

According to a Twitter spokesperson, “Twitter has zero-tolerance for any material that features or promotes child sexual exploitation. We partner with organizations around the globe in this area, including the National Center for Missing and Exploited Children. Our dedicated teams work to stay ahead of bad-faith actors and to ensure we’re doing everything we can to remove content, facilitate investigations, and protect minors from harm—both on and offline.”

