There is a prevailing yet false narrative circling the internet regarding both the recent ruling against Google and the current actions being taken by YouTube in regards to both children’s channels and deplatforming the commercially nonviable.

The latter being the product of YouTube themselves attempting to draw attention away from the substance of the ruling by playing off people’s ignorance of what is a nightmare of a law to read. To the average person COPPA just involves children and content for children. A misunderstanding YouTube themselves in their support pages is not looking to clarify.

They are portraying the issue utilizing the safe harbor status and definitions proposed in the FTC’s 2019 review of the law, which includes amendments to the previously established law. Section 25 is the relevant section applying to YouTube and what many outlets are attempting to obfuscate the issue with.

25. In some circumstances, operators of general audience platforms do not have COPPA liability for their collection of personal information from users of child-directed content on their platform uploaded by third parties, absent the platforms’ actual knowledge that the content is directed to children. Operators of such platforms therefore may have an incentive to avoid gaining actual knowledge of the presence of child-directed content on their platform. To encourage such platforms to take steps to identify and police child-directed content uploaded by others, should the Commission make modifications to the COPPA Rule? For example, should such platforms that identify and police child-directed content be able to rebut the presumption that all users of the child-directed third-party content are children thereby allowing the platform to treat under and over age 13 users differently? (11) Given that most users of a general audience platform are adults, there may be a greater likelihood that adults are viewing or interacting with child-directed content than on traditional child-directed sites. In considering this issue, the Commission specifically requests comment on the following: a. Would allowing these types of general audience platforms to treat over and under age 13 users differently encourage them to take affirmative steps to identify child-directed content generated by third parties and treat it in accordance with COPPA? b. Would allowing such a rebuttal of the presumption that all users are children in this context require a Rule change? If so, would such a Rule change be consistent with the Act? c. If the Commission were to allow such a rebuttal of the presumption that all users of this content are children, what factors should it consider in determining whether the presumption has been rebutted? What methods could a general audience platform use to effectively rebut the presumption that all users of the third-party child-directed content are children? d. Could a general audience platform hosting third-party, child-directed content effectively rebut this presumption by doing the following: i. Taking measures reasonably calculated to identify child-directed content generated by third parties for commercial purposes; ii. Permitting users that identify themselves through a neutral age gate to create an account on the platform; iii. Taking measures reasonably calculated, in light of available technology, to ensure that if personal information is to be collected from a user accessing child-directed content, the user is the person who created an account and identified as being 13 or older, and not a child in the household, such as through periodic authentication; and iv. Providing clear and conspicuous notice at the time the user is interacting with child-directed content of its information collection practices, and separately communicating those information practices through out-of-band notices, such as through online contact information provided as part of the account creation process? The Commission seeks comment on whether these measures, or any others, could effectively rebut the presumption that all users of this child-directed content are children, and also on the ways in which an operator could implement these measures. e. What, if any, risk is presented by permitting general audience sites to rebut the presumption that all users of child-directed content are children? Would it prove challenging to reliably distinguish between a parent and a child who accesses content while logged in to a parent’s account? In considering whether to permit general audience sites to rebut the presumption, should the Commission consider costs and benefits unrelated to privacy, such as whether children may be exposed to age-inappropriate content if they are treated as an adult? F. Right of a Parent To Review or Have Personal Information Deleted

There are two problems with the above involving the recent ruling. The first is it was filed on 7-24-19 and had no bearing on the trial as it hadn’t taken effect at the time of the proceedings that concluded on 9/04/19. The second and more important issue relates to the context of the case. To then represent this as somehow pivotal in the ruling and YouTube’s actions is disingenuous.

The above has bolded text added to highlight important information, and italicized to highlight relevant information to the issue at hand. YouTube was not fined for personalized advertisements on children content account. They were fined, uncontested for illegally collecting data on children.

Keep in mind Youtube is still in violation of COPPA through their user data collection methods. Simply blacklisting child content does not eliminate children who may be watching other channels like Pewdiepie or their favorite e-celebrity. Say nothing of the various adult animation channels that earn a sufficient following from people under the age of 13 alongside any information collected from potential adult accounts that are in fact a child using it which will soon go into effective law.

Even with a zero tolerance dragnet, YouTube would be unable to neutralize the problem from free moving 13 under accounts who do not follow general behavioral patterns of their age group. This means YouTube will continue collecting personalized data on these individuals.

Some will call this a conspiracy theory, but even the mainstream media has acknowledged short of abandoning the internet entirely you will be unable to avoid having your information collected by Google. Thus if you are using YouTube, they are collecting data about you. An FTC document detailing Google’s Collection methods also confirms this, so we are beyond the realm of dismissal by accusing person of being Alex Jones.

In fact of the matter the ruling itself was about how they collected data and handled it in regards to privacy concerns under COPPA. An act they do regardless of what channel is being viewed.

2. The Complaint charges that Defendants violated the COPPA Rule and Section 5 of the FTC Act, 15 U.S.C. § 45, by failing to post a privacy policy on their online service providing clear, understandable, and complete notice of their information practices with respect to the Collection of Personal Information from Children, failing to provide direct notice to Parents of such information practices, and failing to Obtain Verifiable Parental Consent prior to Collecting, using, or Disclosing Personal Information from Children.

Further in the ruling under definition the ruling defines collection of information as I have previously illustrated it.

D. “Collects” or “Collection” means the gathering of any Personal Information from a Child by any means, including but not limited to: 3. Passive tracking of a Child online.

The ruling then goes on to discuss the issue at hand. Their violation is in the handling as it is not collected in accordance with the law, it is not handled in accordance with the law, and parents nor those who are having the information collected from are being presented with the information the law itself requires them to be presented with.

K. “Obtaining Verifiable Parental Consent” means making any reasonable effort (taking into consideration available technology) to ensure that before Personal Information is Collected from a Child, a Parent of the Child: 1. Receives notice of the Operator’s Personal Information Collection, use, and Disclosure practices; and

2. Authorizes any Collection, use, and/or Disclosure of the Personal Information, using a method reasonably calculated, in light of available technology, to ensure that the Person providing consent is the Child’s Parent.

As you can probably surmise yourself the issue Google has nothing to do with children channels, but everything to do with them illegally collecting data on and about children without the notification and consent of their parents. Getting rid of channels that appeal to children will do nothing to resolve this issue other than through the attempt to mitigate the volume of data collected on children.

Rather than simply adjust their data gathering methodology and systems to be in compliance with the law, the company is deflecting by making it appear as if it were another issue entirely.

The previous issues regarding pedophiles using the platform was brought to the attention of YouTube for years with not response or action taken on their part. It was only when a user made a video documenting the scope of the issue and the methodology said pedophile networks were using to browse what amounted to soft core child porn, did the company take action.

Now there is a conflation between that incident and its blow back and the ruling the company just received. In my opinion a conflation Google should be rather happy about as it is deflecting attention from the reality they are illegally collecting information on your children without your consent and are using the information beyond for just advertisement.

YouTube’s current actions against children’s channels are merely a distraction from their true intention. The ability to remove any channel’s critical of the narrative through claiming they are not commercially viable and then using their million dollar legal teams to ensure you cannot fight it in court without bankrupting yourself in the process.

One cannot accuse Google directly of triggering the upcoming ad revenue issue on purpose, but one can aptly point out they benefit from it in a way that incentives them not to course correct. If revenue generation is hard for YouTube then it strengthens the argument for deleting channels for being unviable.

This is theoretical, but based on their actions strangling channels views and unsubscribing individuals it would then be an easy claim to say because you are receiving no views and your subscriber counts are going down that said user is no longer commercially viable. Whether they take that route or not understand it is not hyperbole that they have an agenda to push forward with these actions.

Thanks to leaks from Veritas we know Google’s goal is to prevent the reelection of Donald Trump. A report that Google themselves was quick to censor.

All the steps they are taking are irrelevant in upholding the law as safe harbor status already does not apply to the company because of the nature of their data collection. Said nature that according to the ruling they do not contest is in violation of the law. This isn’t government overreach as many are claiming. What this is is a greedy and frankly evil company getting exposed from harvesting data on your children and hoping you don’t look too deep. Too bad for them One Angry Gamer is already hated by the establishment and has nothing to lose by doing so.