When the Russian and Saudi teams squared off in a World Cup match on June 14, many fans were treated to an enthralling football match; but for a minority of fans with a visual disability, the match was more confusing than exciting.

You see, the Russian team wears red jerseys and the Saudi team wears green jerseys, and red/green color-blindness ("achromatic vision") is the most common form of color-blindness, a hereditary condition that affects millions. For these people, the Saudi-Russia match was the red/green team versus the red/green team in a fight to the finish.

The good news is that color-blindness is no match for digital video analysis. Simple apps like DanKam can shift the colors in any video on your device, replacing the colors you can't see with the colors you can. For people with color-blindness, it's a profound and moving experience.

The bad news is that technologies designed to prevent you from making unauthorized uses of videos can't discriminate between uses that break the law (like copyright infringement) and ones that accomplish socially beneficial and legitimate ends like compensating for color-blindness.

Less than a year ago, the World Wide Web Consortium published its controversial "Encrypted Media Extensions" (EME) for video, which indiscriminately block any unauthorized alterations to videos, including color-shifting. During the long and often acrimonious fight over EME, EFF proposed a covenant for W3C members that would make them promise not to pursue legal action against people who bypassed EME to adapt videos for people with disabilities, a proposal that was rejected by the major rightsholder and technology companies, who said that they and they alone should be the arbiters of how people with disabilities could use their products.

We (genuinely) hate to say we told them so. Seriously. Because this is just the start of the ways that EME -- which affects about 3 billion web users -- will interfere with accessibility. Currently existent technologies that protect people with photosensitive epilepsy from strobe effects in videos are already blocked by EME. As machine learning advances, EME will also block such adaptive technologies as automated captioning and descriptive tracks.

We are suing the US government to overturn Section 1201 of the Digital Millennium Copyright Act, the law that bans bypassing technologies like EME, as part of an overall strategy to end the practice of designing computers to control their owners (rather than the other way around).

Technologies like EME, designed to stop users from adapting technologies to their needs, have found their way into everything from automobiles to voting machines to medical implants. We've asked the Copyright Office to protect the public from the abuse of these technologies, and we continue to back state Right to Repair bills that protect your right to choose independent service centers to fix your stuff, even if it means bypassing a manufacturer's locks.

But while these efforts are making their slow progress through the courts and regulators, it's on the shoulders of technologists to learn the lesson of EME: contributing to technologies that stop the public from adapting or auditing their tools is a profoundly unethical act, one that widens the gap between people with disabilities and the (temporarily) abled people who don't (yet) need to make those adaptations.