About six months ago, in the process of going all Linux, I was selling off an old laptop of mine with an SSD and Windows 10 to a friend. I wanted to securely remove all my data without removing the Windows install since I had lost the OEM key. I recalled seeing Windows 10 having a secure reset feature, described as being designed for easily removing all personal data from your computer before recycling or selling it. That sounded great, and so I pressed the button to start it and the ridiculousness of the last six months of my life began. The reset went most of the way through (around 60% if I recall correctly) and gave a strange error about not being able to delete a file that wasn't there anymore. So I opened up Windows explorer to poke around and found a hidden folder had been created: C:\$SysReset. What appears to have happened after reviewing system log files and experimenting on other computers was the reset feature copied all program installs (including x64, UWP system logs, program logs and what appeared to be encryption keys) to that folder and they were then deleted from there. It also turned out after more tests that the error consistently occurred on SSD based Windows 10 installations (at least as well as I could test). While this just seems like a bug like Microsoft said, this is in practice a genuine security vulnerability: 1) Copying before deleting is comically bad for a tremendous number of reasons and makes it easier to find and recover sensitive information. 2) Given that most people don't know how to reset their computer without using this, and is physically impossible in other cases (i.e. surface tablets), people will often be forced to sell or dispose of their computer even in the presence of the error due to a lack of resources or alternatives. 3) I could find no bit/block level overwrite in Recuva and the system log files showed no indication that they'd try to. Given this and the fact that Windows didn't complain about running the erase on an SSD, Windows 10 almost certainly doesn't even attempt to implement bit/block level overwrite in this features, and that's really bad. Also, given that some wonky error occurred with write permissions on an SSD, I wondered if it would fail without error on an HDD. Sure it was a shot in the dark, but it definitely needed testing. So I went and emailed the MSRC (Microsoft Security Response Center), told them all this, and asking them to test it on an HDD (I only have SSD boot disks in my hardware). They provided a weird and grammatically awkward response that indicated they had read nothing I said, saying that since there was an error it was only a bug, and that I could see if Microsoft was accepting bug report suggestions for this product at this time via a link. They declined to respond to further emails or say anything about testing on an HDD. So I managed to find a friend with an old sacrificial Windows 10 PC with an HDD boot disk to test it; low and behold, it failed without error. Now we have something serious; I emailed the MSRC again. They replied, again with a very interesting grammar and sentence structure, simply saying they were confused because my first contact with them I had said that the reset was failing with error messages and now I was saying it was failing without error messages, so what the hell kind of wizardry was I talking about here. They again declined to respond to further emails. I reached out to a few people I know affiliated with Microsoft, and they all had no idea what to do. So, like any great cybersecurity problem, I posted about it on Reddit (r/microsoft) threatening to publish the vulnerability as a 0 day exploit if Microsoft wouldn't respond to my inquiries. Through this, I got in contact with an engineer in the Windows and Devices group (who repeatedly requested to remain anonymous). He sent it off internally, and I heard nothing meaningful back for a while. A little after two weeks I finally heard back from Windows and Devices Group engineer (and no one else). Apparently, after discussion between the MSRC and the Windows and Devices Group internal security team, they decided to call it a UX bug and not a security vulnerability. They’re planning on making it a very high priority bug fix, told me I was still eligible for consideration for a security vulnerability bounty for responsibly disclosing this, and informed me that I was supposed to have them review this before publicly releasing it. However, it of course wasn’t a security vulnerability. The original engineer also assured me that the copy before delete was "due to a good reason”, though he didn’t know what it was. As of writing this article, the bug persists in the wild to the best of my knowledge. I also find how hesitant the engineer I made first contact with was to disclose any involvement with this and how nervous he seemed about so many things really bizarre and disturbing. This “UX Bug” is a truly ideal example of how not to handle security vulnerabilities, and I hope this can serve as a case study to future cybersecurity students on the exact opposite of best practices in dealing with vulnerability reports, and what can happen when they aren’t abided by.

Microsoft, this is why I switched to Linux.