In the Apple v. FBI standoff, techies are clearly in Apple's corner. But instead of taking sides, the technology community should work to change the entire debate.

Apple got itself into this mess by building insecure products. That's right, the iPhone isn't actually secure. If it were, Apple wouldn't be able to write any code that could help the FBI unlock the San Bernardino, Calf., shooter's iPhone 5c. The government's request would simply be a nonstarter.

In order to call a product or consumer device secure, even its maker shouldn't be able to break into it. And that means Apple, too.

By waging a legal battle against the FBI, Apple is trying to patch a technical security flaw with a legal defense. And if Apple loses, the FBI will score a victory in its war on encryption. Yet even if Apple wins, the public may ultimately lose.

FBI Director James Comey and the Justice Department are using the San Bernardino terrorist attack in their misguided quest to create some kind of legal access – or backdoor – into encrypted consumer technologies. If the court or Congress eventually go along with the FBI or other national security officials calling for greater ability to decrypt consumer communications (and a court loss for the FBI may cause Congress to act) it'll be a bad day for everyone's digital security.

But if you're against backdoors, that doesn't mean you should necessarily support Apple.

In fact, Apple has designed products so that backdoors are possible. On the iPhone, for instance, the software that safeguards the passcode input process can be modified via an authorized update from Apple – and that's a critical flaw.

The FBI wants Apple to write an update so that the iPhone won't erase data after 10 unsuccessful passcode guesses, there's no delay between guesses, and guesses can be entered via an input port. That modification would allow the FBI to connect the San Bernardino iPhone to a computer that will try passcodes until it finds one that works. And given the limits in human abilities to memorize a passcode, the possibilities aren't that numerous.

The courts should not force Apple's engineers to write this kind of program.

Even so, Apple should have designed its products so that engineers wouldn't be put in this position – and it should quickly make design changes so they won't in the future.

In touting the security features of iOS 8, Apple claimed that it could not break into an iPhone even if it wanted to. We now know that this claim was untrue. Even if Apple wins its current legal fight over the San Bernardino iPhone, that won't stop other governments – with more oppressive methods than those available to the FBI – from forcing Apple to help them unlock other iPhones.

If there's any good news that's coming out of this standoff over consumer encryption, it's that Apple appears to be moving swiftly toward correcting its mistake. The next version of the iPhone will make the passcode protection mechanism impossible to change. Apple should roll out iPhone 7 as soon as possible.

Anna Lysyanskaya is a professor of computer science at Brown University. Her research area is cryptography, especially privacy-preserving cryptographic protocols. Follow her on Twitter @AnnaLysyanskaya.