The U.S. Department of Defence has turned to well-intentioned hackers and independent security researchers to help the government agency find software bugs and vulnerabilities in its computer systems.

But in Canada, the government appears to still have no formal policy or public guidelines, which makes it difficult for those who do find flaws to know what to do, or how the government might respond.

"There's no formal process," says Imran Ahmad, a partner at the law firm Miller Thomson who works with clients on cybersecurity related issues. In the absence of such a process, he says, those who find flaws "just don't know how the government's going to react, and they just want to protect themselves."

"My advice to anyone who finds a flaw in a government website at this time would be to forget they ever saw it," wrote web developer and security researcher Kevin McArthur in an email.

In the past, companies and governments often threatened security researchers and coders who found and published details about vulnerabilities in software with litigation, prompting the adoption of an informal process called "responsible disclosure."

My advice to anyone who finds a flaw in a government website at this time would be to forget they ever saw it. — Kevin McArthur , security researcher and web developer

In practice, it means researchers will typically notify a company in private about a bug and give that company time to fix the flaw before disclosing its existence in part or full to the wider public.

In recent years, U.S. technology companies such as Facebook, Google and Microsoft have offered financial incentives and bragging rights for the responsible disclosure of bugs through the creation of programs called bug bounties — welcoming, rather than warring against the efforts of outside researchers who manage to penetrate their systems with the goal of making them safer.

The companies typically have guidelines for how far researchers are able to go in their search for flaws, outlining specifically what is allowed and providing assurance that a researcher won't be labelled a criminal for finding a serious flaw.

U.S. Secretary of Defence Ash Carter wrote in a Medium post that his department's new vulnerability disclosure policy would allow it to 'find and fix issues faster.' (Jonathan Ernst/Reuters)

Last Monday, the U.S. DoD announced a vulnerability disclosure policy, following a pilot project called "Hack the Pentagon" it organized earlier this year.

"For the first time, anyone who identifies a security issue on a DoD website will have clear guidance on how to disclose that vulnerability in a safe, secure and legal way," Secretary of Defence Ash Carter wrote.

CBC News asked seven Canadian federal government agencies — the departments of Finance; National Defence; Justice; Innovation, Science and Economic Development; Public Safety; Shared Services; and Canada Revenue Agency — questions about their policies, but did not receive a response.

"There are a lot of good people who are better at coding, who are better at hacking, who are better at a lot of things than government resources allow them to be," said Ahmad of the U.S., where the DoD's attitude has changed. "So, where in the past, you may want to jail someone for something like this, I think they're now moving towards … bringing them into the fold, and even bringing them onto the team, instead of coming down hard on them."

Chilling effects

In 2014, a serious bug was discovered in the security software that underpinned hundreds of thousands of trusted websites. It was called Heartbleed, and researchers wasted little time determining which sites and services were vulnerable so that the hole could be patched.

Justin Bull — a software engineer with the financial tech startup Wealthsimple and a security enthusiast in his spare time — noticed that software used by the Canada Revenue Agency was also vulnerable. He says he spent about two hours attempting to contact various federal government agencies, finally leaving both an encrypted e-mail and voice mail with the Canadian Cyber Incident Response Centre, or CCIRC.

The CCIRC operates within Public Safety Canada, and is the government's primary point of contact for tips on cyberattacks and other computer-related incidents, but the Centre's website does not specifically address the discovery and disclosure of software flaws.

"I never ever ever got a response," Bull said. "So when I was hearing that the RCMP was investigating … I'm like 'Holy shit, is my door going to get kicked down?' Because I'm trying to tell them here, and I don't know what's going to happen."

"You don't want to anger the beast," he added. "I understand that chilling effect. The messenger always gets shot."

Similarly, McArthur cited the fallout from the RCMP's case against Stephen Solis-Reyes, as one of the reasons he no longer discloses bugs to the Canadian government.

Like Bull, Solis-Reyes discovered that the CRA's website was vulnerable to Heartbleed, but he accessed 900 SIN numbers in the process. Solis-Reyes — who was 19 and a student at Western University — was investigated by the RCMP for the breach, accused of being a terrorist by his interrogators, and pleaded guilty in May to four charges.

Solis-Reyes told the court that he had no malicious intent, and only intended to test the bug's severity. His lawyer argued that he did the country a service by exposing the flaw.

"If you don't provide proof of actual vulnerability exploitation, then the problem is simply ignored and downplayed as theoretical," McArthur wrote of his own experiences bringing bugs to the government's attention, which has discouraged him from disclosing more.

"On the other hand, if you provide the government the required proof, like Solis-Reyes did by demonstrating the Heartbleed vulnerability at CRA, then they classify you as an evil hacker and you're heading for the police cells."