It's the "ultra-secure smartphone" claim that Turing chief executive Steve Chao desperately tried to claw back.

"We're a fashion technology company," said Chao on the phone a few weeks ago. "Seldom do we get people talking about security. I wouldn't brand Turing Phone as a 'secure' phone... it's more a fashion tech phone," he said.

It was a fairly swift, unexpected turnaround from what the company touts as hacker-resistant and "ultra secure". Chao didn't deny the phone has "groundbreaking security", but his backtrack seemed to raise more questions than Chao had answers. The long-awaited Turing Phone was first slated as an unbreakable, security-heavy smartphone that's able to withstand the greatest of malware, hackers, and nation states attackers.

But that illusion quickly unraveled.

We got our hands on the long-awaited smartphone, dogged by delays and setbacks, in part because of a switch from Android to the lesser-known Sailfish OS. Yet, after a detailed and examined look, the device is yet another device in a long list of "secure" smartphones from a company, which nobody's ever heard of, touting theoretical security and unproven privacy.

The phone's flagship feature, a hardware encryption chip, dubbed the Turing Imitation Key, encrypts the Turing Phone, and it lets a device owner communicate securely through end-to-end encryption, said Chao.

"When you initiate a communication, the other user's private key is generated by the chip," he said. That means every email, text message, and VoIP call to another Turing Phone will be encrypted, without having to rely on a third-party key server. If you want to communicate with someone who doesn't have a Turing Phone, you have to rely on a third-party app.

Security going south?

There are a few things about this "secure" smartphone that don't add up.

Chao said the cryptography used in the phone's end-to-end encryption is semi-proprietary. "It's our own algorithm," said Chao. Making it worse, the encryption is closed-source, so it can't be inspected -- though, Chao said that would change down the line. He said that the cryptography had been "inspected by experts", but he declined to name them or say what conclusions they came to, making it impossible to verify the integrity of the encryption.

Ask anyone in security about "proprietary encryption", and they'll tell you it's an immediate security red flag. Some of the most trusted algorithms have been around for decades. New algorithms haven't been inspected.

And "closed-source" is another red flag, as it makes it impossible to know how good the code is, or if there were any backdoors added during the process. Not having the code open to scrutiny by the community means we have no basis of trust for it.

Justin Troutman, an independent cryptographer, told me he had concerns about the company's security approach.

"I remember taking a look at their former QSAlpha Quasar device, and while I generally like the software and hardware approach of securing mobile devices, three fundamental problems remain, just as they did back then," he said.

"Firstly, they're using something proprietary," he said, describing the cryptography. "We can't independently and openly inspect [the crypto]," and, "we have no knowledge of who [the company is] and their ability to design cryptographic primitives".

But it gets worse.

Five million keys

Chao said that the private key, which is the basis for scrambling data on the phone, is created by a master private key. That key, Chao said, generated five million keys -- far more keys than the company expects it may ever need. The company has over 1,000 devices shipped as of July, out of a total of 10,000 devices manufactured in the first batch. Once the keys were created, the company "made the decision to destroy" the key, Chao said.

I asked if the company kept the key. "We don't have access to the master private key," he said. "Not even we have access to the user's data," which is stored in its datacenter in Finland, where the company is now headquartered.

"How do we know you destroyed the key?" I asked.

"Well, there's no way to guarantee that," he said. "Although, we say so. But knowing that we're a private business, even if we go public one day, we're still a business -- not a government agency," said Chao.

"That we know of," I said, half-joking.

Troutman also expressed concerns that users have to take "their word that this master key is being destroyed".

It turns out these aren't even new complaints. Cast your mind back three years ago, when the Turing Phone was the first edition of the futuristic Quasar IV. The phone had some promise and appeared to be a good concept -- with similarities drawn between BlackBerry devices. But after a detailed analysis, it was slated to look like "snake oil" by Ars Technica in a review from 2013.

The phone itself has promise. But the core of the device is built on sketchy security and poorly thought-out principles. The company didn't learn the mistakes the first time, and that's troubling if the phone is effectively a repackaged and rebranded phone with "ultra secure" slapped on its side.

It's tough to reserve judgment when a company promises state-of-the-art and custom security at such a high price. But for anyone looking for an all-in-one security solution, there are far better alternatives that are tried and tested -- and a lot cheaper.