Google and health care provider organization Ascension have publicly confirmed a recent report that the two companies have embarked on a massive initiative to aggregate the data of roughly 50 million patients and store it on the cloud.

The companies say it will improve patient care and administration, but the strategy has also sparked concern among certain consumer advocates, cybersecurity experts and reportedly some Ascension employees — especially because neither patients nor doctors had been previously been notified of this data-sharing arrangement.

Dubbed Project Nightingale, the partnership allows Google to collect troves of sensitive information from Ascension’s roughly 2,600 hospitals, doctors’ offices and care facilities, the Wall Street Journal reported on Monday (with a follow-up article today). Shared data included personally identifying information, as well as lab results, diagnoses and hospitalization records. Reportedly, over 150 Google employees have access to this data, including members of the Google Brain deep learning artificial intelligence research team.

Headquartered in St. Louis, Ascension is a faith-based organization that runs largest non-profit health system in the U.S., and the second-largest health system in the U.S.

In a blog post on Nov. 11, Tariq Shaukat, president of industry products and solutions at Google Cloud, listed three key facets of the partnership: migrating Ascension’s infrastructure to the cloud, providing G Suite productivity tools to Ascension employees and providing tools to enhance clinical quality. Google had previously alluded to a partnership with Ascension in a July earnings call, when it noted that health care organizations including Ascension are leveraging Google Cloud’s artificial intelligence and machine learning solution to improve health care. Both Google and Ascension also made similar points in a joint press release.

There is historical legitimacy to the notion that big data and AI can improve the medicine profession. For instance, IBM trained its AI engine “Watson” in medicine under the premise that one day it could parse through millions of documents that medical professionals would never have time to review themselves in order to accurately diagnose patients and recommend customized treatments based on their medical profiles and symptoms. According to the WSJ, Google officials believe their AI can achieve similar objectives, including using a patient’s history to predict and map the outcome of certain procedures or medications that might be used to treat a condition.

Reportedly, under the terms of HIPAA, Google and Ascension were not required to disclose the third-party data sharing arrangement with patients because the purpose of collecting the data was to help the health care provider better execute its health care functions. Still, some experts and advocate worry about the privacy implications.

“GDPR states that ‘the processing of personal data should be designed to serve mankind…’ Project Nightingale would seem to be in keeping with that lofty directive, said Dov Goldman, director of risk and compliance at Panorays. “The armies of regulators, legislators and public interests scrutinizing Nightingale have thus far reported nothing illegal about the project. Nevertheless, we should be concerned… Only airtight privacy and information security controls will ensure that Nightingale data is truly safe within Google Cloud and used only for the stated purposes.

“It’s good business for Ascension Health to treat Google as a classic third party, and to rigorously assess their privacy and cybersecurity policies and procedures,” Goldman continued. Ascension must monitor any public-facing Nightingale websites and periodically retest Google’s internal controls for this project. With these foundational best practices in place, Ascension will protect their patients’ privacy and safeguard their reputation as a responsible steward of consumers’ most sensitive data.”

Tim Erlin, VP of product management and strategy at Tripwire, sees both sides of the issue. “Google is a company that’s fundamentally built on data, and healthcare is big business, so it’s hard not to see how this project makes sense.” On the other hand, “There’s no doubt that bigger repositories of sensitive data make bigger targets for attackers, so consumers have every right to be concerned about this move. As with all data driven efforts that require personal data to work, consumers have to weigh the benefits against the risks.”

And on the other side of the spectrum is Colin Bastable, CEO of Lucy Security, who renamed the initiative Project Nightmare. “How can Ascension ensure that people employed by a third party that is built on exploiting personal data will adhere to Ascension’s data policies?”

Google last January was fined 50 million euros by a French regulator under the terms of GDPR for inadequately disclosing how the data it collects is used for targeted advertising. And in September the Federal Trade Commission hit Google and its subsidiary YouTube with $170 million in fines for allegedly using cookies to harvest personal data from minors without parental consent and then serving behavioral ads based on this information.

Google is also under scrutiny for its announcement last week that it would acquire Fitbit for $2.1 billion, in the process getting its hands on even more consumer data it could potentially monetize.

To address privacy concerns, the companies stated in their joint release that Project Nightingale would be “HIPAA compliant and underpinned by a robust data security and protection effort and adherence to Ascension’s strict requirements for data handling.”