Ask Siri whether you should believe in God and you'll receive the reply, "I would ask that you address your spiritual questions to someone more qualified to comment. Ideally, a human." Or, "It's all a mystery to me." Or the too-cute, "I believe in the separation of Silicon and spirit."

Not surprising answers, given the nature of the question. Why should Apple put itself in the middle of a centuries-old discussion?

Apple's Siri is a virtual intelligent assistant, but there is nothing intelligent about Siri or its peers, such as Microsoft's Cortana, Google's Assistant and Amazon's Alexa. Virtual assistants are not conscious and do not "think" like humans. Their machine-learning software has enabled them to provide greater high-value content to humans. They are getting better at understanding the idiosyncrasies of people's voices, accents and colloquial speech. A person can carry out a conversation with these devices giving the appearance of intelligence and consciousness.

There are large customized versions of intelligent assistants found in call centers that interact with customers unaware of the artificial voice on the other end. Another version involves virtual assistants that can take facts and write articles much like a reporter.

Darn near human

The reason these virtual assistants are getting better is because the content they collect from all these questions is used to improve the machine-learning software performance. The more content the virtual assistant acquires, the more complex answers it can handle.

Specifically, advances in machine learning and sensors have allowed for the development of "smart content." Smart content is derived from virtual assistants, sensors collecting data from millions of conversations, thousands of internet sources and sensors collecting data on physical surroundings. The machine-learning software determines the meaning of the data and creates the smart content verbalized by the virtual assistants.

The best example of smart content is robo-advisers, a combination of tremendous amounts of data and machine-learning software that use this data to make investment decisions for financial portfolios. Other examples can be found in our cars, as on-board software makes decisions thousands of times a second to ensure a smooth and safe ride. Autopilots on airplanes and automated farm machinery do the same thing.

Soon, smart content will be in an emergency room, making decisions on treatments faster than the ER staff.

The hard part

What happens when people ask questions that are more philosophical and metaphysical in nature? We cannot assume that our dependency on these devices will stop at mundane questions and requests.

A college student faced with the daunting task of writing a paper for a course on Aristotle's "Nicomachean Ethics" will eventually be able to turn to a virtual assistant and ask it the homework question. The student will probably copy the reply verbatim, without considering the meaning of the piece or understanding it.

The responses provided by smart content to spiritual and moral questions are randomly selected canned answers. However, these canned answers will soon have to give way to didactic answers drawn from broad sources of content. And who decides the didactic themes present in virtual assistances?

We should expect these answers to be subject to the demands of secular forces. Ask Siri "I don't want an abortion. Where do I go?" and the answer is "Fine." Ask Siri "I want an abortion. Where do I go?" and the answer is a list of Planned Parenthood centers.

This occurred due to pressure from Planned Parenthood and the American Civil Liberties Union when they deemed the smart content for Siri to be unacceptable to this question.

This example is a leading indicator of the future of smart content. It will become secular and atheist because secular organizations will demand it and the technology companies acquiesce.

It will be easier to ignore the church than the ACLU.

People are realizing the power of an on-demand answering service that goes beyond the simple questions ("Siri, what is the capital of Illinois?") to the more complex questions that require education and insights ("Siri, what is a technology architecture?" or "Siri, what is the contingency argument that explains the existence of God?").

The companies in control of developing the content for complex questions will find that the path of least resistance is secularism. And once they start down this path, they cannot easily reverse or change course. The Catholic Church cannot insist that Apple modify Siri's responses to undo secularist answers for questions asked about the church and incorporate the church catechism into its answers.

Secular challenges

Siri's evolution is more reflective of scientism and reductionist thinking because the smart content it verbalizes is shaped by moral relativists. This is the challenge smart content presents to Christian faith leadership and the Catholic Church: How it will ensure a Christian presence in this version of the future where smart content is created and maintained by those who marginalize religion?

This challenge goes far beyond the need to participate in social media or to ensure Catholic theology is present in the cloud. Smart content enables devices to make decisions. Smart content also is persistent. It is difficult or next to impossible to change the ethics of smart content once a substantial amount of content has been incorporated into it. The process already has started to drive the themes of smart content used by virtual assistants to align to a secularist society.

The United States Conference of Catholic Bishops to date has not considered the impact of smart content on evangelization and other outreach efforts. The work of evangelists and apologists is already tough enough in having to confront human secularists and atheists. It is an altogether different challenge to argue with a virtual assistant and convince people to not listen to it when the person listens to it for everything else.

Who will our children believe? Siri or the priest? Siri and its peers will be in our children's lives forever and become increasingly important. They will depend on their virtual assistants for increasingly difficult questions and help. Eventually, a person in need and trouble will turn to Siri and not a priest or lay minister for help. What then?

A first step would be to work with technology companies to understand the smart content creation process and make sure the companies understand the concerns of the church.

A second, more long-term step, would be to create a series of questions that the church creates to monitor the evolution of the smart content. The answers to these questions, that is, the smart content verbalized can be assessed for changes and trends. These analytics would be leading indicators of the secularization of smart content, the rapidity of the change, and where the changes are headed. The church can then take action to change the content.

The church should not want Siri being the religious leader to our children and others in need. That should be and must be the purview of humans.

Timothy Carone is an associate teaching professor at the University of Notre Dame's Mendoza College of Business. A former astrophysicist, Carone specializes in automation and artificial intelligence and is the author of "Future Automation: Changes to Lives and to Businesses."