Speaking at open networking event ONUG in New York City, Oct. 23, 2018, from left to right: Neal Secher, senior vice president and head of networks and data center modernization at State Street Corp.; Viraj Parekh, director, product and new business initiatives, managed and hybrid solutions at Verizon; Bob Friday, co-founder and chief technology officer at AI-driven wireless network company Mist Systems; Prakash Seshadri, software engineering director, Juniper Networks; Mike Albano, Wi-Fi engineer, Google.

Speaking at open networking event ONUG in New York City, Oct. 23, 2018, from left to right: Neal Secher, senior vice president and head of networks and data center modernization at State Street Corp.; Viraj Parekh, director, product and new business initiatives, managed and hybrid solutions at Verizon; Bob Friday, co-founder and chief technology officer at AI-driven wireless network company Mist Systems; Prakash Seshadri, software engineering director, Juniper Networks; Mike Albano, Wi-Fi engineer, Google. Photo: Sara Castellanos / The Wall Street Journal

NEW YORK — Artificial intelligence, among its many roles in the enterprise, has the potential to wholly transform a large company’s information technology infrastructure -- from automating mundane and repetitive IT tasks to predicting capacity and outages to identifying security threats.

But for those AI algorithms to work properly, there needs to be a standardized way of formatting the data that makes up an enterprise’s back-end infrastructure, IT leaders said Tuesday at a conference hosted by open networking group ONUG.

“AI starts with data, and if the data is lousy, you’re not going to make any great AI,” said Bob Friday, co-founder and chief technology officer at AI-driven wireless network company Mist Systems, speaking on a panel about AI in IT.

The data coming out of various firewalls, routers, load balancers and other devices that applications depend on varies based on vendors providing the equipment. But enterprises can make the most use of AI if the volumes of data extracted from the various elements within a network are formatted the same way regardless of origin, IT leaders said.

Data standardization could be a precursor to large AI-based projects within IT infrastructure, and that conversation has already been going on within the IT community for some time.

“Two years ago we were asking ourselves will we ever get to a standardized place … it sounds like that’s still not settled,” said Neal Secher, senior vice president and head of networks and data center modernization at State Street Corp., at the panel event.

Still, AI already is creeping into back-end IT systems today. Mist, for example, is working to build an AI system capable of answering questions usually asked of network domain experts. Its AI assistant can currently answer many of the company’s own IT support calls, Mr. Friday said.

Bank of New York Mellon Corp. last year said it was building a voice-controlled artificial intelligence platform that could help the firm’s IT staff manage enterprise storage.

Eventually, AI could be useful in identifying security threats and prioritize the ones that need to be addressed quickly, said Nick Lippis, co-founder and co-chairman of ONUG.

“Machine learning and AI is systemic in everybody’s thinking. It’s real,” he said.