How CIOs set the pace of generative AI adoption

This audio is auto-generated. Please let us know if you have feedback.

When it comes to enterprise generative AI plans, timing is everything.   

Jason Strle joined Discover Financial as its CIO a year ago, bringing nearly two decades of tech leadership experience in banking institutions. As Discover pilots generative AI, Strle characterizes the company’s strategy as aiming to strike the right cruising speed on the highway to implementation. 

“You’ve got to have a fundamental belief, as far as: are you going to be leaning towards parking tickets or speeding tickets?” Strle asked. “That’s got to be in conjunction with the rest of the executive committee and stakeholders that you’re accountable to.”

Discover is aiming for an appropriate balance supported by responsible practices and an understanding of risk appetite.

Moving too slowly can give competitors the chance to gain an advantage, and rapid deployment brings its own set of challenges. Most CIOs want to land somewhere in the middle, but the optimal adoption speed is one that keeps pace with organizational priorities.

If AI adoption is done incorrectly within organizations, misuse can have a domino effect, from exacerbating security issues to introducing bias and harming brand reputations,  Chris Novak, senior director of cybersecurity consulting at Verizon and advisory board member of the Cybersecurity and Infrastructure Security Agency, told CIO Dive. 

Enterprise technology vendors have spent billions of dollars building out technical infrastructure to support compute-intensive AI workloads, and they need enterprise buy-in for those investments to pay off. 

CIOs are well positioned to perform organizational readiness assessments, weighting infrastructure needs and workforce skill gaps. To adjust the pace of adoption, tech leaders are focusing on modernization, strengthening risk mitigation and slicing through buzzy hype

Clearing the path to adoption

While most organizations are bullish on what AI can do for their operations and workflows long-term, they aren’t quite ready to move at full throttle today. 

More than 3 in 5 organizations say outdated infrastructure and data ecosystems stand in the way of generative AI plans, according to a Hitachi Vantara report published last week. Nearly three-quarters of IT decision-makers said their infrastructure required modernization before they could hit the gas on generative AI projects.

Enterprises that have already started their transformation have a leg up, even if AI isn’t the driving factor. Norwegian Cruise Line Holdings found itself aboard that boat as it wrapped up a 15-month AWS migration of its shoreside technology earlier this year, leading to improved performance and streamlined operational infrastructure. 

“We were trying to modernize infrastructure and provide an underlying foundation for the application development teams to take advantage of such modernization,” Georgios Mortakis, CIO and CISO at Norwegian Cruise Line Holdings, told CIO Dive.

Cloud often clear the path for accelerating emerging technology adoption.

“Any proof of concepts and any ideation around looking into expanding further with use case areas around AI was facilitated by having an infrastructure that was friendly towards SaaS initiatives,” Mortakis said. “We’re taking a responsible approach, but that doesn’t necessarily mean cautious and ‘Let’s go slow.’” 

Finding a strategy and speed that compliments an organization’s tech maturity, risk appetite and growth goals is a delicate dance. CIOs want to bring innovative solutions to the business while hedging against threats. 

“We’ve got a laundry list of ways that we think AI is going to be an incredible difference-maker within our organizations in very positive and impactful ways,” Jeff Caplan, SVP and CIO at Hooters of America, told CIO Dive. “But part of the question is, when do you jump on the train?”

The company is undergoing a digital transformation journey, focused on strengthening the foundational elements and capabilities of its tech stack. Earlier this year, HOA simplified its operations with a single provider for point-of-sale, restaurant back office and loyalty systems. The franchise operates in 36 states and 18 countries around the world.

Other factors can play a role in adoption speed as well. Cybersecurity concerns, compliance guidance and consumer sentiments can all influence an organization’s pace down the road of adoption. 

“To the extent that [generative AI] impacts a consumer, we’re going to move very slowly to make sure we understand the impact of that,” Wells Fargo CEO Charlie Scharf said during the company’s Q2 2024 earnings call Friday. 

The bank considers traditional AI “business as usual,” Scharf said, but it currently is focused on shorter-term generative AI opportunities to drive efficiency. 

Need for speed

Sometimes a time-sensitive use case puts organizations in the fast lane for generative AI implementation.

The countdown was on for Feroz Merchhiya, CIO and CISO at the City of Glendale, Arizona, as the city was set to approve a major renovation to its City Hall. The process would relocate around 500 employees to satellite hubs as the building was gutted and rebuilt over a two-year period.

Providing elbow-to-elbow IT support for business employees was no longer a viable option. Merchhiya was tasked with avoiding higher service desk costs, reducing travel and freeing up time for agents to manage higher-tier issues. 

“I thought generative AI would be the perfect use case because I can build capacity within the existing team,” Merchhiya said. The ticking clock coincided with Merchhiya running into a Moveworks spokesperson at a conference in early 2023, resulting in a partnership with the software provider. 

“If I were to build this on my own, it would require compute power, which I didn’t have at the time in our data center or even in our cloud partners,” Merchhiya said of the company’s enterprise-focused copilot. “A closed system made sense because time-to-market becomes so much faster.”

The system went live in September. Vendor estimates suggest the solution will save city employees more than 3,500 collective hours per year. 

“There is real business justification to why I used it; there was a need,” Merchhiya said. “Not because this is the new, nice and shiny thing that we should all pursue. There has to be a purpose behind it.”