You shouldn’t hire a chief AI officer, here’s why

Posted: 30 March 2017 | By Darcie Thompson-Fields

As AI technologies advance we are starting to hear more advice about AI strategies and hiring the perfect AI team.

In the same way the big data trend led to a data scientist frenzy, the argument is now that every organisation should be hiring a c-level officer who will lead the company’s AI strategy.

But Narrative Science chief scientist, Kristian J Hammond is here to ask you not to.

Writing for Harvard Business Review, the chief scientist and Northwestern computer science professor is urging companies not to employ a chief AI officer.

Solving business problems

“The very nature of the role aims at bringing the hammer of AI to the nails of whatever problems are lying around. This well-educated, well-paid, and highly motivated individual will comb your organisation looking for places to apply AI technologies, effectively making the goal to use AI rather than to solve real problems.” Hammond wrote.

Hammond doesn’t doubt AI’s usefulness, after an entire professional life devoted to the field he brands himself a “rabid true believer”. But in order to effectively deploy AI to benefit a business, he encourages focusing on achieving business goals over rushing towards an AI strategy.

“Hiring someone with technical skills in AI to lead the charge might seem in tune with the current trends, but it ignores the reality that innovation initiatives only succeed when there is a solid understanding of actual business problems and goals. For AI to work in the enterprise, the goals of the enterprise must be the driving force.”

Whilst you need people who understand AI technologies, effective communication between the technical and strategic sides of your business is key. Hammond suggests that the alternative to hiring a chief AI officer is start with the problems. Rather than bring someone in to manage an AI strategy, these solutions should be moved into the hands of the people who are addressing the problems directly.

“AI isn’t magic”

“If these people are equipped with a framework for thinking about when AI solutions might be applicable, they can suggest where those solutions are actually applicable. Fortunately, the framework for this flows directly from the nature of the technologies themselves. We have already seen where AI works and where its application might be premature.”

Hammond urges that the question comes down to data and the task. Understanding what data you have and what cognitive technologies are most applicable is what will give organisations the advantage.

Writing “AI isn’t magic”, he stressed that technology is simple and doesn’t require a chief of AI to understand it. Specific technologies provide specific functions and have specific data requirements. Applying these to solve business problems requires communication between the teams that understand those problems and those who understand the details of the technical solutions.

“The AI technologies of today are astoundingly powerful. As they enter the enterprise, they will change everything. If we focus on applying them to solve real, pervasive problems, we will build a new kind of man-machine partnership that empowers us all to work at the top of our game and realise our greatest potential.” Hammond wrote.

You can read Hammond’s full article on Harvard Business Review here.

Related industries

Related functions

Related topics

Related key players