Attention members: billing update happening now. Learn more.

  • By Jeff Winter
  • Executive Corner

If you pay attention, even remotely, to tech news, you cannot avoid hearing about artificial intelligence (AI). It’s everywhere! It dominates articles, events, podcasts, discussion boards, videos . . . you name it. I believe this is because of a combination of two notions:

  • AI is extremely versatile in its application; it spans every industry and impacts nearly every job function—from entry level all the way up to the CEO.
  • AI has a mysterious science-fiction allure that captivates peoples’ imaginations and curiosity.

At its core, AI is basically the combination of techniques, practices, and technologies designed to mimic the function of the human mind. This results in a system that is capable of self-learning, self-evolving, and self-improvement.

The idea of artificial intelligence isn’t new. In fact, it has been around since ancient times, when people imagined “beings” who were capable of intelligent decision making and who were controlled by a “master craftsman.” It has only been the past decade or so when the fantasy of AI has become palpable, understandable, and capable of truly being utilized. IBM’s Deep Blue supercomputer in 1997 was famously known for beating chess grandmaster Gary Kasparov using early forms of AI.

Nearly 15 years later in 2011, IBM used the Watson computer system on Jeopardy! to showcase its improvements. In 2020, we live in a world where visual and auditory “deep fakes” are shockingly easy to produce. No matter how you look at it, technology and ingenuity have led this charge and brought us to 2021 where AI is on everyone’s minds. And why wouldn’t it be? The promises of AI are powerful, inspiring, and ever growing.

According to the Center for Data Innovation, in a 2016 article called “The Promise of Artificial Intelligence,” the technology will fundamentally change our lives through seven primary applications (figure 1). Any one of these applications would have tremendous positive impacts on reducing administrative managerial work, improving decision making, reducing operational costs, or reducing waste.

With all these universal applications and clearly understood benefits, the writing appears to be on the wall: AI is the wave of the future, and if you are not using or planning on using AI soon, you will be history! Software, platforms, and technologies are already out there, yet adoption appears to be slow. Financial justification and benefits analysis seem to be no-brainers, yet no one is out rushing to make improvements. Why is that?

State of the AI industry

Surprisingly, market research back in 2015 on AI was far and few between, so unfortunately, it is hard to make useful prediction comparisons over time. One 2016 study by UBS pegged the global AI industry at roughly $5 billion in 2015 with plans to grow to around $120–$180 billion by 2020. Today, there are many more market studies related to AI, but none of them put the industry close to even the low end of $120 billion that UBS predicted. Grandview Insights indicates AI was less than $40 billion in 2019, but projects it to grow to around $740 billion by 2027, representing a 42 percent compound annual growth rate.

So, although AI has been steadily growing, it seems we keep predicting this massive growth that does not happen. According to a 2018 study by McKinsey, only 21 percent of companies say their organizations have embedded AI in several parts of the business, and barely 3 percent of large firms have integrated AI across their full enterprise workflows. For such an exciting technological advancement in how we live our lives, those numbers are not very impressive.

Electricity and the Internet: A similar story

In 1893, the world was mesmerized by the infamous illumination of Chicago during the World’s Fair by President Grover Cleveland simply pushing a button. The entire world was inspired and captivated, fully believing electricity would change everything. The problem was that no one really knew how. According to The New York Times, it took nearly 30 years until 50 percent of U.S. homes adopted the technology, and roughly another 30 years to achieve close to 100 percent adoption. In 1989, the World Wide Web was invented with a similar story, albeit with faster adoption. According to Pew Research, it took until 1995 to get 14 percent of U.S. adults to use the Internet. The same study showed that by 2014, there was only an 87 percent adoption rate. Living in 2021, it is hard to image life without the Internet or electricity—especially when trying to work from home during a pandemic!

Today both of these technologies are considered a utility—something necessary to allow other things to function. When rapid growth by companies occurred as a result of these utilities, they did not attribute their success to electricity or the Internet. Even companies like General Electric, with electricity in the name, were not promoting their products under the slogan “look what electricity can do for you.” Rather, GE and other successful companies took advantage of the utility of electricity to create a bunch of products that solved people’s needs, such as the light bulb, the washer and dryer, the refrigerator, and the TV.

The point is: Electricity by itself was too vague and intangible. It was the products that take advantage of electricity that changed the world and drove recognizable benefits to the way we live our lives. More than 100 years later, it is impossible to imagine a life without electricity. It is used to support nearly everything we do and everything we create (from a manufacturing perspective). The same can also be said for the Internet.

Figure 1. Any one of these applications could have tremendous positive impacts on reducing administrative managerial work, improving decision making, reducing operational costs, or reducing waste.

The inevitable boom

The same can also be said for artificial intelligence. AI will inevitably be used to assist, offset, and even replace tasks we do today, just like the Internet and electricity did many years ago. AI will permeate every part of our personal and professional lives. It will be used to help us grow our businesses and eventually be used in nearly every electronic product companies make. So, what can you do today to take advantage of AI?

The answer is surprisingly simple in concept, yet requires a huge, concerted effort to pull off effectively. You do not need to go out and do tons of research on AI technology, understand the details of how it works, investigate all the existing platforms, or even launch a bunch of pilot projects. Instead, you, along with everyone in your entire company, need to change the way you think.

You need to start every decision process with the idea that there is this “thing” (artificial intelligence) already out there that is smarter, faster, more accurate, and more reliable than you in making a decision. By letting that guide your thought process, you will automatically start to include the idea of AI into everything you do, everything you decide, and every action you take. You will end up identifying and capitalizing on the “appliances” (to tie back to the electricity analogy) that AI allows for, rather than thinking about AI itself. And going one step further, if your company makes electronics of any kind, your product development cycle will naturally start to include AI into the very things you create!

You will notice that this is much different from the traditional technology adoption model, where you either look for a new product on the market that can solve your existing problem or where you discover a new product that can make an incremental improvement on something you already have or do. No, AI is far more fundamental and pervasive. If it is treated like a “product,” it will be siloed, underutilized, and haphazardly applied. It will not ever be fully used in a way necessary to transform the entire foundation of a company.

Let AI channel your left brain’s creative tendencies to get you to think differently. AI is here to stay. Don’t be a laggard!

Reader Feedback

We want to hear from you! Please send us your comments and questions about this topic to

Like This Article?

Subscribe Now!

About The Authors

Jeff Winter is an industry executive for manufacturing at Microsoft. Winter is also part of the leadership committee of the Smart Manufacturing & IIoT Division of ISA, a contributor to IEC as a member of TC 65, on the board of directors with the Manufacturing Enterprise Solutions Association (MESA), and benchmarking chair with Control System Integrators Association.