While today’s widespread use of Artificial Intelligence (AI) is new, the field has more history than many realize. Back in 1950, pioneering British mathematician and computer theorist Alan Turing published Computer Machinery and Intelligence. This eventually became the Turing Test, to determine if machines could, in fact, think. The term “Computer Intelligence” was coined, which in time became “Artificial Intelligence.”
Renowned for his pioneering work in computer theory, Turing quipped a few years before his death in 1954, “No, I’m not interested in developing a powerful brain. All I’m after is just a mediocre brain, something like the President of the American Telephone and Telegraph Company.” Had he lived longer, it would be fascinating to hear Turning’s thoughts on just how advanced and prevalent AI has become.
By the 1970s, some manufacturers were already adopting computers and rudimentary AI in their processes. In the decades since, common tasks once performed only by humans, such as telephone directory assistance, online tech support, and banking services, became dominated by early AI. Today, systems are so realistic we may not even be aware the ‘person’ we are talking to online or on the phone is actually a machine. Developing at breakneck speed, AI is now part of our daily lives.
What is AI, exactly, and how is it—for better or worse—affecting our lives? One of the most succinct definitions comes from multinational technology giant and research organization IBM. “Artificial intelligence (AI) is technology that enables computers and machines to simulate human learning, comprehension, problem solving, decision making, creativity and autonomy.”
Virtually all disciplines, from education to medicine and mining to manufacturing, are undergoing dramatic transformations thanks to AI. In finance, AI analyzes market trends, manages investor portfolios, and even makes personalized wealth management strategies. AI in the medical field is helping doctors diagnose illnesses faster and more accurately. And in manufacturing, AI continues playing a crucial role in how companies make products faster and with greater profitability.
In manufacturing, AI pairs particularly well with robots and with humans working together. It can identify parts, orientation, potential flaws, or safety hazards. It can make decisions on the fly to correct a mistake, improve efficiency, or increase precision, and it can notify humans when intervention is required or stop a process before an accident occurs. And, unlike human beings, AI doesn’t tire or get distracted on the job.
In the recent book AI for Everyone: How to Use Artificial Intelligence to Your Advantage, Kai L. Thornton writes: “Artificial intelligence (AI) is no longer a concept confined to the realms of science fiction and academic discussions. It has become an integral part of our daily lives, shaping how we communicate, work, learn, and even entertain ourselves. For entrepreneurs, professionals, students, and everyday users, AI offers unprecedented opportunities to enhance productivity, automate tasks, and stay competitive in a rapidly evolving world.”
For human workers, the ever-increasing deployment of AI is not all sunshine and roses. Automation and robotics had already impacted the workforce before the implementation of modern AIs, and that trend is only accelerating. Lower-skilled, repetitive jobs like assembling manufactured parts and data entry have already been replaced in many cases, allowing one human worker to do the job of 10, and now AI is quickly expanding to replace even that one remaining human in the loop.
According to recent data from The Organisation for Economic Co-operation and Development (OECD), almost 27 percent of jobs worldwide are at high risk because of automation, creating the potential for mass worldwide unemployment. In his recent book Artificial Intelligence – Friend or Enemy? Robert D. Little writes: “As machines and algorithms replace human labor in these sectors, economic inequality could widen dramatically, as those without advanced skills may find it difficult to secure alternative employment in a rapidly evolving job market.”
Although some costs are coming down, AI, like all technological innovations, is still costly to implement. For massive, multi-billion-dollar corporations, affordability is not an issue; however, for small and medium-sized enterprises (SMEs), the price of implementation and maintenance remains a barrier. Faced with these financial constraints, some SMEs will find completion in the future challenging. This will be especially challenging for low-wage workers in developing countries.
Despite challenges, AI continues to advance modern manufacturing for many reasons. In just a few years, AI has surpassed expectations for many through machine learning (ML), a subset of AI. Instead of technology being used solely to program machines, algorithms analyze and learn from data which is then used to made decisions and predictions. For some fields, such as finance, ML is reducing errors and increasing efficiencies. In other sectors, such as cybersecurity, ML algorithms are being used to monitor and detect cyberattacks before they happen. And in manufacturing, ML helps optimize processes, automate tasks, and detect imperfections which could otherwise delay production, costing companies time and money. One of the greatest advantages of ML in manufacturing is its ability to monitor machines used in production themselves and proactively determine predictive maintenance needs before they become an issue.
Last year, the Washington D.C.-based advocacy group National Association of Manufacturers (NAM)— America’s largest manufacturing trade association—released Working Smarter: How Manufacturers Are Using Artificial Intelligence. Containing key findings and insights, the report discusses how AI can be a force for good.
“Manufacturers have been at the forefront of developing and implementing intelligent systems and AI technologies, including machine learning, deep learning, natural language processing, machine vision, digital twins and robotics,” writes Kathryn Wengel, Executive Vice President and Chief Technical Operations & Risk Officer, Johnson & Johnson, and Chair of the Board, National Association of Manufacturers. “This has positioned manufacturers uniquely as both developers and deployers of AI innovations, providing invaluable insights into the effective and responsible use of these technologies.”
Wengel cites how Johnson & Johnson is using AI as a “force multiplier” in key areas, including drug development, restocking hospitals, sorting masses of data, and “yielding insights for the improved health and wellness of people around the world. It aids us in creating targeted treatments and getting them to the right patients at the right time.” Among the many other benefits of AI are safety and effectiveness guardrails during clinical trials and greater control over supply chains. These factors, and others, see manufacturers working smarter, safer, and more effectively than ever thanks to AI.
In the near future, AI will continue playing an even more prominent role in all sectors, especially in manufacturing, from streamlining processes in the office to machine learning and greater use of robotics on the shop floor.
In 2023, NAM’s Manufacturing Leadership Council conducted surveys on reasons why America’s manufacturers are investing in transformative manufacturing 4.0 (M4.0) technologies and AI. Some of the many reasons include reduced costs, improved operational efficiency, greater operational visibility and responsiveness, improved quality, and greater speed to market. For manufacturers of all types, one of the biggest factors behind increased use of AI remains to “compensate for labor shortages.”
75 years after proposing a test to determine if a machine can exhibit intelligent behaviour, it would be fascinating to hear Alan Turing’s thoughts on today’s complex AI systems and how they learn and perform. Perhaps he would find them “terrific” in all senses of the word.