Artificial Intelligence: What is it? Where did it start? What is its future? More importantly, what are its limits?
AI owes its life to the work of two men: Jack Kilby and Bob Noyce. In 1958, Kilby of Texas Instruments received a patent for miniaturized electronic circuits. Shortly thereafter, Noyce of Fairchild Semiconductor Corporation received a patent for a silicon-based integrated circuit. This is the device that underlies computers.
I had the pleasure of working one year with Bob on a community project. He was a delight in every way. Intelligent, personable, positive, stimulating are words that come to me out of memory.
The integrated circuit gave computers the ability to acquire, organize, process and report data in an entirely new way. Prior to “the chip” data signals were transmitted by the analog method. Without going into details digital signals offer distinct advantages that opened up a new world of data management.
In the 1960s Thomas Watson Jr. bet the company on the development of the 360 series of computers. It was a huge risk that nearly killed him and the company. But obviously it worked. This was the first product line that integrated software and hardware leading eventually to today’s applications such as diagnoses, logistics, chain management and predictability. Most interesting are the programs that teach themselves as they are applied.
Concurrently, integrated circuit technology grew rapidly. It’s development pace was described by Gordon Moore, co-founder of Intel. Moore’s observation that the number of transistors in an integrated circuit doubles about every two years—while simultaneously reducing their costs by half—has become the yardstick by which chip manufacturers plan and forecast their business. I witnessed this in the six years I was head of human resources for a Silicon Valley computer company.
The implications of Moore’s law are visible in the advent of cloud computing and social media systems. These require increased computing capabilities provided by more components on a single chip. When I came into the industry chip designs were hand drawn. Eventually, CAD/CAM software made the tiresome process obsolete and significantly reducing chip design costs. CAD/CAMs efficiency has driven advances and transitions from microelectronics to nanoelectronics. It all adds up to a rapidly changing world of human communications.
While data management was leaping ahead thanks to digital technology the thought that quantitative analysis could be applied to the human side of an organization was missing.
Concurrently, I had been working on a methodology to fill that gap. I published my first book, How to Measure Human Resources Management in 1985 and followed that with Human Value Management in 2000. Together, they provided a system for bringing intelligence into an arena that historically had been neglected. HVM was named the Book of the Year by the Society for Human Resources Management. The measurement book went to three editions over the next eight years. Now, we can manage the human element with the same accountability as we do physical assets.