Background on A.I.

History of A.I.

The term artificial intelligence was coined in the mid 1950's.  Today it is defined as the "subfield of computer science concerned with the concepts and methods of symbolic inference by computer."  The first project utilizing A.I. is traced back to 1955 when Carnegie Mellon University researchers developed a computer program to work through  proofs resulting in theorems.  In the late 50's, McCarthy developed a computer language known as  LISt Processing (LISP) This is the language used for most artificial intelligence projects.  Decades later artificial intelligence is still only a small progression from its birth.  Perhaps in the surge of A.I. optimism was at its best and humanity was oversimplified and underestimated.

  "In from three to eight years,   we'll have a machine with the   general intelligence of an    average human being...  a machine that will be able to  read Shakespeare [or] grease    a car." -Marvin Minsky to   Life magazine 1970

Current A.I. Work

Today, the future for artificial intelligence is not so clear-cut.  Many researchers are investigating the possibilities of artificially intelligent robots in the environment.  Companies like Microsoft use A.I. technology already as part of their help system.  Other companies like IBM  and Dragon System are researching A.I. in speech recognition.  This could allow an individual to speak to a program which will, in turn, "type" what is spoken.  For current technical advances in A.I., refer to the Journal of Artificial Intelligence Research (JAIR).

Man Vs. Machine

Also in the 1950's was a criterion designed by Alan Turing to determine the intelligence of a computer.  This became known as the "Turing Test" and today it is still one of the few standards of A.I. used.

"The goal itself is less important 
 than the quest."-Shieber                
Obviously many differences exist between man and machine. 
A human brain uses "massive parallelism" to think. The brain 
has some 100 billion neurons interconnected functioning as one. 
Human intelligence has evolved from millions of years of evolution.

Even though the processes are slow they still have one advantage over computers; they can filter out irrelevancies.  Computers on the other hand, are better number crunchers.  They never tire and unless programmed wrong (by humans) will produce the correct answer time and time again.

  "Computers find it easy to remember a 25-digit number but find it hard to summarize the gist of Little Red Riding Hood."-Steve Pinker  

 Home Background  Definitions Approaches


 Implications   Re-examine

Back to Top
Back to Psybersite

This document was created April 19, 1998 and last modified on Tuesday, March 11, 2014 at 17:34:10.
This document has been accessed 1 times.
Please send comments and suggestions to