Sixth Generation of Computers Bring Us Closer to HAL

It is amazing to look at all that has transpired in technology in relatively a blink of an eye. In a span of 71 years, about as long as many baby boomers have been alive, computers have gone from huge devices that took an entire room to one that can sit on your lap while you update your Facebook page. In more recent years, we have seen the birth and awkward infancy of artificial intelligence, also known as the sixth generation of computers.

A generation of computer is a term used to describe the evolution of computing and how technology has adapted the computing industry to a more streamlined, powerful set of highly evolved processors. Each technological breakthrough has made computers smaller, faster and less expensive. In the 1980’s it was rare for even one home to have a computer, now most homes have several. Accessing the Internet was once reserved for college universities, now we can download music from the web onto our cell phones. And we are still changing, making things even smaller and able to perform more complex functions, such as type what we say and beat us at chess.

Artificial intelligence is the realm of programming where devices are enables with the ability to think and react to the environment around it. The fields of gaming, robotics, voice recognition, and real life simulation all center on perfecting the science of artificial intelligence. The sixth generation of computer differs from previous generations in terms of size, processing speed and the complexity of tasks that computers can now perform.

Back in the earliest stages of computing, computers contained vacuum tubes and magnetic drums. They were large, expensive and could only perform one task at a time. They were also prone to malfunctions and had the self-destructive inclination to overheat due to the vast amount of electricity it used and heat it generated.


In the mid-50’s, transistors replaced the vacuum tubes. This generation of computer also saw the dawn of more complex programming languages, such as COBOL. Though the size and cost of computers decreased significantly, the issues of overheating and malfunctioning were still prevalent. The mid-60’s brought with them more than just peace and love, they also brought integrated circuits. Silicon chips housed the transistors developed a decade ago and made a significant leap in computer power and speed. These chips also were used for both computing and memory, replacing the metal drums and cores. The CPU (central processing unit) and memory chips allowed computers to perform even more complex functions and introduced keyboards and monitors for input and output devices. Up to that point punch cards were used to provide the computer with its instructions and its answers were supplied via a paper printout.

The ability to perform many complex tasks at one time was expanded and revolutionized with the introduction of the microprocessor in the early 70’s. Now what took up a whole room could rest gently on a fingertip. Microprocessors were the beginning for a fury of technological advancements that includes computerized cars, appliances and smart phones. Everything has become smarter, faster and smaller. They have also become integrated. With the advent of the microprocessor came the ability to link computers together in a network. The birth of the Internet and all of its wonders are attributed to the birth of the microchip.

READ ALSO:  Redstone Computer Tutorials: Ep 2 'Main Cell, Buses & Inverter' | Video

All of this has led to the ability to program computers to imitate the ability to think. While no computer or device can truly think on its own (sorry HAL) it is able to simulate many decision making functions that have helped to improve the lives and fun of humans.

Gaming has seen great advances with artificial intelligence. Sims, Halo, and World of Warcraft all wouldn’t be half as much fun with the ability of the game to react to and proved scenarios for the decisions of the player. Nor would it be as fun to play Age of Empires III against the computer if it didn’t have the ability to beat you once in a while.

In the field of medicine, programs can help doctors diagnose diseases from a list of symptoms and in the military real life simulators have enabled safely training pilots and soldiers. Artificial intelligence is also assisted in the development of voice recognition software (VAR).

While unable to understand the words, VAR has allowed people who have disabilities to speak into a microphone and see the words appear on the screen. Although people need to speak slowly and clearly in order to work properly, and it is not 100% accurate, VAR is also now on many hand held devices. Being able to surf the web or send a text has become easier than ever, and more advancements are on the way.


by Ali Gheli