A Short History of the Computer

Now a part of our everyday lives, the computer is mostly taken for granted. However, it can be useful to review the history of the computer, from the early development of adding machines through to microprocessors and the success of the personal computer. With computer owners now able to build up networks using parts from suppliers such as Comms-Express, and more people opting for portable computers and tablets, it’s worth getting a brief sense of where computers came from.

A Short History of the Computer

Image sent by author

The theoretical basis for a computer was created by Charles Babbage in the 19th century, who had the idea for an adding machine powered by steam. While this invention ultimately didn’t come to fruition, by the early 20th century investments were being made in creating electronic machines capable of speeding up industrial processes and maths and physics programs. Dr. John V. Antanasoff and Clifford Berry worked on military computers between the 1930s and the Second World War, resulting in the Electronic Numerical Integrator and Computer.

Computers after World War II were developed around various electronic technologies, with notable shifts in processing power coming with a switch from vacuum tubes to transistors. However, computers remained large and cumbersome machines that were mostly used as costly number crunchers for research by the government and universities. New programming languages such as FORTRAN helped to push computers towards a standardised format, with notable machines including the Livermore Atomic Research Computer and the IBM 7030.

More powerful computers in the 1960s and early 1970s looked towards semiconductor technology and parallel processing to significantly increase their potential speeds, with integrated circuits enabling a faster expansion in performance. Computers like Seymour Cray’s CDC 6600 offered a sign by the 1970s of what computers could be capable of, while researchers developed a Combined Programming Language.

The major jump forward in computing history of the 1970s involved the creation of minicomputers that could be used in homes and offices. IBM were one of the first companies to experiment with minicomputers, while others looked towards combining existing household appliances with computer processors. By the early 1980s, the Altair 8800, the Apple I, and the IBM-PC demonstrated how personal computers could become a part of people’s everyday lives.

The boom in home computing of the 1980s made companies like Microsoft and Apple hugely successful, with the former company partnering with IBM to provide MS-DOS as an operating system. By the 1990s, the home computer was further boosted by the commercialisation of the internet, opening up the potential for worldwide networks and the ubiquity of PCs and laptops for business.

Computer technology has now reached a point where processing speeds can be pushed to the point of supercomputers capable of handling extreme logic problems and forms of artificial intelligence. At the same time, the memory and RAM that would have filled a room with hardware in previous decades has become portable enough to be carried around in a pocket. Given the rapid rate by which tablet computers and speeds have increased in just the last ten years, the state of computers by 2023 is hard to predict.

Glyn Stevenson writes about computer technology and gadgets. He recommends getting your network equipment from Comms-Express. He also blogs about the latest video games.

Speak Your Mind

*