History of Computers

The first mechanical computer was built by Charles Babbage (1791 – 1871). It was called the Difference Engine. Its successor, the Analytic Engine, is especially noteworthy. Babbage’s girlfriend Ada Lovelace was the first programmer in history.

In the 20th century Konrad Zuse was the first to construct a working computer. Wikipedia writes: “The Z3 was a German electromechanical computer designed by Konrad Zuse in 1938, and completed in 1941. It was the world's first working programmable, fully automatic digital computer.”


(The Z3 – Source: Wikipedia)

Alan Turing and John von Neumann are to be credited with theoretical models how a computer works.

Then the USA became the leading nation in computer building, with the ENIAC completed in 1945. In the late 1970s, finally home computers became available for ordinary people to be bought in specialized shops. One of the most popular home computers was the Commodore 64, which was succeeded by the Commodore Amiga 500.

With home computers, amateurs began to develop their own programs. There was also a commercial game development scene, and amateurs started cracking the copy protection of these games and adding their own audiovisual animations (“intros”) to them. This was how the computer arts community known as the demoscene started.

The very best programmers, graphic artists and music composers were active in the demoscene in the late 1980s and early 1990s. Demos like State of the Art, Second Reality or Crystal Dream 2 cast their spell over the home computer owners, who were not used to such amazing audiovisual presentations from commercial software.


(Screenshot from the demo Second Reality by Future Crew)

Then, networks came up. First there were privately owned bulletin board systems (BBS) which you could connect to via dial-up modem. They were connected to each other using early networks such as the Fido Net. In the end the Internet, which actually dates back to the 1960s, became popular and more or less replaced the BBS. With social media such as Facebook, most of human communication finally went online.

Artificial intelligence was invented in the 1950s. Most of the algorithms that are nowadays used date back to the 1980s. However, there is still progress in the field. The headlines AI is making these days are not only due to superior hardware but also to improvements in the algorithms.

With AI, the usage of computers is going to be made much easier in the future. Instead of starting up presentation software and drawing a presentation manually, you will be able to speak to the computer and tell it what to do.

Computer programming underwent a dramatic change from typing machine code in a human-readable form (Assembly language) to high-level languages such as Java, C++, C# and PHP. In the future, it might be possible to use natural language to describe the basic workings of a program, but it will probably remain inevitable to write code in order to specify the details.

All in all, computers are the greatest invention man has ever made, in my humble opinion!

Claus D. Volko

Comments

Popular posts from this blog

A Proof of the CTMU - Sketch

The Synthesis of Metaphysics and Jungian Personality Theory

A Proof of the Riemann Hypothesis