History Of Computers

Share this post

The early computers

The history of computer dates back a lot longer than the 1900s, in fact computers have been around for over 5000 years.In ancient time a “computer”, (or “computor”) was a person who performed numerical calculations under the direction of a mathematician.

Some of the better known devices used are the Abacus or the Antikythera mechanism.

Around 1725 Basile Bouchon used perforated paper in a loom to establish the pattern to be reproduced on cloth. This ensured that the pattern was always the same and hardly had any human errors.

Later, in 1801, Joseph Jacquard (1752 – 1834), used the punch card idea to automate more devices with great success.

The First computers?

Charles Babbage’s. (1792-1871), was ahead of his time, and using the punch card idea he developed the first computing devices that would be used for scientific purposes. He invented the Charles Babbage’s Difference Engine, which he begun in 1823 but never completed. Later he started work on the Analytical Engine, it was designed in 1842.Babbage was also credited with inventing computing concepts such as conditional branches, iterative loops and index variables.

Ada Lovelace (1815-1852), was a colleague of Babbage and founder of scientific computing.

Many people improved on the Babbage inventions, George Scheutz along with his son, Edvard Scheutz, began work on a smaller version and by 1853 they had constructed a machine that could process 15-digit numbers and calculate fourth-order differences.

On of the first notable commercial use, (and success), of computers was the US Census Bureau, which used punch-card equipment designed by Herman Hollerith to tabulate data for the 1890 census.

To compensate for the cyclical nature of the Census Bureau’s demand for his machines, Hollerith founded the Tabulating Machine Company (1896), which was one of three companies that merged to form IBM in 1911.

Later, Claude Shannon (1916- 2001) first suggested the use of digital electronics in computers and in 1937 and J.V.Atanasoff built the first electronic computer that could solve 29 simultaneous equations with 29 unknowns. But this device was not programmable

During those trouble times, computers evolved at a rapid rate. But because of restrictions many projects remained secret until much later and notable example is the British military “Colossus” developed in 1943 by Alan Turing and his team.

In the late 1940 the US army commissioned John V. Mauchly to develop a device to compute ballistics during World War II. As it turned out the machine was only ready in 1945, but the Electronic Numerical Integrator and Computer, or ENIAC, proved to be a turning point in computer history.

ENIAC proved to be a very efficient machine but not a very easy one to operate. Any changes would sometime require the device itself to be re-programmed. The engineers were all too aware of this obvious problem and they developed “stored program architecture”.

John von Neumann, (a consultant to the ENIAC), Mauchly and his team developed EDVAC, this new project used stored program.

Eckert and Mauchly later developed what was arguably the first commercially successful computer, the UNIVAC.

Software technology during this period was very primitive. The first programs were written out in machine code. By the 1950s programmers were using a symbolic notation, known as assembly language, then hand-translating the symbolic notation into machine code. Later programs known as assemblers performed the translation task.

The Transistor era, the end of the inventor.

Late 1950 saw the end of valve driven computers. Transistor based computers were used because they were smaller, cheaper, faster and a lot more reliable.Corporations, rather than inventors, were now producing the new computers.

Some of the better known ones are:

  • TRADIC at Bell Laboratories in 1954,
  • TX-0 at MIT’s Lincoln Laboratory
  • IBM 704 and its successors, the 709 and 7094. The latter introduced I/O processors for better throughput between I/O devices and main memory
  • First supper computers, The Livermore Atomic Research Computer (LARC) and the IBM 7030 (aka Stretch)
  • The Texas Instrument Advanced Scientific Computer (TI-ASC)

Now the basis of computers was in place, with transistors the computers were faster and with Stored program architecture you could use the computer for almost anything.

New high level programs soon arrived, FORTRAN (1956), ALGOL (1958), and COBOL (1959), Cambridge and the University of London cooperated in the development of CPL (Combined Programming Language, 1963). Martin Richards of Cambridge developed a subset of CPL called BCPL (Basic Computer Programming Language, 1967).

In 1969, the CDC 7600 was released, it could perform 10 million floating point operations per second (10 Mflops).

The network years.

From 1985 onward the race was on to put as many transistors as possible on one computer. Each one of them could do a simple operation. But apart from been faster and been able to perform more operations the computer has not evolved much.The concept of parallel processing is more widely used from the 1990s.

In the area of computer networking, both wide area network (WAN) and local area network (LAN) technology developed at a rapid pace

Uninformed researchers of computer history would probably note the first computer in the mid 1930’s. In reality, this history dates nearly 2000 years ago with the invention of the abacus where the user programmed beads using formulated constructs.

Although many historians caution against the use of the word computer except to define 20th century computers, a broader understanding illuminates an instrument designed by a Frenchman and which functioned as a calculator and was designed for a tax collector in the 1600’s. Improvements to this calculator continued through the 19th century.

Similar work was underway in England and with the support of the government a mechanical calculator was invented. It was powered by steam and supported a fixed program for its use. This calculator went through many changes until an automatic calculator was invented. Following this flurry of discovery and invention, little changed until the early 1900`s when detailed mechanical and transportation work required complex mathematical calculations (especially calculus).

Two Census Bureau workers began to look for a means of accurately calculating information. They conceived the idea of a punch card which would be inserted into the computer, read, and stored. The greatest advantage of these still slow moving machines was the ability to store large amounts of information with ease and accuracy.

The early 1940’s and the imminent World War, brought the military into the computer era. New weapons requiring computer technology for effectiveness, were needed, designed and produced. These were large floor model machines and utilized the floor space of an average one family home (about 2,000 square feet). One independent computer (computador) was not adequate and a means was found to link computers which produced a more accurate and clear channel of information. These devices were not only cumbersome but they required rewiring and rechanneling for each program. Greater inventions were in progress. These new computers (computador) would be equipped with memory capacity and worker faster than any in existence at the time.

In 1947, the first modern programmable computers were designed. They contained RAM (Random Access Memory) and made it possible to access information in seconds. This technology continued to be tested and improved into the 1950’s when magnetic core memory and a transistor circuit element were discovered. These increased the memory capacity and functionality of the computers. On the down side the cost to operate these machines was astronomical. By nearly sheer determination alone, these devices evolved into amazing machines able to work with a number of programs simultaneously while giving the impression that only one program was in use.

As recently as the 1960’s computers were more available and the price had become nearly reasonable for businesses. Their use however, was confined mostly to mathematically based operations such as billing, accounting, and payroll. One of the major purchasers of these devices was hospitals which stored date from patients, inventory, billing, treatments, and the like.

By the 1980’s smaller individual computers were being produced. Technology continued to astound the general public as the microchip came into existence permitting personal computers to be sold with accompanying program disks for downloading. A glance around most medium to large companies would reveal many desk top computers in use.

It would be impossible to track the history of computers without acknowledging Apple Computer and IBM for their leading edge and evolving technology. Radio Shack coupled with Apple Computer (computador) produced video games for the computer (a move from the arcade).

The ability for businesses and individuals to access the worldwide web gave birth to new and innovative marketing and communication with inquirers and/or clients. Today it is inconceivable that one attempt to research something on line and not find multiple references there. The momentum has only continued to mount and new upgrades are available nearly by the day.

Computing hardware is basically a process of information processing. It means that input is equal to output. The history of computing hardware includes the hardware, it’s architecture, and it’s impact on software. Computing hardware has become a source of computation, such as automation, communication, control, entertainment, and education. Computation machines have been used to aid computation for thousands of years. The very first and basic computing device was invented by Romans in 2400 BC known as the abacus.

Some analog computers were invented in ancient times to perform astronomical calculations. The first digital calculator was invented in 1623 by a German mathematician. By the 1900s, earlier mechanical calculators, cash registers, accounting machines, and so on were redesigned. Over time, during the 1950s and 1960s a several different brands of mechanical calculators introduced on the market. After then analog computers were introduced. These analog systems work by creating electrical analogs of other systems.

After analog technology of computing then digital computing was introduced in early 90s. Charles Babbage was titled as father of computer. The commercial development of computers was performed in the decade of 1940s to 1950s. A British catering company famous for its teashops but with strong interests in new office management techniques, decided to take an active role in promoting the commercial development of computers. By this time very few people started to use computers for their jobs.

In the past computers were very big in size but now-a-days these machines are changed into smaller machines. For the first time IBM introduced a smaller, more affordable computer in 1954 that proved to be very popular.

The History of Computers and What Lies Ahead

Since the world’s first ever electronic digital computers were created within the UK and US during the 1940’s times have changed massively, and the exact same can be said for the technology that powers them.

Unlike the 1940’s where computers were the size of a small room (and sometimes a large room) with gigantic internal components and fans which would shame a NASA space craft cooling system, computers now are extremely ergonomic with some examples quite literally fitting in to the palm of your hand and desktop computers being the size of a hard back book.

Way back when computers were first designed these machines were available only to governments and the scientists that created them, a far cry from where we are today with billions of people being connected online through a desktop computer and even more people owning a portable internet connected device such as a smartphone, tablet pc or portable gaming console.

But how did we get to the computing stage which we are today, and at what turning point in computing history did computing become a consumer indulgence?

A brief computer history timeline
– In 1936 Konrad Zuse created the world’s first ever programmable computer named the Z1.

– In 1962 the first ever video game was invented by Steve Russell at MIT, named ‘Space War’.

– In 1969 the world’s first ever internet was invented by ARPAnet

– In 1974 the world’s first ever consumer focused consumers were born, the Scelbi & Mark-8 Altair & the IBM 5100 computer.

– In 1981 IBM created their first ever consumer computer, the IBM PC – Home Computer.

– In 1983 – 1984, Apple created the first home computer with a GUI (graphical user interface).

– In 1985 Microsoft releases Windows, which would soon become the domineering software on all consumer based computers.

The above brief computer timeline misses out a fair few developments within the computing industry, however the most important ones by far to me were the invention of the Z1 by Konrad Zuse and the first ever computer with a graphical user interface by Apple. You do have to say with that analysis though that without Zuse the computer may have never been developed so quickly if at all and without Apple the first ever solid consumer computers would have been awful to use for the average consumer.

Comparing 1985 computing when the first version of Windows was introduced to now brings up some key night and day differences.

For starters computers of the past were not as well designed as they are today, they were in no way as powerful as they are today, and they in no way sold as well as they do today. In part, everybody in the computer history timeline had their own part to play in the success of the computing industry.

It is the advancement of software, hardware and design which has brought our computing needs to what they are today, and without all three of these contributing factors the history of computing may have taken a very different shape to what it looks like today.

Share this post

Leave a Reply

Your email address will not be published. Required fields are marked *