Skip to content
Step by Step Internet 🌐 Guides for learning to surf the Net

Computer generation: origin, history and evolution of computers

Computer generation: origin

The computer world had its beginning almost a century ago . Everything starts concretely in the year 1939 although, as it is obvious and as it always happens, the ideas related to computing come from much earlier, of the desire to digitize functions of devices and be able, with it, to carry out operations more simply. This is how what is known as «Computer Generation» begins.

This is a concept that encompasses, as we say, practically a century of study, invention and development of all kinds of computing concepts (and, with them, machines), first analog, then hybrid and finally digital.

This includes, so to speak, a total of six volumes in which the history of computing is divided to date and which are perfectly differentiated by one or several elements that defined , those generations. We will review all of them to know what has been the history of the generation of computers.

Origin and history of computer generation

We will begin by indicating that when talking about generations of computers, we don’t do it just to refer to what we now understand as our desktop or laptop computer. But we will go much further, before its appearance .

First generation (1940-1958)

primera generacion de computadoras

There is from the year 1940 to 1958 , always considering moments of launches or large events and not the previous moments of study. Which, have often been hidden in a premeditated manner so that the competition did not know what was being worked on.

Obviously, the milestone that makes us talk about a first generation of computers is the step of making calculations manually or semimanually to do it in a digitized and automatic way .

In addition, we have other characteristic aspects such as:

  • The immense size of these computing machines, of several square meters (or even cubic, as they were considerably high too).
  • Its construction with computer systems vacuum valves to perform the replacement of moving mechanical parts .
  • The use of the so-called « machine language «.
  • These machines had a precise objective , essentially demanding military academies and centers.

Neumann, Mauchly and Aiken , among others, leave us jewelry such as the first combined Z1, the Colossus, created to communicate during World War II, the ENIAC, which included the CPU as as we know it Or the MARK I, the main computing machines of these years.

Second generation (1958-1964)

segunda generacion de computadoras

This one, very small, extends from 1958 to 1964 . What defines it is the use of transistors in computers to end up replacing the vacuum valves, which occupied about 200 times more.

  • As you can imagine, this would mean a large reduction in machine size . They also consume less and produce less heat.
  • All this means that we can talk about the birth of the minicomputer.
  • These devices begin to be used in more sectors , especially in banking and, accounting and warehouse logistics in general.
  • We see that the microprogramming is developed in 1959.
  • They also start using more complex languages, called “high level” .
  • The commercialization of COBOL was very famous.

Unforgettable personalities of this stage such as the Amdal multi-award-winning, who was in charge of the design of the IBM 360 series, a set of machines of the same software, but different specifications that allowed the user to take over more suitable to meet your needs. Bardeen, Brattain and Shockley , who worked to invent the transistor and M. Wilkes , who developed the microprogramming.

As for the computers that stand out in these years, we have the IBM 1401 , which is considered the most successful machine in history (selling 12,000 units), the PDP -1 , designed to be used by workers and not by mathematicians and engineers, the IBM Strench , which was the first supercomputer that included a complete transistor system or the aforementioned IBM 360 series .

Third generation (1965-1971)

tercera generacion de computadoras

In this case, what marks the third generation, which covers the years from 1965 to 1971 , of computers is the use of silicon pickup circuits to perform information processing, encompassing the previous transistors and other elements.

  • Thus a more capable processing was achieved in a smaller space and with less independent elements, which favors the reduction of incidents.
  • It is very remarkable that the Computers are used, “as usual”, for commercial purposes . We can say that during these years devices become more “accessible” , being known throughout the world and having features that make it useful for a larger audience.
  • They improve reliability and are more flexible.
  • Teleprocessing and multiprogramming become common .
  • In addition, you start talking now of the computer on a personal level.

The IBM 360 series, already important in the previous stage, would be the first to include integrated circuits. It also highlights the PDP-8, a minicomputer for social use that sold half a million copies and worked with three programming languages. The PDP-11, profitable for a whole decade, was the first to have a single, asynchronous bus, in which to connect all its parts.

They helped make all of the above possible personalities like Kilby , inventor and developer of the integrated circuit, and Noyce , who solved the initial problems of this, Ted Hoff , who invented the microprocessor, especially important years later, and Kurtz and Kenemy , who developed the universal BASIC language.

Fourth generation (1971-1980)

cuarta generacion de computadoras

We advance until 1980 in a context in which the big step is the replacement of conventional processors with microprocessors .

  • This means a new miniaturization of many parts of the computer.
  • There is also a multiplication in power, capacity and flexibility >.
  • Until the point of appearing and selling the personal computers , which occurs in 1977.

Other highlights of the decade are the emergence of the graphical interface, the wedge of the term “microcomputer”, the interconnection in networks to make use of resource sharing and capacity development of supercomputers.

To achieve this, experts worked hard like Ted Hoff who, as we have already advanced, was the brain that developed the microprocessor concept. Kemeny and Kurtz , who were still referents for the growing and more successful use of their programming language. The Intel firm, which launched the first microprocessor in history, Bill Gates , face of Altair Basic, the famous and revolutionary interpreter of BASIC and Wozniak , a child prodigy capable of invent all kinds of devices and improve them until they become practically others.

During the 70 unforgettable teams appear. The first is Cray-1 , the first supercomputer to use a microprocessor. PDP-11 , already revolutionary before, kept talking about being so good that, instead of designing another team to offer the latest news, we worked on it to include them, keeping it in the market, and with great success.

Altair 8800 is the best-selling computer with a microprocessor (it may, in part, be sold together with a mouse and keyboard), in this case, a 16-bit Intel 8080. The Apple II family revolutionizes the market by launching itself as a useful device for the home user, including benefits coming with the spreadsheet; interestingly, the brand started being very affordable.

Fifth generation (1981-2000)

quinta generacion de computadoras

This is undoubtedly a strange computer generation. This is because it is described in two very different ways. On the one hand, there is talk of the moment when Japan undertakes a project, total failure, on computer improvements related to artificial intelligence .

This was carried out from the year 1982 and lasted more than a decade, until it was considered unfeasible to continue with it considering the millionaire cost in resources and the results that, at least, were negative.

However, you have to give this project a big applause because, in fact and as you may be thinking right now, the Japanese looked exactly at the point where it seems that technology is going to be based over the next few centuries.

On the other hand, we can say that, apart from this project, what defines this long period of time or, rather, what makes us consider it as a different one, is the development of the laptop .

Of course, for so many years and at such an advanced time, much more is achieved in terms of computing and information technology in general.

Next, we present the milestones achieved in these wonderful years:

  • Increase exponentially the speed and amount of memory available on computers.
  • Languages ​​are translated immediately .
  • Number of ports begin to be introduced in the computers and, with this, the possibilities are multiplied and a greater customization of these is allowed. The most popular and most important are, without a doubt, storage devices.
  • Computers can once again be designed to be even smaller.
  • Softwares multiply , appearing of all kinds and of all levels of complexity.
  • All of this brings back the fashion of cloning famous teams.
  • Multimedia content stands out from the rest .

The inventions of the moment were the Osborne 1 portable microcomputer, the first to be presented at a fair, Epson HX.20.

Another much more functional laptop, with dual processor and microcassette storage , the flexible, thin and removable disk, ready to store information and be able to carry it with us comfortably and Windows 95, the first Windows operating system and, without a doubt, the best known on the entire planet.

Sixth generation (2000 – To date)

SEXTA GENERACION DE COMPUTADORAS

We are in what is known as the sixth generation of computers, a stage in which there is no general feature but we find much of everything in terms of quantity and quality.

What we do find as a turning point to start the stage is the wireless connectivity that allows us to be connected to networks and other devices without using cables.

We will highlight points such as:

  • The development of other smart devices , first phones and then many others such as televisions, watches and even appliances.
  • A brutal offer of devices for all tastes and needs.
  • Internet as a regular element and, in fact, necessary in the daily life of everyone.
  • Making cloud services available to all users.
  • The popularization of streaming content .
  • Online commerce is also considerably developed to become, in fact, a standard.
  • A vertiginous leap in terms of artificial intelligence .
  • Use of vector and parallel architectures for computers.
  • Stresses and becomes important the storage volume of both internal and external memories.

The inventions and events of this millennium are WiFi, fiber optic, storage capacity, SSD hard drives, smartphones, mobile operating systems, laptops of much smaller size and those that are already known as “desktop laptops” for their incredible features, identical to those of PCs.

What will the future of computing bring us?

The era of digital transformation has begun and the first cobblestones are being put in place to design a fairly long road, but that takes us fully to a future where computing will take the first position and this market will be the most powerful of the world .

Surely you are wondering how these changes will affect each one. Well, let’s analyze each proposal and the implication it will have on our lives.

Augmented analytics (Big Data)

The supercomputers work with a large amount of data and it is not always possible to explore all existing possibilities . Therefore, data analysts do not always work on all hypotheses.

It is considered that many options are being lost and avoiding much information of great interest to us. This is why the increased analytics arises, which aims to look for a new point of understanding of the data . Also, being a machine that takes care of it, eliminates personal choices to review the most hidden patterns.

It has been concluded that by 2022 more than 40% of the data analysis tasks will be automated, so that their human management will pass to another level.

Artificial intelligence

The new solutions created for companies and industry will have an artificial intelligence base. Data analysis, testing, programming code generation and program development process will be automated . In this way, it will not be so necessary to be a programming specialist to work in this branch.

The solutions generated based on artificial intelligence will be faster and faster, without errors or the need for tests . In fact, time will be a much more precious asset than now and at the same time easier to achieve, something paradoxical.

Autonomy of things

Artificial intelligence will have a very important role in the future, but not only at the company level but also individually. This is because robots will be programmed so that they can perform simple tasks , imitating the actions performed by humans. This technology will be increasingly sophisticated, so it will require applications, IoT objects and services that can help automate human processes.

futuro computacion

Changes to the Facebook network

The network will stop having so many ads to finance itself and will start promoting the cryptocurrency. To continue attracting the attention of its subscribers, will continue to develop their own content, buying rights to broadcast content and deploy projects based on the blockchain ; It is a little scary the power of the social platform and the ability it has to turn where it suits.

Blockchain

Blockchain technology allows companies to track a transaction, so that they can also work with untrusted parties without requiring the intervention of a bank.

Blockchain reduces costs and transaction management times , achieving an improvement in the transmission of cash. It is estimated that the blockchain will move approximately 3.1 trillion dollars in a decade.

Emergence of digital twins

A digital twin consists of a digital representation of an object that exists or will exist in real life. Hence they can serve to represent on another scale certain designs. This idea goes back to the representation of designs, assisted by a computer.

Digital twins are much more robust and are used, above all, to answer questions like “what would happen if …? “. Only in this way can you help solve costly mistakes that occur today in real day. The twins will be the official testers.