Computers have become huge parts of our daily lives. Be it office work, education, medical, gaming, entertainment or anything else, we are completely surrounded by computers all around us. The earliest definition of a computer stated that it is a device that could be programmed to compute or calculate or perform arithmetic or logical operations automatically, meaning that computers just had one job, performing calculations. But with time and constant amplifications in the working of computers, they have now been programmed to cater all our needs. The growing popularity of computers makes us wonder how exactly they became so famous. Were computers so popular right from the start or was it the result of some ‘Computer Revolution’?
ENIAC – The First General Purpose Computing System
Electronic Numerical Integrator and Computer or as the ENIAC it is more famously known was the first programmable general-purpose electronic digital computer, and was able to solve a large class of numerical problems through reprogramming. It was invented between the period of the second world war between the years 1943-1945.
The inventors of the ENIAC, physicist John Mauchly and engineer J. Presper Eckert had a much simpler machine in mind when they first proposed the development of ENIAC in 1942. The computer they had in mind was an all-electronic calculating machine. But the U.S Army demanded a machine that was capable of computing complex wartime ballistic tables. Hence, with major tweaks in the initial design, Mauchly and Eckert were able to design the ENIAC.
When the ENIAC was finally designed and launched in February 1946, the war it was designed for was already over. The design of the ENIAC was pretty unique and the cost was more than $400,000The final design was huge and had around 40 panels, occupying an area of 50 by 30 foot. The system required its own air conditioning unit so as to maintain the heavy amount of heat it produced, almost closing to 174 kW. It came with conditional branching, i.e., it could execute different instructions. In its first decade, the ENIAC is believed to have created a record by surpassing the amounts of calculations ever done by all of humanity combined. To communicate instructions and information to the machine, the designers had used plugboards which facilitated the machine to run at lightning-fast speed once the instructions were programmed. The only downside was that once programmed, it would take engineers days to reprogram or rewire the machine; so much to call it a ‘programmable machine’. But nonetheless, the ENIAC stands to be one of the most powerful calculating machines ever built.
Even though the ENIAC was the most powerful machine of its time, its huge size and heavy cost were a problem for the daily life users. People were fascinated by the capability of the machine but at the same time it was way out of reach of their pockets. Scientists and engineers computing devices to touch every individual, they wanted them to reach huge masses, which was not possible with giant computers. Thus, to popularize the reach of computers to every household, began the computer revolution.
The beginning of the Computer Revolution
The earliest designs of a computer system were bulky and enormous; which ultimately were not really portable. They were also very complex and thus required specially trained engineers to operate them. This is why computers were not able to penetrate the lives of the daily life users. It was not until the 1980s the computers became famous, all credits to the computer revolution. Also hauled as the digital revolution, it refers to the advancement of technology from the old giddy mechanical devices to the fresh and fast digital technology available today.
It began in the late 1970s when the concept of time-sharing computers was introduced. Earlier, computers could execute only one program at a time, which they were designed for. With the introduction of time-sharing, this limitation was removed. Now, multiple users could interact with the Central Processing Unit (CPU) and perform multiple tasks simultaneously. This decade also saw the rise of video game consoles and the development of the initial stages of internet. The ARPANET and Telnet networking technologies gathered the attention of huge masses even before being made available for public use. As the 1980 decade started, the concept of portable computing systems, known as laptops which were enclosed in briefcases emerged. These laptops were capable of doing multiple jobs that a computer could perform and were lighter in weight and easier to carry.
As the 1980s ended, the usage of computers was not limited to government or the military. More and more offices in both the government as well as the private sector in the West started using computers to replace manual work. In 1983, there was the revolutionary discovery of the internet and by 1989, Tim Berners Lee had invented the World Wide Web (WWW). This gave computers a new look and paved way for a lot more than just storing data and performing operations. Computers now became a medium to sought information, to communicate and to learn new things. As Internet’s popularity grew in the next 2-3 decades, computers started reaching more and more households. The digital revolution that started in the 1980s has continued till date and will continue till the end of time as there are new inventions and discoveries being made every day. With the growth and popularity of smartphone-based computing and the ability of smartphones to use internet, another revolution started in the 2010s. Now, companies are focused on bringing everything to the smaller and more portable screens. As of now, more than 70% of the people worldwide have been touched by the computing technology in one way or the other and by the time the 21st century is over, they’re expected to touch 100% of the Earth’s human population.
READ MORE ARTICLES ON TECHNOLOGY
Aayush Pathak is pursuing B.A. in Journalism. His interests include writing short stories and poems, watching cricket, gaming and sitcoms. Currently he’s involved in researching about latest trends in technology.
Read all Articles by Aayush Pathak