Computers have been around a lot longer than many people might imagine. The word "computer" has changed meaning over decades, but the electronic computer that we think of in modern times developed throughout the second half of the 20th century. Its popularity as a household item surged in the 1980s following the arrival of operating systems by Apple and Microsoft that mixed graphics and text, replacing the text-only systems of the 1970s. By the 1990s, computers incorporated enhanced communication and multimedia applications and became an indispensable part of daily life for millions of people.
The original definition of the word "computer" was a person who made calculations. This definition goes back to the 1600s and extends midway through the 20th century, when the term "computer" began to refer to a machine. The computer is based on the same concept as the abacus, which goes back many centuries. Technology made a giant leap with punched cards, introduced by Joseph-Marie Masquard in 1801. It's interesting that an early use of this system involved music, in which piano rolls assigned actions to notes on a piano, leading to the "player piano" in the 1870s. In 1835, Charles Babbage combined punched cards with a steam engine to invent what he called an "analytical engine."
Mechanical Information Processing
The company IBM grew out of the invention of the tabulator, crafted by Herman Hollerith in the late 1880s. This was the first use of punched cards representing data as opposed to punched cards automating a mechanical function like a player piano. The information processing world through the 1950s was based on a combination of punched cards, the tabulator and key punch machines. The first calculators appeared in the 1930s. Analog machines began to get replaced by the digital concept of zeroes and ones throughout the World War II era. The first computer made for the masses was UNIVAC, made by Remington Rand in 1951. IBM introduced its mainframe computer the following year.
Video of the Day
Early Remington computers sold at over a million dollars per machine, but IBM made smaller, more affordable machines that became popular. In 1954 IBM developed Fortran, one of the original computer programming languages, based heavily on mathematics. During the same decade, the developments of the transistor, integrated circuits and microprogramming led the way for reducing computer size. Meanwhile, CPUs increased computer processing speed and memory improved data storage. The arrival of microprocessors introduced by Texas Instruments and Intel in the early 1970s paved the way for miniaturized yet more powerful computers.
Rise of the PC
Up until the 1970s computers were mainly used by business, government and universities. Personal computers first appeared on the market in the late 1970s. Apple introduced the Apple I in 1976 and the Apple II the following year, ushering in an era for the masses using computers at home. From this point on, the software industry began to develop, with Microsoft and Apple as the primary companies. Microsoft became a software giant by marketing its DOS operating system with IBM computers beginning in 1984. Apple introduced the Macintosh in 1984, marking the beginning of graphics and text, replacing systems that only displayed text. Ever since, Apple has called its computer system "Mac" to differentiate itself from the rest of the PC market.
During the 1990s, the computer surged in popularity and became a common household item. Microsoft's Windows 95 operating system accelerated the mass use of computers while the growth of the World Wide Web throughout the 1990s also helped attract interest in computers. Soon, nearly every profession needed software to improve its product or service. By the first decade of the 2000s, Microsoft had introduced the XP and Vista operating systems while Apple offered the OS X series through Leopard. These developments, along with other popular software applications, meant that the average person now had access to robust multimedia tools.
Personal computing became truly portable in the late 1990s and 2000s with the development of advanced PDAs, the touchscreen smartphone and tablet PCs. Apple changed the game with the launch of the iPhone in June 2007, but other manufacturers, including Samsung and Nokia, soon developed their own touchscreen smartphones and mobile devices. This new generation of devices took advantage of several technological breakthroughs--including processor miniaturization, flash memory, high-speed Wi-Fi wireless Internet and 3G mobile data networks--to put the power of the personal computer in the purse and pocket.