Unlike analog voltage or current signals, digital signals do not vary smoothly and continuously, but rather in steps or increments. Digital technology generates, stores and processes data as a string of binary digits, or “bits”--ones and zeroes--and its role in society since the early 1990s has been compared to that of the railroad during the Industrial Revolution.
The history of modern digital technology can be traced back to the work on semiconductors--the materials of choice for computer chips--by Michael Faraday in the late nineteenth century. It was not until 1971, however, that IBM sowed the seeds for a revolution in personal computing with its PC 5150 business computer. By 2000, the semiconductor industry worldwide was worth $200 billion.
There has been something of a “digital revolution” since the late twentieth century. Nowadays, digital technology can be found not only in computers, but in a wide range of consumer electronic devices, including cell phones, digital cameras and camcorders and MP3 players.
Digital Data Sources
Digital technology also allows books and manuscripts to be scanned or transcribed into digital form and accessed and searched online or via an electronic device.