When you go computer shopping nowadays, you're inundated with specs like processor speed, RAM and storage capacity – and if you're smart phone shopping, it seems like every manufacturer has some sort of jargon-heavy screen gimmick, like a Quad HD Super AMOLED display. But let's not neglect the monitors on our desktops and laptops. Though they don't often get the same amount of screen time as handheld devices, a whole spectrum of different types of monitors have been serving as our windows into the digital world for more than 40 years.
In the Beginning
While some early home computers, like the Sinclair Spectrum and those from Commodore, used televisions as their visual displays, the Apple II and IBM-compatible personal computers made the monitor a standard home computing accessory in the 1970s and 1980s.
Video of the Day
This early type of monitor, like old-school tube TVs, used cathode ray tubes (CRT) to generate an image. CRT projects an image onto a glass screen when electrons hit a gas-filled tube (hence the term "tube TV" or "watching the tube"). In these days, computer monitors were monochrome, typically black with green, orange or white text. They also used dot-matrix displays capable of displaying only text or crude imagery until the late 1980s introduced CRT monitors with resolutions up to 1024 by 768 pixels capable of displaying full-color pictures.
If you're looking at a desktop or laptop monitor right now, chances are you're looking at an LCD (liquid-crystal display) screen. Sporting a much thinner profile and a flat screen compared to the slight curve of CRTs, LCD monitors use thousands of liquid-crystal molecules which respond to an electrical charge to display an image.
Though LCD technology traces its history all the way back to the watches and calculators of the 1970s, it didn't gain mainstream proliferation as computer monitor tech until the early 2000s, when consumers began to opt for the low-power profile and space-saving designs of LCD over CRT en masse. The rise of LCD also gave rise to high-definition (HD) displays, defined as a display of 1,280 by 720 pixels. Full HD monitors reach 1,920 by 1080 pixels, and 4K displays feature either 3,840 by 2,160 or 4,096 by 2,160 pixels. And, yes, 8K is a thing – these monitors display at 7,680 by 4,320 pixels.
Advances in LCD display tech has seen the arrival of LED (light-emitting diode) monitors. These types of monitors are LCD displays that use LED backlights rather than fluorescent lights. This makes LED monitors more energy-efficient, slimmer and more likely to have a higher refresh rate – the speed at which the image on screen updates itself – than LCD screens.
Touchscreen-equipped displays on desktop, laptop and convertible computers began to commonly appear after the proliferation of smart phones and tablets, though touch technology has been around since the mid-1960s.
Some modern manufacturers offer curved PC monitors, which aim to heighten immersion, reduce eyestrain and increase the user's field of view. This has made them especially attractive to gamers, who also often opt for monitors with high refresh rates – the higher the refresh rate, the quicker the response time from button press to on-screen action.
Other monitor options include IPS (in-plane switching) LCD monitors, which offer better viewing angles and color reproduction compared to common LCD monitors (which use TN, or twisted nematic, panel technology). A VA (vertical alignment) LCD monitor offers a middle ground between the two, both in terms of price and color reproduction.
- Silicon: Tales in Tech History: Computer Monitors
- University of Oxford: Department of Physics: Cathode Ray Tube
- Institute of Physics: Liquid-Crystal Displays
- Mashable: Tech Time Machine: Screens and Displays
- Best Buy: What's the Difference Between LCD and LED Monitors?
- ViewSonic: Curved vs. Flat Monitors: What Are the Benefits of Curved Monitors?