Follow TV Tropes

Following

Media Notes / Display Technology

Go To

Displays are electronic devices intended to give humans a natural method of showing the output of an electronic device, the most common example being the television.

History of Television and Early Displays

A way of converting images into electrical signals that can be viewed back had its start in the late 19th century. One early attempt at a display technology was mechanical television. Though the experience left a lot to be desired due to the poor image quality, the loudness of the mechanical bits, among other things. The first purely electronic display technology, the cathode ray tube (CRT) made headway in the 1920s before becoming the primary display technology until the 1990s

As television was the primary use for displays, there were evolving ways of capturing and broadcasting the image. Eventually this settled into two primary formats: in parts where 60 Hz AC electricity (North America and some parts of Asia) is used, they adopted a 525-line, 30 Hz image refresh rate standard. The rest of the world eventually settled on an 625-line, 25 Hz image refresh rate standard. However, to save on bandwidth and the necessity that CRTs need a reasonably high refresh rate to avoid flickering, only half the lines are transmitted at double the rate, but every other line is sent to fill in the entire screen; this is called interlacing. Conversion between the two was straightforward: 60 Hz broadcasts repeat every fifth frame from 50 Hz broadcasts while 50Hz broadcasts drop every sixth frame from 60 Hz broadcasts.note 

In the television boom years that followed World War II, several different color TV standards were developed and promoted. For a few years, the U.S. saw a battle between CBS's "field-sequential" color system, which was not compatible with monochrome television broadcasts, and a competing standard promoted by RCA (then the parent company of NBC), which sent the same video signal with an embedded color signal to remain compatible with monochrome TVs, of which RCA was a major manufacturer. RCA prevailed in 1953 when the National Television System Committee (NTSC) endorsed its color TV standard, which was later called NTSC. In Europe the situation was even more complex, with two competing standards being adopted in different countries — Germany promoted the Telefunken-developed Phase Alternating Line system (PAL), while France promoted the "Séquentiel couleur à mémoire" (SECAM) standard created by Thomson, which was later also adopted by the Soviet Union.note  note 

Quality-wise, NTSC is often derided as inferior, even jokingly called Never The Same Color. This is a quirk of the NTSC standard which assumed an absolute phase of the color signal note  and due to the imperfect nature of the universe, the phase could sometimes be misaligned. All NTSC color televisions include tint and hue controls to correct this, but this has to be done manually. However, as electronics technology improved both on the television side and broadcasting side, the need to do color correction was virtually eliminated.

Because of the issues with NTSC, Telefunken made PAL send inverted the phase of the signal every other line, hence its name. While this resulted in lower color resolution, it eliminated the need for any sort of tint correction. SECAM used an entirely separate method of encoding color signals, avoiding the need for a correct phase and eliminating dot crawl issues at the expense of color resolution.

Another quirk of NTSC going to color was the transmission rate was slightly adjusted from 30 Hz to 29.97 Hz. There's a whole lot of math that went into choosing that, but the short of it is because the bandwidth couldn't be changed, something else had to change to neatly fit the color signal in the existing black and white one. (This math also resulted in the NTSC color subcarrier having a frequency of approximately 3.579545 MHz. The popularity of NTSC color television sets led to a high demand for crystal oscillators tuned to this particular frequency or a convenient multiple thereof, which in turn led to many circuits having nothing to do with NTSC video output, including sound generators and digital alarm clocks, also adopting this as their standard operating frequency.) PAL and SECAM by design didn't have this problem- in fact, Brazil and Laos were able to keep broadcasting at 30 Hz by merit of using PAL and SECAM color respectively on top of System M, which is used with NTSC in countries who adopted the NTSC color standard. This video by Stand-up Maths explains this in more detail.

Computers Enter the Arena

When color-capable computers started entering the home market, a lot of them used TVs as the monitor, though their video display circuits were normally configured for progressive scan rather than interlaced output. This is also true for video game consoles in the era. These systems made use of a "progressive scan" hack that doubled the display frame rate while halving the resolution of the display, by forcing a full refresh after just one non-interlaced field is drawn. This is the thing that gives many older video game consoles the distinct scanline effect on the display. However, this tends to confuse modern displays- many modern TVs do not understand the signal, and those who do understand the signal attempted to reformat it for modern displays less than perfectly, resulting in the image looking muddy, having noticeable delays, and/or the the image having shimmering. This has gotten worse as many TV manufacturers cut costs in regards to supporting legacy TV signals (notice that portable LCD TVs from the period could actually reproduce signals from old consoles much more reliably and with less latency than their modern counterparts). The use of the hack started died down early with computers who moved on to use dedicated displays, while consoles only stopped using them at the start of the 6th generation of consoles when TVs were also starting to accept progressive scan input. The other device in the era to start benefitting from progressive scan input on TVs were DVD players- While most DVDs are internally encoded as interlaced, the more expensive players advertise having image enhancement capabilities that would allow a better quality image on progressive scan-capable TV sets.

Another quirk with using a TV as a monitor for computers such as the Atari 8-bit line, the Apple ][, the Color Computer and IBM 5150 PC was that they used composite video signaling which induced color artifacting. In NTSC, placing a pixels of a certain color next to each other would produce a different color. In the Apple II's case, this caused graphics and text intended to be purely white to be fringed with green and magenta. (Apple introduced a "color killer" circuit to suppress this effect, but it didn't work perfectly.) In the IBM 5150's case, this allowed the CGA graphics card, ordinarily capable of displaying only 4 colors simultaneously, to produce up to 16 colors and a few games took advantage of this. note  This trick doesn't work with PAL, but PAL instead blends the colors together- meaning a completely different technology has to be used, and in old computers this meant additional circuitry. You can learn more about this at the Other Wiki.. And let's not get started with SECAM, where there is indeed no real exploit available, meaning an even more complex set of additional circuitry is needed for producing colors on SECAM TVs.

Game developers who targeted game consoles also took advantage of this, especially with the 4th and 5th generation of consoles. Famously the Sega Genesis used this color blending quirk. While it couldn't do color blending in hardware to produce transparency like the SNES could, if the artist applied dithering to certain elements, running the console through composite would output a blended image, resulting in either a pleasing blend of colors or transparency. Here's a video about this by Displaced Gamers. This also means that in certain cases, remembering older games looking better isn't a case of Nostalgia Filter, they actually did look better as the color blending produced a pleasing look rather than harsh discontinuities with the dithering effect.

Another quirk with computers that produced color graphics was how they handled colors. Because of the limited amount of memory and processing power, only a small subset of the colors it was capable of producing could be chosen at once to display. For example, the NES could only display 16 colors, but had 64 colors to choose from. (This could be worked around on some systems by reassigning colors in mid-frame, a trick which many Amiga games used to produce background gradients.) On computers that offered multiple resolutions of graphics modes, limits on both the size and the bandwidth of video memory typically resulted in the number of colors that could be used for each pixel being inversely proportional to the number of pixels displayed on the screen. The highest-resolution graphics modes were therefore generally monochrome and also not guaranteed to display correctly on TVs. Video games generally favored low-resolution modes which allowed the most colors to be used at once.

A further technical complication is that many video game consoles and home computers were designed, for reasons having to do with cost efficiency and memory bus synchronization, to operate the CPU, display and sound circuits off divisions of a single master timing chain. Since this timing chain had to be able to generate TV-compatible sync frequencies in addition to appropriate clocks for all other components, this often caused CPU speeds to differ between NTSC and PAL versions of the same systems (the PAL versions usually ran slightly slower), leading to further software incompatibilities.

Eventually due to the need to read fine text and a push for higher resolutions than capable over TV transmission standards, computers started using dedicated monitors rather than TVs.

The Transition From Digital Signals to Analog Signals and back again.

Computers were the first to adopt some form of digital display signaling, specifically IBM PCs. A huge driver for this was initially the reliability and image quality as opposed to connecting a computer to a TV, although the advent of the liquid crystal display (LCD), a digital display, ahead of an attempt to slim down portable PCs of the era, was undoubtly a secondary driving force, as until then portables basically also had CRTs in them and were bulky burdens to bring around. The first digital signals for computers to be standardized was RGBI, which used a primitive form of digital signaling called TTL(Transistor-to-Transistor Logic). It was a very primitive method and only allowed 16 colors including black, but early IBM PCs and its numerous clones, as well as some non-PC systems like the Commodore 128, used it. Early Apple Macintosh machines also used TTL, with an analog board converting the TTL signal to the signals the internal CRT expect. This also greatly simplified implementation of graphics chipsets in PC portables that used LCD panels as the signal was easy to deal with. However, outside of this realm, most other computers were analog as they were designed to be used with a TV.

In the late 80s, the introduction of VGA (and on Apple’s end, the Apple Display Port) also marked the transition to analog (though this is only really true of). For reasons of cost-cutting and simplicity, it was decided that the output of the VGA port would be in analog instead of digital. This made connecting to LCDs a little harder as LCD still expected a digital signal. To rectify this, a specific digital protocol for talking to LCDs, LVDS, was invented. Early portables converted their digital output to analog, which is then sent to a LVDS converter circuit which converts the signal back to digital. Later portables have video chipsets that output natively in LVDS, greatly improving image quality, but monitors from that period was analog.

In the late 90s, computers were quickly hitting the limits of analog- it was soon found that there is a limit as to how much information could be sent. At the same time, digital TV systems were being developed and television broadcasts were going to be getting into digital broadcasting as it was a more efficient use of airwaves by using digital compression- Televisions would soon be transitioning from CRTs to plasma and LCDs, both digital display technologies. And to top it all off, the MPAA was calling for Copy Protection to be implemented at a hardware level for DRM as computers started gaining the ability to play back (and rip) video, as well as record TV signals. This resulted in computers starting to migrate back to digital outputs, and resulted in the creation of DVI. DVI itself is a transitional medium, with the digital part of signal being a data stream, however it also provisions some pins for carrying analog signals for backward compatibility purposes. This is why it was possible to obtain passive DVI-to-VGA adapters. Later, when coming up with a cable to transmit these new HD digital signals for TV use, the computer's DVI standard was used to form HDMI. HDMI omitted the analog pins that was used for backwards compatibility, but added audio channels, audio return channels, control channels and even ethernet communication. Today, the DisplayPort standard is used for computer monitors. DisplayPort itself was adopted from HDMI, however the data stream is packet-encapsulated (although the packet is designed such that it could still be interpreted as a DVI data stream and decoded by a DVI or HDMI display with a matching passive adapter) and with added capabilities like dynamic refresh rates and higher resolution support. Digital signals have since negated any quality difference between regions. The only difference is the 50 Hz/60 Hz refresh rates.

Today, color standards are up to 24-bits, or 8-bits per red, green, and blue channel, producing 16 million colors. There's a push in modern displays to bring this up to 30 bits (10-bits per color channel). Despite 16 million colors being touted as the limit of human perception, there's only 256 pure shades of a primary color which can lead to color banding in certain conditions. Image formats can also store more than 24-bits of color, but this is to ensure that errors accumulate less noticeably when working on the image.

Video transmission types

If not being broadcast from the air, there are a plethora of video connectors to choose from to hook up a device, such as a game console, VCR, or DVD player, to a display like a TV. Note that, except for RF and in some digital signal formats, audio is handled on a separate cable.

The video transmission types, from worst quality to best, are:

  • Analog RF: light + color + audio in one signal over a standard television broadcast standard. Maximum resolution typically is 480i or 525i.note . In this case the video and audio is transmitted across VHF or UHF not unlike the signal from a broadcast tower, and comes with all the same baggage, including being susceptible to signal interference.note  Not to mention, the only reason why analog RF was used on consoles and early home computers is because it offered a convenient single cable solution, and many TVs from before the 90s do not have composite video in, only offering a Type F (or in old world countries, a Belling-Lee) connector for an outdoor antenna. HDMI existed well before the mass digitization of over-the-air broadcasts.
    • For digital TV, it is possible to transmit 1080i. However, no one outside of TV broadcasting use this method, largely because the entire digital TV platform is a huge patent-encumbered mess of a minefield that makes it impractical to implement on a small, personal scale. Digital TV is also much more resilient to interference, but still not completely immune to it..
  • Composite: light + color combined into one signal. Maximum resolution typically is 480i or 525i. By merit of not using the VHF or UHF spectrum, it is also exponentially less susceptible to interferencenote 
  • S-Video: light and color separated, using two signals. Maximum resolution typically is 480i or 525i.note 
  • Component: Color is further separated. Divided into 3-cable and 5-cable solutions. Maximum resolution can be up to 1080p 30Hz or 1080i 60Hz for 3-cable solutions (although the Japanese eventually developed a method to transmit analog 1080p60 over a 3-cable solution with the D-Terminal cable, the format is not adapted outside of Japan), and up to 1080p60 for 5-cable solutions, albeit with a strict cable length limitation.note 
    • The more common one for televisions uses two differential signals (red subtracted by light, blue subtracted by light) along with light and syncing. This is semi-backwards compatible with composite since the cable carrying light is decoded as a black and white image in a composite input as long as the format is 480i 60Hz or 576i 50Hz.
    • Another method is to separate the red, blue, and green color channels, with syncing on the green cable (usually called Sync-on-Green RGB). This is commonly used on European TV sets over the SCART cable system, and in Japan over the JP21 cable system.note  In this case, Red and Blue are not subtracted from green but contains whole frames in themselves. This system is not backwards compatible with black and white TVs without a premixer (although the TV can decode a partial picture if hooked up to only the green line if the signal format is 480i at 60Hz or 576i at 50Hz)- this is why SCART cables carry a composite signal for black and white TV compatibility alongside component RGB.
    • For 5-cable solutions (which is usually used by computer displays of the 90s), this is broken up further into the red, green, blue, vertical and horizontal synchronization signals.
    • Red, green, blue and composite sync signals were normally present on the edge connectors of vintage Arcade Game boards, including those following the JAMMA standard.
  • Digital signals: Pixel information is either sent as a digital stream of values or as packets of data.
    • The dominant formats are HDMI and DisplayPort, the former primarily used in TVs while the latter is used in computer monitors. Another format in current use, though less encountered, is SDI. Its primary use is in professional cameras and numerous other studio grade equipments, although certain companies like Blackmagic Designs and Avid Technologies are pushing for prosumer level adoption of the connection standard as well.
    • Historic and deprecated digital formats are RGBI (or similar) and DVI. The former was used in early computers, which controlled the red, green, and blue signals, as well as an "intensity", for a total of 16 colors, along with separate pins for vertical and horizontal synchronization. DVI was used from the very late 90s to the mid 2010s, only falling out of use due to DisplayPort taking over.
      • It should be noted that DVI is a strange case, being born in the era where computer monitors were transitioning back to digital from analog.note  As such, DVI is actually designed to carry both analog and digital signals. Although there are digital-only and analog-only cables, a generic off-the-shelf cable is usually designed to carry both. Many later graphics cards with DVI outputs also do not output analog signals on the DVI port.
      • It should also be noted that both HDMI and DisplayPort supports DVI signaling at a hardware level, thus it is possible to build DisplayPort -to-DVI and HDMI-to-DVI passive adapters if support for these old displays are needed. However, in the case of DisplayPort, support is optional and the GPU's video encoder has to explicitly support the feature- usually cards where the feature is present are marketed as DisplayPort++ compatible. Cards that do not support DisplayPort++ requires an active DisplayPort-to-DVI adapter.
    • Note that maximum resolution and refresh rate is limited by the format's bandwidth. But typically, for HDMI, DisplayPort, and SDI, they can go up to 2160p 120Hz.
    • For internal connectors, LVDS has, as of 2020, been displaced by DSI (Display Serial Interface) and eDisplayPort (embedded DisplayPort). The former is more commonly found in cellphones, tablets and on SoC board computers like the Raspberry Pi, while the latter is more commonly found in laptops.

For a list of display connectors, see the page on That Other Wiki.

Issues common to all displays

Screen burn-in/Image Persistence

When an image being displayed doesn't change significantly, the pixels may wear out faster than usual. A faint ghost of that image may appear if the display tries to show something else. This is called burn-in. You can commonly see this in digital signs (like the timetables at airports) or on video arcade cabinets. Only displays that emit light are susceptible to this permanent defect.

Displays that block or reflect light may have an issue called image persistence, where if the pixels are constantly showing the same image, they have a tendency to stick that way. Unlike burn-in, image persistence usually goes away on its own.

Dead/stuck pixels

Dead pixels are pixels which will not respond to any input and usually remain either black, white, or some color. These are defective and cannot be repaired. Stuck pixels, on the other hand, are defective but may resolve itself over time. A popular method to resolve a stuck pixel is to flash it several colors rapidly for a few minutes.

Input Lag

Input lag is the time it takes for the screen to show a response from an input, such as moving a cursor. Input lag is most relevant to games and applications where tight timing is important. However, significant input lag is unpleasant regardless.

Contrast Ratio

This is more of a problem with marketing than the display itself. Contrast ratio is the measurement of brightness from the display's pure white against pure black. The rating is under ideal conditions (i.e. a pitch black room). Displays that need another source of light to work (LCD for instance) cannot completely block light, so the manufacturer may fudge with the environment to get the ideal number. The worst case is the term "dynamic contrast" on LCD-type displays, which is actually a marketing fluff term. Basically, the display may adjust its backlight and amplify the color input, giving the illusion of a higher contrast.

Note that some display types, like OLED, are advertised as having infinite contrast ratio. For all practical purposes, this is true because the lighting element can be completely turned off.

Motion Blur

Displays can exhibit an issue with motion blur at even relatively high refresh rates. This is due to displays using the "sample and hold" technique where the image is displayed on the screen and held there until the next refresh cycle. Because of the sudden jump between frames, our eyes try to blend the images together which results in the motion blur effect. Ways to combat this are strobing the backlight really fast or inserting black frames.

CRT, plasma, and other displays that have to be refreshed don't exhibit this problem.

Image scaling on displays with discrete pixels Displays with discrete pixels, or a defined, fixed number of pixels have a problem showing images cleanly if the image is not of the resolution the display was meant for, the display's native resolution. If an image that isn't of native resolution is given to the display, it'll scale it in a way that fits the screen. This happens even if the image is cleanly divisible by the native resolution, such as showing 1080p on a 4K display. Scaling often produces a blurrier image and may also introduce a non-trivial amount of input lag.

Display Technologies

    open/close all folders 

Obsoletednote  and abandoned display technologies

    Cathode Ray Tube (CRT) 
Technology
  • CRTs shoot an electron beam at phosphors coated on a screen at a rate of 50 full images (or more) per second, topping out at 160 or 180 Hz for high-end VGA monitors. When an electron beam hits a phosphor, it glows for a moment then fades, typically in microseconds. Because of the way our eyes work, what we see is a whole image rather than a stream of light racing across the screen. This is also the only display type primarily used for analog signals.
  • To produce color, red, green, and blue color producing phosphors are laid out on the screen, typically requiring an electron gun for each. In order to prevent stray electrons from hitting the wrong phosphor, a metallic "mask" is typically used. Later in CRTs life, a method known as Beam Indexing came about, which used two different phosphors to send feedback to a light detector. This feedback meant that the CRT could use one electron gun, using the feedback to switch over to a different color. But since the quality of this wasn't as good as masked CRTs, its applications were limited.
  • CRTs do not have discrete pixels. While the masking pattern of a CRT suggests there may be pixels, they're arranged in a way either doesn't make sense, such as in slot masks, can be interpreted in different ways, such as a traditional shadow mask, or the phosphors are just a line from top to bottom, such as with the mask in Sony's Trinitron displays. Plus the electron beam scanning across the mask varies continuously, which may cause parts of the phosphor to glow differently than the other.
  • Devices that used CRTs either filled the entire screen with an image or drew a line as a vector, often seen in analog oscilloscopes.
  • A common effect when filming a display using CRT is black bands that slowly scroll down. This is because the filming rate (usually 24FPS) only captures some of what the CRT is showing at the time. Human eyes have what is called persistence of vision, which allows humans to retain the image of something for a brief period of time.
  • There were some applications that took advantage of some of the quirks with CRTs
    • Light Gun Games from the mid 80s to about the mid 2000s figured out where the player was aiming the gun by taking the difference of the start of a frame and when the light gun detected it picked up light. Though a noticeable effect of this was that the screen had to flash white, or at least be of a bright color. Because LCDs dominated the market later, which included an always on backlight on top of changing the image all at once, this effect no longer worked. Note that NES games tended to use a method to black the screen for a frame, then show a white box around the targets in each other frame and noted which box the gun "saw" if it detected light. However, to avoid players cheating by shooting a light bulb, the NES Zapper was tuned so that it could only see light coming from a CRT.
    • The brief electrical persistence of points drawn on a CRT is also what allowed modified CRTs known as Williams tubes to be used as a a primitive type of dynamic memory by several first-generation mainframe computers in the early 1950s.
Pros Cons
  • Most famously, CRT screens are subject to burn-in.
  • CRT displays are bulky, heavy, power consuming, and they run hot, especially the tube itself. This stems from the fact that a CRT needs a very high voltage, anywhere from thousands of volts to hundred thousands of volts, in order to fire its electron guns. This causes some issues of its own:
    • CRTs are hazardous to work with or around as the voltage is generated via a bank of capacitors. Without a bleeder resistor, it can take a long time for the capacitors to discharge. The high voltage from the supporting components carries enough punch to knock a careless engineer across the factory floor and can cause cardiac arrest to those already suffering respiratory problems. They're definitely not to be reckoned with if you don't know what you're doing (and sometimes, even if you do - a second's worth of distraction is all it takes to make that tiny mistake and be jolted with thousand of volts of electricity). We think it's worth repeating - do not leave a CRT display running without its back cover screwed on, heck, do not remove the back cover of a CRT TV if you don't have a reason to, and leave the repairs to the professionals. And even then, keep a safe distance from engineers who're repairing a CRT display.
    • The high voltage going through the coils in the flyback transformer more than often results in a distinctive high-pitched whine. While effort has been made to reduce the loudness of the whine over the years, even by the end of the era of CRTs the problem had not yet been completely eliminated.
  • Contrast isn't as good, as maximum brightness is limited.
    • This limitation is meant to protect the tube. Running a tube on very high brightness is possible, as seen in CRT-based projectors, but the lifespan of the tube would in turn be relatively short. The tube would also become very hot, necessitating a noisy active cooling system. CRT displays for consumer use were designed to only need passive cooling, and part of the trade-off needed to achieve this requirement is to limit the power of the electron gun, and thus the brightness and contrast levels.
  • Subject to flickering if the refresh rate is too low. Interlacing makes this worse, as it effectively halves the refresh rate to double the vertical resolution.
    • This is why most people who claim to get headaches or eye problems from CRTs have them, especially if the room was lit with fluorescent lights, which have a 120hz flicker (in the United States) from the AC power. The CRT's 60 Hz refresh rate could interact with the light's flickering in such a way to exaggerate the flicker of either. Setting the refresh to 72hz or more made the display clear and stable. It's even worse in Europe, Asia and the Pacific, where the AC cycle is 50hz, which makes the flickering more pronounced.
  • Image geometry can be distorted as the phosphor screen is curved (even on so called "flat" displays), often in a way that cannot be corrected by the On-Screen Display (OSD) menu.
  • Color convergence can be thrown off, resulting in color fringing.
  • Since CRTs contain what is essentially a linear particle accelerator, the screen can generate a non-trivial amount of X-rays, especially in color sets. CRT glass is leaded to protect against this, but the lead makes disposal a problem and adds to the already-unwieldy bulk of large tubes. Also, the lead protection can not eliminate it completely, but only reduce it to minuscule amounts.
  • Holding a strong magnet against the screen can permanently damage it by twisting or distorting the shadow mask (a piece of metal that keeps the colors separated) behind the screen. Light magnetic distortions are normal due to the Earth's own magnetic field, and are fixable using via degaussing (most CRTs have a built-in degaussing coil, and will automatically use it when powered up). However, if a TV is sent across long distances, chances are it will face gaussing problems too great to be fixed by the built-in coil and will need to be fixed by a trained engineer with a degaussing wand.
    • Old monochrome CRTs don't use a shadow mask, making it safer (though still not a good idea) to use strong magnets on their faces.

YMMV

  • While CRTs don't have native resolutions, they do have maximum resolutions, although this is artificially imposed to protect the tube. Going to higher resolutions than recommended may cause the image to shimmer and appear blurry, the tube runs hotter, and in a long run, shorten the life of the tube. This is why smaller CRTs has a lower resolution limit than the bigger ones (the last CRT displays in the market could handle a maximum of 1024x768 at 15 inches, while tubes that could handle a full 1920x1080 were at least 25 inches in size. Comparatively, the typical modern LCD or OLED panel can handle 2560x1440 at a little over 5 inches). The use of a moving beam also means that there is a limit as to how fast the screen can refresh, with higher resolutions negatively impacting the refresh rate. Old computers and video game consoles that were designed for use with NTSC and/or PAL TV sets or dedicated CRT monitors with similar operating frequencies tended to have fairly strict upper limits on vertical resolution, partly because of video memory constraints but also because the standards specify a total number of lines in each frame and at least some of those lines must be reserved for vertical blanking periods.
  • It took CRTs many years to get to the point where they have great color quality with true black. Early CRTs had a silver emulsion on the glass surface and thus black looked silver on those sets under normal lighting conditions and the room has to be darkened to a certain degree, or a sheet of tinted glass has to be placed in front of the screen, to get good black levels. CRTs made with tinted glass was experimented on in the late 50s and went into production in the 60s, but were considered too expensive to be put into consumer television sets and thus were only found on computer equipment. The price of tinted CRTs only fell low enough in the early 80s, which saw the first television sets with tinted CRTs roll off the conveyor belt (and even then the display is not tinted as heavily as CRTs made for computers. A CRT of a TV is still tinted lightly enough that it can be inspected with an inspection microscope without being switched on. A CRT made for computers, can’t). The 80s also saw a switch from silver emulsion to a dark grey one, further improving black levels. Color calibration then came in the late 80s, spearheaded by Apple who advertised their machines as the perfect electronic desktop publication computer.
  • The dot pitch of the mask or grille also plays a part in the clarity of the image on screen, especially at high resolutions. The finer the dot pitch, the better as it means the image is sharper.
  • The blurriness can be seen advantageously as an antialiasing effect. Combined with the scanlines and phosphor pattern, it can give the illusion of looking at a higher resolution image through a fine fabric or grille of sorts as your brain fills in the gaps. Example here. And because the image is continuous horizontally and discrete vertically, more sharpness is preserved compared to the typical bilinear filter, which blurs the image in both dimensions. Retro gamers hence often claim old pixel art looks better on CRTs.
  • Before the advent of the intelligent self-degaussing circuit, CRTs had to be specially degaussed in a “gravity room” or “magnetic booth” in the TV factory where the magnetic characteristics in the booth could be adjusted based on which area the TV would be sent to, and the TV then degaussed to meet the area’s requirements. This is why older CRT based TVs start having gauss problems and had to be degaussed again when they are shipped from one region to another. Thankfully, modern CRTs no longer need this and most sets from the late 90s on will automatically do a degauss pass on power up and let the user trigger another degauss when needed.

    Plasma Display 
Technology
  • Made of individual plasma cells. Like CRT, the screen is coated with phosphors that glow, but instead of being excited by electrons, they glow from plasma generated in the plasma cell.
  • Based on the same principle as the neon light; early plasma screens were monochrome and glowed with the bright red-orange color of neon.
Pros
  • Usually very bright with the highest contrast available.
  • Excellent color quality.
  • Flickering is practically non-existent, due to the panels refreshing at a really fast refresh rate of up to 600 Hz in some models
  • A relatively thin profile, comparable to most LCDs of the same size at the time.
  • Very good black levels (Effectively 0); the Pioneer Kuro line even has its main selling point right in the name!
Cons
  • Uses discrete pixels.
  • While marketers would advertise sets as having a refresh rate of 600 Hz, this only applies to the refresh rate of the plasma cells themselves. The input signal most of the time was limited to 50/60 Hz.
  • The front panel is glass (subject to breakage) which makes plasma displays deceptively heavy.
  • Plasma displays run hot and consume lots of energy. Modern displays though are approaching similar power consumption as early LCD sets.
  • Subject to image burn-in, though modern displays aren't as bad as early ones.
  • As of 2014, pixel density appears limited. One needs to buy a 50" TV before reaching 1080p.

    Vacuum Fluorescent Display (VFD) 
Technology
  • Similar to plasma, but built on vacuum tube technology rather than the neon light. Phosphors painted in patterns on the back of the tube are excited by electrons emitted from a filament just behind the screen. A set of grids between the back and the filaments allow control over various areas of the screen; this allows for LCD/LED-style multiplexing.
  • Usually monochrome, specifically a pleasing turquoise color (though Sony has sometimes used a more whitish phosphor on their VFDs).
    • However, since it's based on phosphor technology, it is possible to have a VFD with multiple albeit segmented colors, as seen in various electronics from the 80s (i.e. the spectrum analyzer on some stereos, or the record icon on some VCRs). Read: a VFD can show multiple colors, but the color of a segment on the VFD cannot be changed once it's manufactured.
Pros
  • Very bright and readable, even in bright light.
  • Will run well in cold temperatures, making them popular for automotive use (digital dashboards, radios, and such).
Cons
  • Uses more power than LCD or LED-matrix displays, but not quite as much as plasma.
  • Limited range of colors available; almost all VFDs use the same few shades of turquoise, white, red and orange. Blue VFDs are usually achieved by passing White through a color filter, true blue VFDs are rare and expensive.
  • Not as thin as LCD or LED-matrix displays.
  • Like all vacuum tubes, they're fragile and the filaments have a limited life. However, since the filaments don't need to be run hot, their lifetimes can be very long (decades).
  • Due to limitations of the manufacturing process of VFDs, it's not possible to have a full color dot matrix VFD, let alone one capable of going beyond low-resolution graphics.

    Proto-plasma — Nixie and Panaplex tubes 
Technology
  • They work just like plasma screens, but instead of individual pixels, fully-formed numbers (Nixie) or seven-segmented digits (Panaplex) are used.
  • Very common in older digital equipment from the 1960s and early 1970s.
Pros
  • Largely the same as modern plasma screens; these were the brightest small displays around until VFDs and LED-matrix screens became popular in the late 1970s.
Cons
  • Require high voltages to work, making them impractical for battery-powered equipment (though some people have done it anyway).

    Interferometric Modulation (IMOD) Display 
Technology
  • Similar to DLP, it uses a microelectromechanical-system (MEMS) chip to reflect light. Unlike DLP, it works just fine in ambient light and doesn't require a color filter, as each MEMS element is designed to reflect a specific wavelength of light (which is where that Techno Babble name comes from). While it was used in a smart watch (Qualcomm's Toq), it appears the technology's demise was due to Executive Meddling. The company and patents were purchased wholesale by Apple who proceeded to sit on the technology and thus killed it.

Pros

  • Very good visibility in bright light.
  • Can retain state without power, or at least using very little power.
  • Small enough that the final major manufacturer of this technology used it for a smart watch.
  • Decent refresh rate, at the time the technology was abandoned the displays had a 60Hz refresh rate.
Cons
  • Poor color reproduction, most press coverage reports that the colors on all the displays looks washed out.
  • Extremely power hungry in constant motion applications (ie video, gaming with lots of onscreen action) when compared to alternative technologies like LCD panels and OLED displays.
  • Useless without a light source, and due to design, cannot be backlit, meaning low-light applications are constrained to front-edge lighting which has poor brightness uniformity.

     Other display types 
Field Emission Display & Surface-conduction Electron-emitter Display
  • Field Emission Display (FED) and Surface-conduction Electron-emitter Display (SED) both use a miniaturization of the CRT: instead of one or three electron beams hitting a phosphor coating, have every sub-pixel contain its own electron beam.
  • Development happened during the 2000s, but by 2009-2010, the writing was on the wall that LCDs were the best bang-for-the-buck, plasma was still viable, and the then-emerging OLED was looking promising. The companies that were working on FED and SED announced they were dropping further development.

Scanning LED

  • A scanning LED consists of a single row of LEDs that change rapidly, while either the row of LEDs moves rapidly or something else, like a mirror or a moving object projects the "image" the LEDs are displaying.
  • A famous example is the VirtualBoy, using a single row of LEDs and a mirror that moves back and forth to provide the illusion of an image.

Current display technologies

    Liquid Crystal Display (LCD) 
Technology
  • The display has a material called liquid crystal, which twists light into a certain direction depending on the voltage applied to it. By sandwiching the liquid crystal element with polarizers to only allow light of a certain direction to pass, liquid crystal can either block light or let it through.
    • Reflective LCDs: ambient light is reflected back to the user while the liquid crystal blocks or allows some of that light back. Used in very low power consuming devices but has the problem of requiring a good light source to see well. Front lights can either be embedded or bought separately. The Game Boy and Game Boy Advance used reflective LCDs.
    • Backlit LCDs: an always-on backlight shines behind the liquid crystal. While lower power consuming than other display types, it works poorly in bright light. This is the most common type of LCD you'll find.
    • Transreflective LCDs allow for the display to be readable in direct sunlight like a reflective display, while also having a backlight, but compromises on image quality.
  • Depending on how the polarizers are arranged, "on" can either be black or white. The most common configuration is black is the "on" state, which means that LCDs consume more energy opposite to what one might expect.
  • Backlit LCDs used to be lit by cold cathode fluorescent lights. LEDs now provide the backlight for backlit LCDs. These come in two arrangements:
    • Edge lit displays have the LEDs on one or two edges and the light is as uniformly distributed as possible across the display. Certain sections of the edge can be turned on or off for cheap zone lighting, but this often results in a distracting column of light when viewing certain scenes.
    • Array based has the LEDs behind the display itself in a uniform array. Variations of this exist from using an array to uniformly light up the screen so there's less backlight unevenness to arrays that can locally dim sections of the LCD for higher contrast. Thanks to shrinking LED technology, the number of local dimming zones has gone up to around 1,000 for higher end TVs and monitors, to claims of up to 10,000.

Pros

  • Currently the cheapest, most versatile display technology available.
  • Thin and light profile. Suitable for portable displays.
  • On average, consumes the least amount of energy.
  • Excellent refresh rate and still improving. Active 3D is possible on 120Hz or higher panels. The fastest panel as of 2023 has a 500Hz native refresh rate.
  • Using a specialized polarizer, it's possible to have Passive 3Dnote  or even Autostereoscopic 3D note .

General Cons

  • Uses discrete pixels.
  • Pixel response time is higher than most other technologies. Average response time is 8ms-16ms using the ISO standard test of full black to full white, with 2ms-6ms from gray-to-gray. Longer response times cause a smearing effect known as ghosting.
    • This is worse with passive matrix displays. However, encountering a passive matrix display in a modern electronic device is rare and mostly relegated to really low power devices or devices that only need simple outputs like a pocket calculator.
  • Some LCD panel types have narrow viewing angles which cause color shifting if not viewed head on. Though if the screen is too large, color shifting can still be perceived regardless of viewing angle.
  • Old or cheaply made LCD panels tend to have a problem with image persistence.
  • All LCDs use a light polarizer in order to guide the light in a way that makes easy for the LCD to block light. This presents two problems:
    • If a linear polarizer is used, this will cause the screen to look blank with polarized sunglasses unless the display or sunglasses are rotated to a certain angle. Though some people have taken advantage of this by removing one of the polarizers and making glasses out of them as a privacy filter.
    • It may become loose or degrade over time. On reflective LCDs this results in brown spots showing up on the LCD and nothing can appear in the spots, while on backlit LCDs the screen would look normal when not powered, but will have white spots showing up with nothing appearing inside the spots when the screen is in powered state - this is known in the hobbyist circles as "screen burn", as it tends to happen to LCD devices that has been stored in high-humidity, high-temperature areas (for example, a non-air conditioned store room in a house in equatorial climate regions). In extreme scenarios, the degradation may present itself as bubbling and/or cracking on the screen.

Cons specific to backlit LCDs

  • It's impossible to show "true black" due to the backlight being always on and the liquid crystal itself being unable to perfectly bend light to be blocked by the front polarizer.
    • Array LED lit displays may present a haloing effect in images with high contrast.
  • Backlights in older or cheaper displays employ pulse-width modulation for brightness control. This can be perceived as uncomfortable flickering in some people. However, higher-end displays may use as a user set option this to minimize motion blur at the expense of maximum brightness.
  • As backlit LCDs use a light diffuser to evenly distribute the light, this may over time become damaged or loosened. This can cause bright spots on the LCD to show up.

YMMV

  • LCDs may come in glossy or matte finish.
    • Glossy finishes offer a clearer picture, but there is bad glare in bright light.
    • Matte finishes have excellent glare reduction in all lighting conditions, but may appear fuzzy.
  • LCDs may have a glass covering over it to protect it. Consequently this makes it subject to cracking if the display takes a hard hit. Though glass is normally used in high-end applications. Otherwise plastic is used.
  • Color accuracy actually depends on how the panel was built, not by how many different shades of color it supports (though that definitely helps). Color accuracy is determined by how close the panel can hit certain points in a given color space. So even a TN panel, considered the worst of the panel types, can have more accurate colors than a IPS panel, which is considered one of the best.

Panel Types

  • Twisted Nematic (TN)
    • Comes in two varieties: Passive Matrix (which is very prone to ghosting and has very poor refresh rates) and Active Matrix (eliminates blurring but costs more).
    • Pros: Offers the best response time and is the cheapest to make
    • Cons: Limited to 18-bit colors (using tricks to fake 24-bit), has the narrowest viewing angles, and contrast ratios are the poorest. It also can't be used as a touchscreen without a hard covering due to distortions at slightly modest pressures.
  • Vertical Alignment (VA)
    • Pros: High quality color and offers the best contrast ratios, topping out at 8000:1 in some of the highest-end TVs
    • Cons: Viewing angles are only slightly better than TN, making it unsuitable for viewing it off axis.
  • In-Plane Switching (IPS), also known as PLS in Samsung models and AHVA in AUOptronics panels
    • Pros: High quality color and offers the best viewing angles.
    • Cons: Contrast ratio is barely better than TN, with the best models topping out at about 1000:1. It also has the dreaded "IPS Glow" where parts of the screen have a significant amount of backlight bleed when showing dark content (though this is only noticeable in a dark environment).

    LED Array 
Technology
  • Exactly What It Says on the Tin: A huge array of addressable RGB LEDs arranged in a matrix that, if viewed from a distance, produces an image. These are commonly used for video billboards (both outdoor and indoors) and video scoreboards in stadiums, but Samsung is experimenting with using them as an alternative to projectors in participating movie cinemas. Oftentimes the LEDs would be grouped into panels, and multiple panels in turn grouped to form a display.
Pros
  • Very good color reproduction, true blacks and very bright whites.
  • Reasonably high refresh rates, fast enough that Active 3D is possible.
  • Burn in, while possible, takes a very long time to show up.
Cons
  • Uses discrete pixels.
  • Very expensive as it requires a large amount of addressable RGB LEDs, which itself isn't cheap. A 4K Samsung Cinema panel has about 8.85 million individual aRGB LEDs on them, making them very costly to install or replace compared to conventional projection systems. one source estimates a screen to cost between US$500,000 and US$800,000 depending on features- two to three times the cost of a equivalent digital cinema projector.
  • If repairs are performed, the new LEDs may shine brighter than the older ones, resulting in an uneven image. In a cinema application, the entire set of panels have to be replaced to maintain color uniformity.
  • Viewing from close up destroys the illusion. The panels has to be seen from at least a minimum distance to produce the illusion of a coherent image. Using a diffuser (a sheet of white mostly opaque plastic) in front of the panel can help alleviate the problem, but not eliminate it completely.
  • Size. These panels are built on existing aRGB LED technology and thus are quite huge in relation to their resolution.

    Projectors 
Technology
  • Projectors shoot bright light through a lens to project an image onto something, typically a special screen or a flat wall. This makes them flexible as to where you can put them and as long as the surface is flat and of a neutral color, ideally white.
  • The light source commonly used in projectors are either LEDs, lasers, and halide-based lamps for smaller projectors, to arc lamps used in massive theater-sized ones.
  • There's several ways projectors create an image:
    • LCD type projectors shine the light through what's essentially a miniature LCD panel
    • DLP (Digital Light Processor) types have 1 or more chips that have microscopic mirrors that adjust how much light gets put out. In 1-chip types, a color wheel spins around while the chip rapidly changes to effectively create multiple single-color images, and our eyes blend the result together, but this results in the so-called rainbow effect where you may be able to see each individual color image. Multi-chip types shine light through their own color filter.
    • Laser types use red, green, and blue lasers that scan across the display area using a fast moving mirror (similar to DLP). Because the lasers themselves produce the light and color, they can be made more compact to the point where smartphones had them either built-in or as a reasonably portable accessory.
    • There are CRT based ones as well, using one tube for each color, with a lens to enlarge it onto the screen. CRT based ones need to show a much brighter image than what was normally shown on a CRT based TV, which led to the tubes not lasting as long. In addition, they were more bulky due to the size of the tube itself, the lens needed to enlarge the image, and the focal distance to make sure the image is clear. They also needed both an onboard active cooling system and air conditioning as opposed to other projector technologies. As a result, CRT based projectors were rarely used outside of small to medium commercial establishments (ie Karaoke bars) and early in-flight entertainment systems.
  • Occasionally short-throw projectors (those that work best close to the screen) were built into standalone units, resembling a bulky TV. These are sometimes referred to as rear-projection TVs.
  • As a home project, you can turn your cell phone into a projector by enclosing it in a box and putting a magnifying lens at a certain distance away from the phone. It won't be very effective (it'll be much darker than just viewing the phone's screen), but it also is a good way to show how projectors work in general.

Pros

  • The screen is anything you want it to be, which means you can project images to say the front of your house if you wanted to.
  • The size of the image can be as big as you need it to be.
  • Consumer grade projectors are small enough to be carried around to most places. As mentioned, some projectors can be made small enough that phones had them or can be made into something as small as one.

Cons

  • Projectors that use lamps run hot enough that they need a fan to prevent the lamp from overheating. This fan can get loud enough to be distracting. LED and laser based ones may not need them. If the projector has a fan for this purpose, it also needs a cool-down period after being turned off. Disconnecting power suddenly may cause the lamp to fail.
  • The room used for a front projector must be as dark as possible or the contrast of the projected image will severely suffer. Basically, the "black level" of the projected image is going to be the darkest part of the surface you're using for the screen. Additionally, the area that is to be used for the projection screen must be matte bright white and as smooth as possible, but not reflective, to achieve the best possible image quality.
  • Setting up a projector is more hassle than other technologies. The projector's distance, angle and focal point must be carefully adjusted or the picture will be distorted. While the higher-end projectors offer automatic keystone and focus adjustments using various technologies, the measurements tend to be inaccurate still need manual fine-tuning. This compared to the CRT and LCD display's "plug and play" setup. In fact, this is actually the only barrier that puts people off buying a projector as of 2023 - while projectors are now so cheap that they can compete with the price of a LCD TV, not everyone has the patience needed to set one up. In addition, you need a projector screen of some sort, adding to the cost and may limit where it can be setup.
  • CRT projectors were bulky, heavy, consumed a lot of power and generated a lot of heat because of the fact that there are three extra-bright tubes- one of each primary color- projecting the image onto the screen. In addition to this, they were also expensive to maintain as the tubes have to be replaced frequently due to the high brightness they were usually run at, they were noisy due to active cooling, and even then they needed a sufficiently air-conditioned room to run optimally, color fringing is a problem if one or more of the lens got misaligned, and to top it all off, they share some of the design issues with CRT TVs (namely, flickering and maximum resolution limit).
  • Halide-lamp based projectors need a warm up period before the lamp reaches its maximum brightness.
  • Arc lamp based projectors aren't consumer friendly at all, requiring high-voltage power supplies, careful handling note , and older mercury based arc lamps generating enough UV to burn skin and eyes with brief exposure (most are xenon based now).

    Electronic Paper 
Technology
  • The screen compromises of electromagnetically sensitive ink suspended in a fluid. When given a voltage, the ink moves from one side of the cell to another, which changes how much light is reflected back. The principle is old though, Magna Doodles used a similar technique.
  • Another name for this technology is electronic ink or e-ink.
Pros
  • Lowest power consumption. Electronic paper retains its state even without power and it only needs power to change the image.
    • Some models are shipped with a welcome image set from the factory ready to be seen on unboxing, while expected battery life for dedicated readers is measured in days or even weeks.
    • Stores have switched to using e-ink based price tags to not only make updating the price easy, but tags no longer have to be moved around based on the product, since the tag can be updated to show the product's name as well.
  • Thin profile. Typically used in portable E-Book readers
  • Modern black and white electronic paper screens are easily capable of the equivalent of 300 DPI resolution, which is the standard printing resolution.
Cons
  • Only useful for showing static images as response time is very poor. Android-powered devices require some vendor-specific customization and apps to sidestep the response time limitation.
  • Electronic paper has low contrast ratio, achieving only as high as 15:1. To be fair, consumer ink jet printers are only capable of 50:1 at best on glossy, photographic paper and electronic paper is intended to mimic print media.
  • While color electronic paper is possible, color quality is not quite as good as other display types and total resolution of the display suffers since most forms of it use a black-and-white screen with color filters.
  • Has issues with image persistence, requiring the display to reset itself by going full black and full white every so often.

    Organic Light Emitting Diode (OLED) Display 
Technology
  • This display consists of organic compounds (which in this instance means the compounds contain a hydrogen-carbon bond somewhere) that can produce bright light when given a voltage. This can then be deposited over a screen in such a way that each pixel or subpixel gets its own light source, rather than relying on using a single light source that's diffused across the panel.
Pros
  • Very thin profile, down to millimeters. This allows flexible and semi-transparent displays to be built.
  • The best contrast ratio, along with plasma and CRTs
  • It's possible to print an OLED display using inkjet printing, which when perfected, can lead to drastically lower costs.
  • Compared to LCDs, it has the best viewing angle and theoretically the best lighting uniformity.
Cons
  • Uses discrete pixels.
  • Subject to burn-in
    • Modern OLED displays combat this by either limiting brightness, shifting static elements around, "conditioning" the panel once in a while, and/or adding elements to improve light transmission efficiency.
  • Can't sustain the same maximum brightness as LCDs, but it's being worked upon.
  • Energy efficiency isn't as good as LCDs, but it depends on the content being shown. Some OS and apps offer dark mode to take advantage of this.
  • Blue OLEDs have an issue with both lifespan (16,000 to half-brightness vs. 60,000 of other technologies) and efficiency at high brightness.
  • Will be damaged upon contact with water, including the humidity in the air, thus the display has to be sealed.
  • Flexible displays are more fragile than rigid ones, at least in the applications they're currently being used in (i.e., folding cell phones)
YMMMV
  • A lot of consumer OLEDs use a PenTile Matrix to arrange the subpixels so that there are more (smaller) green and red subpixels and fewer (but larger) blue subpixels. This may give the display a subtly odd hue compared to other displays. Non-uniform subpixel arrangements are used to make OLED colors that wear out faster (like blue) larger so they wear out more or less evenly in brightness compared to red and green.
  • Another sub-pixel arrangement, mostly used by Samsung in their QD-OLED panels, arranges the RGB elements in a triangle pattern. This can lead to color fringing and a decrease in text clarity, but it's mostly a problem with UI-type content.

    Quantum Dot 
Technology
  • A particle called a quantum dot, when exposed to UV light, starts to glow and give off light itself. However, the color of the light depends on the size of the quantum dot. The principle is similar to how neon and fluorescent lamps work, but colors are managed based on the size of the particle, rather than the gas being used.
  • Typically a monochrome LCD or OLED panel drives the quantum dot sub-pixels.
    • If used with an LCD, since the quantum dot is producing the color and provides a pure shade of it, the colors are richer compared to a typical LCD arrangement where the color filter blocks a lot of the light.
    • If used with an OLED, each sub-pixel can be uniform unlike the non-uniform arrangement in RGB OLED displays, creating an image that's sharper for the same resolution.

Emerging display technologies

    microLED 
Technology
  • Individual LEDs are built on a microscopic scale, each LED being a pixel itself. It's similar to the abovementioned large LED Array- based displays found on video billboards and video scoreboards at stadiums, but in a more practical size.
  • Would have very good contrast ratios comparable to OLED and higher brightness than OLED, and darker blacks than plasma.
  • Reasonably high refresh rates, fast enough that Active 3D is possible.
  • Burn-in, while possible, could take an impractically long time to show up.
  • Irreparable. Due to the manufacturing process, if one of the MicroLEDs on the panel develops a fault or defect, the whole panel has to be replaced.
  • Uses discrete pixels.
  • As with many new miniaturized technology, the launch price is expected to be extremely unaffordable- exponentially higher than the large aRGB LED Array solutions that are available today, although the price will go down in time. Most predict that for the first few years the technology will only be used in commercial applications like smaller video billboards, professional displays and maybe arcade cabinets.

Top