A computer monitor is an output device that displays information in pictorial form. A monitor usually comprises the visual display, circuitry, casing, and power supply. The display device in modern monitors is typically a thin film transistor liquid crystal display (TFT-LCD) with LED backlighting having replaced cold-cathode fluorescent lamp (CCFL) backlighting. Previous monitors used a cathode ray tube (CRT) and some Plasma (also called Gas-Plasma) displays. Monitors are connected to the computer via VGA, Digital Visual Interface (DVI), HDMI, DisplayPort, USB-C, low-voltage differential signaling (LVDS) or other proprietary connectors and signals.
Originally, computer monitors were used for data processing while television sets were used for entertainment. From the 1980s onwards, computers (and their monitors) have been used for both data processing and entertainment, while televisions have implemented some computer functionality. The common aspect ratio of televisions, and computer monitors, has changed from 4:3 to 16:10, to 16:9.
Modern computer monitors are easily interchangeable with conventional television sets and vice versa. However, as computer monitors do not necessarily include integrated speakers nor TV tuners (such as digital television adapters), it may not be possible to use a computer monitor as a TV set without external components.[1][2]
Early electronic computers were fitted with a panel of light bulbs where the state of each particular bulb would indicate the on/off state of a particular register bit inside the computer. This allowed the engineers operating the computer to monitor the internal state of the machine, so this panel of lights came to be known as the 'monitor'. As early monitors were only capable of displaying a very limited amount of information and were very transient, they were rarely considered for program output. Instead, a line printer was the primary output device, while the monitor was limited to keeping track of the program's operation.[3]
Computer monitors were formerly known as visual display units (VDU), but this term had mostly fallen out of use by the 1990s.
Multiple technologies have been used for computer monitors. Until the 21st century most used cathode ray tubes but they have largely been superseded by LCD monitors.
The first computer monitors used cathode ray tubes (CRTs). Prior to the advent of home computers in the late 1970s, it was common for a video display terminal (VDT) using a CRT to be physically integrated with a keyboard and other components of the system in a single large chassis. The display was monochrome and far less sharp and detailed than on a modern flat-panel monitor, necessitating the use of relatively large text and severely limiting the amount of information that could be displayed at one time. High-resolution CRT displays were developed for the specialized military, industrial and scientific applications but they were far too costly for general use.
Some of the earliest home computers (such as the TRS-80 and Commodore PET) were limited to monochrome CRT displays, but color display capability was already a standard feature of the pioneering Apple II, introduced in 1977, and the speciality of the more graphically sophisticated Atari 800, introduced in 1979. Either computer could be connected to the antenna terminals of an ordinary color TV set or used with a purpose-made CRT color monitor for optimum resolution and color quality. Lagging several years behind, in 1981 IBM introduced the Color Graphics Adapter, which could display four colors with a resolution of 320 × 200 pixels, or it could produce 640 × 200 pixels with two colors. In 1984 IBM introduced the Enhanced Graphics Adapter which was capable of producing 16 colors and had a resolution of 640 × 350.[4]
By the end of the 1980s color CRT monitors that could clearly display 1024 × 768 pixels were widely available and increasingly affordable. During the following decade, maximum display resolutions gradually increased and prices continued to fall. CRT technology remained dominant in the PC monitor market into the new millennium partly because it was cheaper to produce and offered to view angles close to 180°.[5] CRTs still offer some image quality advantages[clarification needed] over LCDs but improvements to the latter have made them much less obvious. The dynamic range of early LCD panels was very poor, and although text and other motionless graphics were sharper than on a CRT, an LCD characteristic known as pixel lag caused moving graphics to appear noticeably smeared and blurry.
There are multiple technologies that have been used to implement liquid crystal displays (LCD). Throughout the 1990s, the primary use of LCD technology as computer monitors was in laptops where the lower power consumption, lighter weight, and smaller physical size of LCDs justified the higher price versus a CRT. Commonly, the same laptop would be offered with an assortment of display options at increasing price points: (active or passive) monochrome, passive color, or active matrix color (TFT). As volume and manufacturing capability have improved, the monochrome and passive color technologies were dropped from most product lines.
TFT-LCD is a variant of LCD which is now the dominant technology used for computer monitors.[6]
The first standalone LCDs appeared in the mid-1990s selling for high prices. As prices declined over a period of years they became more popular, and by 1997 were competing with CRT monitors. Among the first desktop LCD computer monitors was the Eizo FlexScan L66 in the mid-1990s, the Apple Studio Display and the ViewSonic VP140[7] in 1998. In 2003, TFT-LCDs outsold CRTs for the first time, becoming the primary technology used for computer monitors.[5] The main advantages of LCDs over CRT displays are that LCDs consume less power, take up much less space, and are considerably lighter. The now common active matrix TFT-LCD technology also has less flickering than CRTs, which reduces eye strain.[8] On the other hand, CRT monitors have superior contrast, have a superior response time, are able to use multiple screen resolutions natively, and there is no discernible flicker if the refresh rate[9] is set to a sufficiently high value. LCD monitors have now very high temporal accuracy and can be used for vision research.[10]
High dynamic range (HDR)[9] has been implemented into high-end LCD monitors to improve color accuracy. Since around the late 2000s, widescreen LCD monitors have become popular, in part due to television series, motion pictures and video games transitioning to high-definition (HD), which makes standard-width monitors unable to display them correctly as they either stretch or crop HD content. These types of monitors may also display it in the proper width, by filling the extra space at the top and bottom of the image with a solid color ("letterboxing"). Other advantages of widescreen monitors over standard-width monitors is that they make work more productive by displaying more of a user's documents and images, and allow displaying toolbars with documents. They also have a larger viewing area, with a typical widescreen monitor having a 16:9 aspect ratio, compared to the 4:3 aspect ratio of a typical standard-width monitor.
Organic light-emitting diode (OLED) monitors provide higher contrast, better color reproduction and viewing angles than LCDs but they require more power when displaying documents with white or bright backgrounds and have a severe problem known as burn-in, just like CRTs. They are less common than LCD monitors and are often more expensive.
The performance of a monitor is measured by the following parameters:
On two-dimensional display devices such as computer monitors the display size or view able image size is the actual amount of screen space that is available to display a picture, video or working space, without obstruction from the case or other aspects of the unit's design. The main measurements for display devices are: width, height, total area and the diagonal.
The size of a display is usually by monitor manufacturers given by the diagonal, i.e. the distance between two opposite screen corners. This method of measurement is inherited from the method used for the first generation of CRT television, when picture tubes with circular faces were in common use. Being circular, it was the external diameter of the glass envelope that described their size. Since these circular tubes were used to display rectangular images, the diagonal measurement of the rectangular image was smaller than the diameter of the tube's face (due to the thickness of the glass). This method continued even when cathode ray tubes were manufactured as rounded rectangles; it had the advantage of being a single number specifying the size, and was not confusing when the aspect ratio was universally 4:3.
With the introduction of flat panel technology, the diagonal measurement became the actual diagonal of the visible display. This meant that an eighteen-inch LCD had a larger visible area than an eighteen-inch cathode ray tube.
The estimation of the monitor size by the distance between opposite corners does not take into account the display aspect ratio, so that for example a 16:9 21-inch (53 cm) widescreen display has less area, than a 21-inch (53 cm) 4:3 screen. The 4:3 screen has dimensions of 16.8 in × 12.6 in (43 cm × 32 cm) and area 211 sq in (1,360 cm2), while the widescreen is 18.3 in × 10.3 in (46 cm × 26 cm), 188 sq in (1,210 cm2).
Until about 2003, most computer monitors had a 4:3 aspect ratio and some had 5:4. Between 2003 and 2006, monitors with 16:9 and mostly 16:10 (8:5) aspect ratios became commonly available, first in laptops and later also in standalone monitors. Reasons for this transition was productive uses for such monitors, i.e. besides widescreen computer game play and movie viewing, are the word processor display of two standard letter pages side by side, as well as CAD displays of large-size drawings and CAD application menus at the same time.[12][13] In 2008 16:10 became the most common sold aspect ratio for LCD monitors and the same year 16:10 was the mainstream standard for laptops and notebook computers.[14]
In 2010 the computer industry started to move over from 16:10 to 16:9 because 16:9 was chosen to be the standard high-definition television display size, and because they were cheaper to manufacture.
In 2011 non-widescreen displays with 4:3 aspect ratios were only being manufactured in small quantities. According to Samsung this was because the "Demand for the old 'Square monitors' has decreased rapidly over the last couple of years," and "I predict that by the end of 2011, production on all 4:3 or similar panels will be halted due to a lack of demand."[15]
The resolution for computer monitors has increased over time. From 320 × 200 during the early 1980s, to 1024 × 768 during the late 1990s. Since 2009, the most commonly sold resolution for computer monitors is 1920 × 1080.[16] Before 2013 top-end consumer LCD monitors were limited to 2560 × 1600 at 30 in (76 cm), excluding Apple products and CRT monitors. Apple introduced 2880 × 1800 with Retina MacBook Pro at 15.4 in (39 cm) on June 12, 2012, and introduced a 5120 × 2880 Retina iMac at 27 in (69 cm) on October 16, 2014. By 2015 most major display manufacturers had released 3840 × 2160 resolution displays.
Every RGB monitor has its own color gamut, bounded in chromaticity by a color triangle. Some of these triangles are smaller than the sRGB triangle, some are larger. Colors are typically encoded by 8 bits per primary color. The RGB value [255, 0, 0] represents red, but slightly different colors in different color spaces such as Adobe RGB and sRGB. Displaying sRGB-encoded data on wide-gamut devices can give an unrealistic result.[17] The gamut is a property of the monitor; the image color space can be forwarded as Exif metadata in the picture. As long as the monitor gamut is wider than the color space gamut, correct display is possible, if the monitor is calibrated. A picture that uses colors that are outside the sRGB color space will display on an sRGB color space monitor with limitations.[18] Still today, many monitors that can display the sRGB color space are not factory adjusted to display it correctly. Color management is needed both in electronic publishing (via the Internet for display in browsers) and in desktop publishing targeted to print.
Most modern monitors will switch to a power-saving mode if no video-input signal is received. This allows modern operating systems to turn off a monitor after a specified period of inactivity. This also extends the monitor's service life. Some monitors will also switch themselves off after a time period on standby.
Most modern laptops provide a method of screen dimming after periods of inactivity or when the battery is in use. This extends battery life and reduces wear.
Most modern monitors have two different indicator light colors wherein if video-input signal was detected, the indicator light is green and when the monitor is in power-saving mode, the screen is black and the indicator light is orange. Some monitors have different indicator light colors and some monitors have blinking indicator light when in power-saving mode.
Many monitors have other accessories (or connections for them) integrated. This places standard ports within easy reach and eliminates the need for another separate hub, camera, microphone, or set of speakers. These monitors have advanced microprocessors which contain codec information, Windows Interface drivers and other small software which help in proper functioning of these functions.
Monitors that feature an aspect ratio of 21:9 or 32:9 as opposed to the more common 16:9. 32:9 monitors are marketed as super ultrawide monitors.
These monitors use touching of the screen as an input method. Items can be selected or moved with a finger, and finger gestures may be used to convey commands. The screen will need frequent cleaning due to image degradation from fingerprints.
Some displays, especially newer LCD monitors, replace the traditional anti-glare matte finish with a glossy one. This increases color saturation and sharpness but reflections from lights and windows are very visible. Anti-reflective coatings are sometimes applied to help reduce reflections, although this only mitigates the effect.
In about 2009, NEC/Alienware together with Ostendo Technologies, Inc. (based in Carlsbad, CA) were offering a curved (concave) 43-inch (110 cm) monitor that allows better viewing angles near the edges, covering 75% of peripheral vision in the horizontal direction. This monitor had 2880x900 resolution, 4 DLP rear projection systems with LED light sources and was marketed as suitable both for gaming and office work, while for $6499 it was rather expensive.[19] While this particular monitor is no longer in production, most PC manufacturers now offer some sort of curved desktop display.
Newer monitors are able to display a different image for each eye, often with the help of special glasses, giving the perception of depth. An autostereoscopic screen can generate 3D images without headgear.
Features for medical using or for outdoor placement.
Narrow viewing angle screens are used in some security conscious applications.
Integrated screen calibration tools, screen hoods, signal transmitters; Protective screens.
A combination of a monitor with a graphics tablet. Such devices are typically unresponsive to touch without the use of one or more special tools' pressure. Newer models however are now able to detect touch from any pressure and often have the ability to detect tilt and rotation as well.
Touch and tablet screens are used on LCDs as a substitute for the light pen, which can only work on CRTs.
The option for using the display as a reference monitor; these calibration features can give an advanced color management control for take a near-perfect image.
Option for professional LCD monitors, and basic feature of OLED screens; professional feature with mainstream tendency.
Near to mainstream professional feature; advanced hardware driver for backlit modules with local zones of uniformity correction.
Computer monitors are provided with a variety of methods for mounting them depending on the application and environment.
A desktop monitor is typically provided with a stand from the manufacturer which lifts the monitor up to a more ergonomic viewing height. The stand may be attached to the monitor using a proprietary method or may use, or be adaptable to, a Video Electronics Standards Association, VESA, standard mount. Using a VESA standard mount allows the monitor to be used with an after-market stand once the original stand is removed. Stands may be fixed or offer a variety of features such as height adjustment, horizontal swivel, and landscape or portrait screen orientation.
The Flat Display Mounting Interface (FDMI), also known as VESA Mounting Interface Standard (MIS) or colloquially as a VESA mount, is a family of standards defined by the Video Electronics Standards Association for mounting flat panel monitors, TVs, and other displays to stands or wall mounts.[20] It is implemented on most modern flat-panel monitors and TVs.
For Computer Monitors, the VESA Mount typically consists of four threaded holes on the rear of the display that will mate with an adapter bracket.
Rack mount computer monitors are available in two styles and are intended to be mounted into a 19-inch rack:
A fixed rack mount monitor is mounted directly to the rack with the LCD visible at all times. The height of the unit is measured in rack units (RU) and 8U or 9U are most common to fit 17-inch or 19-inch LCDs. The front sides of the unit are provided with flanges to mount to the rack, providing appropriately spaced holes or slots for the rack mounting screws. A 19-inch diagonal LCD is the largest size that will fit within the rails of a 19-inch rack. Larger LCDs may be accommodated but are 'mount-on-rack' and extend forward of the rack. There are smaller display units, typically used in broadcast environments, which fit multiple smaller LCDs side by side into one rack mount.
A stowable rack mount monitor is 1U, 2U or 3U high and is mounted on rack slides allowing the display to be folded down and the unit slid into the rack for storage. The display is visible only when the display is pulled out of the rack and deployed. These units may include only a display or may be equipped with a keyboard creating a KVM (Keyboard Video Monitor). Most common are systems with a single LCD but there are systems providing two or three displays in a single rack mount system.
A panel mount computer monitor is intended for mounting into a flat surface with the front of the display unit protruding just slightly. They may also be mounted to the rear of the panel. A flange is provided around the LCD, sides, top and bottom, to allow mounting. This contrasts with a rack mount display where the flanges are only on the sides. The flanges will be provided with holes for thru-bolts or may have studs welded to the rear surface to secure the unit in the hole in the panel. Often a gasket is provided to provide a water-tight seal to the panel and the front of the LCD will be sealed to the back of the front panel to prevent water and dirt contamination.
An open frame monitor provides the LCD monitor and enough supporting structure to hold associated electronics and to minimally support the LCD. Provision will be made for attaching the unit to some external structure for support and protection. Open frame LCDs are intended to be built into some other piece of equipment. An arcade video game would be a good example with the display mounted inside the cabinet. There is usually an open frame display inside all end-use displays with the end-use display simply providing an attractive protective enclosure. Some rack mount LCD manufacturers will purchase desktop displays, take them apart, and discard the outer plastic parts, keeping the inner open-frame LCD for inclusion into their product.
According to an NSA document leaked to Der Spiegel, the NSA sometimes swaps the monitor cables on targeted computers with a bugged monitor cable in order to allow the NSA to remotely see what is being displayed on the targeted computer monitor.[21]
Van Eck phreaking is the process of remotely displaying the contents of a CRT or LCD by detecting its electromagnetic emissions. It is named after Dutch computer researcher Wim van Eck, who in 1985 published the first paper on it, including proof of concept. Phreaking more generally is the process of exploiting telephone networks.[22]
See For Yourself The Effects of Misinterpreted Color Data
The rendering intent determines how colors are handled that are present in the source but out of gamut in the destination
By: Wikipedia.org
Edited: 2021-06-18 18:48:56
Source: Wikipedia.org