by Kevin Pedersen, marintellect.com
Home | Portfolio | Contact | Help Topics
Table of Contents
The display is also known as the screen or monitor. There are many different types of displays. The two most common types are CRT and LCD.
CRT stands for Cathode Ray Tube. CRT displays are made of a glass picture tube with a "gun" at the back of the tube that "shoots" the image onto the screen at the front of the tube. The gun shoots a beam of electrons, called a cathode ray. When the electrons hit the screen on the front of the picture tube, they make chemicals called phosphors on the inside of the screen glow red, green and/or blue. Most televisions and computers use CRT displays, which have been around since the late 1920's. They are large and bulky.
LCD stands for Liquid Crystal Display. It is a relatively new technology and is getting better and more affordable all the time. Liquid crystal displays are just what you see in your digital watch, except that most of the ones used in computers can display many colors instead of just black & grey like the average digital watch. Liquid crystal is a material that is either clear, letting white light shine through from a backlight behind the screen, or it changes color (red, green and/or blue) to affect the color of the white light shining through it. All laptop computers use liquid crystal displays. There also are many liquid crystal displays available for desktop computers and televisions, if you want to spend more money to get a much thinner screen than you can get with a CRT display. Liquid crystal displays are thin and lightweight.
Table of Contents
Images are created on television and computer displays by combining thousands of little dots on the screen. Each dot is called a pixel. Each set of pixels that fills the entire screen is called a frame. Many individual still frames are shown every second, so quickly that they blend together and we see continuous motion. The number of different frames that are shown on the screen every second relates to the refresh rate (see section V).
Each pixel in a color display is made up of three colors: red, green and blue. Red, green and blue light combines in varying intensities to allow each pixel to display all the colors of the rainbow and thousands or millions of shades in between. The number of shades of colors your computer can display relates to the bit depth (see section VI).
Each frame is drawn on the screen in rows of pixels starting in the upper-left. Each row of pixels goes across the screen in a line from left to right, the next row starting one line down, going from left to right, over and over again. When the last row of pixels is drawn across the bottom of the screen, the next frame begins again in the upper left.
To summarize: each frame is drawn one pixel at a time, line by line, from left to right, top to bottom, just as you would type a paper. This all happens too quickly for you to see.
Table of Contents
The number of pixels on a screen relates to display's the resolution. The more pixels there are on a screen, the more detail can be seen in an image on the screen. Higher resolution means smaller pixels.
A screen with many pixels is said to be high resolution. A screen with very few pixels, such as a standard television, is said to be low resolution. Nearly all computer monitors and televisions can display different resolutions. There are many situations where you might want to change the resolution of your display, depending on what you're viewing. For example, many computer games ask you to switch your computer screen to a lower resolution when you play them. A high resolution display may slow the game down because it requires more computing power to draw all the extra detail in each frame.
If you have a High-Definition Television, or HDTV, then the display has a higher-resolution than a standard television. Most TV programs still broadcast at low resolution (roughly 460 pixels across by 360 pixels up and down), while DVD movies can take advantage of the high-resolution capability of your HDTV and display a sharper, more detailed picture (up to 720 pixels across by 480 pixels vertically).
Below is a comparison of different common resolutions for televisions and computers. As noted above, higher resolution means smaller pixels, but this example shows all pixels the same size to demonstrate the amount of pixels relative to each resolution. The numbers shown (for example, 1024 x 768) represent the number of pixels going across the screen horizontally by the number of pixels going up and down the screen vertically. This image is NOT TO SCALE. Click here to find the resolution of the screen you're viewing this web page on.
A standard television in North America displays a maximum of about 640 by 480 pixels. If you connect a TV screen to a computer, it will be blurry and difficult to read. High-Definition Televisions are becoming more popular, but nearly all TV broadcasts still only carry about 460 by 360 lines of resolution (not an exact number, because broadcast TV is based on an analog signal that is continuous vs. a discrete pixel-by-pixel digital signal). Television stations are all supposed to be able to broadcast digital high-definition programming by 2006, but that remains to be seen. For now, you can only get HBO, Showtime and some pay-per-view programming in high-definition, and then only if your carrier (e.g. Dish Network) provides a digital tuner and you have an HDTV, which very few people do. For now, the biggest advantage of HDTV is better picture quality for DVD movies, and the ability to connect your TV to your computer without sacrificing detail, although not all HDTV's have connections for computers.
Table of Contents
The three most common resolutions for computer displays are 640 x 480 (known as VGA), 800 x 600 (known as Super VGA) and 1024 x 768 (known as XGA). Most CRT displays can handle all of these resolutions, and many more. Liquid crystal displays have a fixed number of pixels (usually at least 1024 x 768) and can use software to display resolutions below their fixed resolution with a process called interpolation, but cannot go above their fixed resolution. Interpolation results in a blurred-looking LCD monitor, so it's best to stick with the native (or maximum) resolution your LCD will support.
Consider that your screen size stays the same, whatever resolution you choose. When you choose a higher resolution, the display must fit more pixels onto the screen , so the pixels must be smaller. In addition to the fact that the display will show more detail, another advantage of higher resolution is that you will be able to fit more toolbars onto your screen. The image below shows how a large font will shrink on your display when you switch to a higher resolution. Toolbars, menu bars and other objects on the screen will also shrink proportionally.
When you are viewing a web page and you have to scroll left and right to see the whole page, that means that your display's resolution is set too low. Sometimes you don't have the choice to switch to a higher resolution (see section VII. Video Memory), but if your video card will support a higher resolution then you can switch, for example, from 800 x 600 to 1024 x 768 to fit more pixels on the screen and you'll be able to view the entire page. The display settings are usually found in your computer's "Control Panels."
Table of Contents
As discussed in section II, pixels are drawn in rows across the screen from upper left to lower right and the entire display image, or frame, is updated numerous times per second. The term for "times per second" is Hertz, abbreviated Hz, pronounced "hurts." The frequency at which the frames are displayed, or refreshed, is called the Refresh Rate (given in Hz). There are two ways to refresh the display: interlaced and progressive scan.
Interlaced means that every other line is drawn each time the display refreshes. For example lines 1, 3, 5, ... 457 and 459 are drawn in one frame, and lines 2, 4, 6, ... 458 and 460 are drawn in the next frame. The frames are drawn so quickly that they are blended together, or interlaced. North American Televisions have a refresh rate of 30 Hz. Since televisions are interlaced, they must refresh the odd frames and the even frames twice as fast (60 Hz) because each is only half a frame. When displaying fast motion, such as sports or video games, this method of interlacing, or refreshing half a frame at a time, can flicker and produce jagged edges. It is inferior to progressive scan, but it was the best solution available when television was invented in the 1930's.
Progressive scan displays refresh every line of the frame each time the display refreshes. Computer monitors generally have variable refresh rates, with 60 Hz being the lowest. Displays can only refresh the frames up to a certain point, but our eyes do not notice any difference after about 85 Hz. A refresh rate of 85 Hz will give a good image without noticeable flicker to most peoples' eyes. Most displays support a refresh rate of 85 Hz, although older video cards and displays may only go up to 60 Hz. Lower refresh rates cause the screen to flicker, which can be unpleasant to look at.
When we talk about movies or animation, we refer to the frame rate. The frame rate relates directly to the number of frames of the movie that are shown each second. Frame rates are given in units of fps (frames per second). For example, movies in the theater are shown at a frame rate of 24 fps. The refresh rate is always greater than or equal to the frame rate.
Table of Contents
Bit depth relates to how many different colors each pixel can display. Computer monitors were originally monochrome - meaning that each pixel could only display two colors such as black and white. Monochrome monitors have now been replaced with RGB monitors. Each pixel on an RGB monitor contains three dots - one for each primary color. RGB stands for the primary additive colors Red, Green and Blue, which combine to form every possible color. One bit color is what monochrome monitors use. A bit is a binary digit, which has two possible values - zero and one. A bit can represent colors, for example, black (0) and white (1). The more bits are used per pixel, the more colors it can display.
Possible colors per pixel:
1 bit = 21 colors = 2 colors (monochrome)
8 bit = 28 colors = 256 colors or grays
16 bit = 216 colors = 65,536 colors (thousands of colors)
24 bit = 224 colors = 16,777,216 colors (millions of colors)
32 bit = 232 colors = 4,294,967,296 colors (billions of colors)
Table of Contents
It used to be very simple to figure out how much video memory you needed. You just took the total number of pixels on the screen and multiplied that by the amount of memory that was used by each pixel. Nowadays, with 3-D graphics, texture rendering and other fancy stuff like that, computers use the video memory for a lot more than just displaying colored pixels. However, we can still use the following formula for computers that do not have fancy video cards designed to handle 3-D graphics.
Everyone who has used a computer has heard the terms bits and bytes thrown about all the time. These terms are very simple to understand. They represent a quantity of information, whether it be data flowing across a network or stored on a hard drive. Think of them like you would think of pints and gallons. Gallons can be used to measure how much water runs out of your shower head per minute (flowing), or how much gas you put in your car (stored). In case you forgot, there are eight pints in every gallon. Similarly, there are eight bits in every byte. A bit is the tiniest unit of data. It can only have two values, zero or one. These values are easily represented in computers with electricity either off (0) or on (1).
How do bits and bytes relate to computer graphics?
Computer graphics are nothing more than data, composed of little bits. Take eight bits and you've got a byte of data. Your computer needs to have enough storage in its video memory to be able to
Q: How do you calculate the amount of video memory a computer needs to display a given resolution at a given bit depth?
A: The bit depth represents how many bits of memory each pixel uses. So, you multiply the number of pixels on your screen by the bit depth. To get the memory in bytes, just divide by eight (remember: 8 bits per byte).
Example:
How much memory is required for an 800 by 600 display resolution with thousands of colors per pixel (16 bit)?
800 * 600 * (16 bits) / (8 bits / byte) = 960,000 bytes, or about 1 MB
You would need 1 MB of video memory to run your monitor at 800 by 600 with thousands of colors.
Table of Contents
Modern television and computer monitors create images with three different-colored phosphors, Red, Green and Blue (RGB). This is how computer graphics programs work best with images.
Grayscale images are usually 4, 6 or 8 bit. 8 bit grayscale images display 256 shades of gray. This is different from monochrome, which only displays solid light or dark, but nothing in between.
Indexed color images are single-channel images (8 bits per pixel) that use a color palette containing no more than 256 colors. Limited editing is available in this mode. Examples of color lookup tables are:
The CMYK model is based on the light-absorbing quality of ink printed on paper. As white light strikes translucent inks, part of the spectrum is absorbed and part is reflected back to your eyes.
In theory, pure cyan (C), magenta (M), and yellow (Y) pigments should combine to absorb all color and produce black. For this reason these colors are called subtractive colors. Because all printing inks contain some impurities, these three inks actually produce a muddy brown and must be combined with black (K) ink to produce a true black. (K is used instead of B to avoid confusion with blue.) Combining these inks to reproduce color is called four-color process printing .
Table of Contents
When you scan a photo into a computer or print something out on paper, resolution is referred to in dots per inch (dpi), as opposed to screen resolution, which just describes the number of pixels the screen displays horizontally by the number of pixels the screen displays vertically.
Your choice of image resolution should be based on where the image will be output to:
File size is calculated by multiplying the number of pixels by the bit depth. The number of pixels in an image is determined by the resolution:
A 4 inch by 5 inch photo scanned at 600 dpi would yield an image that was 2400 by 3000 pixels (multiply inches by dots per inch). This gives us
2400 * 3000 = 7,200,000 pixels
For good color reproduction, the image is scanned with millions of colors, or 24 bit color. The memory that this image file will take up, in uncompressed form (we'll discuss that next), would be:
7,200,000 pixels * (24 bits / pixel) / (8 bits/byte) = 21,600,000 bytes
= 21.6 MB
The file size in the previous example is what would be required for a computer to specify the color of each and every pixel in the image. There are ways of writing image files that can dramatically reduce the file size by using mathematical equations to represent the image's colors. File size can also be reduced by limiting the number of colors the picture uses (lower bit depth means less memory) or by using notation that groups together all pixels of the same color (e.g. a blue sky) as one block of code instead of specifying nearby pixels individually (GIF images use both of these techniques).
The term Lossy means that some information about the original image is lost when the file is compressed. Lossy image compression formats will yield smaller file sizes, but too much compression will yield noticeably poorer image quality (JPEG images are an example of this).
TIFF images are non-lossy, but still compressed. That is, they can make slightly smaller file sizes without any loss in amage quality. TIFF files are used commonly by graphics professionals.
Image File Formats
Table of Contents
Last updated on the twenty-eighth of March, 2003.
Please use this contact form to get in touch with me.
| made | ||
|---|---|---|
| with | ||
| HTML | ||