*** From the Archives ***

This article is from March 27, 2003, and is no longer current.

Under the Desktop: Reconsidering Display Resolutions

Resolution is something digital content creators deal with every day. We see its result on the screen and in hardcopy. But how much do we really understand about resolution?

The terminology of resolution often changes, depending upon the digital workflow, input or output device, and graphics application. And the usual discussion of resolution often gets bogged down in terminology, engineering, and mathematics. Worse, the reasoning is often lengthy, overwhelming even those willing to give it a try.

Even I glaze over sometimes.

So, let’s look at a simple question, and find perhaps the not-so-simple answer: Can your display ever have too much resolution?

This seems like a ridiculous question, since more is always better, right? Still, this issue comes up in various professional circles as higher-resolution flat-panel displays come into the market.

A Pixel is a Pixel is a Pixel?
When we think of display resolution, we mostly think of the settings in the operating system controls: the Mac OS Displays preference pane (see Figure 2) or the Windows Display Properties dialog box. Here we can determine the resolution of the screen — or more accurately, the resolution we want to apply to the screen.

Figure 1: The Display pane in Mac OS X shows the recommended and optional resolutions for the LCD screen for my iBook. Clicking on the color tab flips the pane and offers a list of color profiles as well as a button that launches the Display Calibrator application.

The resolution setting we finally choose, as well as the choice of resolutions we’re offered, will be determined by the type of display (flat panel LCD or traditional CRT), the resolutions and color depth supported by the video hardware in our computer, and at times the density of pixels on the screen itself.

This latter issue of pixel density is the number of pixels packed into a particular space. The most familiar measure is pixels per inch, or ppi. (Note that this isn’t dpi, or dots per inch, even though many people use them interchangeably. A pixel is a unit of image and hardware resolution, while dots are what some printing devices put on paper. In some circles the “D word” is a social and professional faux pas.)

Pixel Gridlock
Flat-panel displays have a fixed, or native resolution. It’s a grid of pixels, each one made up of a set of red, green, and blue cells. The smaller the cells, the higher the ppi available from the screen. Some screens can be big on the desktop but offer the same or lower resolution than a smaller-sized panel.

In the past, most LCD screens were small when compared with CRTs, and offered a low pixel density. They carried a ppi figure between 80 and 95. Even Apple’s Cinema and Cinema HD displays have 86 and 98 pixel densities, respectively. This situation is changing and a recent Dell white paper considers anything over 100 to be high resolution.

Last year, however, we saw that number rise to more than 200 ppi. Several vendors offered models with more than 9 million pixels with high but “reasonable” prices (when compared with the pioneering models I reported on a couple of years ago). For example, ViewSonic offers its 22-inch VP2290b with a 204-ppi density for $7,199 (see Figure 2).

Figure 2: This image of ViewSonic’s high-resolution VP2290b flat-panel display looks to be an illustration rather than a photograph. Perhaps this is for the benefit of the birds?

At January’s CES show in Las Vegas companies showed screens in the 40- and 50-inch range. Samsung showed a 54-inch model and Philips presented a 52-inch screen, and each claimed to be the world’s largest. Of course, these larger screens don’t offer anywhere near the resolution of ViewSonic’s 22-inch panel. In fact these mega-displays were consumer HDTVs with “only” 2.07 million pixels and a 1,920-by-1,080-pixel resolution.

At the same time, displays based on the cathode ray tube (CRT) are a mixed bag for high resolution. The usual maximum resolution of a 22-inch CRT monitor is 2,048-by-1,536 pixels, bringing a 116 ppi density, depending on ruling of the pitch of the monitor’s aperture grille or shadow mask. Still, unlike the LCD, the image will be sharper in the middle and will have trouble resolving at the edges, according to display consultant Joel Ingulsrud. “This mush on the edge is the nature of the analog beast,” he says. Because the topic of resolution can be confusing, Ingulsrud offers a pixels-per-inch calculator on his Web site.

However, this tube-based technology offers a number of resolution benefits, since CRT resolution isn’t fixed as with LCDs. In a sense, the CRT emulates a certain pixel density, allowing users to quickly and easily shift between resolutions.

Conversely, a flat-panel must perform a lot of math, or interpolate, the pixels of a screen for a resolution that isn’t an even multiple of the native resolution. And these interpolated resolutions present many artifacts for color, image, and refresh rates. In other words, they don’t look so good.

So what’s the problem with ever-higher resolutions for displays? Why would we even care?

When I’m 72 or 96
The trouble with increasing pixel density and higher resolutions comes with our wish to use the monitor as a computer screen. The windows of the operating system, the interface, our applications and images, all have a fixed resolution: 72 ppi for the Macintosh and 96 ppi for Windows, holding true even as the OSes have transitioned to OS X and XP, respectively.

As the resolution of the display increases and the pixels get smaller, all the elements on the screen also get proportionally smaller. The rulers get smaller along with the images. An inch is no longer an inch. We understand this and live with it, and work around the relative shrinkage. Still, it’s really not how things should be.

Each operating system assumes a fixed and surprisingly low resolution. And you may be interested to know that the operating system doesn’t really consider the screen resolution or even care about the setting. Your images, desktop, windows, and icons will shrink or grow according to the resolution. As far the OS is concerned, however, their size always stays the same.

The Mac OS chose 72 ppi because of its relationship with existing print standards, such as the point, which itself divides an inch into 72 points. The original 512-by-342 monochrome display (and even the later 640-by-480 color display) corresponded closely to print reality: the concept of WYSIWYG (What You See Is What You Get).

Reportedly Microsoft chose 96 ppi for compatibility with some part of the VGA standard. And there was the concern over the legibility of small text displayed on the monitors of that time; the small-point text would be bigger on the screen. This difference is still a continuing problem for text and layout between the platforms.

Magnifying Glass Not Included
This pixel shrinkage creates a usability problem, especially for the larger LCDs — so much so that they become almost impossible to use at their native resolution. On that 9.2-million-pixel screen, you can see an entire large Photoshop file perfectly. Great. However, the palettes and menus are so small you can’t accurately click in them, or even see the menu items.

Some vendors suggest using the super-high-resolution as a second display only for evaluating an image. This was the approach of Totoku North America last year.

The ViewSonic VP2290b monitor has a button on the display that toggles the resolution between its native 3,840-by-2,400-pixel resolution and half of that. This extremely handy feature makes the display much more usable. However, even at 1,920-by-1,200 pixels the resolution is still very high, on the borderline of usability.

The answer of the future will come when the OS is resolution independent instead of fixed. With this approach, the OS and applications would talk to the display hardware and automatically raise the size of certain interface elements as resolution increases. Microsoft has a white paper on the subject for Windows developers. Or at least, adoption by developers would be the hope of content creation pros.

Just making everything bigger won’t be adequate, for example doubling the resolution in the OS and in applications would be the simplest approach. Content pros would want something more tricky, since our desktop, images, and application elements such as toolbars and palettes, often may span multiple monitors, each with its own resolution.

Such a transition will be difficult, whatever the scheme, requiring extra work from developers. Still, content creators can hope.

As the rabbinic saying advises: “Even one ear of corn is not exactly like another.” Nor the resolution of an image on a monitor.

>