Moving towards Better Images

The idea that a picture is often worth more than a textual description is central to the need for image databases, particularly photographic image databases. By offering high quality digital copies, users are given the ability to examine items which otherwise would be inaccessible due to distance or security/preservation issues. Replications can never be perfect, but the goal is to make them detailed enough to allow even the most demanding of users to obtain meaningful information.

The development of image databases as a viable tool involves several key issues: scanning, storage, networking and transfer, search engines - to name a few. Most of these issues are interrelated to one degree or another. This is particularly true of image display. All other concerns must address the challenge of giving the end-user as high quality an image as possible, without subjecting them to long waits or technological compatibility issues. The result is a series of compromises which will become less problematic as technology advances.

The display of good quality images is most strongly restricted by three elements:

Currently this means that image display is far from ideal. The usual compromise is to have images that are not as good as they could be, transferred at rates that are annoying, and compounded by a wait for image decompression. This satisfies no one. Fortunately, several innovations (and ordinary technological advancements) promise to improve the situation. Perhaps photographic quality images, asked for, received, and displayed in scant seconds is not an immediate probability, but there is certainly promise that we are headed in that direction and that a solution is not impossible.

Image Capture

The technique is not tremendously important. What is important is that an analog image (whether slide, negative, print, or flat item) is digitized and inevitably reduced in quality. This reduction tends to be large. A photograph has an effective resolution measured in the 1000's of "dots per inch". Scanning for an image database is not likely to be anywhere near that. Consider the statistics below for a 3"x5" print scanned in 24-bit color.

Resolution (dpi) File Size (kilobytes)
2000 180,000
1000 45,000
500 11,250
250 2,800
125 700

Because file size quadruples every time resolution is doubled (and file sizes are big enough to begin with), it is not economical to go much above 500 dpi - well short of an analog photograph's resolution - with 300 dpi being common. Image display for the end user is thus noticeably degraded.

File Size

The reason no one scans at 2000 dpi (or higher) is that hard drive space is expensive, and transferring large files over networks - even fiber optic - takes too long. Additionally, even if this was no difficulty to the owner of an image database, the end-users would likely not have such powerful systems. Compression helps, but many of the forms currently available are far from perfect. JPEG compression, the de facto standard for photographs, can reduce images without appreciable loss to 5-10% of the original size.

As an example, the same 3"x5", 24-bit photograph mentioned above, when scanned at 300 dpi and then reduced using moderate JPEG compression, would yield a file size of approximately 250k (minimum). A database of ten thousand of these images would require 2,500 megabytes. Assuming a connection via modem (14,400 or better), an image transfer would likely take 2 minutes under ideal conditions. Factor in real world line noise, decompression times, and incidentals such as hard drive access times and resource sharing, the actual transfer time would likely be at least 5 minutes. Using a better connection, such as direct ethernet access, would reduce base transmission rates to almost nothing, but decompression and incidentals would still have an appreciable effect - from request to actual viewing could still take from 30 seconds to a minute.

User Hardware

The standard viewing screen for computer users has a resolution of only 72 dpi and measures less than 9 by 7 inches. That means that the scanned image used in the example above would overflow the screen extensively if viewed at its best resolution. Even at higher resolutions such as 112 (common for SVGA display cards) their is overflow.

In order to get the entire image onto the screen, it would have to be reduced, which requires discarding information. The result is image degradation for the viewer. For larger size images the problem is, of course, worse, and is further compounded by other hardware issues; namely RAM. The 250k image used in the example is actually 4,000k compressed. When this is viewed on screen it needs 4 meg of RAM so that a user can pan around, accessing parts of the image which are not currently contained on the screen. Many systems do not have this much RAM and are forced to use "virtual" (hard drive) memory. This is much slower than RAM and makes for very jerky, unsatisfactory image viewing.

Standards

What complicates the issue of image display even further is that the provider of the image database has no way of knowing what the specifics of a user's computer display are, and the user generally has no way of knowing what software will show an image to its best advantage, what compression scheme to expect, and what image qualities are due to hardware differences rather than scanning procedures, original appearances, or compression artifacts. Obviously image display suffers from a lack of standards. Unfortunately, many of them are not easily solved.

Developments and Solutions

Basically it all comes down to file size. In the short term there will be little improvement in modem transmission rates due to the inherent limits in telephone lines. If an image database is to be accessible to as many people as possible, then this is the most significant bottleneck. The impact of provider storage needs is less, though complementary, because the cost of fast storage is rapidly decreasing. CD-ROM's may not yet be fast enough to keep up with transmission, but multi-gigabyte hard drives are becoming quite affordable, and much faster optical storage is also on the horizon.

There are three ways to decrease file size: lowering resolution, lowering bit-depth, and compression.

Lowering Resolution: This is a worst case scenario unless detail is unimportant to a particular image database. Since this essay concerns itself with photographic image databases specifically, resolution is assumed to be very important. True, large resolutions make for images which do not display within the bounds of most monitors, but cheaper, flatter, wider screens are in the near future. Coupled with more access to inexpensive display cards giving 24-bit (or even 32-bit) at high resolution, images will continue to look better and fit better. Considering the time and effort that goes into scanning photographs for an image database, it is best to plan for the years ahead as well as for the immediate capabilities of technology.

Additionally, even though most computer screens have relatively low dpi (due to their display cards), this is not the only way to view an image. Laser printer technology is already capable of printing color images at several hundred dpi for fairly low cost. This will continue to improve over time.

An unusual approach which is being tested is to allow for variable resolution within an image. For example, if a photograph were taken of a man against a backdrop, the man could retain a fairly high resolution (especially in critical areas such as the face, hands etc.) while the background could be dropped to a low resolution. The trick here would be to establish either a median resolution for display, or a lowest common denominator. The "extra" resolution would not display unless an area was zoomed in on. Variable resolution, especially in images which have a good deal of "unimportant" areas, could offer considerable savings in file size.

Lowering Bit-Depth: This is not much better of an option than lowering resolution for the same reasons as outlined above, but it is a little more flexible. Black and white photography works quite well with 16-bit or even 8-bit (as greyscale). Color photography of such things as old manuscripts and other textual material can also get by with lower resolution. Of course, the future promises another dilemma - whether to go with 24-bit or 32-bit color. Considering that the human eye is limited in its ability to differentiate that many colors, this might appear to be a false problem. However, even the slightest measurable difference in color can have important implications in many disciplines: chemistry chromatography, art restoration, oceanography etc. Images can be examined with software, even if the human eye is not capable of it.

Compression: The ascendancy of JPEG compression for digitized photographs is still progressing. It is superior to LZW, and in JFIF has found a file format that almost all widespread multi-graphics viewers now support. It is lossy and that can be problematic where tonal information must be retained, but the degree of compression can be controlled to minimize unwanted image degradation. JPEG also offers 24-bit and 32-bit color retention. It's main negatives are that it does not work well with 8-bit color, and that it takes longer to decompress than other compression schemes. This second drawback is mitigated somewhat by the option some viewers have of skipping steps in decompression and extrapolating from other data to speed up the process.

There is, however, a promising new possibility to reduce file sizes even more. Fractal compression operates on the principle that basic formulas can be discerned that can be used to represent complex colors, patterns and shapes. This is particularly true of digitized natural images (i.e. not computer generated, drawn etc.). Whereas JPEG compression yields high results at ratios of 10:1 to 20:1, fractal compression has similar success at ratios as high as 50:1. A 4 meg image could then be reduced to a mere 80 k. There are other benefits. Images compressed in this way decompress more quickly than with JPEG, they gain scalability independent of resolution, and if very large, show an even more dramatic decrease in size. JPEG file sizes increase linearly - fractal file sizes increase more slowly as image size increases.

The problem with fractal compression is twofold. First, it is proprietary and offered by only one company (Iterated Systems). This is problematic for its adoption in an image database as all users must be customers must buy special software before viewing images. Second, compression is extremely slow. For long term projects with ample computer resources this may pose little difficulty, but it is not good for more fast-paced initiatives or quick updates.

In Conclusion: Image databases do not appear to be facing any insurmountable difficulties as far as graphics go. Graphics as a communications device is fairly new and there are many issues of hardware and software that need to be worked out. These are mainly non-technology concerns, or rather technology choices that people already have the ability to make. It is a matter of deciding on and adopting standards. Additionally, compression and computer displays are improving rapidly, and 24-bit depths are becoming commonplace.

Drin Gyuk : ILS 603 : Ocober 15, 1995


Sources and Resources

Preparin g Quality Images for Computer Networks

Graphics : 2D BitMap Specifications

Image Standards Needed

My Main Squeeze : Fractal Compression

For Optimal Use of the Internet

Fractal Image Compression

Increasing Web Bandwidth through Image Compression

News in the Future

Introduction to Imaging

Sundt, Christine L. (ed.), Special Issue: Issues in Electronic Imaging, Visual Resources 10 (1), 1994