Sunday, December 30, 2007

The Z axis

I have to remind myself that writing a blog does NOT mean that every post requires a specific word count, brief entries can be as useful and interesting as long ones.

Ah, hell, I'll remember that for the next post. This one has more than a few words, 'cause I have more than a couple of thoughts about, well, color.

The last couple of times I appeared on the Tech Night Owl Live, I brought up a topic I've been thinking about a lot lately - the unused "Z Axis" of current computer displays.

Back in the dark ages of the beginning of multimedia, 8 bit color palettes kept artists and programmers awake at night - the art of making color images look good with a constrained set of up to 256 colors, was the difference between the average designer and the one who got the sweet gigs. Around 1989 or so, I plunked down $10,000 on a Howtek flatbed scanner which sported an optical resolution of 300 dpi and needed a GPIB interface plugged into an original Macintosh II. I used this boat anchor to scan a series of 8x10 chromes of cars for the Oldsmobile Consumer Computer, an ambitious touchscreen kiosk project that was one of the highlights of my NYC multimedia career. In order to make the images look as good as they could with the "Mac II system palette", we tested a variety of color imaging software, including Color Studio, PixelPaint and some other obscure dithering applications. In the end, the software that provided the absolute best color reduction was a very early Photoshop, 0.36 if memory serves me correctly. It didn't have a tool palette yet, just some rudimentary menu items, but it did an amazing job of reducing the 24 bit scans to 8 bit images, which we then loaded into Hypercard using the VideoWorks II player software. Full color Hypercard, yesiree, which even fooled John Sculley (not a huge achievement in and of itself) and Jerry Pournelle (of Byte fame) into thinking that we somehow magically endowed Hypercard with glorious color.

I'll devote a future blog entry to the trials and tribulations of the "Netscape palette", "Mac & Windows system palettes" and the "web-safe 216 colors", the bane of web designers up to recent years, but now largely forgotten, and for good reason.

The main point is this: every computer that has shipped in the last, what, 6 years or so, has at least 16-bit color, but more likely 24 bit color graphics. That means that everyone has a full range of deep, rich colors for displaying images and video. Good stuff.

Meanwhile, the often neglected side of this situation is that every computer has lots of color available for interface design and implementation. But you would never know this by looking at the interfaces for most applications software, which mostly look like they're stuck in the mid-90s.

24 bit color is now the lowest common denominator. 24 bits of memory for each pixel works out to be 16,777,216 colors, and is usually rounded off to "16.7 million". What's 77,216 colors when you've got millions, right? What are we doing with all these colors? Anything? Nothing?

Look at the desktop of your computer. It's got icons for applications, documents and folders. It probably has a cool - or dorky - image loaded as a backdrop. It's a flat, two-dimensional construct, with X and Y dimensions. If it's anything like my desktop, it's cluttered with all sorts of stuff, not unlike many real desktops.

Apple releases a new version of OSX and one of the major new features is that the dock can now contain "stacks" of icons, which looks like a software rendition of the leaning tower of Pisa. This is the best that Apple can do for expanding the amount of virtual real estate for icons?

It's time that we turned on the Z Axis of the screen. While the X and Y dimensions of the screen are fixed, the Z dimension - depth - is wide open. You can move into the screen endlessly, without ever hitting the back of it, if software was designed to let you explore the virtual terrain. Imagine using the 24 bit color display in a way that takes advantage of the fact that files are time/date stamped by the OS. You've got documents on your desktop, and as time goes by and the documents remain unused, they start to recede into the background, getting smaller and dimmer until finally, they either vanish or are automatically placed in an "inactive file" folder, which softly blinks when it's starting to get too full.

Imagine editing a video clip by moving around it in 3D, and creating looped video by wrapping a clip back onto itself, Moebius-style. How about storing your files in a virtual representation of the world's deepest file cabinet, organized chronologically. Look at the Time Machine component of Apple's Leopard to get a good idea of what I'm talking about, it's a somewhat crude - but workable - version of what I've got in mind, but instead of being implemented in a modal fashion, I want it to always be there. I want to be able to store groups of related digital photos in clusters that I can push back "into" the screen, and where they can be retrieved by holding down a modifier key while scrolling on my Mighty Mouse ball. It works for Google Earth, why not my desktop? With the proliferation of virtual worlds such as Second Life, the essentials metaphors are in place. Let's see how we can use the virtual depth of the screen as an organizational tool, as the equivalent of the Hole in the Pocket that Jeremy gives to Ringo in Yellow Submarine.




24 bit color means that every shade of every color is there for the taking. We've yet to see true color coding at the OS level - a handful of color labels is nice, but we're talking transparency, subtle shading and true dimming, all used to clean up our acts.

And don't get me started on all that CPU power sitting there unused most of the time, and how it could make itself useful by keeping a watchful eye on you and how you interact with those files crowding that desktop.

We'll delve into that topic shortly.

3 comments:

Anonymous said...

Sound cool, lets see a mockup!

John Worthington said...

There were actually some very interesting interfaces for video editing that got prototyped as part of the original QuickTime development. They got killed when SuperMac threatended to drop all of their Mac development if Apple released a video editor would compete with their forthcoming app - Premiere.

Gene said...

It appears that you are virtually alone talking about this topic, which is highly unfortunate. This is the sort of dialogue that ought to be going on in Apple's development labs, but I'm not betting on it.