Monday, December 31, 2007

And on a completely different note...



James Brown didn't have a drinking problem - no, it was more like an alcohol solution.

Sunday, December 30, 2007

An Innovative New Interface




This is a fascinating, original approach to taking audio synthesis in a new direction, utilizing a set of physical instrumentation devices that interact with their host surface. Learn more about the project and team.

A True American Hero

Some of us long for leaders that sound like this man, a true hero, a thoughful, intelligent human being with absolute integrity and honesty.

The Z axis

I have to remind myself that writing a blog does NOT mean that every post requires a specific word count, brief entries can be as useful and interesting as long ones.

Ah, hell, I'll remember that for the next post. This one has more than a few words, 'cause I have more than a couple of thoughts about, well, color.

The last couple of times I appeared on the Tech Night Owl Live, I brought up a topic I've been thinking about a lot lately - the unused "Z Axis" of current computer displays.

Back in the dark ages of the beginning of multimedia, 8 bit color palettes kept artists and programmers awake at night - the art of making color images look good with a constrained set of up to 256 colors, was the difference between the average designer and the one who got the sweet gigs. Around 1989 or so, I plunked down $10,000 on a Howtek flatbed scanner which sported an optical resolution of 300 dpi and needed a GPIB interface plugged into an original Macintosh II. I used this boat anchor to scan a series of 8x10 chromes of cars for the Oldsmobile Consumer Computer, an ambitious touchscreen kiosk project that was one of the highlights of my NYC multimedia career. In order to make the images look as good as they could with the "Mac II system palette", we tested a variety of color imaging software, including Color Studio, PixelPaint and some other obscure dithering applications. In the end, the software that provided the absolute best color reduction was a very early Photoshop, 0.36 if memory serves me correctly. It didn't have a tool palette yet, just some rudimentary menu items, but it did an amazing job of reducing the 24 bit scans to 8 bit images, which we then loaded into Hypercard using the VideoWorks II player software. Full color Hypercard, yesiree, which even fooled John Sculley (not a huge achievement in and of itself) and Jerry Pournelle (of Byte fame) into thinking that we somehow magically endowed Hypercard with glorious color.

I'll devote a future blog entry to the trials and tribulations of the "Netscape palette", "Mac & Windows system palettes" and the "web-safe 216 colors", the bane of web designers up to recent years, but now largely forgotten, and for good reason.

The main point is this: every computer that has shipped in the last, what, 6 years or so, has at least 16-bit color, but more likely 24 bit color graphics. That means that everyone has a full range of deep, rich colors for displaying images and video. Good stuff.

Meanwhile, the often neglected side of this situation is that every computer has lots of color available for interface design and implementation. But you would never know this by looking at the interfaces for most applications software, which mostly look like they're stuck in the mid-90s.

24 bit color is now the lowest common denominator. 24 bits of memory for each pixel works out to be 16,777,216 colors, and is usually rounded off to "16.7 million". What's 77,216 colors when you've got millions, right? What are we doing with all these colors? Anything? Nothing?

Look at the desktop of your computer. It's got icons for applications, documents and folders. It probably has a cool - or dorky - image loaded as a backdrop. It's a flat, two-dimensional construct, with X and Y dimensions. If it's anything like my desktop, it's cluttered with all sorts of stuff, not unlike many real desktops.

Apple releases a new version of OSX and one of the major new features is that the dock can now contain "stacks" of icons, which looks like a software rendition of the leaning tower of Pisa. This is the best that Apple can do for expanding the amount of virtual real estate for icons?

It's time that we turned on the Z Axis of the screen. While the X and Y dimensions of the screen are fixed, the Z dimension - depth - is wide open. You can move into the screen endlessly, without ever hitting the back of it, if software was designed to let you explore the virtual terrain. Imagine using the 24 bit color display in a way that takes advantage of the fact that files are time/date stamped by the OS. You've got documents on your desktop, and as time goes by and the documents remain unused, they start to recede into the background, getting smaller and dimmer until finally, they either vanish or are automatically placed in an "inactive file" folder, which softly blinks when it's starting to get too full.

Imagine editing a video clip by moving around it in 3D, and creating looped video by wrapping a clip back onto itself, Moebius-style. How about storing your files in a virtual representation of the world's deepest file cabinet, organized chronologically. Look at the Time Machine component of Apple's Leopard to get a good idea of what I'm talking about, it's a somewhat crude - but workable - version of what I've got in mind, but instead of being implemented in a modal fashion, I want it to always be there. I want to be able to store groups of related digital photos in clusters that I can push back "into" the screen, and where they can be retrieved by holding down a modifier key while scrolling on my Mighty Mouse ball. It works for Google Earth, why not my desktop? With the proliferation of virtual worlds such as Second Life, the essentials metaphors are in place. Let's see how we can use the virtual depth of the screen as an organizational tool, as the equivalent of the Hole in the Pocket that Jeremy gives to Ringo in Yellow Submarine.




24 bit color means that every shade of every color is there for the taking. We've yet to see true color coding at the OS level - a handful of color labels is nice, but we're talking transparency, subtle shading and true dimming, all used to clean up our acts.

And don't get me started on all that CPU power sitting there unused most of the time, and how it could make itself useful by keeping a watchful eye on you and how you interact with those files crowding that desktop.

We'll delve into that topic shortly.

Sunday, November 4, 2007

The Mozart of Creative Software

In a hundred years, when historians look back on the early development of computer technology, they will see things very differently than we do, and will likely appreciate things that don't appear on our current radars. When Van Gogh was doing some of his best work, it was the support of his brother that kept him from completely losing his sense of self. History has treated Van Gogh far more kindly than his contemporaries, and it's quite possible that he would have been both deeply gratified and furious at the way that the value of his artwork skyrocketed - not that it had any impact on his own rather frugal and difficult life. Like so many great artists, Van Gogh was only truly appreciated after he left the planet, and while this is sad, it's an unfortunate way to treat creativity. My buddy Paul Mavrides suffers from the same reality - his visual work stands alone. He is the single most amazing artist I've ever had the pleasure of calling friend, and his impact and talent will be truly appreciated two hundred years from now. It makes me sad beyond words.

The world of software development is often a relatively anonymous one, where teams of people gather together and work with one another, while being managed by other teams of people who in turn, must answer to other managers, account executives and other assorted power players. Lots of software ends up being designed by committee, and let me tell you, it shows. But you already knew that, if you've ever launched Microsoft Word.

There are exceptions to this rule, of course, usually involving pairs of personalities, two humans bound together by serendipity and circumstance. As far as bitmapped graphics software, there are three pairs of deeply talented geniuses that, in essence, created the industry for image editing software on personal computers. Mark Zimmer and Tom Hedges produced The Realist, which later became ImageStudio and ultimately ColorStudio. They also cooked up the original version of Fractal Painter. Then there's Keith McGreggor and Jerry Harris, the dynamic duo behind the rather wonderful PixelPaint. It all culminated with the brothers who changed the world, Tom and John Knoll, and their historic garage project, Photoshop. I'll be writing about these folks in future blog posts, as I was an integral part of all three efforts. Lots of stories, some good, some great, and some which will leave you scratching your head and wondering why certain people made amazingly bad decisions.

But this entry will look at one single person, someone who has never really been recognized for his awesome (and I don't use that world lightly) contributions to the software industry. When future software engineers look to the past for inspiration and understanding, they will all talk about the one crazy Frenchman who could only march to the beat of his own drummer.

His name is Eric Wenger. He is the Mozart of the software world.

You might not know the name, you may not be familiar with the fruits of his coding prowess. If you're a friend of mine, you've heard me sing his praises. And for good reason - Wenger has consistently been the lead vision behind some of the most creative applications to ever see the light of day. He is a one-of-a-kind, simply unreal genius, an artist, musician and tool maker unlike any other. His creations have been a large part of the reason that I stay interested in the field of software design.

I first encountered Eric while I was at the original MacUser magazine, back in the days when Felix Dennis ruled the roost and Steven Bobker locked himself in his office for days at a time. A box arrived at the office, with a cryptic cover and the words "Art Mixer". As I was the resident graphics guru, it was dropped on my desk without any explanation. I opened that box, installed the software from the multiple floppy disks, and spent the next few hours clicking, grinning and gasping, absolutely and totally amazed by what I was seeing. Art Mixer could do things that remain unmatched by any software - 3D paintbrush strokes with depth prioritization, nested graphics documents with realtime, dynamic links, 3D image mapping (in 1986!), and lots of other totally unique and utterly wild stuff. I proceeded to spend some of Felix's money on a long distance call to Paris, and spoke to Eric at length. His English was better than my French - an easy thing to accomplish - and we spoke for a couple of hours. He was pleased that I understood what he had done with Art Mixer, and I was blown away by his creativity. We stayed in touch over the years - mind you, these were the days before the Internet, so email was not quite as universal as today, but I resigned myself to big phone bills, economic considerations thrown to the wind.

I first met Eric in person at a MacWorld Expo, and it was instant friendship. He is a charming, brilliant and irreverent individual, and that last attribute deeply endeared him to me. I think he appreciated my enthusiasm and interest in his work. He mentioned that he was dipping his toes in serious 3D for the first time, and mailed me a disk with an early version of his first 3D modeling and rendering package.

It was called Bryce. His first 3D program, and it was a planet builder. It was clear that Eric was on another planet, a better one than ours.

I fell into Bryce instantly and completely, and the thing that really hooked me was the Deep Texture Editor. I had never seen anything quite like it, the power behind it and the range of bizarre and luscious textures it could produce blew my mind wide open. I would place a camera inside of a semi-translucent sphere, spend hours tweaking the ABC components of the procedural noises, and stare at the rendering process with a child-like wonder that made happy. I was hooked. This was fun in a way that software doesn't often feel like - I wanted to wrap my brain around it for hours, days if left to my own devices. And I'll tell you a secret - Eric has a version of Bryce that renders images way faster than the commercial releases, and lets him attach sounds to objects in a scene, and render animations with the surround sound audio tracks, complete with corresponding Doppler effects - as you zoom by objects, they increase and decrease in pitch. Very cool stuff.

A year or two later, I met a guy named Kai Krause at the Ted 3 conference in Monterey (Adobe had invited me to do some demos in a room they had at the event). He introduced me to a longtime hero of mine, Roger Dean, and in exchange, I told Kai about this amazing guy I knew in Paris. That's how Kai got ahold of Eric, ironically. It was my fault.

Bryce was released by Kai's software company, and that provided Eric with an income, and the inspiration to head off in new directions. The thing about Wenger is that he is the kind of person who does not want to learn anyone else's way of working, he's too impatient to master an interface that does not work along the lines of his own thoughts and creative process. He'd rather make his own tools. He's an artist who makes his own brushes, mixes his own paints, finds canvases that are made of exotic materials, coming up with stuff that no one else has ever considered.

From this mind, we have MetaSynth. And ArtMatic. Then there's VTrack. And Videodelic. And others which may or may not come back to life. Only Wenger knows.

Eric does not make software that falls squarely into existing categories. He creates entirely new metaphors, original and innovative approaches that no one else seems to ever consider. In the world of sound design, MetaSynth is a secret weapon. No one will admit to using it, for fear that their competition will find out about it and add it to their own arsenals. At the most basic level, it converts images to sound, putting an original spin on the player piano roll and extending it into other dimensions. It's a synthesizer that looks at a 2D image and considers every pixel a discrete oscillator. Most synths have one, two, maybe three oscillators. MetaSynth has infinite oscillators. Just about anything can be an oscillator: complex wavetables, samples, granular audio generators, jelly sandwichs, anything. The Image Filter is an Ultimate Audio Monster, terrifying and gorgeous.

MetaSynth is totally unique, capable of conjuring sounds like nothing else on this planet, and there is absolutely no equivalent, no competitor, nada. You've heard it in movies and music, but you'd never know it. It's an 8000 pound invisible gorilla. It stomps on the terra and emits the most outrageous aural entities that anyone could ever imagine.

Inspired, Wenger decides that he needs a source of wacky, complex images to feed to MetaSynth, something designed to make procedural textures that he can convert to sound. Forget Photoshop plugins, Eric wants something that fits his uniquely warped mind. He comes up with the most single terrifying, mind-bending, immersive visual synthesizer ever conceived, ArtMatic, and adds its insane, phase-modulated animation abilities almost as an afterthought. Ever since first touching this algorithmic beast, my life has never been the same - Artmatic is the one program I can absolutely lose myself in at any moment, it's the ultimate high in software psychedelia, and there's nothing like it anywhere else in the galaxy of applications. It's become one of my own secret weapons, and I'm happy to admit it on this blog. I've rendered High Definition animation with it that looks like nothing else you've ever seen, and I've cooked up a luscious procedural texture that is 32,000 by 32,000 pixels in size. At 72 dpi, it's a whopping 45 square feet.

People buy Macintoshes in order to run Wenger's software. He's never bothered to port his code to Windows, and it seems like he never will, or let anyone else do the port. Apple barely knows who he is, and has never tried to support his coding efforts in any serious fashion. It's not like Wenger is concerned about Apple's lack of interest in his creative output - he doesn't seem to care much about commerce, he could give a damn about publicity, he's a terrible businessman. I know that his business partner will likely send me a scathing email if he reads these words, but it's the honest, unvarnished truth. Eric makes tools for himself, and if you want to go along for the ride, great. But don't expect software design by focus groups here, Eric knows what he wants and that's all he really cares about. If you can't figure out his software, it's not his fault. You'll need to look at the world through his eyes in order to get the best use of his software.

Eric is Van Gogh with a mouse, he's Mozart with a MIDI keyboard. He's the most talented software artist in the world, and the implications of his work will only truly be understood many years from now. Without his contributions, the world of creative software would be a much poorer place. You won't find much about this man on the Internet, but I suspect history will treat him better than we did. One day, I will visit him in Paris and be amazed at whatever he's made since the last time we laughed together. Eric, thank you for your creativity and your wonderful mind. You made the fields of graphics, animation and audio creation more interesting, and you've shown us what software can be when it's the product of a single, strong creative vision.

First Post - Long Live the PC!

My first post on Analog Digits will serve to set the stage and tone for this blog, as I have had a longtime fascination with the ineptitude of the "technology press", especially as it exists in mainstream media. The folks who are paid to cover the tech beat seem largely clueless when it comes to actually understanding any kind of objective reality regarding how and why people use specific forms of technology.

The Associated Press released a story about how PC sales are in decline in Japan, supposedly due to the continued proliferation of dedicated devices such as cell phones, video game consoles, camcorders and other vertical digital gear. The assumption is that the recent slowdown in the volume of new computer purchases is indicative of the upcoming demise of the desktop computer.

According to an IDC "analyst", Masahiro Katayama, it's game over for PCs.

"Consumers aren't impressed anymore with bigger hard drives or faster processors. That's not as exciting as a bigger TV," Katayama said. "And in Japan, kids now grow up using mobile phones, not PCs. The future of PCs isn't bright."

Here's my take on this: for the average consumer, the current generation of personal computers has reached a point where the speed is simply good enough for the kind of things they're like to do - cruise the web, word processing, digital photograph editing, listening to music, and anything else that falls under the umbrella of consumer applications. Folks who enjoy video games have long known that dedicated hardware circuitry - such as chipsets devoted to 3D rendering - are always going to provide a superior gaming experience than a PC, and the best games are typically released for dedicated gaming systems.

The notion that a mobile phone is somehow going to displace or replace a PC is patent nonsense, and for one simple reason - applications. People don't use computers, they use the applications that run on computers. Is there a possibility that Photoshop will one day run on a handheld device, and deliver all of the power and flexibility that can be achieved on the desktop? I don't think so. Screen real estate is already an issue for folks trying to do image editing on smaller laptop screens, and we're not even going to discuss the concept of video editing on anything smaller than a 15" display. Anyone who has ever spent time in Adobe After Effects knows that the best single addition to getting the most from that software, is a dedicated display for the timeline window (and folks involved in the days of creating interactive multimedia with VideoWorks/Director probably remember the joy they felt when they saw the Score window on a separate monitor for the very first time). The power that can be put into a handheld device is constrained by scale, and this means that for the foreseeable future, we're not going to have applications on mobile devices that can vaguely match their desktop counterparts.

All of these cool miniaturized gadgets are great, but let's remember that they are all designed using CAD software, on the fastest desktop machines money can buy. If you create music, spend time rendering complex 3D animated sequences, edit and create HD video or build websites from scratch, you already know that there will never be enough power in that CPU to truly make you feel satisfied and complete. You already need a faster machine, and it's quite likely that you'll tap that next machine out within a year, after laying down 15 tracks with Logic Studio and the Sculpture synthesizer. You freeze tracks all day long, and it makes you wonder when they'll be able to put a 20 core chip in your desktop without emptying out your bank account. Musicians want more tracks, more inserts, more instances of digital delays, reverbs and instruments. There is never enough processing power for making music.

The world needs producers as much, if not more, than consumers. Someone has to write, orchestrate and record all that music that you download from iTunes. Someone is busy right now, as you read these words, coming up with the next 3D animated creature that will make you drop $10 on a movie ticket and another $10 on popcorn and a drink (and ultimately, another $20 on the DVD). The next great miracle drug that will save your life one day, is going to be the result of great minds using powerful computers to model the molecular bonds and interactions that make up a specific compound that will heal the sick cells in your body. Medical researchers may use cel phones to talk to their peers, but they rely on their computers to sift through the CAT scans and MRIs images.

The great promise of technology is that we can all take charge of our destinies, and become active producers instead of passive consumers. Many years ago, Alvin Toffler came up with the idea of the prosumer, which has now come true to a good degree. The principle idea behind "Web 2.0" is that the content of a site is generated by the actual community, creating a productive, positive feedback loop. The loop is essential in a healthy technology ecosystem - the producers drive the innovations, and the consumers provide a venue in which the usefulness and potential success of said innovations can be judged by the marketplace.

The consumer world is indeed ruled by an internal set of assumptions and parameters that are different from the realm of the producer. Regardless, it's the producers who will guide the future, with the consumers falling into line based on the marketing manipulations that create the desire for products that will make them happy (or dimply distract them for a few moments). I'm not really happy about this particular state of affairs, and deep down inside, I want people to realize that there's an amazing array of opportunities for them to express their creativity with the truly awesome power of the applications which exist on desktop computers. In future blog entries, I'll be writing about many of these applications, and hopefully, anyone reading about Groboto on their iPhone will run home and download it for use on their iMac. Should we set our sights for a future where we'll be able to use fully-loaded applications on handheld devices? Sure, but that will require an entirely new type of display technology that looks like nothing we have today, as well as input devices which tap directly into our minds. None of this is likely in the near future. Someday, possibly, but not soon. And by that time, the amount of sheer power that will live on a desktop, well, now there's a compelling thought: perhaps we'll have an operating system that watches us, learns about how we work, and anticipates what we'll be doing in the next minute, or next hour, and filters out everything that's in the way. Most of the power of your computer lies dormant at any given moment - it'll be interesting to see how that unused power is put to work, behind the scenes, to make your computer truly easier to use.

The desktop computer is here to stay. It's an infinitely configurable, highly flexible, insanely customizable symbolic manipulation device, capable of blending together all relevant forms of human communications media. A mobile phone might let you listen to music and watch television, but if you want to actually make the media being consumed by the masses, you'll need the real deal, the desktop computer. Accept no substitutes.