10 gui is an attempt to rethink the GUI, particularly mouse operation, based on a touch-based input device. Instead of touch screens, they use a separate touch-pad (like an artist’s tablet).
Their GUI makes me squirm a little: it’s not what I’m used to, but I can just about see it. The Amiga user in me wishes they used vertical panning instead of horizontal…
I just don’t think it goes far enough. The paradigm shift we need is a two-way, touch-based input device. “Touch” is a two way interaction. It is both an application and a sensing.
What the tablet needs is tactile output: the ability to raise pixels on its surface — like a Haptic Reader.
What I’m talking about is a tablet surface capable of raising “pixels” slightly so that you can form simple lines, edges, surfaces, textures. Our fingertips are incredibly good at feeling that kinda detail.
Consider 10gui’s fader example. If my input tablet can produce subtle surface changes that outline the faders for me, I don’t need them to be displayed on-screen. The same device can now double as a keyboard for touch typists. Non-touch typists can use an on-screen keyboard display and be spared that akward hunt-the-key that makes them look away for the screen. Blind users will be able to interact with a computer just as efficiently as sighted users.
And since it is a “soft” keyboard, no language issues! The user can use whatever layout suits them, can have the keys organized in any way they like.
I think 10gui is on to a good start, but I think the real winning concept is going to be that last step of providing feedback through the input device.