Sincerely... all I can think about is not that she found a good solution to her problem, but the fact that she "achieved the same level of accuracy" (and I suppose speed).
This is not something about her capabilities, but about the limitation of current input devices regarding our hands.
This is the proof that using a touch pad with your nose is not worse that using it with your hand. There is something wrong in this: try using any real-world interface with your hand, how the shapes, the stiffness, the flexibility of any handle, pen, button, spring you interact with give you some kind of information and let you operate with a superior kind of consciousness.
Or you can look at it from an optimistic-almost-narcissistic point of view and say that the interfaces that we have are so well-deaigned either by accident or on purpose that the same level of accuracy can be achieved with whatever appendage.
Personally, I think we stumbled onto a design that works well after years of trials. When you look at children using tablets, for example, you quickly realize that they don't have the same dexterity as adults, yet they are able to use the device with almost the same capability as adults. With tablets in particular, I don't think this was an accident...
We could make input devices/interaction paradigms that are highly specific to one domain usable by only certain people because of learning curves or physical limitations, or input devices that can be used by anyone with sacrifice in speed and productivity.
That's a false dichotomy, saying that it's one or the other.
We could have much higher precision, and add highly sensitive pressure sensitivity. For most tasks touch sensitive surfaces would be identical and equally easy to get into, except that they would allow for much higher ceilings of mastery and skill than they do now.
>This is not something about her capabilities, but about the limitation of current input devices regarding our hands.
I think it's more a story about our adaptability in general.
Many years ago, I worked with a developer who happened to be blind. While coding, he used an audio screen reader. If you've ever heard one, you know that they can be set to read incredibly fast, such that it sounds like absolute gibberish to the untrained ear.
So, whereas I couldn't even discern the words/characters being read, it made absolute sense to him. Much as I was amazed by it, I also know that he didn't start there. Years of practice, finding shortcuts, etc. made it as much second nature to him as reading code on the monitor was for me.
Of course, it also didn't hurt that he was brilliant.
A few years ago I took a tech support call from a guy who was visually impaired and had voice over turned on. It read incredibly fast, he had no problems following instructions, and was familiar enough with what we were doing to make the call easy for both of us.
I also used to work as a personal assistant for a guy in a wheelchair who used Dragon Speakeasy, I think that was back in 2005 - '06ish, and was surprised how well he was able to use his PC. He also had a book turning machine that reliably turned one page at a time.
This also reminds me of people who ride bikes using echo location.[1]
I wonder if it's possible to generate a sound stream from video, and what sort of resolution and colour depth could be encoded in to the sound.
This is not something about her capabilities, but about the limitation of current input devices regarding our hands.
This is the proof that using a touch pad with your nose is not worse that using it with your hand. There is something wrong in this: try using any real-world interface with your hand, how the shapes, the stiffness, the flexibility of any handle, pen, button, spring you interact with give you some kind of information and let you operate with a superior kind of consciousness.