It’s rude not to point
Ok here I am typing on the onscreen keyboard of a Tablet PC*. A first for me.
The good folks at Tegatech Australia kindly sent me a Tablet to loan and have a bit of a play and see how someone without the use of hands might go using a Tablet PC. As they put it “All well and good to call it intuitive when everything (on us) works, but I reckon it’s important for you to have a say”. That expresses the kind of attitude I like and find helpful.
So the Tablet arrived, a Sahara Pen Slate PC with Intel Pentium M 1.3, 512Mb RAM running Windows XP Tablet PC Edition 2005, Version 2002, Service Pack2.
[Fig . 1] – Sahara Pen Slate PC Specifications
As this Tablet had the type of screen requiring the use of a special pointer ‘pen’, Wacom style, the first task was to attach the pen to my ‘standard’ input method, a mouthstick. This was quickly done by a helper with a couple rubber bands.
[Fig . 2] – Tablet PC pen [top], PDA Stylus [inset]
The resultant â€˜duo-stickâ€™ was a tad heavy, a lot more noticeable than the small stylus I have permanently affixed to my â€˜stickâ€™ at home for use on my PDA.
The next thing was to position the Tablet where I could reach it on my desk and at an angle as I found I couldnâ€™t reach the top in Portrait configuration. I managed to find how to configure the screen to turn it into Landscape mode which proved better. Getting the angle right was a trick â€“ too flat and I couldnâ€™t make the pen work as it was on too much of an angle, too steep and it just didnâ€™t match the â€˜peckâ€™ angle of my typing. Even in landscape, reaching the top of the Tablet proved a bit of a stretch. I always found myself stretching to grab a window and drag it down the screen where I could reach it and give the pen a better operating angle. Tapping and dragging came easy. I found in much simpler to drag windows around using the ‘pen’ than a trackball. There’s less clicks involved.
[Fig . 3] – Angled tablet, need landscape}
[Fig . 4] – View over the shoulder showing operation}
I watched one of the Tutorials and discovered how the input bar operated. I went for the onscreen keyboard (osk) and opened Wordpad to test it out. Iâ€™m used to onscreen keyboards, using them periodically on my PDA and while operating my computer while in bed. The experience is different however when using a â€˜stickâ€™ directly on a screen and using a stick to control a Trackball that controls the cursor to point to an osk. I found myself straight away hunting for a â€˜clickâ€™ sound on the keyboard. I found the option in the Control Panel under the Tablet Properties. That improved the experience of typing.
[Fig . 5] – Properties showing pen options
This issue of feedback is one Iâ€™ve never heard anyone talk about and one that anyone with all their senses working would not necessarily think about however itâ€™s one I come up against in many areas of my life, not just computers. Because I have no â€˜feelâ€™ sensory feedback I find it extremely better to have some other form of feedback for operating buttons/keys. Usually this means auditory feedback, so I can actually hear Iâ€™ve actuated something. Sometimes a visual cue is okay too, such that I can see it move. While the Tablet pen did make a faint sound when â€˜tappingâ€™ the screen, having the â€˜clickâ€™ sound was so much better.
More Input Stephanie!
Next I turned my attention to using the hand writing recognition part of the input bar.
Again, I was used to this idea as I use the transcriber function often on my PDA. The big difference I noticed here was how on the Tablet you have to write on the input bar whereas on my PDA I scribble on the screen â€“ something I sometimes found myself trying to do on the Tablet. It would be good to see the Tablet and the Windows Mobile have a bit more consistency across the two platforms. This is the one killer area that Apple have always done well, consistency across applications.
I quickly got into the grove of what Iâ€™ll call â€˜scrawlingâ€™. The weight of the pen made it a little more cumbersome than I wouldâ€™ve liked but the ability of the software to recognise my â€˜scrawlingâ€™ was amazing. I found I could even write in a manner where I left the pen on the screen and joined the letters in a semi-cursive style.
[Fig . 6] – Good â€˜scrawlâ€™ recognition
I’m not sure I was any faster writing (penning?), â€˜scrawlingâ€™ than typing on the screen initially, but I’ve been so used to not being able to write for so long maybe typing has become more â€˜normalâ€™ to me. I was certainly getting quicker the more I used it. Even though I was using my mouth it somehow felt more â€˜naturalâ€™ to â€˜writeâ€™ and I’d imagine anyone who can write would find it intuative. I also think that it would be a boon to those new to computers who break into a confusion of fear when faced with a computer keyboard.
[Fig . 7] – Recognition close-up
I can imagine taking notes with the writing part of the input bar would be an advantage over using a keyboard. It’s a quick way to get stuff input I as long as you’re not concerned with correcting as you go. In that regard I found the experience a bit like voice recognition where the idea is to use the method as way to get data into a document quickly and correct later.
One interesting thing I discovered was instead of hitting the â€œInsertâ€ button to put the â€˜scrawlâ€™ into the document, I could just â€˜tapâ€™ on the document. Mind you, it also sometimes set the cursor in the middle of a sentence doing that, so maybe it wasnâ€™t such a great discovery.
Just say the word
I noticed that this Tablet has voice recognition, and from the input bar you could switch between command mode and dictation mode. I gave it a go using the inbuilt mic. That was a bit of a disaster as the noise of the computer â€˜clickingâ€™ would set off the recognition software and started taking on a life of its own. Obviously the built in mic would be ideal for note taking/voice recording and with the speaker turned off could be possibly used. However it advises use of a headset microphone, and Iâ€™m sure itâ€™d work then. I wasnâ€™t in a position to attach a different microphone and besides which I hate being â€˜tetheredâ€™ to a computer, but Iâ€™ve used these systems before and have no reason to expect it would work any differently in a Tablet format PC. It would be good to see how this speech engine works as compared to Dragon which Iâ€™ve used. Will be interesting to see what Vista brings with its built-in recognition.
For a laugh, hereâ€™s something that was â€˜jointlyâ€™ dictated by me and the Tablet.
Until July, 1963 1721163 one and a little unusual in speech recognition system lucky people might find it in
The union speech dictation system is very minimum of 515
I also tried a bit of drawing on the Tablet however as time was limited and I was unfamiliar with the platform and applications used for â€˜inkingâ€™ I canâ€™t offer much in that department aside from a quickly drawn diagram which I posted on my blog â€œLifekludgerâ€ (see Fig. 10). It certainly was a much easier and â€˜naturalâ€™ way of â€˜drawingâ€™ than using a trackball. It was quick and very easy and enjoyable.
He wondered where all the computer had gone
Something really hit me, and is encapsulated in the cartoon-style drawing (see Fig. 10), as the overall biggest sensation I had while spending time with a Tablet. That sensation was that for once I had everything I needed in one compact place. Thatâ€™s hard to notice as a big thing unless you realise just how Iâ€™m dependant on only being able to â€˜reachâ€™ a certain, limited area with my mouthstick. Once the tablet is setup in the position where I can get to the screen easily, I’m ready to go.
These pictures might better show it.
[Fig . 8] – Usual work setup
This is what I have setup at home and work. The issue is how to operate a pointing device and keyboard with what I can â€˜reachâ€™ while also being able to see the screen while â€˜pointingâ€™ or moving the cursor. I designed these devices that hold a standard keyboard and trackball in a relative position to each other while being in â€˜reachâ€™ of my mouthstick and also with screen viewable whe using the trackball. Now Iâ€™ve tried conventional laptops but thereâ€™s big issues. First one is those touch pads â€“ they’re skin sensitive. That means short of taping someoneâ€™s finger to the end of my â€˜stickâ€™ theyâ€™re useless. So that means I need a joystick-style ala IBM etc. But then thereâ€™s the issue that to reach the keyboard I have problems with the mouse buttons being too close. Now, all that was to indicate that with a Tablet platform none of that matters anymore. If I can reach the screen I can operate it. No mouse, keyboard, mouse-button issuesâ€¦â€¦mostly because there arenâ€™t anyâ€¦.any physical ones anyhow.
All the issues are virtualised. All I need is this.
[Fig . 9] – Tablet..itâ€™s all just there
One of the drawbacks of this whole â€˜virtualisationâ€™ of physical input devices is the sacrificed screen space. As I said earlier, I found the Tablet just a little too big. For the first time since pre-release I found myself thinking an Origami/UMPC sized device might be perfect. However the screen space would be a problem.
With an onscreen keyboard I already lose usable space. The same issue applies on a Tablet with virtual keyboard as this picture shows.
[Fig . 10] – Some screen space used by onscreen keyboard (BTW Those are clouds on the screens)
“You can’t always get what you want”
Most things in life are a compromise. Add a disability into the mix and they just get bigger compromises requiring bigger kludges.
A Tablet tryout definately gave me more ideas and insights into future computing possibilities. There’s no substitution for ‘Hands-on’ (stick-on) in this area of working around issues involving using technology to make lives work.
[Fig . 11] – Daddy, Baby, Mummy â€“ Desktop, PDA, Tablet
Now, I can’t wait to see if a UMPC might fit that space between PDA and Tablet just right.
David N Wallace – Dave the Lifekludger.
* Not all this was written on the Tablet.