Dec 13

A Tribute To Microsoft Courier Tablet

Courier: First Details of Microsoft's Secret Tablet (gizmodo)

It’s the shame that Microsoft canceled its Courier concept in April 2010. Still, it is my biggest source of inspiration while evolving Notes Plus. Someday, Notes Plus will be a Courier on iPad. Hey, it takes courage to dream, right?

To remind myself of this inspiration, I collect all videos I found about the Courier. If you know any other, please let me know.

Site: Courier FAQ

Microsoft Courier interface demo HD

Microsoft Courier – How Microsoft Thinks You’ll Use It

Microsoft Courier User Interface Documentation

Viet Tran
Dec 8, 2010


Dec 13

Design for Toddlers (UI The Apple Way)

Ever wonder why there are many hidden features on iPhones? (after all, what is a point of a feature if it is hidden, right?) Ever wonder why a four-year-old toddler could figure out iPhone hidden features faster than you?
When I got my first iPhone, I didn’t know about the “swipe gesture” to delete an email until many Googlings later. Not until many weeks later, I found out that the basic calculator when turned landscape became a full scientific calculator. Wonderful! Actually, I didn’t figure it out by myself. I bought an iPhone tips and tricks book.
Some other features I couldn’t figure out by myself include:
  • Scroll to top fast: tap on the status bar at the top of the screen.
  • Type special characters: tap and hold on some keys.
  • Adjust audio and video scrub rate from high-speed to fine: drag your scrubbing finger down. I bet you didn’t know about this either.
If you are like me, make sure you check out the iPhone Tips and Tricks page from I’m sure it will surprise you.
On the contrary to my struggles, my four-year-old son had no problem with Apple UI. When I got my iPad, I just downloaded some games, setup my Netflix and handed it to him. He never came back asking for instruction, direction, tip or trick. And it is not just my son. Check out this TechCrunch video: Could There Be A Better Advertisement For The iPad?
Why is that? Why did Apple hide those wonderful features? Why did Apple (UI) make it hard for me but easy for my son?
This post is my observation of a new trend in UI design that focus on being natural, but not intuitive. It is also about how this new trend affects me and my app: Notes Plus.
The Principle of Feature Exposure
Since the inception of Apple’s iPhones, with a touch screen interface, Apple has broken a few design principles written in traditional UI books [MAYHEW91COOPER95, GALITZ02]. One most noticeable design principle of those Apple has broken, which this post will focus on, is the Principle of Feature Exposure [TALIN98].
The ground of the Principle of Feature Exposure (let the user see clearly what functions are available) is based on Myers-Briggs personality classification of the Sensibles and the Intuitives [MYERS-BRIGGS]. While the Intuitives are comfortable with abstract models, the Sensibles prefer UI with “up front” and “in their face” features. According to some psychological studies, the “Sensables” outnumber the “Intuitives” in the general population by about three to one [TALIN98]. Thus, to please the general audience, computer software UI should have clearly-exposed features. In fact, this principle is one of the core UI design principles for Web pages.
While Apple is brilliant on other UI design principles (principle of metaphor, principle of aesthetics, etc.), it completely ignores this principle of feature exposure. Why?
Natural versus Intuitive
Let me say it up front: Apple UI focuses on being natural, but not intuitive. The word “Intuitive” in dictionary is defined as “obtained through intuition rather than from reasoning or observation“. Intuitiveness is good for UI; it saves users time thinking. The problem is: intuition changes from person to person. Your intuition might not be my intuition. It might be intuitive to you but not to me. There is a “training” factor that makes people intuition varies. Using a computer mouse is intuitive because people get plenty of training on it. However, for a person who has never used it or for a toddler, it is not intuitive at all. A button with big text: “Stop” could be intuitive for people who understand English. For people who don’t, it is a dead end (with no possible guess).
On the other hand, being natural is being universal (to the human race). Touch, push, grab, or mentally associating signs (color, drawing) with meanings are natural. A big “Don’t walk” sign is not natural for people who don’t read English. A red, flashing sign with a walking-person picture is natural. People associates red with fire, flashing with thunder (thus, danger), walking person with themselves.
Still confused about the two, give a UI to a toddler to test it. If he could figure it all out by himself, the UI is natural. Now, this is an important question: Why did Apple focus on being natural versus intuitive? Or even more important, why did it work wonderfully?
A new kind of user training – There’s no training, just play with it
When Apple released the first iPhone, they didn’t pack it with a user manual. In fact, not until recently, they provided a soft user manual hidden in Safari bookmarks. With a brand new product and the first of its kind, you would think that iPhone needs a user manual. Why didn’t Apple provide one? I think Apple did this intentionally. They want to send a message to their users: “the toy is natural to use, you don’t need a manual, just play with it“.
There is a paradigm shifting in user training between traditional computer software and this Apple new trend. In traditional software, users are expected to understand and master all the steps before performing a task: “Go to File menu, under Printer Setup, choose Landscape”, “Don’t push that button, it’ll blow you up! “, etc. Imagine giving these kinds of software to a toddler, they’ll mess up everything. In the new Apple products, users don’t need to know everything up front. The more they play with it, the more interesting things they find out.
Combining this new kind of user training with a UI that focus on being natural, it actually makes sense. Users (or toddlers) will figure it out, even though it’s not intuitive up front. The point here is: play with it, not just use it.
What’s in it for me (and my users)?
Enough with all the mumbling about UI design principles, what’s in it for me and my users? Apple has all the luxury to dictate and revolutionize their users’ behaviors. They have huge marketing budgets and they have reputation. I don’t. I cannot just tell my users: “hey dude, just play with it; look at how your son do it”. I have to please my users and bend their behaviors very gently.
The very first principle in UI design is the Principle of User Profiling – know who your users are [TALIN98]. My users (for my app Notes Plus) are tech savvy and early adopters (after all, they own iPads). My typical user is either a working professional or someone in academia (a student, a scholar, a professor, etc.).
These characteristics make it both easy and hard for me to try new UI concepts and behaviors. Being tech savvy, my users won’t have problems of trying new things. That makes it easy for me to introduce new UI concepts such as gestures to select, delete, scroll, etc. On the other hand, being working professionals, using computers for five or more hours a day, my users are very accustomed to traditional UI components such as: menus, buttons, dialog boxes, etc. As a consequence, they expect the UI reacts similarly to what they are accustomed to. If they don’t see a familiar reaction, they easily get frustrated. That makes it hard for me to introduce new UI behaviors.
What do I do? I introduce new UI with a way to rollback to traditional UI if users don’t like it. For example, in traditional UI, my users are very accustomed to changing “editing modes”. To draw, tap a “Draw” button to switch to drawing mode. To select and re-arrange, tap a “Select” button to switch to select mode. While this might work fine for desktop/laptop computers, it is kind of awkward for tablets because one must move his hand away from the content area to the location of the “switching modes” buttons. Besides, after switching to select mode, he has to start dragging to select. That is an extra step.
I introduced a gesture to just circle around an area to select objects within, while in drawing mode. Users also have an option to undo the selection if they meant to draw a circle, not to select. While many users like this selection gesture, many others couldn’t figure how to select and get frustrated. I have to offer an option to turn off selection gesture. Once turned off, a “switch to selection mode” button is presented on toolbar so that users can tap on it to switch modes.
It is really fun for me and for my four-year-old toddler to play with Apple products. Maybe for the last 50 years of computing, we’ve been trained wrongly about how to interact with a computer user interface. Maybe we should be more playful, adventurous and less serious when interacting with a computer. As for me, when I have another Apple product, I will try to poke, shake, grab, turn, or even throw and catch to explore their UI.
Viet Tran
October 9, 2010
[MAYHEW91]: Deborah J. Mayhew – Principles and guidelines in software user interface design (page 6-28) – Prentice-Hall, Inc. – 1991
[COOPER95]: Alan Cooper – About Face: The Essentials of User Interface Design (page 15) – John Wiley & Sons, Inc. – 1995
[GALITZ02]: Wilbert O. Galitz – The Essential Guide to User Interface Design: An Introduction to GUI Design Principles and Techniques – Second Edition (page 41-51) – John Wiley & Sons, Inc. – 2002
[TALIN98]: Talin – A Summary of Principles for User-Interface Design ( – Unpublished – 1998
[MYERS-BRIGGS]: Wikipedia – Myer-Briggs Type Indicator (


Dec 13

4+1 Reasons Why Pen And Paper Are Still Better Than An iPad

Leave alone the price, I will discuss in this post why a $.99 ballpoint pen and a paper notepad is still better than a $500.00+ iPad.

The beautiful enormous multi-touch iPad screen shouts out for handwriting applications. That is why there are at least a dozen iPad apps today aiming for handwriting notes (smartNotes, Penultimate, TakeNotes, WritePad, uWrite, Scribble Notes, PaperPad, WriteNow – I just searched the App Store for “handwriting”). I recently added one more into the collection – Notes +. Many of these apps focus more on the cool factor (the hype) than on being practical. And they are not even close comparing to pen and paper.


Average human fingertip size is about 64 – 100 square millimeters (8 – 10 mm wide); the pad of the finger, which most people touch with, is 100 – 196 square millimeters (Ubuntu Designing for Finger UIs). Apple recommends a minimum touch target size of 29 pixels wide by 44 pixels tall (iPhone Human Interface Guidelines). iPad resolution is 132 ppi (iPhone resolution is 163 ppi), which equates to 5.58 mm wide by 8.38 mm tall for the minimum touch target size, or 47 square millimeters.

The tip size of a 1.0mm ballpoint pen is, well, 1.0 mm. Ballpoint pen tip size is about 0.7 – 1.2 mm in diameter (Wikipedia – Ballpoint Pen); that is 0.49 – 1.44 square millimeters. Therefore, a finger touch is about 136 – 217 times bigger than that of a ballpoint pen. That makes the 7.8” x 5.8” iPad screen less useful than a 3” x 3” Post-It note!

Would you bring a pile of Post-It notes to a meeting or a class in place of a notepad? People brag about handwriting iPad apps nowadays. Cool – yes; practical – no! My Notes + app provides a Zoom-Write mode where you can write small text with your finger, making the iPad screen feels like a normal notepad.


Writing on an iPad means keeping your palm on the air, at least with most of current handwriting apps. The iPhone/iPad SDK provides no inherent API to help detecting a palm touch. A touch is a touch; there is no concept of a touch size. It is all up to app developers to find a way to detect and reject palm touches. How do they do it currently? They don’t.

It is not an easy problem differentiating between a palm touch and a finger touch, with unknown touch size. That is why most, if not all, of the current handwriting iPad apps just ignore this problem (I have solved this palm problem in my app: Notes +).

If you think that it is ok to finger write with your palm on the air, try to write with a stylus!

3. The LOOK & FEEL

Pen and paper is the natural way for writing. Whenever I’m in a need of writing something, I pick up my pen, put my palm down on the paper, jot down the title, lift my pen up a little bit, think, put my pen down again, squiggle it because I’m still thinking, then write and let my thought flow while moving the pen tip on the paper. Have I got this relaxed feeling when I write on an iPad? No.

I did much calligraphy when I was a kid. I learned how to press on a pen precisely to have different stroke width. I learned how to angle my pen to have different amount of ink flowing down, to pace my pen movement to have a different feels for letters. Can I do any of those on iPad? No.

In the art of calligraphy, pressure, angle, speed, and movement defines strokes. The iPhone/iPad can only comprehend the latter two. Some iPhone/iPad handwriting apps went extra miles to interpret speed for pressure and provide various stroke settings. Do they feel the same? No.

How about the look? OpenGL is a powerful drawing foundation. To draw ink-like strokes, one can apply an appropriate texture to the strokes. In fact, Apple provided a very good sample code, GLPaint (and Serg Koren’s excellent 6-part GLPaint Dissected article if you want details), to do this and many handwriting apps have used this technique. The result: it looks good enough to start a hype. How close to pen and paper is it? Not even close. How about different textures / styles / opacity with different pressure and pen-holding angles?

4. There are still PAPER AROUND

Imagine you went to a doctor office and they handed you a form to fill out. Would you pull out your iPad and tell them “get it on here and I’ll fill it out”? There are still too MUCH paper around: forms, class handouts, checkbooks, receipts, dollar bills, … If you can have only one device, a pen or an iPad, to get the job done, what would you pick?

+ 1. The CRASH

I will show you a way to crash any handwriting iPad app (including mine, Notes +). Squiggle your finger up and down, left and write to fill a writing area like you would do in a boring meeting or class.

Do it for the entire iPad screen. All apps will either crash or respond very slowly. Why? An app will have to store your touches. It needs to do that to repaint the screen, to save, to export, to re-arrange, … A touch has many attributes: x coordinate, y coordinate, timestamp (for calculating speed), … It needs at least 5 bytes to store a touch, using advanced encryption algorithm. Without compressing, it can easily be 12 bytes.

Assume that you will fill the entire iPad screen with your strokes. Of course, your touches will be on top of each others but for simplicity, we just account one touch for one pixel. The iPad screen is 1004 (-20 for status bar) by 768 pixels, equates to 771,072 pixels. It needs 3,855,360 bytes, or 3.68 megabytes to store all the touches. The app will crash. If you don’t believe me, try it now.

Would your pen and paper crash and all of your writing disappear?

It is amazing that something invented thousands of years ago is so hard to replace. I wrote this post not to trash the beloved iPad, or any app; I wrote this post to let the developers realize the problems we have. After all, if we don’t know what the problem is, how are we going to fix it. I truly believe that we will get there (a world without any tree being cut for paper). That is why I wrote my app, Notes +.

Viet Tran

May 26, 2010

Update – June 5th, 2010 – On Bill Gates’ “the iPad isn’t there yet”.

On Larry King Live, CNN, June 3rd, with his father, he said (copied from Crunch Gear):

We’re all trying to get to something that you just love to take to a meeting and use and it is not quite there yet. You need to have input. You need to take notes and edit things. Microsoft and a bunch of other companies are working on getting that final, ultimate product. [...] It still isn’t the device that I’d take to a meeting because it has no input.

Despite many jokes about this on the tech blog community, I 100% agreed with Mr. Gates (I still blame him for the terrible OS and all the malware though). When I think of an iPad, I think about bring it to a meeting too. That is why I started this project, Notes +, (and hopefully many others later). The iPad input is simply suck! Just my 2cent.