I have difficulty reading people; part of this may arise from incongruities between facial expressions and what they say. Theoretically, facial expressions are one of the great universalities shared by our species. So, you'd think it'd be easy (unless you were trying to teach a computer to do it ;).
For starters, perspective matters. In Japanese Noh theater, the actors wear non-malleable masks; they convey emotion by changing the inclination of their face relative to the audience (this tells you something about the layout of Noh theaters ;).
Since computer facial recognition has the interest of governments and security companies, we have free training databases of faces such as FERET. I just shipped in my email to get access to it and see what it has. I hope that it has a range of emotions displayed throughout. The Japanese Female Facial Expression yields 6 emotions per person over a smaller database.
What I'd like to do is create an emotional IQ reading test online like the color IQ test. Basically, you'd get asked to sort a bunch of images into happy/sad/neither, or angry/fearful/neither, or pleased/disgusted/neither ranges. You'd get back a comparison of your reads versus all previous reads.
Ultimately, I'd like to see online testing across the range of metacommunicative competence. CIO has a small training test for microexpressions ( Paul Ekman's work = time for an amazon order ;). Time to find some more online....
Update: 2008-10-04
Received my FERET username and password yesterday for a application processing lag of 18 days. NIST says they
automated the process; did they automate the delay as well? ;)