Computer reading thoughts

Just a quick read here.

The participants viewed a series of houses and faces that appeared on a screen for 400 milliseconds at a time, and were told to look for the upside-down building. An algorithm tracked the brain waves of their temporal lobes, which deals in sensory input. By the end of each session, the program was able to pinpoint with roughly 96 percent accuracy which images the patients were looking at, in real time. The program knew whether the patient was seeing a house, a face or a gray screen within 20 milliseconds of actual perception.

So in a nut shell, while it was only a few different images, the fact remains, the computer could tell with high accuracy, in very little time, what the person was ‘thinking’.

If this sort of stuff is making the press, what is going on in the labs with more money?