The Mediocre Multitasker - NYTimes.com

http://www.nytimes.com/2009/08/30/weekinreview/30pennebaker.html?_r=1&ref...

Read it and gloat. Last week, researchers at Stanford University published a study showing that the most persistent multitaskers perform badly in a variety of tasks. They don’t focus as well as non-multitaskers. They’re more distractible. They’re weaker at shifting from one task to another and at organizing information. They are, as a matter of fact, worse at multitasking than people who don’t ordinarily multitask.

Why Windows 7 isn’t competing with Mac OS X Snow Leopard — RoughlyDrafted Magazine

http://www.roughlydrafted.com/2009/08/26/why-windows-7-isnt-competing-with-ma...

Microsoft is not anything like Apple. Microsoft almost exclusively licenses its Windows software to PC makers, which are then pitted against each other to sell commodity hardware to consumers. Apple sells a unique, integrated product directly to consumers.

In many other market segments however, Macs (and Linux) have both found conformable niches where they are more fit for survival than Windows. Apple has specifically targeted home and education markets, mobile business users, music and video production, and sci/tech markets, all of which represent low hanging fruit that is also highly profitable.

Apple’s specialization and unique differentiation from generic Windows PCs, including the Mac’s advantages of being a highly integrated product with centralized support resources, distinctive hardware design and attractive OS software, all combine to make it better suited for certain markets than the run of the mill PC. No features in Windows 7 can compete against those core strengths of Apple’s integrated platform.

The only way Microsoft can take on the Mac is by creating its own PC hardware.

Technology Review: Blogs: arXiv blog: New Measure of Human Brain Processing Speed

http://www.technologyreview.com/blog/arxiv/24030/

Today, Fermín Moscoso Del Prado Martín from the Universite de Provence in France proposes a new way to study reaction times by analysing the entropy of their distribution, rather in the manner of thermodynamics The entropy is an estimate of the amount of information needed to specify the state of the system. Martin says the the entropy of the distribution of reaction times is independent of the type of experiment and so provides a better measure of the cognitive processes involved. That’s important, not least because it provides a way to more easily compare the results from different types of experiment. Martin uses his method to determine how much information the brain can process during lexical decision tasks. The answer? No more than about 60 bits per second. Of course, this is not the information processing capacity of the entire brain but one measure of the input/output capacity during a specific task.