Sunday, January 30, 2011

Economics and the Rational Person



Being in the cognitive sciences, I hear a lot about how economics tends to ignore irrationality in people. Maybe it's the availability heuristic*, but I thought that biases are so well known that most economists had gotten over it.

I've been listening to the Freakonomics podcast, and I generally I like it, but I can't believe how much they buy into this "rational man" hypothesis.

I know that economics is its own science, with its own problems and methodologies, so I don't want to be too harsh, but sometimes they sound like psychologists that have only one theory: that people respond rationally to incentives. Even the way they pose problems reveals this: "How can we change incentives to make politicians act for the public good, rather than just their own?" This irritates me, because it presupposes what form the answer will take.

The field of economics endeavours to study how people deal with scarce resources. I think this is a great subject matter. I also think that economics has a lot to teach cognitive science, and I wish the two fields communicated more. However, any field that is based on a theory, which does not hold in all cases, is doing itself a disservice. It's like saying "we study human behaviour when resources are scarce. Oh, and by the way, the only thing we will consider that could affect behaviour are incentives."

I just listened to a story about how competition has been successfully used to innovate. Well, okay, but they don't mention that competition often stifles creativity. Fear reduces divergent thinking.** This is important, because it limits the usefulness of competition, as a tool, when innovation is the overall goal.  

* People use the ease with which something is brought to mind as a measure of how common or probable something is. It's problematic because it's easier to recall lurid and sexual things, and because the mass media doesn't give us a good representation of the world.

** I can't seem to find the reference for this. Anyone?

Pictured: I don't know, but what a great picture!


Bookmark and Share

Monday, January 24, 2011

Who Invented the iPhone?



I know it's a little silly to criticize superhero comics for being unrealistic, but what Reed Richards (Mr. Fantastic) and Tony Stark (Iron Man) couldn't do in real life is come up with tech that is light years ahead of the rest of the science community. It's the genius myth played out in fiction.

Even when there appear to be breakthroughs, such as the iPhone, other companies quickly make copies. It's not like they're in the dark for 20 years trying in vain to figure out how Apple pulled it off. 

Who made the iPhone? All of science did.  It took hundreds of years of hard work.

Pictured: A man dressed as Mr. Fantastic. His left hand appears large because it is closer to the camera. Trick photography!

ps: I write this blog on blogger, which is owned by Google. Blogger's spell-checker thinks "iPhone" is misspelled. :)


Bookmark and Share

Thursday, January 20, 2011

Citations, Endnotes, References, Footnotes. Pick your poison.


I'm reading a wonderful book right now, but reading it is a bit of a pain, because of the way the notes and references work.

Popular books try to avoid footnotes, equations, and in-line references. I think publishers think, rightly or wrongly, that potential readers will see these things and perceive the book as being difficult, or stuffy, or something. Perhaps they are right. Stephen Hawking's publisher told him something like "every equation you put in the book cuts your readership in half."

So in this wonderful book, Dennett uses end notes. What this means is when he would normally put in a footnote, or a reference (he's a philosopher, so references would typically be in the footnotes) he puts a number in superscript, like this 1. Then, at the end of the book, he has all of the notes from the whole book. After that, he has the references section. This is a bit annoying, and he knows it. He even says so in the beginning, that books written like this require the scholarly reader to keep two bookmarks. In this case, the scholarly reader, yours truly, really needs three bookmarks-- one for where I'm reading, one for my place in the notes pages, and then sometimes the note will cite something I will have to find in the reference section.

I'm a big fan of e-books, but I'm very thankful I'm reading this one on paper. One thing e-books have not gotten the knack for is this problem. Page turning on a e-paper device, such as a kindle, nook, or kobo, is quite slow. It's impractical to turn back and forth to the notes and references. It would take forever. So you're stuck with just reading the book through, and then getting to the notes section and hoping you remember the context. Good luck.

But, luckily, I am reading this book on paper, and I can keep little stickers and fingers in multiple places and turn to them quickly.

E-books will need to find a good solution to this, and soon. 'Cause paper book are going bye-bye.

I would love cognitive science to be the first to use scientific principles to come up with the best way to write books, papers, and to give presentations. But that's a story for another time.


Pictured: a one-man band. He plays notes with his foot. Get it?


Bookmark and Share

Saturday, January 01, 2011

New Year's Resolution 2011: Eat A Blueberry Every Day

Every year my man Lou Fasulo and I do a new year's resolution. It is some restrictive idea that we only do for a single year. The resolution for 2010 was to eat no cold cereal. So after midnight last night, while everyone toasted to the new year with wine, I toasted with a bowl of Honey Bunches of Oats.

This year we have a positive one: to eat a blueberry every day. No fewer than one.

If you want to know why I do these resolutions, and what my past resolutions have been, to the best of my memory, see the Jim Davies FAQ:
http://jimdavies.org/personal/faq.html

Pictured: January 1, 2010, 1:00am. Eating a bowl of Corn Flakes with my first daily blueberry. Lubricated with soy "milk." Photo credit Vanessa Davies. Overexposure credit iPhone 4.

Bookmark and Share