July 08, 2008

Economics meets social psychology on incentive theory

In another June 2008 American Economic Review article, Ellingsen and Johannesson introduce a standard concept from social psychology into a standard economic model of incentives, and find that it helps explain some well-known empirical puzzles.

This is not at all the first article in the economics literature that explores the role of social motivations, and the authors provide a good discussion of prior work.

"In Pride and Prejudice: The Human Side of Incentive Theory", Ellingsen and Johannesson add two motivational premises to the standard principal-aget model: people value social esteem, and the value they experience depends symmetrically on who provides the esteem: they value esteem more from those who they themselves esteem.
Their main result is to show how an incentive that otherwise would have a positive effect on behavior can have a negative effect for some people because of what the incentive tells the agent about the principal. For example, they suggest this as an explanation for "the incentive intensity puzzle that stronger material incentives and closer control sometimes induce worse performance" (p. 990).


Posted by jmm at 09:06 AM | Comments (0) | Permalink »

ICD introductory readings from on high

Students often ask me what they can read to learn about ICD. I've not had a terribly good answer to that. On the one hand, the foundations -- especially mechanism design in economics, and game theory, and engineering design theory, and social psychology -- are ancient (well, a few decades old), and have very rich literatures. But I haven't seen (haven't really searched for) good intros. And, these are the building blocks of ICD, but the particular area in which we focus -- incentive-centered design for information systems -- and the particular multi-disciplinary approach we take -- is rather new. I don't know that folks have written any good overviews yet.

However, three quite nice articles just appeared in the American Economic Review that are a step in the right direction. They are focused on mechanism design and microeconomics (not social psychology, computation theory, nor specifically applications to information system design). But they are accessible, short, and written by giants in the field; in fact, they are revised versions of the Nobel lectures given the by three laureates recently cited for creating the foundations of mechanism design theory: Leonard Hurwicz, Eric Maskin and Roger Myerson.

Maskin's overview, "Mechanism Design: How to Implement Social Goals", doesn't require any math. He introduces implementation theory, "which, given a social goal, characterizes when we can design a mechanism whose predicted outcomes (i.e., the set of equilibrium outcomes) coincide with the desirable outcomes" (p. 567).

Myerson's article, "Perspectives on Mechanism Design in Economic Theory", begins to introduce some of the basic modeling elements from the theory, so it has a bit more math, but it's not heavy going for those who have had an intermediate microeconomics class. He introduces some of the classic applications from economics: bilateral trade with advsere selection (hidden information), and project management with moral hazard (hidden action).

Posted by jmm at 08:49 AM | Comments (1) | Permalink »

March 29, 2008

Presentation at Yahoo! Research on user-contributed content

Yahoo! Research invited me to speak in their "Big Thinkers" series at the Santa Clara campus on 12 March 2008. My talk was "Incentive-centered design for user-contributed content: Getting the good stuff in, Keeping the bad stuff out."

My hosts wrote a summary of the talk (that is a bit incorrect in places and skips some of the main points, but is reasonably good), and posted a video they took of the talk. The video, unfortunately, focuses mostly on me without my visual presentation, panning only occasionally to show a handful of the 140 or so illustrations I used. The talk is, I think, much more effective with the visual component. (In particular, it reduces the impact of the amount of time I spend glancing down to check my speaker notes!)

In the talk I present a three-part story: UCC problems are unavoidably ICD problems; ICD offers a principled approach to design; and ICD works in practical settings. I described three main incentives challenges for UCC design: getting people to contribute; motivating quality and variety of contributions; and discouraging "polluters" from using the UCC platform as an opportunity to publish off-topic content (such as commercial ads, or spam). I illustrated with a number of examples in the wild, and a number of emerging research projects on which my students and I are working.

Posted by jmm at 10:02 AM | Comments (0) | Permalink »

March 03, 2008

UCC incentives the old-fashioned way

Ben Kaufman announced Kluster at TED 2008. This is a business through which businesses can solicit user-contributed content: innovative technology or product ideas, business solutions, etc. Why would anyone give a for-profit company good innovation ideas? For a cash incentive...Business post challenges with a cash bonus, and Kluster has a scheme for paying out tha bonus to people whose ideas are successful. (It also runs a prediction market on the side for wagers on which of the proposed ideas will succeed.) No volunteers here: this UCC is compensated in the traditional form of tournament prizes.

Two similar businesses, at least, are already operating: InnoCentive and Cambrian House.

Think you're smart, but don't have time or capital to turn your ideas into businesses? Go sell your ideas online!

(Based on reporting in Putting Innovation in the Hands of a Crowd - New York Times)

Posted by jmm at 01:27 AM | Permalink »

January 22, 2008

Tying Odysseus to the mast

There is a well-known, difficult-to-solve motivation problem: keeping a commitment to yourself. Nearly everyone experiences this in one form or another: "I'm going to diet until I lose 25 pounds". "I'm going to get more sleep" (honest, right after I finish typing this post). "I'm never going to smoke another cigarette." John William Waterhouse's 'Odysseus and the Sirens'

In Homer's epic, when nearing the Sirens whose entrancing song would lead men to dash their ships on the rocks, Odysseus had his men tie him to the mast and plug their ears with beeswax so they could hear neither the Sirens, nor his orders to doom them (all of this because he was curious and couldn't resist listening himself!).

There is a well-known story among economists that Nobelist Thomas Schelling advised those who wanted to diet: "Write a large check to the American Nazi Party and put it in an addressed envelope. When you break your diet, mail it." (Steven Levitt writes that he heard the advice first-hand.) This sensible scheme to increase our incentives to stick with a commitment suffers the problem of a bit of circularity: if you decide to break your diet (or other) commitment, what is to stop you from breaking your commitment to mail the check?

Yale economists Ian Ayres (a classmate of mine while getting our Ph.D.s) and Dean Karlan (also an MIT grad) have started a Web-based company to help implement this tempting but difficult to implement scheme: StickK.com. The scheme is pretty simple: mail them the check and they will hold it in escrow, and if you break your commitment will mail it for you.

"But what", you say, "will force me to let them know I broke my commitment"? Here's where the time-honored mechanism of a trusted-third party referee comes in: set up your commiment in a way that can be verified, and then have a friend or other trusted third-party monitor you, and agree to notify StickK.com if you break your promise.

(Yes, yes, of course: what's to stop you from offering your buddy half of the money if she doesn't report you? Isn't recursion fun?)

This is a fun example of a principal-agent problem: you are the principal and the agent, and you have what amounts to a hidden action problem. That is, you cum principal can't enter an enforceable contract with you the agent to ensure performance because the agent can take an "unobservable" action: instructing you the principal to not enforce. The third-party mechanism transforms this into a symmetric information problem with verifiable, enforceable action.

(Thanks to Buzzy Nielsen for pointing me at StickK.com.)

Posted by jmm at 10:58 PM | Comments (0) | Permalink »

October 10, 2007

Is cheaper monitoring technology a good thing?

In settings in which the effort or actions of one party to a transaction (say, an employee or a professional services provider like a consultant or a lawyer) cannot be easily observed or verified, we have a situation known as the hidden action problem. The asymmetry of information means that contracts and interactions will likely not be as efficient or productive as they would if the action were equally observable (and verifiable) on both sides.

In the very large literature on this problem (in economics, management science and law), one standard bit of received wisdom is that there is a direct trade-off between costly monitoring and providing costly incentives to perform that are self-enforcing (that is, the incentives make it in the self-interest of the party with the information advantage to act as if the information were equally shared). (See, e.g., Eisenhardt, K. (1989). Agency Theory: An Assessment and Review. Academy of Management
Review
, 14(1), 57-74.) As the costs of technology or systems to monitor behavior fall for certain applications, principals will want to rely more on monitoring and less on inducement.

One of my graduate students, Pieter Kleymeer, pointed out a company selling Cataphora, which it claims is "the most comprehensive electronic evidence management and analysis platform available". They promote this both for litigation applications (who said what to whom about what when?), but also, for example, for monitoring the electronic communications of one's own employees.

Pieter asked me the following very good question: "What does this say about the inherent trust built into a contract? If we don't need contracts to incentivize behavior and instead directly monitor behavior, don't we lose trust or good faith between the parties?"

What Pieter raises is that following the cost curve down for monitoring technology may have adverse consequences. When people are induced to behave honestly because they want to (positive incentives) rather than because they can't get away with untrustworthy behavior and will be punished (monitoring), perhaps there is more and better social capital, which creates a positive feedback loop and other social benefits. (*Maybe* -- it's not necessarily the case that people respond "better" or differently to positive rather than negative incentives -- but it's a plausible conjecture that they might.) The immediate savings from using less expensive monitoring technology rather than providing costly incentives to encourage loyalty, honesty and trustworthiness may be outweighed by losses from allowing a culture to develop in which people are only trying to avoid getting caught. I don't know if there is any research literature on this possible structural cost from relying more on monitoring.

One of my colleagues in the Economics Department (Joel Slemrod) is fond of saying that the US has a voluntary income tax system: monitoring is so low that it makes rational sense for most people to cheat (more) on their taxes, yet compliance appears (based on lots of studies) to be relatively high. Somehow we have established a system of trust and civic responsibility in which enough people *want* to honestly pay their taxes that we raise enough revenue without more draconian monitoring and enforcement.

Posted by jmm at 12:30 PM | Comments (0) | Permalink »