Doing Business with the Untrustworthy

Anticipating and perhaps forestalling a collapse of trust in others can improve decision making when the trustworthy do business with the untrustworthy.

shutterstock_people-buy-from-people-they-trust-353671223-200x150Doing business among Quakers is simple. Their trusting and trustworthy character facilitates trade. In the eighteenth- and early-nineteenth centuries, a seller in London could ship goods across the ocean and trust that he would be paid by a trustworthy buyer when the goods arrived in Philadelphia. As Quakers grew prosperous, others sought them as trading partners. A common contention is some of those new partners adopted Quakers as role models as a means to prosperity, becoming trustworthy in their own right.[1] But that may be an oversimplification.

It may be that the prosperity of a Quaker has less to do with his own trustworthiness, and more to do with the trustworthiness of his Quaker partners. That is, while a Quaker can indeed prosper trading with other Quakers, extra profit may be made by an untrustworthy opportunist trading among Quakers.

Empirical observations, laboratory experiments, and game theory[2] suggest business trades that seem honest at a particular moment can be part of a strategic plan by an opportunist. The opportunist can initially make honest trades to build a reputation for trustworthiness as a means to earn repeat business or personal referrals. Maintaining that reputation through honest trade might last most of the life of the opportunist, and beyond if he incorporates. But there may be extra profit in eventually breaking trust (milking the reputation). Anticipating and perhaps forestalling such a collapse of trust in others can improve decision making when the trustworthy do business with the untrustworthy.

Fool me once, …

“Fool me once, shame on you. Fool me twice, shame on me,” is an ancient folk proverb that can help the trustworthy do repeat business with the untrustworthy. Notice it is not “Fool me once, shame on me” because it may pay to gamble that your partner is trustworthy, or is at least motivated to act that way. “Fool me twice, shame on me” reminds one that being fooled once is sufficient to reveal your partner as untrustworthy. Thus if you cannot profit by doing business with someone known to be untrustworthy and if you don’t think they can somehow grow to be or to act trustworthy, then cut them off after their first offense.

Your threat of cutting off or punishing those known to be untrustworthy is an incentive for the untrustworthy to delay exploiting your trust and so initially allowing the mutual benefits of cooperation. There are many contexts where initial cooperation is followed by eventual exploitation. In the lab, it is found in various experiments of game theory.[3] In a long-distance race, bicyclists or runners may cooperate for most of the race by taking turns leading and letting others follow in their slipstreams, with cooperation collapsing near the end of the race. And in college towns, “no checks allowed” signs are more common near the end of an academic year. Considering one context in detail may suffice to offer guidance for managing the rest.

Exploiting trust

Suppose you can hire a worker for each of five years. Should you employ that prospect? And if so, what terms should you offer? To make those decisions, suppose you two agree that the prospect must work hard during any year that he is on salary, but may work as he pleases if he is on profit sharing (a lower salary plus a piece rate). Before work begins, you two gather and verify the following data:

  1. The prospect prefers shirking (giving zero effort) while on salary to working hard while on salary; and prefers working hard on salary to working while on profit sharing; and prefers working on profit sharing to working for someone else.
  2. Each year profits you 75 thousand if the prospect works hard while on salary, and 25 thousand if the prospect works while on profit sharing.
  3. Each year loses you 125 thousand if the prospect shirks while on salary.

(For the academic reader, the next section shows how that data arises in a textbook principal-agent model of game theory. In that model, there is less profit from a worker on profit sharing because the employer must compensate the worker for the greater risk he faces when uncertain factors beyond his control affect the profits being shared.)

Option A: One simple employment option is offering the salary each year. The best possible case is if the prospect turns out to be trustworthy: he honors his agreement to work hard, and you profit 5×75 thousand over 5 years. The worst case is if the prospect turns out to be opportunistic: he breaks his agreement to work hard, and you lose 5×125 thousand.

Option B: Another simple employment option is offering profit sharing each year. Regardless of whether the prospect is trustworthy or opportunistic, you profit 5×25 thousand over 5 years. That lies in between the best-case and worst-case scenarios of Option A.

There are many textbook case studies[4] and much empirical research[5] [6] where experiments in profit sharing (Option B) yield higher profits than guaranteed salary (Option A). Profit sharing may seem like the best option when most workers are untrustworthy, but the presence of even a few trustworthy workers may cause the opportunistic workers to initially mimic the trustworthy workers. That suggests experimenting with a third option.

Option C: Employ the prospect on a year-to-year contingent salary: offer the prospect a salary for a year if the prospect has worked hard in every previous year, but offer the prospect profit sharing (a lower salary plus a piece rate) if he has ever shirked. How will the smart opportunist respond to that offer? Consider two possibilities. One response is the opportunist shirks in his first year, and so gets one year of shirking while on salary followed by 4 years working under profit sharing. The other response is the opportunist works hard in his first 4 years then shirks in his last year, and so gets 4 years of working hard while on salary followed by one year of shirking while on salary. Either way, the opportunist gets one year of shirking while on salary, but he chooses the latter response because his other 4 years are working hard while on salary, which he prefers to working while on profit sharing.

Putting it all together, the contingent salary (Option C) is the superior option in this context. In the best-case scenario of a trustworthy worker, the contingent salary (C) generates the same profit (5×75 thousand) as the guaranteed salary (A) and more profit than profit sharing (B). And in the worst-case scenario of an opportunistic worker, the contingent salary generates 4×75 thousand over the first 4 years (while the opportunist mimics the trustworthy) followed by a loss of 125 thousand in the last year, which exceeds the worst-case scenarios under the other two options. The superiority of the contingent salary is even stronger if employment extends beyond 5 years: the worst-case scenario under the contingent salary gets closer to the best-case scenario under the guaranteed salary because there are more years of the opportunist working hard but still only one year of the opportunist shirking. That reasoning suggests that employers that have already run successful experiments with profit sharing over guaranteed salary should further experiment with contingent salary for their long-term employees.

A principal agent model of profit sharing

 academic-reader-1-replacement

academic-reader-2-replace

academic-reader-3-replace

academic-reader-4

Conclusions

“An unthinking person believes everything, but the prudent one thinks before acting.” Proverbs 14:15 (International Standard Version)

Empirical observations, laboratory experiments, and game theory offer coherent guidance for a prudent person doing repeat business with potentially untrustworthy partners. Such a prudent person does not trust in their partners’ character, but simply in their partners’ ability to recognize their own self interest.

  • Even untrustworthy partners may initially act as though trustworthy in order to earn repeat business or referrals. So, do not trust a partner with a large deal just because they have previously acted trustworthy in small deals.
  • Be especially careful toward the natural end of a business relationship because that is when untrustworthy partners are especially motivated to break agreements.
  • The number of periods where untrustworthy partners act as though trustworthy increases as the total number of periods of the relationship increases. So, seek partners with which you can potentially do more repeat business. And, if possible, divide a big deal into a series of smaller deals (such as paying workers monthly rather than yearly).
  • Punishing those acting dishonestly is an incentive for the untrustworthy to delay exploiting your trust, and so may forestall a collapse of trust and cooperation.

  

[1] Surowiecki, James. The Wisdom of Crowds (New York: Anchor Books, 2005), 120.

[2] Kagel, J., and Roth, A.E., (eds.). Handbook of Experimental Economics (Princeton, N.J.: Princeton University Press, (1995), 8-10, 26-28.

[3] Ibid.

[4] Sameulson, William F. and Marks, Stephen G. Managerial Economics (Hoboken, N.J.: Wiley, 2014), chapter 14.

[5] Banker, R., Lee, S., Potter, G., & Srinivasan, D. “An Empirical Analysis of Continuing Improvements Following the Implementation of a Performance-based Compensation Plan. Journal of Accounting and Economics 30 (2000): 315-350.

[6] Lazear, Edward P. “Performance Pay and Productivity,” (July 1996). NBER Working Paper 5672.

[7] Sameulson, Managerial Economics.

2017 Volume 20 Issue 2

2017 Volume 20 Issue 1

2016 Volume 19 Issue 2

2016 Volume 19 Issue 1

2015 Volume 18 Issue 2

2015 Volume 18 Issue 1

2014 Volume 17 Issue 3

2014 Volume 17 Issue 2

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

Big Data Decision Making

The process we use to gather information in making decisions can be as important as the decisions themselves. Do you rely more on sophisticated analytics or intuition? Using a self-report exercise, this article assists the reader in recognizing their decision-making style and offers a framework to enhance the process.

 

big data iStock_000046322302_thumbprintOne of the key roles of a leader is decision-making. Staff issues, moving forward with initiatives, managing costs, expanding revenues, and hiring are a few of the challenges associated with management and all require effective decision-making. Intangibles such as stewardship, empathy, and strategic insight play a role as well. Clearly, the process of gathering information becomes critical to achieving quality outcomes. What evidence convinces leaders that they have accurate and sufficient data to make the right call?

The purpose of this article is to assist readers in recognizing their decision framework: how they gather information during the decision process and ways in which they can enhance quality decisions. In other words, they will consider which data sources they are likely to rely on—facts, intuition, statistics, or a combination of these three factors as well as others in finding solutions.

Before continuing, however, we believe readers can take greater advantage of this article by doing the following exercise. In reviewing the items below, it would be helpful to jot down or mark responses, rather than relying on recall. The opportunity to review these initial insights will integrate effectively with the recommendations expressed at the conclusion of the article.

A Thought-Provoking Exercise

The benefit of this exercise is to create an active rather than passive experience. Consider a recent substantive workplace decision you needed to make. Think about the information-gathering process you engaged to reach your decision. Please respond to the following questions:

  • Describe the situation leading up to the decision.
  • What process did you use to gather the information that was used in making the decision?
  • When did you recognize that you had sufficient information to make the decision?
  • To what extent did you rely on information from records, reports, etc.?
  • Did you seek advice from colleagues before making the decision?
  • Were you pleased with the outcome that resulted from the decision?

Okay, now put aside your notes and continue reading the article. At the conclusion, you will be asked to return to these comments and incorporate the knowledge presented in the following pages.

The Value of Data-Driven Decision-Making

In 2007, Ian Ayers, the author of Super Crunchers, described the value technology presents to improve decision-making through the availability of unlimited information. “We are living in an age when dispersed discretion is on the wane.”[1] The movement to highly sophisticated quantitative models suggests the role of intuition will become as anachronistic as hi-fi sound systems. Most Fortune 500 companies, such as eBay, Amazon, Walmart, and Facebook, rely on advanced analytics in predicting customers’ buying preferences, managing inventory, and collecting purchasing patterns. Airlines rely on algorithms to determine not only the schedule, frequency, and number of flights, but promotional programs as well.[2] With the advent of relatively low-cost technology, small and medium-size firms can also benefit from advanced analytics.

McKinsey Global Institute (MGI) stated that data are becoming the main factor of production, similar to physical or human capital. Their studies suggest that the use of sophisticated analytic models could result in a $300 billion per year savings to the American healthcare system.[3] In addition, the report described the value of these models for managers engaged in operational and tactical decisions as well as their strategic benefits for senior executives. For example, quantitative data can enhance the selection process. Rather than relying solely on interviews, recommendations from former supervisors, and a sense of culture fit, through analytics, an open position can be assessed using a number of variables thought to identify indicators of quality performance for the role. Once the ideal candidate characteristics are determined, assessment techniques can be administered to evaluate applicants and find the right fit.[4]

The use and value of analytics was clearly identified in Michael Lewis’ book, Moneyball, which became a successful box office film as well. Faced with a tight budget for acquiring professional baseball talent, the general manager of the Oakland Athletics relied on sophisticated data rather than observation and surface-level statistics to trade for players, which led the Athletics to a division title (2003 Michael Lewis, Moneyball. W.W. Norton & Co.) (YouTube: https://www.youtube.com/watch?v=yGf6LNWY9AI).

The evidence suggests that given the analytic tools available, the ability to crunch the numbers will raise decision quality and render intuition obsolete as a means of attaining desired outcomes. Although it is reasonable to assume that sophisticated analytical models, such as Big Data, are the answer to 21st-century complexity, there are those in the academic and business community who question the strict devotion to quantitative decision frameworks.

The Case for Intuition

Man w binoculars iStock_000023770820_Medium croppedA number of concerns, however, have been raised about an overreliance on the movement toward the primary use of quantitative methods as the future of corporate decision-making. One assumption that drives the value of Big Data is the belief that quantitative input used to generate output is, indeed, accurate or has included the appropriate variables that would result in effective decisions. The reliability of the numbers can come into play and concerns about asking the wrong questions can impede the value of the results obtained. Advanced quantitative models will certainly offer more information; however, with a greater focus on granular data, the big picture could be overlooked. As pointed out in a recent article, “the sheer size of today’s data means that companies need to be more careful than ever to treat data as a slave rather than a master.”[5]

A number of studies have revealed that more information about a purchasing decision does not necessarily result in better choices. This has shown to be the case, for example, in Wall Street stock choices, job selection, grocery shopping habits, and college test taking as well as a number of other choices.[6] [7] In addition to individual decision-making conundrums associated with information overload, examples from the corporate world have also been identified. A few years ago, Cadbury, the British chocolate maker, produced an ad that was screened by the firm Millward Brown, the world’s largest tester of advertising. The analytic research indicated that the ad, which featured a gorilla playing the drums with background music from a hit Phil Collins song, did poorly in tests with consumers.[8] The subject responses indicated the ad had limited brand appeal and awareness. The quantitative data clearly demonstrated a no-go recommendation. Even with Millward Brown’s recommendation, Cadbury decided to run the ad—with stunning success. Millions of online views, along with better perceptions of the Cadbury brand, resulted in higher sales. In this case, intuition, rather than rigorous quantitative analyses, resulted in a better outcome.

The medical profession is well noted for its reliance on algorithms for patient diagnoses. The practice of medicine is greatly aided by the availability of vast amounts of quantitative data in assessing physical conditions. One of the specialties where this advantage may be most prevalent is emergency medicine. However, research has demonstrated that experienced emergency room physicians often rely on intuition in carrying out protocol, particularly given the high stakes, high stress, rapidity, incomplete information, overwhelming data, and overlapping processes associated with the environment.[9] It is not unusual for an emergency room doctor to intuit a symptom that the medical tests do not reveal as a potential problem as just the opposite and, in some cases, a life-threatening situation.

So, how do we sort out this conundrum? Is intuition indeed a relic of 20th-century management or a realistic approach to the complexity associated with strategic decision-making? Rather than attempting to determine whether sophisticated quantitative approaches offer better decision-making criteria than intuition, or vice versa, perhaps our attention should be directed toward a process that integrates the two in a manner that would result in better outcomes. The following section presents a practitioner perspective on the tension between rational and intuitive modes of decision-making.

Employee Schizophrenia: The Multiple Personalities of an Ad Agency

An example of the rational thought versus intuition enigma can be seen through an advertising industry quandary. Although the advertising business is often regarded for its fast pace and creativity, much of the day-to-day work performed within an agency is cyclical, methodical, data-oriented, and process-driven. Many an agency person can attest to client mandates to adhere to quantitative ad effectiveness benchmarks, segmentation research, tracking studies, and the like. These mandates tend to increase with the size of the client’s business. In the right hands, these data can guide the agency and the marketer alike to make better-informed decisions, garner insights that generate a competitive advantage, and avoid otherwise costly mistakes. There are also times when this data can be a hindrance and even a serious detriment to the output of the agency. Say, for example, a marketer has conducted a lengthy and costly customer segmentation study. Not surprisingly, plans will often be postponed and decisions will be delayed until the results of the research arrive. The promise of quantitative data creates an optimistic anticipation among the ranks, almost as if the results will reflect something of a silver-bullet quality. However, with rare exception, the actual results are far from magic, but rather beg more questions, add layers of complication, create confusion, and pose even more difficult decision-making challenges. It’s never as clean and clear as people think it will be.

Data provide the promise of certainty. However, in business, certainty is not too dissimilar from a desert oasis; pursuing it typically results in disappointment. Interestingly, many an ad agency veteran will testify that agencies are at their best during either a) client crisis events or b) new business pitches. Both scenarios have several key factors in common: a lack of time, a lack of resources, and a lack of information. These factors tend to force people into intuitive decision-making mode. It is worth noting that the most effective agency people are able to toggle between the two modes, depending on the circumstances of the project.

Six Guides to Advancing Quality Decision-Making

The recommendations below may prove useful in resolving, in part, the conundrum associated with intuition in data-rich environments. These principles may allow the decision maker to pause and adopt a broader perspective in sorting out key issues in the process.

  1. Mind the mission: Be crystal clear about the goal. It is Management 101; clearly defined outcomes and objectives are essential in making effective decisions. With that, practice a certain amount of self-honesty to determine if the mission has shifted to supporting and/or validating the data, rather than using data to achieve the stated objective. This strategy can slow the decision-making process to allow the leader to be more focused on the goals to be accomplished.[10]
  2. Don’t treat data like a person: Data are inanimate and silent; treat them accordingly. Resist the temptation to give data a seat at the head of the table and a loud voice. Don’t be afraid to ask yourself, “Does this make sense?” Similarly, appoint the most appropriate people to the task. Nasr, CEO of Armedia (2015), describes the value of Subject Matter Experts (SMEs), who reflect on the richness of the data to substantiate the value of the information.[11]
  3. Listen to your gut, particularly in the early stages: Research suggests that physical sensations (i.e., gut feelings) can be an extremely reliable source for effective decision-making.[12] We often get that sense of “knowing” long before we are able to verbalize it. So, next time, take notice when your gut is talking and do not be so quick to discount it, but instead incorporate those feelings in a process that verifies their substance. Gladwell refers to the phenomena of 10,000 hours of experience as a means of building intuitive expertise.[13] We may conclude, therefore, that “gut” reactions may have a place in the early stages of the decision process. Perhaps it is one’s experience, based on multiple trials of the same or similar events, that can activate both acute perception of the issue at hand and a flurry of creativity that may expand alternative perspectives. However, as described above, there is likely to be added value if the decision-maker can utilize the tools that allow the decision-maker to “apply the brakes” before taking hasty action.
  4. Recognize emotions: Being aware of the emotions related to the issue associated with the decision can be helpful in teasing out the extent to which they may bias your action.[14] Optimism or pessimism surrounding the circumstances or individuals involved in the decision outcome can cloud an objective decision. The decision-maker needs to ask, “To what extent are my emotions driving my actions?”
  5. Reflective Inquiry: There is evidence to suggest that taking a step back allows the decision-maker to consider further available options.[15] [16] For example, journaling offers the opportunity to move intuition from mind to paper or electronic visualization. Writing a script in which the “author” examines the goals and initial reactions may result in widening the decision pool, leading to more effective outcomes. Questioning assumptions in the dialogue could provide insights that were not recognized in the initial process. The writing exercise allows the decision-maker to consider ways in which the initial judgment could be improved and, therefore, may portray the “landscape” in a broader manner.
  6. Devil’s Advocate: Intentionally seeking out a contrary opinion may strengthen the process.[17] Asking a member of the organization who has knowledge of the issues the question “What is wrong with this action?” may bring about an awareness of the limitations of the initial action and result in an improved decision.

Guiding Intuitive Decision-Making in Data-Rich Environments

In summarizing the dichotomy associated with rational thought versus. intuition, please return to the exercise we asked you to engage in at the beginning of the article. The purpose of this “drill” was to examine the extent to which you used data gathering in making your most recent decision. The more you focused on more analytic investigation—“What data do I need to collect?” “How do I make sense of the data?” “What actions do I take based on the analysis of the data?”—the more you embraced rational thought. On the other hand, the greater your feelings about the decision—“What does my experience tell me about this decision?” Are there any lessons from the past that I can draw from?” “How does the decision feel to me?”—the more you relied on intuition. Both processes could have led to positive results, but the purpose of this article is to recognize the value of each, meaning: the integration of rational thought and intuition to enhance the quality of decisions.

Intuitive decision-making hinges on two key factors: experience and time. It stands to reason that the more experience one has with a business, an industry, a consumer group, or so on, the more easily one will be able to intuit an appropriate course of action. Likewise, the less time and information one has, the more one is forced to rely upon intuitive sense.

 

[1] Ayers, I. (2007) Super Crunchers: Why thinking-by-numbers is the new way to be smart. Bantam: NY.

[2] Watson, H. (2013) The business case for analytics. BizEd, June.

[3] The Economist. (2011) Schumpeter: Building with big data. May 28, p. 74

[4] Kahneman, D. (2012) Thinking, Fast and Slow. Farrar, Stauss & Giroux: NY.

[5] The Economist.

[6] Ariely, D. (2010) Predictably irrational: The hidden forces that shape our decisions. Harper Collins Publishers, NY.

[7] Kahneman, D.

[8] http://www.youtube.com/watch?v=NHtEyDrD4oA

[9] Coget, J.F., Keller, E. (2010) The critical decision vortex: Lessons from the emergency room. Journal of Management Inquiry, 19, 1.

[10] Kahneman, D.

[11] Nasr, J. (2015) Personal conversation with the author.

[12] Bechara, A., Damasio, H., Tranel, D. & Damasio, R. (1997) Deciding advantageously before knowing the advantageous strategy. Science, 275, 5304.

[13] Gladwell, M. (2008) The outliers: The story of success. Little, Brown & Co: NY.

[14] Coget.

[15] Ibid.

[16] Sadler-Smith, E. & Shefly, E. (2004) The intuitive executive: Understanding and applying ‘gut feel’ in decision-making. Academy of Management Executive, 18, 4.

[17] Ibid.

2017 Volume 20 Issue 2

2017 Volume 20 Issue 1

2016 Volume 19 Issue 2

2016 Volume 19 Issue 1

2015 Volume 18 Issue 2

2015 Volume 18 Issue 1

2014 Volume 17 Issue 3

2014 Volume 17 Issue 2

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

Thinking, Fast and Slow by Daniel Kahneman

Thinking, Fast and Slow

by Daniel Kahneman

Farrar, Straus and Giroux

New York, 2011

512 pages

See more reviews




5 stars: Stop what you're doing and read this book!

As a manager, are the decisions you make rational? How much would you bet that they are or are not? Does your formulation of problems and objectives influence how rational the decisions of others will be? This book can help you check your answers.

The author, Daniel Kahneman has spent most of his professional life studying how people actually make decisions. His field is “behavioral economics.” This book is the capstone of a career that has deeply penetrated the processes of brain functioning and challenged classical economic assumptions. These led him to receive the Nobel Memorial Prize in Economic Sciences with Amos Tversky in 2002. The book is thorough, clearly written, and easily understood. It offers important lessons not only for economists, but especially for managers and for anyone who cares about the value of the decisions they make.

Many Managers are familiar with early insights of Herbert Simon, also a Noble Laureate, that alerted the economics world to some of the limits of rationality. For example, he observed that many successful managers frequently made decisions by “satisficing,” i.e. picking the first solution that meets a criterion rather than holding out for the optimal solution.

Here, Kahneman greatly expands on Simon’s work to include an even broader range of psychological and behavioral aspects of decision making. He shows how there are not only rational aspects to decision making, but other human features as well. His challenge to economic orthodoxy is that the situation in which the decision maker finds himself does matter, as well as how he feels about it, whether the issue involves a gain or a loss, how a question is presented, what it is compared to, and other characteristics of the issue’s formulation.

Essential factors of effective decision making such as “accessibility,” “broad and narrow framing,” “key heuristics,” “anchoring,” and “attribute substitution” are carefully spelled out. These constitute in part “prospect theory” for economics. A central conclusion is that there are essentially two modes of thinking, one rapid and more emotional, essentially intuition, and another, slower, more difficult and more analytical. Both have value, but the payoff for managers is to know the difference, know where they fit, and become more aware of how and when to use them. Examples extend to such management issues as: when you can trust an expert, how to formulate business strategies, why labor negotiations are often difficult and, in golf, why you might try harder to avoid a bogey than to achieve a birdie.

You will find that this book contains more details of psychological research than a typical management offering and is thus not a quick read. However, it is definitely worth the time it takes to grab hold of its key findings and conclusions.

2017 Volume 20 Issue 2

2017 Volume 20 Issue 1

2016 Volume 19 Issue 2

2016 Volume 19 Issue 1

2015 Volume 18 Issue 2

2015 Volume 18 Issue 1

2014 Volume 17 Issue 3

2014 Volume 17 Issue 2

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

The Winner’s Curse and Optimal Auction Bidding Strategies

Auctions are popular mechanisms for the exchange of goods and services in the marketplace. Common examples of items offered at auctions include real estate, mineral rights, construction contracts, agricultural products, United States Treasury bills, and government procurement contracts.[1] Auctions are now even more commonplace in our personal lives, where online auction sites have made it possible for individuals to bid for and sell items en masse.[2]

Auctions are particularly useful whenever there is uncertainty about the value of an item to be sold and the seller is willing to accept whatever price the market (i.e., the bidders) establishes for the item. Unfortunately, the competition amongst bidders in auctions often results in a phenomenon termed the “winner’s curse,” an outcome in which the winner prevails by submitting a bid that is not only higher than competing bids, but also higher than the true value of the item. In this article, we discuss how this phenomenon can be modeled via simulation and how the simulation results can be used to develop strategies for hedging bids, creating an optimal balance between the probability of winning and the economic gain from the transaction.



Image: cofkocof



Auction Settings

The study of auctions and bidding strategies is complicated by the great variety in types of auctions. Two of the most common types of auctions are the open auction the bidding price is publicly announced in ascending order (i.e., an English auction) or descending order (i.e., a Dutch auction) and the closed auction sealed bids are submitted simultaneously and the winner is the individual submitting the highest bid or, less frequently, the second-highest bid (i.e., a Vickery auction).[3]

Here, because of their predominance in business settings, the authors focus on bidding strategies for first-price “sealed bid” auctions, where either the highest bid wins or the lowest bid wins. The relatively limited amount of information available for bidders participating in these types of auctions makes the analysis of such bidding strategies all the more important.[4] The focus was further narrowed to sealed bid auctions where the highest bid wins the auction; however, it is straightforward to consider mirror cases where the lowest bid prevails, as would be the case with auctions for government procurement contracts, for example.

Regardless of auction type, one of two value models is appropriate:

  • Independent private values model: The value of the object at auction is different for the various bidders, for example, an antique to be purchased for a private collection.
  • Common value model: The value of the object is approximately the same for all rational bidders. This is the model most typical for business auctions, and it is also the model that gives rise to the “winner’s curse.”

The Winner’s Curse

Uncertainty exists in first-price sealed bid auctions with common item values for many reasons, including:

  • Bidders have access to different information,
  • Bidders interpret the same information differently, and
  • Valuation of items is a complicated and subjective process.

To see how such uncertainty can lead to the winner’s curse, consider the following example: Several firms are bidding for the rights to an oil exploration tract and each firm has developed its own internal estimate of its value. The firm winning this auction will typically be the one that produced the highest estimated value for the tract of land. However, the greater the submitted bid, the greater too is the likelihood that such a bid will exceed the true value of the tract, resulting in a “cursed” winner.

The winner’s curse was first identified, and then verified empirically, in the early 1970s for precisely this type of situation.[5] Since that time, economists, operations researchers, sociologists, and others have studied the phenomenon, confirming its existence in numerous empirical studies. The amount of the winner’s curse is the difference between the true value of the item being auctioned and the amount paid for it by the winning bidder.

An illustration of the curse commonly employed in the classroom setting is the auctioning of a jar of coins to student “bidders.” Almost always, the winning (i.e., the highest) bid exceeds the true value of coins contained in the jar, even though the average of all the bids is typically less than the true value, due to students’ risk aversion.[6] The winner’s curse has also been modeled mathematically, and its existence has been confirmed under very general conditions, including those cases where an object being auctioned has different values for the bidders involved and where competitors’ estimates are sometimes biased (i.e., higher or lower than the true value), for all types of auctions.[7]

Implications for Bidders

In view of the winner’s curse, how should a bidder behave in an auction to avoid or at least minimize its effect? Intuitively, it seems one should probably bid less aggressively. Analytical results for very simple auction models suggest that rational bidders in common value sealed bid auctions can generally avoid the winner’s curse if they presume that their estimate of an item’s value is the highest amongst all competitors and then bid some fraction of their original estimate.

In principle, there is no real cost associated with such a strategy as losing bidders neither gain nor lose anything; moreover, taken to the extreme, bidders can completely eliminate the threat of the winner’s curse by bidding such a small fraction of their estimate so as to never win an auction. Such a conservative strategy would clearly not be viable for bidders seeking to generate a profit; therefore, the challenge is to determine the fraction of the estimated value that should be bid to optimally balance risk and reward. In addition, common basic findings in virtually all research on auctions indicate that bidders should be aware that the winner’s curse is most severe in the following situations:

  1. Bidders have less information than their competitors,
  2. There is significant uncertainty about the true value of the item being auctioned, and
  3. There are a large number of bidders.

Under these conditions, the need for an optimal bidding strategy becomes more critical.

Developing an Optimal Bid

To illustrate how one can develop an optimal bid in an auction, consider the example of a two-bidder, first-price sealed bid auction for a tract of real estate. As is the case in most real estate auctions, we assume that the exact value of the tract of land is uncertain and that the bidders know enough about the parcel to describe its worth using a range of likely values (e.g., from analysis of “comparables”) using a probability distribution. For simplicity, we assume that the true value of the tract can be described by the normal distribution (i.e., the bell curve) with an expected value of $1 million and a standard deviation of $200,000 (i.e., the tract is worth $1 million on average). We also assume that both bidders will base their respective bids on their own proprietary estimates of the value of the tract and that both bidders have generated an unbiased estimate of that value. However, because there is great subjectivity inherent in their estimation processes, there is also significant uncertainty in the values bid. Hence, the two unbiased bids are not deterministic (i.e., certain); instead, they derive from a normal probability distribution, with a mean of $1 million and a standard deviation of $200,000. In the end, the higher of the two bids will be deemed the winner of the auction, and the winning bidder will realize a “profit” equal to the value of the tract of land acquired, minus what she or he paid for it.

Even with this simple example, an analytical estimate of the winning bidder’s expected profit would be quite difficult requiring, for example, the derivation of the probability distribution for the maximum of two random normal variables, and the integration of the functional form for this distribution to determine expected value.[8] An empirical solution, based upon the historical data collected for repeated auctions of similar tracts of land, could be used instead; however, this is usually difficult to obtain as the information from sealed bid auctions is typically not publicized. It seems that a rational bidding strategy based upon quantitative analysis is not readily available to the practitioner.

Simulation as a Tool for the Analysis of Auctions

Fortunately, commercially available simulation packages, such as @RISK or Crystal Ball, can be used to simulate repeated random samplings from realistic probability distributions for the values of the tract and the bidders’ estimates. The maximum of the two bids can be identified to determine a winner, and simple arithmetic can be used to calculate the winning bidder’s profit. If this process is repeated a large number of times, and those results are averaged, the long-run expected profit for a bidding strategy can be determined computationally. In short, one can create one’s own set of empirical data for the bidding analysis a sort of computational empirical analysis.

In Figure 1, we summarized the results from such a simulation study (for the example above) by plotting the output data for both the expected profit and the probability of winning over a variety of bidder-specific scenarios, given various opposing strategies. We define a strategy in terms of “hedging,” where the hedging percentage is the reduction in the bidder’s expected value that is used to calculate the final bid submitted in the auction.[9] For example, a 10 percent hedge indicates that a bidder would submit 90 percent of his or her original expected value as a final bid for the auction.

The simulation results shown in Figure 1 provide clear evidence of a winner’s curse; moreover, we can see the cost associated with the curse. For example, for the case where both bidders hedged their original bids by zero percent (i.e., they both bid their original estimated values), the expected profit is a loss of more than $50,000. And, as one might expect, both bidders are equally likely to “win” the auction. This can be seen at the extreme left-hand tails of the solid black and dashed black curves, respectively. While this is a desired outcome for the seller of the tract of land, it is clearly a suboptimal outcome for the bidders involved.

Assuming that sophisticated bidders are at least intuitively aware of the winner’s curse, we next consider the case where one or both bidders adopt different hedging strategies. Under these circumstances, we expect the probability of winning the auction to decline with increasing hedging percentage, holding the opposing bid constant.

This can be seen in Figure 1 by observing the following characteristics:

  1. The dashed lines, which show the probability of winning as a function of a bidder’s hedge all monotonically decrease as the magnitude of percent hedge increases, and
  2. Those dashed lines are ordered from top to bottom in decreasing amounts of the opponent’s hedge amount.

A related, and perhaps more important observation, can be made from the solid profit curves in Figure 1; one can see that there is a maximum value associated with each of the curves. At the extreme left-hand side of each of the profit curves, the bidder suffers from the winner’s curse, since she or he realized a high probability of winning the auction, but paid too high a price for the tract. Conversely, at the right-hand extreme of each curve, conservative bidding offers the potential for high profit, and yet the conservative bid almost never results in a win for the bidder.





Figure 1 Expected Profit and P(Winning) vs. Hedge % for a Two-Bidder Auction





The greatest height achieved by each profit curve in Figure 1 offers an indication of the amount of hedge that would lead to maximum profit, given a particular opposition strategy. For example, we can see that profit is maximized when the bidder’s hedge is in the range of 20 to 40 percent, depending on the particular balance or imbalance between the bidder’s hedge and the competitor’s hedge. We believe this clearly demonstrates the extraordinary value of using simulation to understand auctions and to develop optimal bidding strategies.

Formulating a Bidding Strategy with Incomplete Information

If our example bidders could agree to share information and cooperate with each other in aggressively hedging their bids, they could drive the auction price down to a small fraction of its actual value and produce a large profit for the winning bidder, assuming the seller was obliged to sell to the higher bidder. Evidence of this can also be seen in Figure 1: the profit to the winning bidder would be in excess of $200,000 if both bidders agreed to hedge their original bids by 50 percent. It is also clear from the figure, however, that there would be a strong temptation for one of the bidders to deviate from such a plan (e.g., hedging slightly less than the agreed amount), thereby increasing her or his chances of winning the auction and receiving a handsome profit. Thus, even if such cooperation amongst bidders was considered fair and ethical, it is unlikely that it would be at all common in practice.

A more realistic scenario is an auction where bidders neither act in cooperation with each other nor share information; instead, they prepare rational bids, and they expect other bidders to behave in an equally rational manner. Thus, while we have seen that simulations can be a very useful tool for determining auction outcomes, given bidder strategies, we would like to address the following question: How should a bidder behave without knowledge of the strategy her or his opponent will use?

For auctions characterized by rational bidders with incomplete information, economic game theory can be used to explore the possibility of a dominant strategy for bidding in auctions. Unfortunately, the common assumptions of game theory must be simplified radically in order to find the analytical point of equilibrium between bidders. Nevertheless, some researchers have attempted to derive the optimum bid for a two-person auction game by assuming restrictive probability distribution forms for bids or the value of an auction item.[10], [11] By using simulation models, however, we are not bound by simplifications of or restrictions to the assumptions. To illustrate, the payoff table in Figure 2 was generated using output from repeated simulation runs of the example above, and the table shows the payoffs for our two bidders over a wide range of bidding strategies. A comparable graphic could be generated for virtually any probability distribution of values or bids, not simply the normal distribution, and for very general parameter specifications.

Once generated, the payoff table below can be used to search for the equilibrium point between two bidders. In the jargon of game theory, this is referred to as a Nash equilibrium the point or points where bidders cannot unilaterally improve their expected gain by moving to a different strategy.[12]

  • If Bidder 1 hedges at zero percent, Bidder 2’s optimal strategy would be to hedge at 25 percent (green cell showing the maximum profit in column two).
  • If Bidder 2 hedges at 25 percent, Bidder 1’s optimal strategy would be to respond by changing to hedge at 25 percent (yellow cell showing the maximum profit for Bidder 1 along that row).
  • If Bidder 1 hedges at 25 percent, Bidder 2’s optimal strategy would be to respond by changing to hedge at 30 percent (orange cell).
  • Finally, if Bidder 2 hedges at 30 percent, Bidders 1’s optimal strategy would be to respond by changing to hedge at 30 percent (see the blue cell).

The last combination of strategies (i.e., Bidder 1 hedges at 30 percent; Bidder 2 hedges at 30 percent) is Nash equilibrium, because neither bidder can improve her or his profit by changing the hedge percentage. Note also that profit for Bidder 1 ($111 thousand) is very close to the profit for Bidder 2 ($110 thousand), with the slight difference being a computational artifact of numerical simulation. (Click here for Figure 2 Expected Profit Table for a Two-Bidder Auction.)

In this example, we assumed that our bidders acted rationally, and of course, bidders sometimes behave in irrational ways at auction, especially when the item up for auction holds sentimental or emotional value to a particular bidder.[13] Still, our analysis of the rational bidder case can be used to set the standard for more complex auction strategies. Also, to limit the length of our discussion, we considered a simple case with only two bidders; however, we could generalize our two-bidder example to cases where there are n bidders in an auction, in two ways. First, we could stay with a two-player model and assume that the opponent is the collective of all other bidders, who act in the same way. Alternatively, we could use the approach discussed here to rigorously model each bidder in an auction that is, we could simulate each bidder in the n-bidder game. The difficulty with the latter approach lies in the computational burden, which increases as more bidders are added. Finally, under certain assumptions, it is also possible to model the case of n bidders as sequential auctions between pairs of bidders whose bids are repeated until equilibrium bidding strategies are reached.[14]

In each of these cases, the ability to simulate a two-bidder auction serves as the basic framework, and the information required to set up the simulation analysis is simply the estimated probability distribution functions for the value estimates for both the bidder and the competitor(s).

Conclusion

Given the prevalence of auctions in business today, it is important for decision-makers (i.e., bidders) to fully understand the nature of auctions and the winner’s curse. We have shown that simulation, an analytical tool from the management science field, can be of tremendous value in generating empirical evidence about auctions when actual data does not exist. Moreover, simulation can deliver insights for the formulation of effective bidding strategies. The results from a simulation analysis of the relatively simple example discussed in this paper, for instance, allow us to make several important observations about auctions:

  1. When bidders do not hedge their value of an item in a competitive auction, they will likely pay too high a price for an item, if they win an auction. The expected loss can be quantified under very general conditions using simulation modeling.
  2. To avoid the winner’s curse, rational managers should choose a valuation model carefully and then decide on an appropriate hedge for their final bid. The expected profit, as a function of hedging percentage, has a maximum at the point where the tradeoff between risk and reward is optimal. This optimal hedge can also be found using simulation.
  3. In the usual case where competitor-bidding strategy is unknown, a range of bidding strategies, for both the bidder and the competitor(s), can be simulated and then economic game theory can be used to determine the optimal bidding strategy. While this type of analysis cannot guarantee an outcome or provide assurance that the winner’s curse will be avoided, the resulting bidding strategy can provide the bidding decision-maker with the best opportunity for success.

As mentioned earlier, we have focused on the so-called common value model, where the value of the object, and thus the profit from acquiring it, should be approximately the same for all rational bidders. An interesting extension that is beyond the scope of this paper would be to consider the independent private values model in which the value of the object at auction will be different for the various bidders to see what effect the additional uncertainty about competitor valuation would have on bidding strategy.


[1] R. McAfee and J. McMillan, “Auctions and Bidding,” Journal of Economic Literature, 25 (1987): 699 738.

[2] M. Rothkopf and S. Park, “An Elementary Introduction to Auctions,” Interfaces, 31 (2001): 83 97.

[3] M. Rothkopf and R. Harstad, “Modeling Competitive Bidding: A Critical Essay,” Management Science, 40 (1994): 364 384.

[4] W. Vickery, “Counterspeculation, Auctions, and Competitive Sealed Tenders,” Journal of Finance, 16 (1961): 8 37.

[5] E. Capen, R. Clapp, and W. Campbell, “Competitive Bidding in High-Risk Situations,” Journal of Petroleum Technology, 23 (1971): 641 653.

[6] R. Thaler, “Anomalies: The Winner’s Curse,” The Journal of Economic Perspectives, 2 (1988): 191 202.

[7] S. Oren and A. Williams, “On Competitive Bidding,” Operations Research, 23 (1975): 1072 1079.

[8] P. Milgrom and R. Weber, “A Theory of Auctions and Competitive Bidding,” Econometrica, 50 (1982): 1089 1122.

[9] M. Rothkopf, “On Multiplicative Bidding Strategies,” Operations Research, 28 (1980): 570 575.

[10] L. Friedman, “A Competitive Bidding Strategy,” Operations Research, 4 (1956): 104 112.

[11] E. Dougherty and M. Nozaki, “Determining Optimum Bid Fraction,” Journal of Petroleum Technology, 27 (1975): 349 356.

[12] B. Smith and J. Chase, “Nash Equilibria in a Sealed Bid Auction,” Management Science, 22 (1975): 487 497.

[13] M. Rothkopf, “A Model of Rational Competitive Bidding,” Management Science, 15 (1969): 362 373.

[14] S. Oren and M. Rothkopf, “Optimal Bidding in Sequential Auctions,” Operations Research, 23, no. 6 (1975): 1080 1090.

2017 Volume 20 Issue 2

2017 Volume 20 Issue 1

2016 Volume 19 Issue 2

2016 Volume 19 Issue 1

2015 Volume 18 Issue 2

2015 Volume 18 Issue 1

2014 Volume 17 Issue 3

2014 Volume 17 Issue 2

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

Moral Markets by Paul J. Zak, (Ed.)

Moral Markets: The Critical Role of Values in the Economy

By Paul J. Zak (Ed.)
Princeton University Press, 2008

See more reviews

5 stars: Stop what you're doing and read this book!Moral Markets is a collection of essays based on the premise that markets are driven by virtuosity, an ambitious argument in today’s climate of market distrust. The authors’ goal is to raise the awareness of market stakeholders, a category that includes almost everyone, that markets are “good” today, and have been throughout time. By no means is this an “airport” book on management; the reader is challenged to engage in a profound study of human behavior and values.

Moral Markets is edited by Paul Zak of Claremont Graduate University, who is known for his work in neuroeconomics, a transdisciplinary study of how decision making, risk, and trust produce good economic outcomes. The book is organized into five parts: Philosophical Foundations of Values; Nonhuman Origins of Values; The Evolution of Values in Society; Values and the Law; and Values and the Economy.

We are reminded early on that “moral markets” that is, economies where exchange is fair, good, truthful, efficient, and productive are really communities of individuals whose practice of personal responsibility has proven to yield the best possible outcome for society at large. The takeaway is that for the individual, the good life translates to the good life of the community, and it is fueled by individual achievement through the maximization of talents. The authors affirm that moral markets are not only the place where society participates in good exchange, but they also serve as a means for individuals to derive purpose from their life endeavors. They write, “Meaningful human activity is that which intends the good rather than stumbling over it on the way to merely competitive or selfish goals, and the predictable outcome of such behavior is not the mysterious result of an invisible hand but of our own good intentions, amply rewarded.”

Economies ebb and flow because of traditional business cycles. These same economies can shutter and collapse when the morality of their markets is compromised this phenomenon is more evident today than it has been in decades. The future of the economy does not hinge on a dose of public policy or even an infusion of morality. Instead, as Moral Markets asserts, we need to reevaluate the entire spectrum of values and virtues, from the core of the individual to the soul of the society. Moral Markets provides the serious student of business, government, and society with the necessary intellectual tools to reengineer their knowledge and practice of virtue, and perhaps enlighten them on why and how “goodness” ultimately serves society.

See more reviews

2017 Volume 20 Issue 2

2017 Volume 20 Issue 1

2016 Volume 19 Issue 2

2016 Volume 19 Issue 1

2015 Volume 18 Issue 2

2015 Volume 18 Issue 1

2014 Volume 17 Issue 3

2014 Volume 17 Issue 2

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

Knowledge is Power…

How could your organization more effectively warehouse and mine its information?

Information systems have traditionally assisted managers and decision-makers by providing better data, quicker access, better models, or more optimal solutions. Information systems have been used to help us do tasks more quickly and cost-effectively, and have been most helpful to managers in information gathering and decision-making. This is no longer enough, however. Today’s information systems must truly add value to the organization through the creation, capture, distribution, application, and leveraging of knowledge – or knowledge management.

The role of information technology is being transformed in four major ways…

  1. The traditional emphasis on fast access to data, gathered in functional departments, is shifting toward more centralized access to integrated and coordinated data to support decision-making.
  2. Information systems are focusing less on summarizing data in reports, and more on sifting through huge volumes of data for potential problems or opportunities, hidden patterns, and meanings.
  3. There is increased need for technologies that support effective communication, document sharing, knowledge sharing, and decision-making among groups – particularly those separated by time and distance.
  4. Systems are connected to the outside world via the Internet to facilitate communications, relationship building, and sharing of information and experiences with customers, clients, and industry partners.

The globalization of the world’s industrial economies is one reason for the changing role of knowledge management. Globalization greatly enhances the value of information to the firm and presents new challenges and complexities in communicating, controlling, and coordinating information and operations across the far-flung corporation. More tasks are now done in a distributed environment by people scattered across different countries, cultures, and time zones. The relatively recent explosion of electronic commerce has further increased the pace of business transactions and made it even easier for customers to quickly identify and make comparisons of price and quality among global competitors.

A second force changing the business environment is the transformation of major industrial economies into information-based service economies in which knowledge and information become the key ingredients in creating wealth. Knowledge and information work currently account for some 60 percent of the American gross national product, and nearly 55 percent of the labor force. The growing importance of knowledge management is evident in the changing roles of information systems in organizations.

Changing Roles of Information Systems
Traditional Roles Emerging Roles
1. Fast access to functional information. Centralized access to integrated, coordinated information.
2. Summarize data in reports. Sift through data for patterns and relationships
3. Support centralized, individual decision-making. Support decentralized & group decision-making, information sharing.
4. Internal communication. Broad-based communication, external relationship building.

Information is a Critical Strategic Resource

Information and information technology have become critical strategic resources for businesses. In order to bring all the necessary information and expertise to bear on a task or decision, work must increasingly be done cross-functionally, and by teams rather than individuals. The greater availability and importance of information is resulting in more decentralized decision-making and information sharing.

Technologies that gather, integrate, and facilitate centralized access to data are the foundation of many knowledge management activities. A relatively new and rapidly growing technology is an Enterprise Resource Planning (ERP) system. An ERP is a business management system that integrates all facets of the organization including inventory planning, manufacturing, sales, payroll, and finance so that they can become more coordinated through information sharing. This integration is critical because it means that, as a transaction is processed in one area, such as receipt of an order, its impact is immediately reflected in all related areas such as accounting, production scheduling, and purchasing.

The second crucial characteristic of an ERP system is that the modules are designed to reflect a particular way of doing business based on a value chain view in which functional departments coordinate their work. Many businesses are finding this coordinated approach far preferable to the old way of doing business (by separate functional departments) because it greatly facilitates data gathering, integration, information sharing, and decision-making.

A data warehouse is another technology that supports centralized access to integrated information. Data warehousing is the creation and maintenance of a large special-purpose database containing current and unified data from all functional units, as well as easy-to-use query, analysis, and reporting tools. It pulls data from the various departmental systems, and sometimes from external sources as well, and puts them into a separate warehouse so that users can access and analyze the information without endangering the original systems.

This centralization of data about a company’s business, products, and customers eliminates redundancies and errors. It also allows employees to draw upon a wide range of information without having to make multiple requests of different departments. The fact that one standardized set of analytical, query, and reporting tools can now be used across the board greatly facilitates the transformation of data into useful information.

Data Analysis Separates Gold from Dross

As the amount of available data increases, managers must find ways to turn all that data into useful information. Data mining is an advanced analytical technique for uncovering small nuggets of information within vast quantities of data. Data mining actually involves using a variety of analytical techniques to identify patterns, correlations, or trends in massive amounts of data.

Data mining has many valuable applications such as market segmentation (identifying the common characteristics of customers who buy the same products from your company); customer churn (predicting which of your customers are likely to defect to a competitor); fraud detection (identifying which transactions are most likely to be fraudulent); interactive marketing (predicting what each individual accessing a Web site most wants to see); and market basket analysis (understanding what products or services are commonly purchased together, e.g., beer and pretzels).

Artificial intelligence tools can also help draw meaning from data. Many people associate the term “artificial intelligence” with expert systems designed to capture the expertise of humans in a rule-based computer program. Expert systems can help businesses with knowledge management by capturing and codifying knowledge that might otherwise be lost due to the absence, retirement, resignation, or death of an acknowledged expert.

Neural networks are a different and more flexible form of artificial intelligence useful in knowledge management. Neural networks attempt to tease out meaningful patterns from vast amounts of data. They use statistical analysis to recognize relationships and can actually adapt as new information is received – a process called adaptive learning.

Neural networks enhance the organization’s knowledge base by suggesting solutions to specific problems that are too massive and complex for efficient analysis by human beings. For example, neural nets are used by BankAmerica to evaluate commercial loan applications. American Express uses neural nets to read handwriting on credit card slips while Arco and Texaco use it to locate oil and gas deposits below the earth’s surface.

Groupware Links People and Information

Groupware is another application of information technology that has emerged to support the needs of global organizations and virtual teams. Groupware such as Lotus Notes offers e-mail, calendaring, group scheduling, Web access, and information management in a relatively easy-to-use and customizable environment.

Virtual teams working from different locations can set up discussion “databases,” accessible remotely over the Internet, that organize such things as e-mail discussion threads, spreadsheet files, and slide shows around central topics or tasks. In a corporate setting, one discussion database might summarize daily activities on a client project, another might provide a searchable set of names and contacts, another might include company “best practices” or guidelines, and another might contain client contracts, specifications, and communications.

In an educational environment, there might be a discussion database established for each individual course. Students in the course could use their virtual workspace to ask questions, post responses, get assignments and handouts, and submit work. Student project teams might also have their own workspaces, as could faculty committees or research teams who find it difficult to meet face to face. Databases can be customized to admit or restrict specific users as appropriate.

Another type of groupware is designed specifically to facilitate group meetings. Some researchers have estimated that middle managers spend 35 percent of their work week in meetings and that top managers spend 50-80 percent of their time in meetings. Many such meetings are characterized by unclear goals, unequal participation, and difficulty reaching consensus.

An Electronic Meeting System (EMS) consists of a computer-supported meeting room containing networked computers for each participant and a large public screen to facilitate common viewing of information. Participants can provide their input simultaneously and anonymously via computer. This has been shown to result in greater participation and more complete consideration of topics. The display of incoming inputs on the public screen during brainstorming also tends to stimulate a larger number of ideas.

EMS software contains tools designed to support activities such as team idea generation, organizing ideas, prioritizing ideas (voting, ranking), and policy development such as stakeholder identification. Each computer-based activity is followed by facilitated discussion. More recent versions of these tools can be used in a “different time, different place” environment with members participating remotely with the help of videoconferencing systems and the web.

Information Technology Used in Supply Chain Management

Businesses recognize the growing importance of communicating and forming alliances with customers and suppliers so they can adapt quickly to the ever-changing environment. Information technology is increasingly being used strategically for supply chain management which refers to the integration of supplier, distributor, and customer logistics requirements into one cohesive process. To manage the supply chain, a company tries to eliminate delays and cut the amount of resources tied up along the way, ideally creating more efficient customer response systems as well.

Wal-Mart’s “continuous inventory replenishment system” captures sales data via point-of-sales terminals at the checkout counter and immediately transmits electronic restocking orders directly to its suppliers. This system allows them to keep prices low, shelves well-stocked, and overhead costs at only 15 percent of sales revenue. Enterprise resource planning (ERP) systems may also be extended outside the firm to increase coordination with supply chain partners.

The supply chain can go global as well. Adaptec Inc, a Silicon Valley-based computer chip company that obtains many of its products in East Asia, is using the Internet to give its suppliers access to purchase orders and factory-status updates. The company says this has made it more responsive, cutting the manufacturing cycle from 12 weeks to eight and saving millions in inventory costs.

Dell Computer is linking its entire supply chain via the Web. Chrysler’s Supplier Partner Information Network (SPIN) allows 3500 of Chrysler’s 12,000 suppliers selective access to portions of its intranet where they can access current data on design changes, parts shortages, packaging information, and invoice tracking. SPIN can even automatically notify suppliers of critical parts shortages. Chrysler believes SPIN has reduced the time to complete various business processes by 25 to 50 percent.

Knowledge Management Requires Organizational Change

While the capabilities and roles of information technology are changing to address new business challenges, technology alone is insufficient for meeting the demands of the global and information-based economy. Knowledge management technologies necessitate fundamental changes in the organization itself in order to effectively integrate, communicate, and distribute information. The potential benefits of these information technologies can be realized only if their implementation goes hand in hand with changes in organizational structure and culture.

2017 Volume 20 Issue 2

2017 Volume 20 Issue 1

2016 Volume 19 Issue 2

2016 Volume 19 Issue 1

2015 Volume 18 Issue 2

2015 Volume 18 Issue 1

2014 Volume 17 Issue 3

2014 Volume 17 Issue 2

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1