Real Options: The Value Added through Optimal Decision Making

One of the primary responsibilities of a management team is to make decisions during the execution of projects so that gains are maximized and losses are minimized. Decision analysis is especially critical for projects with built-in flexibility, or options. This article explores how merging decision analysis with the well-known principles used in valuing options on financially traded assets can be further enhanced by applying an intuitive approach based on familiar concepts from the field of decision analysis.

[powerpress http://gsbm-med.pepperdine.edu/gbr/audio/spring2010/options.mp3]

Introduction

One of the primary responsibilities of a management team is to make decisions during the execution of projects so that gains are maximized and losses are minimized. This is especially important for projects with built-in flexibility, such as those with options to expand operations in response to positive market conditions, to abandon an asset that is underperforming, to defer investment for a period of time, to suspend operations temporarily, to switch inputs or outputs, to reduce operational scale, or to resume operations after a temporary shutdown. By merging decision analysis with the well-known principles used in valuing options on financially traded assets, we can quantify the potential value associated with these types of options on real assets.

Asset Valuation

For quite some time, discounted cash flow methods (DCF) have been the primary approach used by practitioners for the valuation of projects and for decision making regarding investments in real assets. With the DCF approach, the net present value of a project is calculated by discounting future expected cash flows at a given discount rate. As an example, consider a simple three-period project for which a pro forma cash flow sheet is shown in Figure 1.






Pro Forma Cash Flow Sheet for Simple Three-Period Project

Figure 1: Pro Forma Cash Flow Sheet for Simple Three-Period Project





This example might be representative of a typical industrial manufacturing application with a three-year production planning cycle under a forecasted market price environment. In the pro forma, the production and price forecast in each period translate to revenue, which can then be netted of production costs to arrive at the expected cash flow in each period. The cash flows should then be discounted at a rate that is commensurate with the riskiness of the project. In practice, this discount rate is often the weighted average cost of capital for the firm (WACC), based on the assumption that both the firm and the project have the same risk level. While this assumption may be valid for projects that mimic the risks associated with the firm as a whole, it may not be appropriate for unusual or innovative investment projects. In such cases, the practitioner must exercise judgment in choosing an appropriate discount rate for the project.[1]

Unfortunately, this approach ignores the significant incremental value that can be derived from management’s response to conditions in the future. For example, if the product unit price in future periods increases or decreases significantly, relative to the expected prices used in Figure 1, it seems untenable to assume that the firm’s management would fail to respond to such a change. If the firm indeed has the flexibility, we might instead expect management to revise the production level accordingly. Thus, we could have different revenues and cash flows than the ones shown in Figure 1, and the resulting present value would change as well.

An approach that treats future decision-making opportunities as options can account for this value, but because these more advanced methods are less familiar to many managers, their widespread use has been slow to arrive in practice. In recent work, Copeland and Antikarov,[2] Copeland and Tufano,[3] and others have sought to increase the application of more advanced valuation approaches by introducing computational methods that are more accessible to practitioners. In this article, we discuss how this work can be further enhanced by applying an intuitive approach based on familiar concepts from the field of decision analysis.

Option Pricing

Option pricing methods were first developed to value financial options. However, the potential application of these methods to the valuation of options on real assets was soon identified, and given the moniker “real options.” Although hundreds of scholarly papers have been written on this topic, the complex mathematics required for option pricing techniques have unfortunately limited the appeal of these topics for many practitioners. A study conducted earlier this decade indicated that, while DCF valuation methods were used by over three-quarters of corporate finance practitioners surveyed, only about one-quarter used a real options approach.[4]

Unlike the case with DCF analysis, in an option pricing approach, we do not assume deterministic (certain) expected values for the relevant asset or project uncertainty, and must therefore model how the value evolves over time. There are several different types of mathematical models, called stochastic processes, which have been developed for this purpose. To simplify the analysis of option valuation problems, we typically work with a discrete approximation of the selected stochastic process. A discrete model contains a limited number of outcomes for the uncertainty at regularly spaced intervals in time, rather than a continuous distribution of outcomes for all points in time. This way, the firm need only make decisions at the discrete points to optimally respond to the uncertainty as it evolves. These discrete models have been shown to closely approximate the exact solutions derived using stochastic calculus, without the need for advanced mathematics.

An early example of this type of discrete approach was a binomial lattice model developed by Cox, Ross, and Rubinstein[5] to value options to buy or sell financial instruments, such as stock. This model consists of a binomial lattice, which depicts two possible changes in value for a stock in each time period; a move up by a factor u or a move down by a factor d. An example of this type of binomial lattice is shown in Figure 2, where S is the current market price of the asset, q is the probability of an upward move, u is a factor greater than 1, and d is the reciprocal of u.





Three-period discrete binomial lattice model of Stock Price

Figure 2 – Three-period discrete binomial lattice model of Stock Price





To find the present value with options with such a lattice, we start from the final time period and work backward through time, finding the value from exercise or deferral of the option at each node in each period until we arrive back to the starting point (time zero). At nodes where the value has gone up, the optimal decision for a call option (option to buy the stock), for example, would be to exercise, while at nodes where the value has gone down, the optimal decision would be not to exercise. The opposite policies would generally be true for a put option (option to sell the stock).

Note that we must accurately assess the level of risk associated with the option exercise decision at each node because it dictates how much future cash flows (option payoffs) should be discounted during the backward induction process. This presents a challenge because the risk level is not constant, but is instead specific to each node in the lattice. Option pricing theory provides us with different methods for addressing this problem, and in the next section we will discuss one such method that can be applied in a decision-tree framework.

Applying Decision Trees to Solve for Option Value

We can construct a binomial tree that is equivalent to the binomial lattice in Figure 2, with the only difference being that branches do not recombine in the binomial tree. Therefore, the multiple paths that lead to the four possible outcomes in Figure 2 are all explicitly shown in Figure 3.









Three-period discrete binomial tree model

Figure 3 – Three-period discrete binomial tree model









With this type of tree, we can then model decision making about options in discrete time with decision nodes in the manner of standard decision tree analysis (DTA) familiar to many practitioners. Nau and McCardle[6] and Smith and Nau[7] studied the connection between DTA and standard lattice-based option pricing methods and demonstrated that the two approaches yield the same results, as long as the risk level is correctly specified throughout the tree in the DTA approach.

To adjust for the risk level in the DTA approach, we use a different set of transformed probabilities, p and 1-p, for the up and down outcomes at each chance node, respectively. These are the probabilities that a risk-neutral investor would assign to the two outcomes, therefore they are often called “risk-neutral” probabilities. The value obtained from solving a decision tree that is transformed with risk-neutral probabilities can be interpreted as the value that a rational risk-neutral investor would assign to the project. Under such risk-neutral conditions, the need for estimating the risk level at any point in the tree is eliminated, and we can simply discount all cash flows at the risk-free discount rate.

There are several different ways to estimate the up and down movements and risk-neutral probabilities, all of which incorporate information about the uncertainty, or “volatility” of outcomes associated with the project. Perhaps the most common method is to follow the convention used by Cox, Ross, and Rubinstein, in which the up and down movements at each step are u = eσ√Δt and d = 1/u, respectively, where σ is the volatility of asset returns per time increment in the tree and Δt is the length of the time increment. Once u and d have been determined, the probability for an up move at each node in the tree is then p = (1 + rΔt – d)/ (u-d) , while the corresponding probability of a down move is simply 1-p. These values for u, d, and p are based on the assumption that the value over time evolves according to a Geometric Brownian Motion (GBM), a common stochastic process for modeling financial values. Details associated with the binomial approximation of a GBM stochastic process can be found in Hull.[8]

We emphasize that only three parameters are needed to specify this discrete approximation: the estimate of the current deterministic value of this project (for the starting point of the tree), the estimated volatility of the returns from the project (for the up and down values in the tree), and the risk-free rate (for the probabilities in the tree).

An Example

If an initial investment of $1 million is required to commence the project shown in Figure 1, the resulting NPV is $55,000, assuming the firm’s WACC or investment hurdle rate is 10 percent. From a deterministic DCF perspective, the expected future cash flows provide an internal rate of return of 13.7 percent on this investment. Since the project NPV is positive and the rate of return exceeds 10 percent, this indicates that the project is a good investment opportunity; however, there may also be many other projects competing for funding under the firm’s capital budget. Thus, it is important to obtain an accurate valuation of each project that includes all sources of value, including the managerial flexibility to optimize outcomes.

Suppose, for example, that instead of being locked in to the forecasted production levels shown in the pro forma cash flow sheet, the firm can expand production in response to changes in the product unit price in years one and two if it so chooses. Specifically, we assume the firm has the option to increase production by 20 percent after year one at a cost of $160,000, and after year two at a cost of $62,500. From the real options perspective, these investment opportunities are analogous to two independent call options on an incremental 20 percent increase in production capacity. We assume that the optional investments at the end of years one and two will only be exercised if they are justified by the price and the estimates of the remaining project value at those points in time, and thus these investments can only add to the project’s deterministic NPV.

To value the expansion options, as suggested by Copeland and Antikarov, we use the present value of the project without options as the underlying asset for the options. We already have the beginning value given in Figure 1 ($1.055 million), and therefore need only to estimate the volatility in order to construct a discrete stochastic model of project value. It is typically not possible to estimate the volatility for real assets using market information; however we can instead simulate the cash flow pro forma sheet to generate a set of synthetic returns for the project, entering the project uncertainties as random variables, rather than deterministic expected values. In this example, we would enter random variables in each period in the Price row in Figure 1 (using functions from simulation software applications such as @RISK™ or Crystal Ball™). Then the return from period 0 to period 1, for example, can be calculated by dividing the present value in period 1 (currently shown as a fixed value of $1.161 million in Figure 1) by the present value in period 0 (fixed value of $1.055 million) and taking the logarithm of this ratio. When the spreadsheet is simulated for a large number (>1,000) of iterations, the different random prices produced in each iteration yield a probability distribution, including a mean and standard deviation, for the return (instead of a single fixed value). The volatility σ of the plant’s present value is then equal to the standard deviation of the returns. In many cases, the volatility will change from period to period, so the simulation should include an output for the return in each period, not just the return from period 0 to period 1.[9]

For this illustration, we assume that a simulation of the cash flow pro forma sheet has provided us with a volatility estimate of 30 percent for all periods and also that the risk-free discount rate r is 5 percent per year. We will model the project in one-year time increments, therefore Δt =1, and we have u = e0.30√1=1.35, d = 1/1.35=0.74, and p = 0.51 as the parameters needed to construct a binomial model for project value.

The resulting three-period (T1, T2, T3) decision tree for the project value, without options and starting from a value of $1.055 million at t = 0, is shown in Figure 4. The values displayed above and below each branch in the tree are the discounted present value and cash flow, respectively. For example, the value shown above the up branch of T1 ($1.354 million) is $1.055 million multiplied by u (1.35) and discounted at 5 percent, while the value below the branch ($676) is the value above the branch multiplied by the cash flow ratio for period 1 (0.5, as shown in Figure 1). Figure 4 shows that the tree without options can be “rolled back” to verify its starting value.





The binomial tree for project value (without options)

Figure 4 - The binomial tree for project value (without options)





Next, the real options in the project can be modeled simply by adding decision nodes to the tree shown in Figure 4. Specifically, we insert nodes after time periods one (Opt1) and two (Opt2) for the decisions about whether to expand production. The solution to the tree with these decision nodes added is shown in Figure 5, which indicates that the expected present value of the project with options is $1.105 million, which increases the NPV to $105,000.





The solution to the binomial tree (with options)

Figure 5 - The solution to the binomial tree (with options)





We also note that the optimal decision policy is obvious from the graphic view of the solved decision tree, whereas it must be inferred from a binomial lattice representation. Notice, for example, that the production should be expanded if the expected value of the project moves up during the first time period. Additionally, we can see that the only case where the production should not be expanded after the second period is when the project value has decreased in both periods one and two.

Conclusions

This example shows how an approach using decision-analysis methods provides a straightforward yet flexible way to apply option-valuation techniques. The solution shown in Figure 5 was obtained using the software application DPLâ„¢, but the basic approach can be implemented using virtually any commercially available decision-analysis package. We refer the interested reader to Brandao, Dyer, and Hahn[10],[11] and Smith[12] for more details and other examples of the application of this valuation approach. We believe that decision-analysis techniques provide managers with more intuition for solutions to valuation problems, and ultimately will lead to more utilization of advanced valuation methods. This will be critical in an increasingly competitive business environment where the ability to accurately assess project and asset values, including the incremental value related to a project’s embedded options, will heavily influence difficult investment portfolio decisions.


[1] Grinblatt, M. and S. Titman, Financial Markets and Corporate Strategy, (New York: Irwin/McGraw-Hill, 2nd Edition, 2001).

[2] Copeland, T. and V. Antikarov, Real Options, (New York: Texere LLC, 2003).

[3] Copeland, T. and P. Tufano, “A Real-World Way to Manage Real Options,” Harvard Business Review, 82 No. 3 (2004): 90-99.

[4] Graham, J. and H. Campbell, “Theory and Practice of Corporate Finance: Evidence from the Field,” Journal of Financial Economics, 60 (2001): 187-243.

[5] Cox, J., S. Ross, and M. Rubinstein, “Option Pricing: A Simplified Approach,” Journal of Financial Economics, 7 (1979): 229-263.

[6] Nau, R. and K. McCardle, “Arbitrage, Rationality and Equilibrium,” Theory and Decision, 33 (1991): 199-240.

[7] Smith, J. and R. Nau, “Valuing Risky Projects: Option Pricing Theory and Decision Analysis,” Management Science, 14 No.  5 (1995): 795-816.

[8] Hull, J., Options, Futures and Other Derivatives, (New Jersey: Prentice Hall, 2003).

[9] See Brandao, Dyer and Hahn (2005b) for a detailed discussion of simulating pro forma cash flow sheets to obtain volatility estimates for real assets with uncertain variables.

[10] Brandao, L., J. Dyer, and J. Hahn, “Using Binomial Decision Trees to Solve Real-Option Valuation Problems,” Decision Analysis, 2 (2005b): 69-88.

[11] Brandao, L., J. Dyer, and J. Hahn, “Response to Comments on Brandao, et al (2005),” Decision Analysis, 2 (2005a): 103-105.

[12] Smith, J., “Alternative Approaches for Solving Real Options Problems,” Decision Analysis, 2 (2005): 89-102.

2017 Volume 20 Issue 1

2016 Volume 19 Issue 2

2016 Volume 19 Issue 1

2015 Volume 18 Issue 2

2015 Volume 18 Issue 1

2014 Volume 17 Issue 3

2014 Volume 17 Issue 2

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

The Role of Finance in the Strategic-Planning and Decision-Making Process

The fundamental success of a strategy depends on three critical factors: a firm’s alignment with the external environment, a realistic internal view of its core competencies and sustainable competitive advantages, and careful implementation and monitoring.[1] This article discusses the role of finance in strategic planning, decision making, formulation, implementation, and monitoring.

[powerpress: http://gsbm-med.pepperdine.edu/gbr/audio/winter2010/PedroKono_article.mp3]

Any person, corporation, or nation should know who or where they are, where they want to be, and how to get there.[2] The strategic-planning process utilizes analytical models that provide a realistic picture of the individual, corporation, or nation at its “consciously incompetent” level, creating the necessary motivation for the development of a strategic plan.[3] The process requires five distinct steps outlined below and the selected strategy must be sufficiently robust to enable the firm to perform activities differently from its rivals or to perform similar activities in a more efficient manner.[4]

A good strategic plan includes metrics that translate the vision and mission into specific end points.[5] This is critical because strategic planning is ultimately about resource allocation and would not be relevant if resources were unlimited. This article aims to explain how finance, financial goals, and financial performance can play a more integral role in the strategic planning and decision-making process, particularly in the implementation and monitoring stage.

The Strategic-Planning and Decision-Making Process

1. Vision Statement

The creation of a broad statement about the company’s values, purpose, and future direction is the first step in the strategic-planning process.[6] The vision statement must express the company’s core ideologies—what it stands for and why it exists—and its vision for the future, that is, what it aspires to be, achieve, or create.[7]

2. Mission Statement

An effective mission statement conveys eight key components about the firm: target customers and markets; main products and services; geographic domain; core technologies; commitment to survival, growth, and profitability; philosophy; self-concept; and desired public image.[8] The finance component is represented by the company’s commitment to survival, growth, and profitability.[9] The company’s long-term financial goals represent its commitment to a strategy that is innovative, updated, unique, value-driven, and superior to those of competitors.[10]

3. Analysis

This third step is an analysis of the firm’s business trends, external opportunities, internal resources, and core competencies. For external analysis, firms often utilize Porter’s five forces model of industry competition,[11] which identifies the company’s level of rivalry with existing competitors, the threat of substitute products, the potential for new entrants, the bargaining power of suppliers, and the bargaining power of customers.[12]

For internal analysis, companies can apply the industry evolution model, which identifies takeoff (technology, product quality, and product performance features), rapid growth (driving costs down and pursuing product innovation), early maturity and slowing growth (cost reduction, value services, and aggressive tactics to maintain or gain market share), market saturation (elimination of marginal products and continuous improvement of value-chain activities), and stagnation or decline (redirection to fastest-growing market segments and efforts to be a low-cost industry leader).[13]

Another method, value-chain analysis clarifies a firm’s value-creation process based on its primary and secondary activities.[14] This becomes a more insightful analytical tool when used in conjunction with activity-based costing and benchmarking tools that help the firm determine its major costs, resource strengths, and competencies, as well as identify areas where productivity can be improved and where re-engineering may produce a greater economic impact.[15]

SWOT (strengths, weaknesses, opportunities, and threats) is a classic model of internal and external analysis providing management information to set priorities and fully utilize the firm’s competencies and capabilities to exploit external opportunities,[16] determine the critical weaknesses that need to be corrected, and counter existing threats.[17]

4. Strategy Formulation

To formulate a long-term strategy, Porter’s generic strategies model [18] is useful as it helps the firm aim for one of the following competitive advantages: a) low-cost leadership (product is a commodity, buyers are price-sensitive, and there are few opportunities for differentiation); b) differentiation (buyers’ needs and preferences are diverse and there are opportunities for product differentiation); c) best-cost provider (buyers expect superior value at a lower price); d) focused low-cost (market niches with specific tastes and needs); or e) focused differentiation (market niches with unique preferences and needs).[19]

5. Strategy Implementation and Management

In the last ten years, the balanced scorecard (BSC)[20] has become one of the most effective management instruments for implementing and monitoring strategy execution as it helps to align strategy with expected performance and it stresses the importance of establishing financial goals for employees, functional areas, and business units. The BSC ensures that the strategy is translated into objectives, operational actions, and financial goals and focuses on four key dimensions: financial factors, employee learning and growth, customer satisfaction, and internal business processes.[21]

The Role of Finance

Financial metrics have long been the standard for assessing a firm’s performance. The BSC supports the role of finance in establishing and monitoring specific and measurable financial strategic goals on a coordinated, integrated basis, thus enabling the firm to operate efficiently and effectively. Financial goals and metrics are established based on benchmarking the “best-in-industry” and include:

1. Free Cash Flow

This is a measure of the firm’s financial soundness and shows how efficiently its financial resources are being utilized to generate additional cash for future investments.[22] It represents the net cash available after deducting the investments and working capital increases from the firm’s operating cash flow. Companies should utilize this metric when they anticipate substantial capital expenditures in the near future or follow-through for implemented projects.

2. Economic Value-Added

This is the bottom-line contribution on a risk-adjusted basis and helps management to make effective, timely decisions to expand businesses that increase the firm’s economic value and to implement corrective actions in those that are destroying its value.[23] It is determined by deducting the operating capital cost from the net income. Companies set economic value-added goals to effectively assess their businesses’ value contributions and improve the resource allocation process.

3. Asset Management

This calls for the efficient management of current assets (cash, receivables, inventory) and current liabilities (payables, accruals) turnovers and the enhanced management of its working capital and cash conversion cycle. Companies must utilize this practice when their operating performance falls behind industry benchmarks or benchmarked companies.

4. Financing Decisions and Capital Structure

Here, financing is limited to the optimal capital structure (debt ratio or leverage), which is the level that minimizes the firm’s cost of capital. This optimal capital structure determines the firm’s reserve borrowing capacity (short- and long-term) and the risk of potential financial distress.[24] Companies establish this structure when their cost of capital rises above that of direct competitors and there is a lack of new investments.

5. Profitability Ratios

This is a measure of the operational efficiency of a firm. Profitability ratios also indicate inefficient areas that require corrective actions by management; they measure profit relationships with sales, total assets, and net worth. Companies must set profitability ratio goals when they need to operate more effectively and pursue improvements in their value-chain activities.

6. Growth Indices

Growth indices evaluate sales and market share growth and determine the acceptable trade-off of growth with respect to reductions in cash flows, profit margins, and returns on investment. Growth usually drains cash and reserve borrowing funds, and sometimes, aggressive asset management is required to ensure sufficient cash and limited borrowing.[25] Companies must set growth index goals when growth rates have lagged behind the industry norms or when they have high operating leverage.

7. Risk Assessment and Management

A firm must address its key uncertainties by identifying, measuring, and controlling its existing risks in corporate governance and regulatory compliance, the likelihood of their occurrence, and their economic impact. Then, a process must be implemented to mitigate the causes and effects of those risks.[26] Companies must make these assessments when they anticipate greater uncertainty in their business or when there is a need to enhance their risk culture.

8. Tax Optimization

Many functional areas and business units need to manage the level of tax liability undertaken in conducting business and to understand that mitigating risk also reduces expected taxes.[27] Moreover, new initiatives, acquisitions, and product development projects must be weighed against their tax implications and net after-tax contribution to the firm’s value. In general, performance must, whenever possible, be measured on an after-tax basis. Global companies must adopt this measure when operating in different tax environments, where they are able to take advantage of inconsistencies in tax regulations.

Conclusion

The introduction of the balanced scorecard emphasized financial performance as one of the key indicators of a firm’s success and helped to link strategic goals to performance and provide timely, useful information to facilitate strategic and operational control decisions. This has led to the role of finance in the strategic planning process becoming more relevant than ever.

Empirical studies have shown that a vast majority of corporate strategies fail during execution. The above financial metrics help firms implement and monitor their strategies with specific, industry-related, and measurable financial goals, strengthening the organization’s capabilities with hard-to-imitate and non-substitutable competencies. They create sustainable competitive advantages that maximize a firm’s value, the main objective of all stakeholders.


[1] M.E. Porter, “What is Strategy?” Harvard Business Review, 74, no. 6 (1996). [purchase required]

[2] D. Abell, Defining the Business: The Starting Point of Strategic Planning, (New Jersey: Prentice-Hall, 1980).

[3] J.S. Bruner, The Process of Education: A Landmark in Education Theory, (hyperlink no longer accessible). (Boston: Harvard University Press, 1977).

[4] J.A. Pearce and R.B. Robinson, Formulation, Implementation, and Control of Competitive Strategy, (New York: Irwin McGraw-Hill, 2000).

[5] C.S. Clark and S.E. Krentz, “Avoiding the Pitfalls of Strategic Planning,” Healthcare Financial Management, 60, no. 11 (2004): 63–68.

[6] T. Jick and M. Peiperl, Managing Change: Cases and Concepts, (New York: Irwin/McGraw-Hill, 2003).

[7] J.C. Collins and J.I. Porras, “Building Your Company’s Vision,” Harvard Business Review, 74, no. 5 (1996). [purchase required]

[8] Pearce and Robinson.

[9] J.A. Pearce and F. David, “Corporate Mission Statement: The Bottom Line,” The Academy of Management Executive, 1, no. 2 (1987): 109–116. [purchase required]

[10] R.K. Johnson, “Strategy, Success, a Dynamic Economy, and the 21st Century Manager,” The Business Review, 5, no. 2 (2006).

[11] M.E. Porter, “How Competitive Forces Shape Strategy,” Harvard Business Review, 57, no. 2 (1979).

[12] Ibid.

[13] A.A. Thompson, A.J. Strickland, and J.E. Gamble, Crafting and Executing Strategy, (New York: McGraw-Hill/Irwin, 2009).

[14] Pearce and Robinson.

[15] Thompson, Strickland, and Gamble.

[16] B. Jovanovic and G.M. MacDonald, “The Life Cycle of a Competitive Industry,” The Journal of Political Economy, 102, no. 2 (1994: 322–347).

[17] C.A. Lai and J.C. Rivera, Jr., “Using a Strategic Planning Tool as a Framework for Case Analysis,” Journal of College Science Teaching, 36, no. 2 (2006): 26–31.

[18] M.E. Porter, Competitive Advantage: Techniques for Analyzing Industries and Competitors, (New York: The Free Press, 1980).

[19] Thompson, Strickland, and Gamble.

[20] R.S. Kaplan and D.P. Norton, “Using the Balanced Scorecard as a Strategic Management System,” (hyperlink no longer accessible). Harvard Business Review, 74, no. 1 (1996).

[21] Ibid.

[22] Peter Grant, “How Financial Targets Determine Your Strategy,” Global Finance, 11, no. 3 (1997): 30–34

[23] Ibid.

[24] Sidney L. Barton and Paul J. Gordon, “Corporate Strategy: Useful Perspective for the Study of Capital Structure?” The Academy of Management Review, 12, no. 1 (1987): 67–75.

[25] B.T. Gale and B. Branch, “Cash Flow Analysis: More Important Than Ever,” Harvard Business Review, July–August (1981).

[26] H.D. Pforsich, B.K.P. Kramer, and G.R. Just, “Establishing an Effective Internal Audit Department,” Strategic Finance, 87, no. 10 (2006): 22–29.

[27] Q. Lawrence, “Hedging in Perspective,” Corporate Finance, 115, no. 36 (1994).

2017 Volume 20 Issue 1

2016 Volume 19 Issue 2

2016 Volume 19 Issue 1

2015 Volume 18 Issue 2

2015 Volume 18 Issue 1

2014 Volume 17 Issue 3

2014 Volume 17 Issue 2

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

IT MATTERS: Measuring Success

The success of any great moral enterprise does not depend upon numbers.

William Lloyd Garrison








Photo: Keith Syvinski








Since the development of the abacus in ancient China, humankind has searched relentlessly for numerical approaches to informed decision making. That progression quickened with the development in 1946 of the world’s first electronic numerical integrator and digital computer (ENIAC),[1] developed by Army Ordnance to compute World War II ballistic firing tables. Many view the ENIAC as the first practical electronic computer. Since then, we have witnessed a steady, exponential increase in computing power and network bandwidth, along with a simultaneous decrease in both size and cost of computing power.

Gordon Moore’s 1965 prediction that computing power would essentially double every 18 to 24 months for the same cost has, astonishingly, held true. Attempting to capitalize on these vast improvements in computer performance, businesses developed powerful decision support systems and collected even more numbers (i.e., data). And now, those information technologies allow business managers to digitize, classify, and evaluate hundreds of strategic variables, providing them with quick and accurate solutions to common business problems.

While this capability is clearly a benefit to decision-makers, it can also create inherent risk namely, that variables easiest to measure and track via information technology and quantitative models can assume greater importance than they deserve. Sometimes, what is critical is also difficult, if not impossible, to measure and, unfortunately, tends not to be considered in the final analysis. In this article we discuss some of the extraordinary benefits that IT and quantitative analysis bring to the decision-making process, and we also challenge the reader to consider, thoughtfully, the inherent risk of reducing complex decisions to measurable, or calculable mathematical models.

How IT and Numbers Can Help Us

Information technologies have, undeniably, had a positive impact on economic growth over the last century. One need look no further than a daily newspaper to see that information technology has radically changed today’s cultures and the way people think. Advanced information systems and networks enable the nearly immediate transmission of digital information in such forms as photographs, streaming video, graphs, numerical indices, financial facts, market analyses, expected values, etc. All of this information flows through and nourishes the global economy as blood does the human body. Numbers, the information systems in which those numbers are stored, and the analytical procedures we use to make sense of those numbers are fundamental to progress in modern societies.

This constant flow of information has at least three distinct advantages for business. It provides 1) an increased awareness of opportunities, 2) an increased awareness of inefficiencies, and 3) an increased ability to predict, monitor, and track performance. Information technologies make it possible for managers to identify opportunities in the form of emerging markets, demand for new products, and new customer service initiatives.








Photo: Lotus Head








For example, premium ice cream company Cold Stone Creamery has a difficult product mix. The staple of their product line is a collection of ice cream dishes recommended by the store. However, by allowing customers to mix and match “add-ins” such as chocolate chips, fresh strawberries, and sour gummy worms into their choice of ice cream flavors, Cold Stone Creamery has an almost limitless set of possible product combinations. Unfortunately, the legacy IT systems at Cold Stone Creamery were not able to record what add-ins were mixed with which flavors. By integrating a new Customer Relationship Management (CRM) system, the company is now able to quickly identify new trends in flavor mixes from their broad base of customers and offer these new flavors as in-store recommendations for new customers. This “sense-and-respond” capability has contributed to Cold Stone Creamery’s dramatic growth from one to 12,000 stores in ten years.[2]

Information Technologies Help Eliminate Inefficiencies, Track Performance

The application of information technologies also permits business organizations to identify and replace established processes that are embedded with inefficiencies. An exciting example of this technology is known as “business intelligence” (BI). BI methodologies make it possible for firms to measure, collect, analyze and act on information on a grand scale. An example of the benefits of BI can be seen at Continental Airlines, whose various BI initiatives have saved the organization more than $500 million over the past six years.[3] Continental’s COO, Larry Kellner, argues that “BI is critical to the accomplishment of our business strategy and has created significant business benefits.”

Information systems such as their “Demand-driven Dispatch” system have helped Continental become a more efficient organization. Prior to developing the Demand–driven Dispatch system, Continental rarely changed flight schedules, plane assignments, or maintenance routines. This inflexibility led to frequent inefficiencies in the form of oversold flights, empty planes and frustrated customers. However, with the Demand-driven Dispatch system, Continental now combines data from revenue forecasts with flight schedule information to identify promising new sources of additional revenue. For instance, the system might recommend larger planes for routes with high demand, or a reduction in the number of flights on unprofitable routes. These systems have helped Continental increase performance since 1996 when they were rated among the “worst” of major U.S. air carriers.

Finally, information technologies have increased a firm’s ability to predict, monitor, and track performance. A powerful example of this capability surfaced in the fall of 2005 as America watched various U.S. agencies forecast the path, strength, and potential for peril of hurricanes Katrina in Louisiana and Mississippi and Rita in Texas. Days before the storms came ashore, analysts were able to assess risks and warn inhabitants in areas that would likely be affected. Such capabilities were unheard of just a decade or two ago.

These advances have been possible only because of the ongoing development and application of IT resources. Undoubtedly business has benefited greatly from rational judgments based on “hard evidence,” i.e. numbers, rather than on “gut reactions”, intuition, or myths.

What Numbers Cannot Measure

Where is the life we have lost in living? Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?
T. S. Eliot

When one recognizes the extraordinary value to business of numbers and the information systems that store them, it is easy to see why the maxim at our headiest consulting firms goes something like this: “If it can be performed, it can be measured. If it can be measured, it can be managed.” We ascribe inordinate importance to numbers in our personal and professional lives. Professional presentations usually include charts, summary statistics, financial projections, and expected monetary values. Every day it seems that someone in the media or government proffers a number that will help us better comprehend our world. We saw this recently in the U.S. when the hurricanes threatened. The National Weather Service immediately queried their information systems looking for the numbers that would help them select an optimum strategy.

Our infatuation with numbers has been described most eloquently by Antoine de Saint-Exupéry:

Grown-ups love figures. When you tell them you’ve made a new friend, they never ask you any questions about essential matters. They never say to you “What does his voice sound like? What games does he love best? Does he collect butterflies?” Instead they demand “How old is he? How much does he weigh? How much money does his father make?” Only from these figures do they think they have learned anything about him.[4]

Why then, in spite of all the counting, do we continue to find ourselves disillusioned, confused, and surprised by the “unexpected.” Saint-Exupéry observed that perhaps it is because numbers neither capture everything of importance, nor do they tell the whole story. This is especially true when we insist upon relying on numbers to the exclusion of good judgment or intuition, or when we find that we have asked the wrong questions and therefore have collected the wrong data. In fact, the relative simplicity with which many variables can be identified, measured, and tracked with IT may hinder effective decision making. Decision makers may erroneously choose to focus their attention on the easily measured variables and models rather than giving adequate attention to those elements of a decision that are difficult even impossible to measure. While this decision-making strategy may lead to a mathematically optimal solution, it may not suggest the wise or prudent response.

We described above how Cold Stone Creamery, Continental Airlines, and the National Weather Service use information technologies to identify new opportunities, isolate inefficiencies, and predict, monitor, and track performance. Many companies have followed suit, creating massive data warehouses of customer-based information on what products their customers buy, how often they buy those products, and the stores from which they purchase the products. Yet a true understanding of how their customers came to make those decisions is still anyone’s guess.

Such “measurement above meaning”[5] is not necessarily an asset; neither does it lead to good decision making. As Fortune columnist Michael Schrage argues, “just because single, left-handed, blond customers who drive Volvos purchase 1,450 percent more widgets on alternative Thursdays than do their married, non-blond, right-handed, domestic car-driving counterparts does not a marketing epiphany make.”[6]

Managers must realize that often the number crunching, the cost/benefit analyses, and the net present value calculations can be of great value; but they can also leave us wanting. In the final analysis after the numbers have been counted, the information systems built, the predictions made the ultimate success of a strategic decision may well depend on factors that have escaped measure. For many of the terribly complex problems that managers encounter, cannot be solved using the information from numbers alone. In those circumstances, intuition, profound process knowledge, sound judgment, moral reasoning, and the zeal for managing people will also be essential.

Summary

When one considers the tremendous capabilities of information technology, it is easy to see why IT professionals place great value on numbers. Indeed, business strategists from all disciplines, including the authors of this article, have championed this very noble cause.

Encouraged by the many successes, however, we have perhaps put too much emphasis on the value of measurable information. In our haste to measure performance and manage uncertainty, we have been slow to recognize that the information we collect, manipulate, and analyze often has limited value. For example, society is well served by the increases in information technology that enables experts to simulate a variety of hurricane scenarios and prioritize levee construction projects based upon estimates of potential damage.

However, once the simulations are complete and the numbers are “in,” it us up to leaders with strong resolution, sound judgment and moral reasoning to weigh the immeasurable costs of human life and well-being that can be affected by a natural disaster and to make the difficult and morally weighty decisions about which projects to fund. It is difficult when building mathematical models or information technologies, to place value on some very important things the elderly that were left behind, the children separated from their parents, or the untoward effects of pollutants on the environment. Unfortunately, because these “values” can not be easily measured nor included in a simulation model, they tend to be ignored. And, if they are ignored, we not only risk making poor decisions, we also are in danger of creating a world where only those values that can be “seen” will matter. We endorse numbers, and urge our readers to use them.

We also recognize, however, as Peter Drucker points out, tools and concepts are mutually interdependent. That is, the tools we use to analyze our business often force us to see our business differently. As our tools develop and change, so do our businesses.[7] To see rightly, then, we need to look beyond the numbers; calling upon a broader set of values, and making sure that we have asked the right questions, created metrics with meaning, and made judgments that were based on both mathematical and moral reasoning.


[1] Meik, Martin H. (1961) The ENIAC Story – Retrieved from the World Wide Web: 10-6-05 http://ftp.arl.mil/~mike/comphist/eniac-story.html. (no longer accessible)


[2] Schuman, E., (2005). From Gooey Designs to GUI, IT Helps Ice Cream Chain Deliver, CIO Insight, September.

[3] Anderson-Lehman, R., Watson, H.J., Wixom, B., & Hoffer, J. (2004) Continental Airlines Flies High with Real-Time Business Intelligence. MIS Quarterly Executive. 3 (4): 163-176.

[4] Saint-Exupéry, A. (1993). The Little Prince, translated by Katherine Woods. New York, NY: Harcourt Brace Jovanovich.

[5] Schultze, Q. (2002). Habits of the High-Tech Heart, Grand Rapids, MI: Baker Academic Press.

[6] Schrage, M. (1999). “Sixteen Tons of Information Overload,” Fortune, August: 244.

[7] Drucker, P.F. (1998). “The Information Executives Truly Need,” Harvard Business Review, February, 73 (1): 54-62.


2017 Volume 20 Issue 1

2016 Volume 19 Issue 2

2016 Volume 19 Issue 1

2015 Volume 18 Issue 2

2015 Volume 18 Issue 1

2014 Volume 17 Issue 3

2014 Volume 17 Issue 2

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

Decision Tree Analysis

This tree calculates the potential economic impact on Gerber based on the outcome of the CPSC investigation and whether Gerber takes a proactive or reactive approach. Outcomes are based on estimates of the gain or loss for each outcome and the anticipated likelihood that each scenario will occur. The most positive outcome, shown on the uppermost branch, results from a proactive approach to the issue by Gerber and a favorable report on phthalates by the Consumer Products Safety Commission. Enter new gain/loss estimates and/or probability estimates (must total 1) and click on “calculate” at lower left to test different scenarios.

back to article

Estimated
Gain/Loss

Estimated
Probability

Adjusted
Gain/Loss

Estimated
Outcome

Increase







sales.gif (694 bytes)

Favorable

Decrease









cspc.gif (1021 bytes)

Proactive

Unfavorable

Increase







Phthalates Controversy

sales.gif (694 bytes)

Decrease









Increase







sales.gif (694 bytes)

Reactive

Favorable

Decrease









cspc.gif (1021 bytes)
Unfavorable

Increase







sales.gif (694 bytes)

Decrease









2017 Volume 20 Issue 1

2016 Volume 19 Issue 2

2016 Volume 19 Issue 1

2015 Volume 18 Issue 2

2015 Volume 18 Issue 1

2014 Volume 17 Issue 3

2014 Volume 17 Issue 2

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

How Gerber Used a Decision Tree in Strategic Decision-Making

Decision trees can assist executives in making strategic decisions.

Management invariably encounters situations in which uncomfortable decisions must be made. In some cases, the difficulty may be that, although certain alternative choices are clear, the consequences of these choices are not readily apparent. One possible tool for a manager in such a situation is decision tree analysis.

A decision tree is a graphical diagram consisting of nodes and branches. The nodes are of two types. The first is a rectangle that represents the decision to be made. The branches emanating from decision nodes are the alternative choices with which the manager is faced. One and only one alternative can be implemented. The second type of node is a circle. Circles represent chance nodes. That is, the alternatives emanating from chance nodes have some element of uncertainty as to whether or not they will occur. The primary benefit of a decision tree is that it provides a visual representation of the choices facing the manager.

Analytic Considerations

The first task of the manager is to identify the decision that needs to be made based upon a given situation. Next, the manager must think of all the possible alternative actions that could be implemented which would “solve” the problem. These alternatives are connected to the decision node as straight lines emanating from the node. The next step is to identify all the possible consequences that could occur as a result of an alternative being implemented. This process is accomplished for each and every alternative action identified in the previous step. Since these consequences have some element of uncertainty as to whether or not they will occur, the manager needs some way in which to evaluate the likelihood that they will (may) occur. The end goal is to obtain probabilities as to the likelihood of each consequence occurring. The best process to obtain these probabilities is to use past experience of similar outcomes. But, often there is no past experience of similar outcomes available to the manager. In these cases, the best tool is to utilize the collective wisdom of experts as to how likely it is that the particular consequence will occur in the future. Using an appropriate consensus building technique, estimates from a panel of experts can be combined or averaged to create a probability of the likelihood of the occurrence of each and every consequence. The only requirement is that the sum of the probabilities of the set of consequences emanating from a chance node must equal one.

The next step is to evaluate the end result of each possible alternative in concert with the consequences identified for each alternative. This step results in a monetary figure that would be obtained if this course of action were implemented. This step is accomplished for each possible alternative. Finally, the entire tree is evaluated by employing a technique known as mathematical expectation in order to select the most beneficial alternative.

Product Planning at Gerber

Gerber Products, Inc., the well-known baby products company, recently used decision tree analysis in deciding whether to continue using the plastic known as poly-vinyl chloride or, more commonly, PVC. The situation involved a number of organizations including the environmental group Greenpeace, the U.S. Consumer Products Safety Commission, the toy and plastics industries, and the general public.

PVC is a composite plastic material used in numerous household, commercial, and medical products including food storage containers, toys, and medical tubing. To make PVC soft and pliable, a chemical plasticizer known as “phthalates” is added to soften the plastic. In the latter half of 1998, Greenpeace announced that it had conducted scientific testing on phthalates and found them to be carcinogenic in lab rats. Further, Greenpeace claimed that the chemical leeches from the plastic over time and voiced particular concern with, “products that were aimed at small children and used to suck on or chew on.” Although phthalates have been used in plastic for over 30 years, and there are no known cases of phthalates causing health problems, Greenpeace’s press release was strategically timed to coincide with the Christmas toy season, thereby guaranteeing maximum media coverage. As expected, it was immediately picked up by the television networks and, in fact, the ABC show 20/20 did an entire segment on the possible health risks of phthalates.

The problem grew worse for Gerber when the media focused specifically on products made for oral use by children. Gerber, the largest producer of nipples, pacifiers, and feeding products in the U.S., produced some 75 different products containing phthalates and was under considerable pressure to respond publicly to the investigation.

Decisions

Gerber management had to evaluate all of the current information, weigh the consequences of each action, and proceed on the most prudent course to insure as limited an interruption in business as possible. Gerber knew that a vast body of scientific evidence indicates that phthalates are completely safe. However, once the Greenpeace announcement was publicized, the Consumer Product Safety Commission was spurred to issue a press release expressing new doubts. As the focus gradually fell on items children put in their mouths, and large toy manufacturers like Mattel and Disney began to distance themselves from phthalates, the spotlight of the CPSC fell squarely on Gerber. A month before Christmas, the CPSC informed Gerber they would issue a press release advising parents of the potential dangers of phthalates, and Gerber would be named as one of the companies involved. This is the point at which Gerber implemented a decision tree.

Gerber basically faced two choices, neither of which was particularly beneficial. The firm could be reactive, wait for the announcement, and gauge consumer response before deciding on a course of action, or it could be proactive and aggressively pursue resolution of the problem regardless of the public’s response to the report. The CSPC report suggested the agency would either issue a recall of all products containing phthalates (shown on the decision tree as the unfavorable response), or they would issue a report merely expressing concern in which case the public response would be minimal (shown on the decision tree as favorable).

Gerber projected eight possible outcomes on its decision tree. If the firm reacted proactively by discontinuing use of all phthalates, and the CSPC report simply issued a warning, Gerber predicted an 80 percent chance that the public would react favorably to Gerber’s responsiveness causing sales to increase over competitors who reacted more slowly. A potential nationwide revenue increase of $l million was entered into the decision tree. Given a proactive response and a favorable CSPC report, Gerber also recorded a 20 percent chance that sales would decline by $1 million due to the sensationalistic nature of the press coverage.

If the CSPC report is negative and a recall is issued, Gerber predicted 25 percent likelihood that it could preserve current sales through a proactive response. On the other hand, the firm placed a 75 percent probability that a recall would hurt sales by $1.25 million.

Four more alternatives were predicted in the event that Gerber waited for the CSPC report before taking action. With a favorable report and a delayed response, there was thought to be a 25 percent chance that sales would remain flat, along with a 75 percent chance that sales would decline by $2 million.

The worst case scenario is if Gerber remains passive and the CSPC report calls for a recall. In that case, Gerber optimistically predicted a 20 percent probability that it could still increase sales by taking advantage of companies who were less prepared for the report and actually gain approximately $.5 million. However, it was considered an 80 percent probability that significant volume would be lost.

Using a decision tree like the one shown at the end of this article, Gerber concluded that its best option was to be proactive and initiate its own solutions without waiting for the CPSC report. Then, Gerber hoped for a favorable report that would provide a strategic advantage over competitors who opted for a different path.

Some Branches May be Missing From the Tree

As this example shows, decision trees enable managers to use available data to conceptualize and articulate possible scenarios of future events even though important pieces of information may still be missing.

Unfortunately, some important branches of the tree, such as ethical considerations, may be missing since they are not easily converted into financial terms. But the ability to predict economic outcomes in this way can still play a vital role in management decision-making.

Editor’s Note

Subsequent to these actions by Gerber, phthalates were approved for use in toys and other products by the U.S. Consumer Product Safety Commission, the U.S. Food & Drug Administration, and the American Council on Science and Health. An ACSH panel of scientists and physicians headed by former U.S. Surgeon General Dr. C. Everett Loop took the extra step of underscoring the benefits associated with the use of the phthalate DEHP in medical applications because of its demonstrated reliability.

See the Gerber Decision Tree and Analysis here.

2017 Volume 20 Issue 1

2016 Volume 19 Issue 2

2016 Volume 19 Issue 1

2015 Volume 18 Issue 2

2015 Volume 18 Issue 1

2014 Volume 17 Issue 3

2014 Volume 17 Issue 2

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1