Editor’s Note

In this Editor’s Note we invited Professor Rick Hesse to contribute an article on using spreadsheets. Dr. Hesse has taught management science using spreadsheets since 1982 in both engineering and business schools, and at both the graduate and undergraduate levels. Spreadsheet analysis is a very visual technique to help improve business operations and profitability.







Photo: Kheng Guan Toh









On the 25th anniversary of the development of electronic spreadsheets, the good news is that spreadsheets have had a positive impact on the way businesses operate behind the scenes-from finance and accounting to operations and marketing and human resources-and they save time. The bad news is that training is woefully inadequate and haphazard. The ugly news is that spreadsheets are full of errors, with a majority of firms using them in some way for financial reporting that falls under the Sarbanes Oxley act.

The Good

History

The real breakthrough for business use of personal computers (PCs) was the development of VisiCalc, the first electronic spreadsheet, and its successors: Multiplan, Lotus 1-2-3, Excel, and Quattro Pro. Such programming has meant that results can be shown on the PC screen in a spatial, visual way that the average person can easily view and understand. There is no need for a compiler-results are immediate.

Programming is no longer linear, relegated to the realm of an elitist functioning behind the scenes, or to an arcane way of thinking (computing) “behind closed doors.” It is now an open, common-sense, real-time, and spatial way of looking at calculations. In addition, the spreadsheet allows for a collaborative learning approach, since templates are shared within and across companies versus taking the individualistic approach to computer programming. Users can now design their own layouts using a few simple rules, and thus employees become rule-makers themselves.[1]

Benefits

In their January 2006 cover story, Business Week highlighted Excel as one of the necessary tools for business analysts.[2]

Excel, the de facto standard, is an essential tool in business today. Wayne Winston, an MBA professor at Indiana University, has trained thousands of middle managers in industry in using Excel. He estimates conservatively that there are a million analysts in Fortune 500 companies who could benefit from such training. Professor Winston estimates that analysts do 90 percent of their analysis with Excel.[3]

For example, numerous students in my MBA Quantitative Analysis class have reported success with models or Excel skills they learned:

  • After demonstrating how he could use LOOKUP tables and data analysis tools, one student working at an aerospace company received a promotion even though he was not yet eligible in terms of length of service.
  • After learning how to setup a Capital Budgeting modeling using the Solver in Excel, one student, who was also a manager, was able to have an assistant come in early each morning to allocate funds for a non-profit fund, then the student, as the manager, could just check the results during the day.
  • Another student, after completing her MBA, received a bonus from her company, but got an unsolicited employment offer from another firm. Because her company did not want to lose her Excel and Access skills, it matched and increased that offer.
  • Another student changed jobs in the field of reliability testing. In his new job he did not have specialized software, but knew that Excel should be able to do a curve fit for the Weibull distribution. He emailed for help, and I was able to retrofit one of my curve-fitting templates to give the correct results. A few months later the student was finally able to get the expensive software which confirmed the Excel results.
  • Another student found the Data Sort option in Excel saved her over two hours a day of work.
  • In her work for the Army Corp of Engineers, my daughter taught a colleague how to use a pivot table and thereby reduced eight hours of work on a single spreadsheet to two minutes.

The Bad

In business, spreadsheets are more “caught” than “taught.” Because spreadsheets look so simple, it is often thought that ordinary, non-programming business people will just grasp how to use them. Employees often share among themselves little tricks and traps along with their spreadsheet templates, but formal training is not offered in most businesses.

As college graduates entered the workforce, it was thought that they would infuse each organization with these new skills and obviate the need for training in using spreadsheets. As a result, training businesses faltered, and it was left to Human Resource (HR) departments to provide any necessary training internally. However, such departments did not (and do not) provide instruction in the functional skills of finance, accounting, marketing or management science, nor do such programs offer very effective training in fully utilizing spreadsheets. Consequently, HR was ignoring the hidden costs of not training for these skills.[4]

One of the main issues is that even though employees and employers know that spreadsheets save time and increase efficiency, there is no way to measure the cost-benefit.

With automated drawing and drafting, there are benchmarks on how long it takes to develop a drawing and how many revisions are necessary to be able to show the worth of programs like AutoCad. However, spreadsheets do not have such benchmarks or measures for how much time is saved. Research has shown that logical reasoning skills significantly increase after just six weeks of training on spreadsheets.[5] However, how do you quantify the financial benefit of that skill?

My experience in 10 years at Pepperdine teaching almost 1,000 fully-employed MBA students has not shown any noticeable improvement overall in computer or spreadsheet skills among entering students. While some students are quite adept at learning the skills, others are totally clueless.

Employers are usually amazed at what these students learn to do with spreadsheets, and some are dubbed by their employers as being “queen” or “king” of spreadsheets. However, the danger then becomes that the employee spends more time helping other employees than in getting their own work done.

Only a small percentage of learners can pick up a manual, watch a video, or attend a training session and become proficient in using spreadsheets. The majority of learners often need someone to coach them through the different functions or aspects they need for a particular task. Over time they then accumulate enough familiarity to competently handle spreadsheets. (Even I learn something new every semester from my students, which I in turn incorporate into the quantitative courses.)

Perhaps one reason that training companies and their materials failed to help people learn to use spreadsheets is because of a computer mainframe, linear mentality. They thought that by teaching several commands or functions, a trainee should then know how to program. Rather, it is a matter of taking an employee’s work and laying it out to see where the connections are.

At Pepperdine, dedicated staff members of academic computing tried to develop training materials and seminars for our graduate students in business, education, and psychology. However, since most of the IT staff do not know these fields, they were ineffective in applying their computer programming knowledge to practical applications. Students often comment they learn much more about using spreadsheets in a quantitative, finance, or accounting class because it applies to their work.

What we have been left with is training by business and engineering schools, mostly for undergraduates. Even in MBA programs, the assumption is that students know spreadsheets or will pick up the knowledge from their classmates or at an introductory session. Textbooks that use spreadsheets generally do not spend much time teaching the basics and count on the fact that using them in an application will teach more than the traditional approach to teaching programming.

The Ugly

Ray Panko, at the University of Hawaii, has done extensive research in the area of spreadsheet errors and has made some startling discoveries.[6] His research shows that:

  • One percent of formula cells have errors
  • Ninety-five percent of U.S. firms (and 80 percent in Europe) use spreadsheets in some form for financial reporting, and thus are subject to the Sarbanes-Oxley Act
  • Ninety-four percent of 88 spreadsheets audited in seven studies contained errors

Additionally, in 2003, Fannie Mae admitted to making a $1.2 billion error in calculating third quarter earnings on a spreadsheet, while a former vice-president of HealthSouth admitted to inflating earnings by $3.5 billion with a false spreadsheet made up for auditors.[7]

There have been numerous articles warning of the dangers of spreadsheet errors.[8] Most companies have no error checking procedures, unlike computer programming, which counts errors per thousands of lines of code. Many organizations have no procedures for developing and then updating spreadsheets.

Further compounding this problem are the built-in errors in spreadsheets such as Excel, which concern many statisticians.[9] In a previous article I have already warned about the improper trend fits provided by Excel for nonlinear models.[10]

It is clear that employees need enough knowledge of spreadsheets to:

  • Be able to use a template correctly,
  • Develop a useful working template, and/or
  • Change a working template[11]

In addition, it is clear that spreadsheets should be checked for errors, either through built-in checks or by peer review.

Recommendations

Seek out the best of your company’s spreadsheet developers to form an ad hoc committee and appoint them to do the following:

  • Design training materials that are specific to your business to illustrate functions and commands in Excel
  • Set standards for developing, documenting, and changing spreadsheets
  • Require at least two people to check spreadsheets for errors of omission, commission, and transmission
  • Use the each one/teach one method (Make sure that each person who is taught passes on the knowledge to at least one coworker)
  • Take spreadsheets seriously-when spreadsheets are used effectively, they offer tremendous efficiencies to your workforce

[1] J.E. Baker, S.J. Sugden, “Spreadsheets in Education: The First 25 Years,” Spreadsheets in Education, (2003).

[2] S. Baker with B. Leak. “Math Will Rock Your World,” BusinessWeek, (January 23, 2006): 54-60.

[3] Wayne L. Winston. “Executive Education Opportunities,” OR/MS Today, August (2001).

[4] A. Bellinger. “Tackling the Hidden Cost of Spreadsheets,” IT Training (July 2005): 14.

[5] S. E. Kruck, J. J. Maher, and R. Barkhi. “Framework for Cognitive Skill Acquisition and Spreadsheet Training.” Journal of End User Computing, 15, no. 1, (2003): 20-38.

[6] Ray Panko. “Facing the Problem of Spreadsheet Errors,” Decision Line, (October 2006): 8-10.

[7] J. P. Caulkins, E. L. Morrison, and T. Weiderman. “Are Spreadsheet Errors Undermining Decision-Making in Your Organization?” Nonprofit World, 24, no. 3,(May-June, 2006): 26-28.

[8] D. A. Hicks. “The Value of Graphics in Communication,” IIE Solutions, 30, no. 7,(July 1998): 18-20. H. Howe, M.G. Simkin. “Factors Affecting the Ability to Detect Spreadsheet Errors,” Decision Sciences Journal of Innovative Education, 4, no. 1, (2006): 101-122. S.E. Kruck. “Testing Spreadsheet Accuracy Theory,” Information & Software Technology, 48, no. 3, (2006): 204-213. C. Randles. “Spreadsheets: Often Don’t Add Up,” Design News, 60, no. 4, (2005):24.

[9] B. D. McCullough, B. Wilson. “On the Accuracy of Statistical Procedures in Microsoft Excel 2003,” Computational Statistics & Data Analysis, 49, no. 4, (2005):1244-1252.

[10] Rick Hesse. “Incorrect Non-Linear Trend Curves in Excel,” Foresight, 1, no. 3, (2006).

[11] T. J. McGill, J. E. Klobas. “The Role of Spreadsheet Knowledge in User-Developed Application Success,” Decision Support Systems, 39, no. 3, (2005): 355-369.

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

IT MATTERS: The IT Governance Road Map

Managers have made numerous attempts to increase the understanding of how IT operates, and more importantly, how IT can be used to leverage the business and provide a competitive advantage for the firm. IT managers need to assess their capabilities by asking the following questions: 1) Are IT managers able to support the firm in obtaining its objectives? 2) Are they capable of keeping up with the constantly changing market environment? 3) Are they up to date on the latest and greatest technology trends and offerings to the marketplace? 4) Are they flexible enough to understand and lead business process changes as needed? 5) Are they capable of judiciously helping to manage the firm’s risk?





Photo: Renato Cardoso





Information technology (IT) offers firms many opportunities to enhance or transform their products, services, markets, work processes, and business relationships. Such efforts, however, require carefully orchestrated efforts between the firm’s technology and business specialists. It is often the case that the ways in which the firm utilizes IT and the impact that IT has on a firm’s performance have been carefully guided by well-thought-out IT governance policies and procedures. Interestingly, the Meta Group recently reported that more than 80 percent of Global 2000 firms do not have a formal governance committee in place.[1] The analyst firm also predicts that 50 percent of companies will attempt to improve their IT governance policies this year. According to the Meta Group, firms having better than average IT governance policies can realize at least a 20 percent higher return on assets than organizations with weaker governance.[2]

What is IT Governance?

IT governance is defined as “the decision rights and accountability framework for encouraging desirable behavior in the use of IT.”[3] IT governance is seen as a framework that ensures that information technology decisions consider the business’ goals and objectives. Similar to ways in which corporate governance aids the firm in ensuring that key decisions are consistent with corporate vision, values and strategy, IT governance ensures that IT-related decisions match companywide objectives.

IT governance has primarily been driven by the need for the transparency of enterprise risks and the protection of shareholder value. The overall objective of IT governance is to understand the issues and the strategic importance of IT, so that the firm can maintain its operations and implement strategies to enable the company to better compete now and in the future. Hence, IT governance aims at ensuring that expectations for IT are met and that IT risks are mitigated. IT governance exists within corporations to guide IT initiatives and to ensure that the performance of IT meets the following corporate objectives:

  • Alignment of IT to support business operations and sustain advantages;
  • Responsible use of IT resources;
  • Appropriate identification and management of IT-related risks;
  • Facilitation of IT’s aid in exploiting opportunities and maximizing benefits.[4]

A structured IT governance committee or policy along with corporate managers combine to ensure that IT is synchronized with the business and delivers value to the firm. IT governance also aids companies in instituting formal project approval processes and performance management plans.

Firms typically make five types of IT decisions:[5]

  1. IT principles decisions dictating the role of IT in the enterprise.
  2. IT architecture decisions on technical choices and directions.
  3. IT infrastructure decisions on the delivery of shared IT services.
  4. Business application requirements decisions for each project.
  5. IT investment and prioritization decisions.

To successfully make these five types of decisions, firms must develop and implement IT governance mechanisms. There are three general categories of IT governance mechanisms and techniques,[6] which include 1) decision making, 2) process assignment, and 3) communication approaches. A recent study asked 250+ Chief Information Officers (CIOs) how IT governance was enacted within their organizations.[7] Utilizing the three general categories of governance mechanisms, the table below summarizes the techniques used by the firms:

GOVERNANCE MECHANISM

Decision-Making Structures

Business/IT relationship managers

IT Leadership committee composed of IT executives

IT council composed of business and IT executives

Executives of senior management committee

Process teams with IT members

Architecture committee

Capital approval committee

Alignment Process

Tracking of IT projects and resources consumed

Service-level agreements

Formal tracking of business value of IT

Chargeback arrangements

Communication Approaches

Office of CIO or officer of IT governance:

Work with managers who fail to follow the rules;

Publicize announcements from senior management;

Manage and monitor Web-based portals and intranets for IT.

Despite the fact that corporations are beginning to experience success with implementing IT governance mechanisms to better manage their IT resources, individual governance mechanisms cannot alone promise the successful implementation and execution of IT governance policies and procedures. Companies must be able to better understand the complex playing field of their competitive environment and be able to put together a reliable set of governance techniques that are simple, are easily shared and implemented, and that engage managers who make key decisions for the company.

These mechanisms provide firms, at a minimum cost, with the coordination, control, and trust that is needed to manage and utilize their IT related resources. Hence, well-developed and implemented IT governance mechanisms help firms to establish coordinated mechanisms that link IT-related objectives and goals to measurable goals. IT governance also helps to provide the necessary checks and balances to better manage and mitigate risk, standardize practices, streamline procedures, and improve returns on technology resources and assets.

IT Governance: A Continuous Process. IT governance can be seen as the continuous process of aligning corporate and IT strategy. IT governance helps to shape organizational changes over time and should be tightly tied to corporate governance procedures and regulations. IT governance is intended to safeguard the organization against criminal activity inside and outside the organization and then to develop and implement strategies and processes to manage governance.

IT Governance at Different Layers of the Organization. IT governance is typically the primary responsibility of the board of directors and executive management (including the Chief Information Officer). It is an integral part of enterprise governance and consists of the leadership and organizational structures and processes that ensure that the organization’s IT sustains and extends the organization’s strategies and objectives.

IT governance should typically address IT-related risks and opportunities at different layers of the organization. IT managers should solicit input for the development of IT governance policies and procedures, since such governance affects employees within different layers of the organization and across different business functions. All employees, from front-line employees and their managers to the executives of the board of directors, should contribute to the enforcement of IT governance policies and procedures.

Ten Action Items to Consider When Establishing IT Governance

1. Define your company’s direction on IT governance. In this step, the goal of the firm is to identify and define the strategic and tactical IT governance roles and responsibilities. Ensure that your firm has documented roles and responsibilities of the board, the executives, and the IT strategy committee. Identify and specify how priorities are set, how resources are allocated, and by whom, and how projects are tracked. In addition, include senior managers from both the IT and business divisions when you establish your direction; these individuals serve as the key champions to disseminate and encourage the adoption of IT governance procedures and policies within their divisions. Identifying champions from both sides of the business decreases the likelihood of a disconnect between business objectives and IT capabilities.

2. Determine an IT governance implementation plan. The firm requires an effective action plan that matches specific circumstances with needs. It is of foremost importance for the board to take ownership of IT governance and determine the direction that managers should follow. Such decisions are efficiently made by ensuring that the board operates with IT governance in mind:

  • Ensure that IT issues, plans, and wins are on the Board’s agenda.
  • Uncover IT issues by challenging management’s activities with regard to IT.
  • Guide managers by helping to align IT initiatives with real business needs.
  • Highlight the potential impact on the business of IT-related risks.
  • Insist that IT performance be measured and reported to the Board.
  • Establish an IT strategy committee that is responsible for communicating IT issues between the Board and mangers.
  • Insist that the firm utilize a common approach to employing a management framework for IT governance.

3. Identify champions who have a vested interest. Assign clear responsibilities for each type of IT decision to individuals who can accept accountability for the outcomes of those decisions. Constrain the number of decision-making structures when determining how IT resources are acquired, utilized, and discarded.





Photo: Donald Lee




4. Ensure cross-coordination and responsibilities for IT decisions. The previously listed five types of IT decisions are often distributed across the firm, so corporations need to consider overlapping responsibilities in the decision-making bodies. Overlapping memberships coordinate decisions throughout the enterprise and often ensure that the strategic objectives of managers filter down to decisions made at the individual project level.

5. Create an IT governance road map and plan for long-term strategies. IT governance should be integrated with the more broad and strategic Enterprise Governance goals. An IT governance approach helps board and management understand the implications and strategic implications of IT and assists in ensuring that the enterprise can sustain its operations and implement the strategies required to extend its operations for future growth. Avoid the “doing it all” syndrome, which most organizations attempt to do.

6. Walk before trying to run: Target short-term IT governance goals and wins. After the firm has identified and developed a strategic IT governance road map, perhaps identify short-term IT governance issues that can serve as quick wins to get the organization jump-started on its IT governance policy and regulation enforcement. These quick wins will provide a good indication of the possibilities and challenges associated with implementing sound IT governance; they also help to uncover corporate barriers that need to be addressed before long-term strategies can be implemented. Such wins will also help to provide evidence that IT governance procedures and policies can aid and protect the organization, as well as further establish the credibility for implementing IT governance policies.

7. Go To the place: Identify and manage IT-related risks and opportunities. Do your homework and understand what it is that your users need and determine how such needs affect ways in which IT is used within the corporation. In doing so, you can uncover IT-related risks and opportunities. Instead of pretending to understand instances of IT’s improper and ineffective use, go to the place where there is pain within the organization. Pay your users a visit to personally experience their IT-related difficulties. Another suggestion for identifying corporate IT risks or opportunities is to survey your users. They can be one of the best sources of input for identifying security gaps or inappropriate use of IT.

8. Revisit IT governance policies on a regular basis. Once a firm has designed a feasible set of IT governance mechanisms, governance can remain in place until a change in strategic direction or a business opportunity redefines what the firm sees as desirable use of IT resources. However, opportunities sometimes arise that are not fully (or partially) addressed in the IT governance policies and procedures. When this situation occurs, the IT governance policies must be revisited to address these situations.

9. Increase the transparency of your IT governance. One of the most significant factors that can influence the success of IT governance policy and procedures is the number of employees who can accurately describe the company’s IT governance policies. IT executives and their staffs must engage in proactive conversations with business people and IT users to better understand corporate needs. One suggestion to promote IT governance in your firm is to boost the public relations activities of the IT department. For example, consider producing and distributing an annual report from the IT department that explains and shares the firm’s IT governance and future strategic goals and plans.

10. Establish exceptions to processes in the governance processes. Occasionally business situations or opportunities occur that are not governed or addressed by the firm’s IT governance policies. Such occurrences arise simply because IT governance may prohibit particular actions, or perhaps IT governance policies may be out of date. Establish a process for the firm to follow if the need arises to update or to provide an exception to the IT governance policies that are in place.

Conclusion

IT governance exists to assist enterprise leaders in their responsibility to make IT successful in supporting the firm’s goals and mission. IT governance helps firm executives to raise awareness and understanding among employees. Such governance also helps provide guidance and tools to boards of directors, executive managers, and CIOs to ensure that IT is appropriately aligned with corporate goals and policies and that IT meets and exceeds expectations of the firm. Over the next 40 years, IT leadership will move from serving as an individual contributor on the corporate team to being a full member of the team. The huge burden of the CIO ensuring that IT is effectively managed will become a company and board-level responsibility. However, this change will be more easily accomplished if IT governance is fully incorporated and is properly enforced within companies.


[1] SearchCIO.com, 1/11/2005, “Executive Guide: IT Governance.”

[2] Ross, Jeanne, and Weill, Peter. “Recipe for Good Governance,” CIO Magazine, 15 June 2004, 17, (17).

[3] Ibid. Ross & Weill.

[4] “Board Briefing on IT Governance,” 2nd edition, IT Governance Institute, 2003.

[5] Ibid.Ross & Weill.

[6] Ibid.

[7] “Effective IT Governance Mechanisms,” CIO Magazine, June 15, 2004.

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

IT MATTERS: Or more correctly, use of IT matters…

What matters about Information Technology (IT)? How does IT add value to a firm? Who should take responsibility for IT management? These are a sampling of the questions to be answered in this and upcoming issues that will explore why and how IT matters to business and management.






Photo: Ann Petersen






The closing years of the last century saw heavy investments in IT that were motivated by “Y2K,” “dot com” and “ERP” (Enterprise Resource Planning) mania. During these opening years of the current century, many business executives have become increasingly concerned about the levels of IT spending in their firms, questioning the business impact they have received from prior investments in IT and viewing proposals for expensive new IT initiatives with great trepidation. From their perspectives, such concerns have been legitimized by the now largely dismissed “IT Doesn’t Matter” debate and its claim that IT has become a commodity that no longer conveys strategic advantage.

However, the real problem is not that IT has become a commodity, or even that IT is ubiquitous. The real problem is the extent to which most firms continue to struggle (and often fail) in their attempts to extract positive business impact from their IT systems. The root cause is management’s failure to get these systems used in ways that actually create value for the firm. Such value can only emerge when IT systems are used in ways that are consistent with the business objectives that motivated and justified the IT system investment in the first place.

What accounts for this failure? For the most part, I consider the poor track record of IT business impact to be an outcome of two related problems with how IT has been managed. First, the dominant focus in IT has been on the activities associated with the development and acquisition of IT systems, at the great expense of attention paid to the activities necessary to get IT systems used appropriately. This focus has resulted in a dominant philosophy of IT management that has often been referred to as “If we build it, they will come.” Very often, IT systems were built, but nobody came to use them. And when nobody uses an IT system, there is no possible way that business impacts can ever occur.

Second, the traditional norm has been for business executives to abdicate complete responsibility for the management of IT and to delegate that task to IT professionals. Many reasons account for this norm, including a lack of understanding and comfort with IT on the parts of business executives, a perception that IT projects are tangential to executives’ primary responsibilities, and an enthusiasm among IT professionals for demonstrating their own importance to business.

Certainly, IT professionals have an important role to play in the management and operation of IT resources and in the execution of IT projects. However, they should not assume primary responsibility for every stage of the process through which firms acquire and deploy new IT systems. In particular, IT professionals should not, and cannot, be responsible for the business impact of IT systems. Why? Because IT professionals do not have the management authority to bring about the process and behavior changes necessary to achieve actual use of these new IT systems and consequently to realize the desired business impact of IT.






Photo: Nik Frey






Two Shifts in Thought Regarding IT

Both of the aforementioned problems can be addressed by two simple shifts in the prevailing thinking regarding how the acquisition and deployment of IT should be managed. Both shifts are motivated by a fundamental tenet of management thinking: Start with the goal.

First, the dominant focus of IT management attention needs to shift from acquiring/building IT systems to ensuring that IT systems are used. Since the ultimate goal of most IT projects is real business impact (the exceptions are IT infrastructure renewal projects), and since business impact can only emerge if IT systems are used, then the dominant focus of management attention during IT projects must be on getting IT systems used in ways that are consistent with the business objectives of the project. This simple shift results in some powerful insights. For example, when the dominant focus is on getting an IT system used, it becomes apparent that less complex, easier-to-use, and possibly cheaper technologies may offer far greater potential for business impact than do the all-bells, all-whistles, best-of-breed “TLA systems”** du jour.

This simple shift in IT management philosophy is consistent with the long understood importance of “user involvement” in IT projects. However, the traditional IT project philosophy argues for “user involvement in IT development projects.” Let’s turn that notion on its head. The ultimate importance of IT use over IT development suggests that the underlying IT project philosophy should be switched to “developer involvement in IT use projects,” i.e. projects that aim to bring about use of IT that is consistent with desired business outcomes. As an aside, the “End-User Development” movement had similar motivations, but resulted in a plethora of “locally optimized” applications of dubious quality at the expense of firm-wide business impact.

Second, and most important, if indeed the ultimate goal of an IT project is also to achieve a business objective, then the ultimate management responsibility for that project should rest with those responsible for achieving business objectives: business executives. In my teaching and consulting engagements, I advocate that there should be no “IT projects with business objectives,” only “business projects that are IT enabled.” Business projects should be the responsibility of the appropriate business executives.

To bring these points together, I suggest that only business executives have authority over the users whose ultimate adoption and use of an IT system is fundamental to achieving the business objective being pursued. This assertion adds further credence to the notion that business executives should have primary responsibility and stewardship for “IT-enabled business projects.”

The Role of IT Professionals

Is there a role for IT professionals? Absolutely! They have a variety of critical roles, ranging from business-technology visioning, vendor management, IT architecture, and infrastructure management to project management. But my key point is that ultimate management responsibility and accountability for the business outcomes of an IT-enabled business project must rest with a business executive. What’s more, the extent to which IT permeates virtually every business and organization process today makes it nearly impossible to separately manage the IT-enablement from the process itself.

Equipping Executives to Manage IT

Having argued the case that business executives must take primary responsibility for IT-enabled business projects, I must point out that the unfortunate reality is that most business executives are not well equipped to do so. The tradition within business and management education and within corporate executive development programs has been to prepare business executives for business roles and to hand off IT projects to their IT professionals. Core MBA classes have typically educated future business professionals to be participants (or informed observers) in IT projects, but not to assume management responsibility for IT-enabled business projects. It is important that executive education programs instead prepare business executives to assume the responsibility of leading their firm’s IT-enabled business projects in competitive environments in which the effective management of IT does matter.


**”TLA” refers to “Three-Letter Acronym.” The IT industry has a near obsession with three-letter acronyms to refer to categories of IT systems (e.g. MRP, ERP, CRM, SCM).

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

The Cost of Lost Data

The cost of lost data from computers is substantial. Businesses must be proactive in protecting this important resource.

The Nature of the Problem

All computer users are familiar with the problem of lost data. Fortunately, most such incidents are relatively inconsequential, representing only a few minutes of lost work or the deletion of unnecessary files. However, sometimes the nature of the lost data is critical, and the cost of lost data is substantial. As reliance on information and data as economic drivers for businesses continues to increase, owners and managers are subject to new risks. One study reports that a company that experiences a computer outage lasting for more than 10 days will never fully recover financially and that 50 percent of companies suffering such a predicament will be out of business within 5 years.[1]

Levels of Risk

Of course, the value of lost data varies depending on their application, as well as the potential value that can be captured from use of the data. The loss of computer code, for example, represents a significant loss of value because computer code must be rewritten by highly skilled and highly paid software developers. In contrast, the loss of a customer history database would represent a less significant episode of data loss, assuming original source copies of the information are available. In this case, although the data would need to be re-keyed, it could be done by lower skilled and lower paid data entry personnel.

Using available data sources, this paper attempts to quantify the costs associated with episodes of data loss in the aggregate for the US economy. Implications of these findings for business owners and managers will also be discussed.

PCs in Use

Companies increasingly rely on data in a distributed environment. Therefore, the examination of data loss here will focus on the level of the personal computer.[2] US businesses use an estimated 76.2 million PCs to aid in the production of goods and services. Laptops are relied upon more and more, with a current installed base of 15.2 million units, or about 20 percent of all business PCs. The number of desktops in use currently totals approximately 61.0 million units.[3]

Episodes of Data Loss

Statistics on data loss are sparse. Data loss incidents can be hardware- or software-related. Consequently, a consideration of both is necessary to estimate the magnitude of data loss. Thus, this study combines two data sources to estimate the magnitude of data loss in the US: (1) claims data from an insurance company that insures computer hardware; (2) survey data from a company that specializes in data recovery.[4] Estimates from this combination suggest that the most common cause of data loss is hardware failure, accounting for 40 percent of data loss incidents. These include losses due to hard drive failure and power surges. Human error accounts for 30 percent of data loss episodes, which include the accidental deletion of data as well as accidental damage done to the hardware, such as damage caused by dropping a laptop. Software corruption, which might include damage caused by a software diagnostic program, accounts for 13 percent of data loss incidents. Computer viruses–including boot sector and file infecting viruses–account for 6 percent of data loss episodes. Theft of hardware, especially prevalent with laptops, accounts for 9 percent of data loss incidents. Finally, hardware destruction, which includes damage caused by floods, lightning and fire, accounts for 3 percent of all data loss episodes. The relative magnitudes of the different types of data loss are illustrated in Figure 1.

These data may be mapped to census (“installed base”) data on computers to estimate the number of severe data loss episodes that occur each year. Table 1 reports the results of this mapping, estimating 4.6 million episodes of severe data loss per year. Reflected in these data are significant differences in the incidence of data loss between laptops and desktops. While less than two percent of desktops are likely to experience an episode of data loss each year, the corresponding rate for laptops is greater than ten percent.

The Cost of a Data Loss Incident

An episode of severe data loss will result in one of two outcomes: either the data are recoverable with the assistance of a technical support person, or the data are permanently lost and must be rekeyed.[5] A calculation of the average cost of each data loss incident must take into account both possibilities. The ability to recover data depends on the cause of the data loss episode. The permanent loss or theft of a laptop whose data have no tape backup will result in permanently lost data. In addition, fire or flood damage can also make the possibility of data recovery very remote. For other causes of data loss, data recovery specialists are becoming more adept at restoring inaccessible data.[6] Taking into account all causes of data loss, evidence suggests that in 83 percent of the cases, data may be recovered.[7]

The first cost of data recovery to be considered is that associated with hiring a computer support specialist in the recovery effort. If there is a computer support specialist employed within the company, both the number of hours needed to recover the data and the cost of employing this individual must be taken into account. Most recent information from the Bureau of Labor Statistics states that the average computer support specialist earns an estimated $28.10 an hour, including both salary and benefits.[8] The time needed to recover data may vary greatly. If a data backup exists and is readily accessible, the time needed to recover data may be very short. At the other end of the spectrum, if the data are corrupted on the hard drive, several days may be required to retrieve the data.

If the average time needed to recover lost data is approximately six hours, the cost of using an employee to recover lost data is approximately $170. However, if a firm does not employ a specialist who is able to retrieve lost data, the company must go to an outside firm to attempt data recovery. Outside data recovery specialists can be much more expensive than in-house sources, sometimes exceeding two to three times the cost of an in-house specialist. Thus, taking into account that an outside specialist must often be used in data recovery attempts, one can conservatively estimate the minimum cost of outside technical support to recover lost data to be around $340.

During the time in which the attempt to recover data is underway, an individual is unable to access his or her PC, thereby reducing productivity, which in turn impacts company sales and profitability. This opportunity cost–lost productivity due to computer downtime–impacts a company’s income statement just as do other more common and explicit costs. Lost productivity represents missed opportunities for income generation. Some employees are directly involved in sales and revenue production; others are involved in more supportive or indirect roles. Economics teaches that each employee’s productivity, or contribution to firm revenue, can be approximated using the individual’s compensation.[9] Available data sources suggest that individuals who use computers at work earn an average of $36.20 an hour in wages and benefits.[10] Thus, $38.70 for six hours totals approximately $217.[11]

The final cost to be accounted for in a data loss episode is the value of the lost data if the data cannot be retrieved. As noted earlier, this outcome occurs in approximately 17 percent of data loss incidents. The value of the lost data varies widely depending on the incident and, most critically, on the amount of data lost. In some cases the data may be re-keyed in a short period of time, a result that would translate to a relatively low cost of the lost data. In other cases, the value of the lost data may take hundreds of man-hours over several weeks to recover or reconstruct. Such prolonged effort could cost a company thousands, even potentially millions, of dollars.[12] Although it is difficult to precisely measure the intrinsic value of data, and the value of different types of data varies, several sources in the computer literature suggest that the value of 100 megabytes of data is valued at approximately $1 million, translating to $10,000 for each MB of lost data.[13] Using this figure, and assuming the average data loss incident results in 2 megabytes of lost data, one can calculate that such a loss would cost $20,000. Factoring in the 17 percent probability that the incident would result in permanent data loss, one can further predict that each such data loss would result in a $3,400 expected cost.

Added together, the costs due to technical services, lost productivity, and the value of lost data bring the expected cost for each data loss incident to $3,957. (See Figure 2.) It should be noted that most data loss incidents (approximately 83 percent) result in much lower average costs ($557), but in the smaller portion of cases in which the data are permanently lost, the average costs are estimated to be much higher ($20,557). In addition to highlighting the significant costs involved in re-keying data, these figures reflect the importance that data play in creating value for businesses. Once data are lost, those value-creating opportunities are also lost. These losses are multiplied in a networked environment. A survey conducted in 2001 by Contingency Planning Research reports that the majority of companies estimate the average cost of computer network downtime to exceed $50,000 an hour, and for some companies that figure rises to over $1,000,000 per hour.[14]

Total Annual US Data Loss Costs

When information on data loss episodes is mapped along with the cost data, an estimate of aggregate data loss may be obtained. This calculation, reported in Table 2, estimates that annual data losses to PCs cost US businesses $18.2 billion.[15]> This estimate represents an increase from a 1999 study that estimated the annual cost of lost data to be $11.8 billion.[16] Although it is difficult to measure with precision the cost of lost data, and the analysis is sensitive to the assumptions that underlie its calculations, there are several reasons to believe that $18.2 billion is a conservative estimate. First, that figure does not take into account costs that are difficult to quantify, such as lost sales and reputation damage a firm may experience during an extended period of computer downtime. In addition, research in the field of network economics suggests that extra costs would be incurred if a data loss incident occurs to two or more PCs on a network. Such additional cost is due to the interdependence and reliance that each computer user experiences when working with other computer users. As noted earlier, research on incidents of server downtime suggests that such costs can be significant. Finally, it is important to note that these figures do not include any collateral costs that may be incurred in some instances of data loss, such as when damaged hardware must be replaced.

Trends and Implications

What trends are likely to impact the potential for data loss in the future? Available evidence suggests that PC users are more likely now than ever before to use power surge protectors and virus protection software.[17] In addition, Safeware, a company that specializes in insuring PCs, reports that computer thefts appear to be declining as a percentage of computer loss incidents.[18] This is positive news. However, it is the opinion here that two trends will drive the annual amount of data loss upward: (1) increased reliance on laptops, which are much more likely to suffer episodes of data loss than are PCs, particularly from accidental damage due to dropping; and (2) more data stored in smaller spaces, since hard drive capacity continually increases. Conservative estimates place the rate of data growth at 80 percent per year.[19] Not only is the amount of data increasing, but business reliance on data is also rising.[20]

Implications from this research are clear. Business managers should invest in technologies that can reduce the possibility of data loss. Examples include the use of computer virus software and back-up systems. PCs should be password protected, to reduce the value of a stolen PC to a potential thief. Also serving as a theft deterrent are computer-tracking services which serve as a sort of “LoJack for laptops.”

However, even in the face of strong protection measures, some episodes of data loss will inevitably occur. Plans to deal with such episodes can mitigate recovery times. And although back-up protocols are common for server-located data, plans to protect data in a distributed environment are less commonplace.[21] Since the technologies available to back-up data are often reasonably priced, cost does not necessarily present a stumbling block in preventing permanent data loss. A simple and essentially zero cost data back-up procedure involves copying data on CD writable disks using the pre-loaded software that comes with PCs. IT staff should hold training sessions on such protocols because adherence to such procedures will rely on individual users following through with such protective procedures. A saying that precedes the advent of the computer is appropriate here: an ounce of prevention is worth a pound of cure.


Endnotes

[1] Jon Toiga, Disaster Recovery Planning: Managing Risk and Catastrophe in Information Systems, (Yourdon Press, 1989).

[2] Data, as defined here, are the bytes that reside on personal computers. Data could be more broadly defined as all digital media, but such a consideration is beyond the scope of this study. See Simon Forge, JP Morgenthal, and Richard Ptak, Manager’s Guide to Distributed Environments: From Legacy to Living Systems, (New York: John Wiley & Sons, 1998).

[3] Data in this paragraph from the Computer Industry Almanac, 2003; U.S. Dept. of Commerce, National Telecommunications and Information Administration, A Nation Online: How Americans Are Expanding Their Use of the Internet, http://www.ntia.doc.gov/ntiahome/dn/, February 2002; and, John Spooner, News.com, “Laptops gain in PC Market,” August 20, 2001.

[4] See Safeware, The Insurance Agency, Inc., “2000 Safeware Loss Study,” 2001; and, ONTRACK Data International, Inc., “Understanding Data Loss,” 2001, ( 2003).

[5] This assumes that the vast majority of computer users are unable to recover from a severe data loss incident without the assistance of a technical support individual. The household equivalent of this assumption would be that the vast majority of households could not recover from a severe plumbing problem without a plumber. This also assumes that the original source information is available in hard copy or some other form from which it can be rekeyed.

[6] The Data Recovery Group reports that they are able to recover inaccessible data in 95% of incidents. See http://www.datarecoverygroup.com, (2003). It is noted that data recovery firms may have an incentive to overestimate their success rates.

[7] An 83 percent recovery rate is reported by Denise Deveau, “Lost all your data? Time to Call the Experts,” The Globe and Mail, February 25, (2000).

[8] From Bureau of Labor Statistics, Employer Costs for Employee Compensation, March 2003, and Occupational Employment Statistics Survey, (2001).

[9] The reasoning is fairly intuitive: a company will pay an employee as long the individual adds to firm profits. The company will stop paying an employee when the revenue generated from that individual is exactly equal to the compensation paid.

[10] The average compensation for white collar workers as reported in Employer Costs for Employee Compensation, Bureau of Labor Statistics, (March 2003).

[11] It may be claimed that an individual could move on to other tasks, which in turn would only reduce their productivity by a fraction of this amount. However, it is common that the PC user must work closely with the data recovery specialist in the recovery effort. In addition, productivity could be hampered for several days if the computer must be sent to an outside specialist.

[12] A National Computer Security Association (now Trusecure Corporation) survey reported that for an average engineering department it would cost $100,000 to rebuild 20 megabytes of data.

[13] For example, see Stuart Hanley, “Keep Those Data Protection and Recovery Options Open,” Storage Management Solutions, November 1997; and ONTRACK Data International, Inc., “The Data Recovery Solution,” (1998).

[14] See Contingency Planning Research, 2001 Cost of Downtime Survey, http://www.contingencyplanningresearch.com/2001%20Survey.pdf, (2002).

[15] This could be considered the annual data loss estimate for the year 2003. However, although this study aims to utilize the most up-to-date data sources available, some data are from years prior to 2003. Thus, $18.2 billion represents the best estimate of annual data loss, based on the most recent sources available.

[16] David Smith, “The Cost of Lost Data.” Storage Management Solutions No. 4 (1999): 60-2.

[17] Trusecure Corporation, 7th Annual ICSA Labs’ Virus Prevalence Survey, (2002).

[18] Safeware, The Insurance Agency, Inc., “2002 Safeware Loss Study,” (2003).

[19] Jon William Toigo, “Storage Disaster: Will You Recover?” Network Computing, (March 5, 2001).

[20] Toigo, 2001.

[21] Forge, Morgenthal and Ptak, 1998.

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

Using Dashboard Based Business Intelligence Systems

User-friendly business intelligence systems help increase organizational participation as well as improve bottom line performance.

Business Intelligence Systems Defined

Business intelligence systems (BIS) are interactive computer-based structures and subsystems intended to help decision makers use communication technologies, data, documents, knowledge, and analytical models to identify and solve problems. The new generation of BIS offers the potential for significantly improving operational and strategic performance for organizations of various sizes and types.

During the 1990s, most large organizations engaged in enterprise data warehousing projects. The scope of these efforts ranged from combining multiple legacy systems to developing user interface tools for analysis and reporting. The data warehouse is the underlying structure that is used to generate a variety of reports and analyses. In the past, business intelligence amounted to a set of weekly or monthly reports that tended to be unconnected.

Two salient features of the new generation of BIS are integration and visualization. Typically, this information flow is presented to the manager via a graphics display called a Dashboard. A BIS Dashboard serves the same function as a car’s dashboard. Specifically, it reports key organizational performance data and options on a near real time and integrated basis. Some BIS industry pundits claim that Dashboards are simply “eye candy” for executive managers. This perspective suggests that these systems are merely a new fad being promoted by consultants and vendors. While these claims may have some merit, Dashboard based business intelligence systems do provide managers with access to powerful analytical systems and tools in a user friendly environment. Furthermore, they help support organization-wide analysis and integrated decision making.[1]

The first executive dashboards actually went into organizations around 1985. We called them executive information systems at the time. And they had limited success because they were executive systems — the chairman of Merck would have it on his desk — but then that was it. What we’re seeing today are management dashboards, which have been pushed down through the organization, providing relevant information to a particular manager. At Southwest Airlines, they call them cockpits, and they’re specialized, so that the guy in charge of putting peanuts on airplanes gets a different view than the guy who’s in charge of purchasing jet fuel. But they all see what planes are flying where.

~ John Kopcke
Hyperion Solutions Corp.*

Typically, BIS can be categorized into two major types: model-driven and data-driven. Model-driven systems tend to utilize analytical constructs such as forecasting, optimization algorithms, simulations, decision trees, and rules engines. Data-driven systems deal with data warehouses, databases, and online analytical processing (OLAP) technology. A data warehouse is a database that is constructed to support the decision making process across an organization. There may be several databases or data marts that make up the data warehouse. OLAP is increasingly utilized by managers to help process and evaluate large-scale data warehouses and data marts.

In five years, 100 million people will be using information-visualization tools on a near daily basis. And products that have visualization as one of their top three features will earn $1 billion per year.

~ Ramana Rao, founder and chief technology officer,
Inxight Software Inc.*

Today, there is an ongoing requirement for more precise decision making because of increased global competition. Generally speaking, decision making should be based on an evaluation of current trends, historical performance metrics, and forecast planning. New and improved BIS continue to emerge to help meet these ongoing requirements.[2]

Within three years, users will begin demanding near-real-time analysis relating to their business — in the same fashion as they monitor stock quotes online today. Monthly and even daily reports won’t be good enough. Business intelligence will be more focused on vertical industries and feature more predictive modeling instead of ad hoc queries.

~ Thomas Chesbrough, executive vice president
Thazar*

BIS Developments

BIS vendors are offering a variety of new systems that provide necessary links and end user interface for managers to access and receive selective information such as competitor behavior, industry trends and current decision options. To increase organizational acceptance and use, these new systems feature distributed decision making, which helps leverage organizational visibility. Specific attention is being given to the user interface as highlighted by the following list of standard end user features:[3]

  • Filter, sort and analyze data.
  • Formulate ad hoc, predefined reports and templates.
  • Provide drag and drop capabilities.
  • Produce drillable charts and graphs.
  • Support multi-languages.
  • Generate alternative scenarios.

Dashboards

There are a number of approaches for linking decision making to organizational performance. For example, in the manufacturing industry, decisions may focus on resource allocation optimization and waste reduction, as supported by the Lean Manufacturing Methodology. From a decision maker’s perspective, the new BIS visualization tools such as Dashboards and Scorecards provide a useful way to view data and information. Outcomes displayed include single metrics, graphical trend analysis, capacity gauges, geographical maps, percentage share, stoplights, and variance comparisons. A “Dashboard” type user interface design allows presentation of complex relationships and performance metrics in a format that is easily understandable and digestible by time pressured managers. More specifically, such interface designs significantly shorten the learning curve and thus increase the likelihood of effective utilization. Figure 1 presents an example of a dashboard design.

Figure 1: Example of a Dashboard

Scorecards

A “scorecard” is a custom user interface that helps optimize an organization’s performance by linking inputs and outputs both internally and externally. (The Balanced Scorecard is the specific methodology associated with the Kaplan and Norton model).[4] To be effective, the scorecard must link into the organization’s vision. Over the next few years the differences between dashboards and scorecards will become increasing blurred as these interface structures become fully integrated. Figure 2 illustrates the current adoption of BIS throughout the organization.

Figure 2: BIS Adoptions by Management Area

Figure 3 illustrates the basic structure of how the Dashboard fits into the decision making process. The Dashboard integrates the data warehouses and analytical models directly into the decision making process. This is a continuous process based on ongoing environmental scanning and feedback from current performance metrics, e.g., inventory turns. Behind the graphical interface lie the supportive analytical systems such as statistical analysis for data validation, combined forecasting algorithms, and expert systems for decision options analysis and recommendations.

Figure 3: The Dashboard Interface Structure

The Importance of Training

Training at all levels is a key ingredient in the successful application of BIS. In many applications, training occurs at the last minute and is simply geared towards how to use the system for specific assignments. Intensive training before, during, and after system implementation helps create the cultural change needed to maximize acceptance throughout the organization.[5] Training simulators represent one approach for both improving system utilization and increasing organizational buy in.

Within two to three years, companies will ditch the traditional model of making business adjustments on a quarterly basis. Instead, they’ll use business intelligence and performance management tools to make real-time shifts in strategy to respond to changes in the marketplace.

~ Rob Ashe, president and chief operating officer
Cognos Inc.*

Some current technical challenges facing this evolving industry are presented below:[6]

  • Integrating optimization based models with enterprise resource planning systems.
  • Developing an observation oriented approach to data modeling that includes manual and automated processing.
  • Combining decision support, knowledge management, and artificial intelligence in a data warehousing framework.
  • Designing intelligent agents that can be used to support decision making processes.
  • Formulating adaptive and cooperating systems that use evaluation and feedback to improve the decision making process.

Additionally, speech recognition represents a significant development for improving the human/computer interface. Specifically, a speech interface system would allow the manager to increase the decision making flow volume as well as to explore a broader range of unstructured decision applications.[7]

In the next two years, business intelligence capabilities will become more democratized, with a far greater number of end users across the enterprise using the tools to get better visibility into the performance of their segment of the business. Think of it as executive dashboards for worker bees.

~ Steve Molsberry, senior consultant
Stonebridge Technologies Inc.*

Applications

Highlighted below are some specific examples in which dashboards have been successfully applied to improve organizational performance. Following each abstract is a link that will take you to the actual study.

  • Hospital Bed Management – The current crisis in the nation’s health care system has triggered an intensified focus on increasing productivity and reducing costs. Two primary goals of a hospital bed management dashboard system are to optimize bed resources and reduce emergency department wait times. The system consists of a number of modules, which include both bed placement and data mining models. Specific displays include real time bed availability forecasts and capacity alerts. In many respects this BI system is like an air traffic controller for hospital beds. For example, it both schedules patient bed assignments as well as facilitates the transfer of patients from other departments. (Bed Management)
  • Conflict of Interest Assessment – Prior to taking on a new client, many law firms routinely check throughout the organization to determine the potential for a conflict of interest. Historically, this has required many man hours of effort with the possibility of errors that could significantly affect operating performance. This dashboard based system, which connects attorneys and staff, automatically checks organizational records and results in reduced operating expenses and improved worker productivity. Specifically, the system has reduced the time to conduct conflict checks by 75%. (Conflict)
  • Product Development Management – Historically, measuring the performance of ongoing product/service development (PD) has been a hit or miss proposition. This inconsistency has often led to significant overruns and in some cases, total failure. Estimating product development cycle time is key to any effective assessment process. A typical PD dashboard system is designed to report results to date as well as to indicate the potential for continuing success/failure. Project compliance is one key dashboard PD metric. A gauge reports the fraction of new product launches that occurred on schedule and budget. Another standard dashboard metric is the fraction of products/services that has received a favorable trade journal review. Additionally, the dashboard should have the capability of identifying new product/service opportunities. (Product (hyperlink no longer accessible))
  • Financial Management – Many financial and investment organizations have concluded that it is essential to have real time updates of key performance metrics such as revenues and profits in order to remain competitive in today’s marketplace. Traditionally, many organizations have relied on quarterly reports to support the decision making process, a practice which has often led to uneven performance. A financial dashboard provides an integrated and real time overview of performance that can be directly correlated to the business model. Specific metrics include balance sheets, income statements and competitor performance. Additionally, the dashboard can display alerts identifying negative trends that require immediate attention.

Each of these applications was developed based on a well designed business intelligence strategy.

Building the Business Intelligence Strategy

Developing an effective business intelligence strategy is predicated on three key drivers: perceived value, organizational utilization and a cost effective solution. The development of a BIS strategy should be tied to specific organizational performance goals and operational objectives.[8] Examples of the latter include increasing customer retention and reducing turnover of key personnel. The proposed solution must be adaptable, scaleable and maintainable. Often a phased schedule in implementing the BIS is best since it tends to minimize risk as well as increase organizational acceptance. Such an approach allows elements of the system to be checked out prior to full system deployment.

Presented in the following list are the major steps involved in developing an effective BIS strategy:

  • Establish BIS objectives. (Specifically, what do you want the system to do?)
  • Evaluate the current in-house support capability, including the present system’s architecture.
  • Perform a gap analysis on existing data systems, including response time.
  • Identify alternative technical solutions.
  • Formulate an implementation timeline.
  • Conduct organizational “Town Hall” meetings to solicit ideas and to enhance the cultural climate for change.
  • Determine the need for outsourcing support.

Outsourcing some or all of the implementation process can offer significant benefits to organizations with limited internal technical capabilities or an already strained IT department. Outsourcing also brings the latest in technological development. A first step when considering outsourcing is to assess the organization’s internal infrastructure. This assessment is essential since BIS applications can become very expensive whether developed internally or outsourced. The initial investment for developing a BIS ranges from $1 million to $20 million plus, depending on organizational goals, current IS capabilities, and the projected number of users. The annual system operating expenses can often equal a significant proportion of the initial investment.

Table 1 presents a list of selected BIS vendors. (This list does not imply an endorsement of any vendor. These are presented as examples only.) Generally it is a good idea to start the selection process with the development of a request for proposal (RFP). There are a number of standard RFP formats (no longer accessible) available on the Internet. Obtaining multiple bids will insure both a competitive process as well as serve as a forum to generate additional ideas and technical approaches. Keep in mind that only 50% of all IT oriented projects are completed on budget and on time. A careful check of the references cited by the vendor is essential.

Table 1: Selected BIS Vendors

Conclusion

  • The use of BIS throughout most organizations is on the increase as a result of growing global competitive pressures. Improved user friendliness through the use of graphic interfaces is a primary characteristic of the new generation of BIS applications. Specifically, managers require interactive interface systems such as dashboards that are easy to understand and use. Organizational integration represents another important characteristic of BIS.
  • Current industry challenges include improving system integration and developing cooperative and adaptive systems that incorporate feedback and evaluation automatically into the decision making process. More specifically, real time speech recognition represents a new technology for improving the human/computer interface that is essential for use by managers at all levels.
  • Developing a BIS strategy involves three key issues: perceived value, organizational utilization, and a cost effective solution. The development of a BIS strategy should be tied to specific organizational performance goals. With a carefully crafted plan, organizations can realize significant increases in productivity and insights into the marketplace.
  • Ongoing management training is essential for insuring the continued effective use of the BIS. Simulation is one training strategy that provides an effective and dynamic structure for introducing and supporting new BIS applications. Many organizations should consider outsourcing for implementing their BI strategy. The initial investment for a BIS can range from $1 million to $20 million plus, depending on the specific operational requirements.
  • Some potential implementation barriers include failure to establish viable performance metrics, failure to fund adequate post-system training, and failure to obtain organizational “buy in.” *Quotations are from “The Future of Business Intelligence,” Computerworld.com.

[1] Kobana Abulkari and V. Job, “Business Intelligence in Action,: CMA Management, 77, Issue 1, (March, 2003): 15.

[2] Eric Bonabeau, “Don’t Trust Your Gut,” Harvard Business Review, 81, Issue 15, (May, 2003): 116.

[3] John Orefica, “Moving to the Next Level,” Health Management Technology, 22, Issue 17, (July, 2001): 46.

[4] Robert S. Kaplan and David P. Norton, “Using the Balanced Scorecard as a Strategic Management System,” Harvard Business Review, 74, Issue 1 (January/February, 1996): 75.

[5] Sarah F. Gale, “For ERP Success, Create a Culture Change,” Workforce, 81, Issue 9, (September, 2002): 88.

[6] Christer Carlsson and Efraim Turban, “DSS Directions for the Next Decade,” Decision Support Systems, 33, Issue 2, (June, 2002): 105.

[7] Carl M. Rebman, Milam W. Aiken, and Casey G. Cegielski, “Speech Recognition in the Human-Computer Interface,” Information & Management, 40, Issue 6 (July, 2003): 509.

[8] Sanjay K. Singh, Hugh Watson, and Richard T. Watson, “EIS Support for the Strategic Management Process,” Decision Support Systems, 33, Issue 1, (May 2002): 71.

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

IT MATTERS: Portal Combat

Although there is no doubt that 2002 and the first quarter of 2003 have been extremely difficult for software companies, recent figures surprisingly indicate that the enterprise portal software (EPS) market is continuing to grow.

IDC analysis even indicates that this growth may continue over the next few years (www.portalsmag.com). According to their research, five factors are primarily contributing to this trend:

  1. Software vendors recently added EPS to their portfolio of products, and EPS is penetrating their installed base.
  2. More companies are beginning to understand the business benefits of EPS.
  3. As EPS is deployed to improve specific business processes, the benefits are increasingly easier to measure.
  4. The increased marketing dollars invested in educating the prospect base are beginning to pay off for some software companies.
  5. Broader adoption and deployment of EPS across the enterprise is spurring sales for companies that originally deployed only to a single department.

For firms considering EPS deployment, it is useful to know what these survey data suggest in terms of EPS trends in the marketplace. For example, it is helpful to know what types of companies are making this investment to get an indication of experience in an industry sector. Furthermore, understanding within the companies who is leading the implementation and which departments are targeting this change also lends useful insight.

In the first instance, IDC survey results have indicated that those companies involved in portal initiatives were highly concentrated in financial services, process manufacturing and business and legal services industries (Figure 1). Interestingly, these data also indicate that cross-functional teams typically lead the portal implementation project, followed by the CIO (Figure 2). Such information may suggest the importance of considering the business user as well as including the IT department in decision making. In addition, these data suggest that companies are focusing on addressing specific business problems using EPS technology.

Additional data indicate that perhaps portals are being deployed to address needs in specific departments as well as specific business needs (Figure 3). Such departmental focus suggests a staged implementation strategy for the majority of the companies surveyed. This staged implementation enables companies to limit their risk and investment before expanding the scope of a given project. The departments that are generally first to target the software are corporate management, human resources and marketing.

Finally, it appears that there is typically a discrepancy in the initial implementation between employees who access the portal versus those who use it daily (Figure 4). Such usage difference suggests that one of the biggest challenges in portal implementation is involving employees in the process in order to capture the features and functions that they need to create greater job efficiencies and effectiveness.

The business process and department focus indicate that the EPS buyer should assess current capabilities for supporting decisions before and after a software purchase/implementation of this magnitude. Although a specific department may be funding an initial implementation, future use and deployment should also be considered at the outset to ensure long-term viability as well as to limit short-term risk. Furthermore, simply planning to provide all employees with access to corporate information is insufficient. Instead, involving employees in cross-functional teams is a trend that is shaping the marketplace. Corporate benefits can be realized by providing access to custom applications or legacy systems and improving information sharing among employees, suppliers and customers. Ultimately, as e-business evolves, EPS software will either meet the challenges of such change or lose out on market opportunities.

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

IT MATTERS: Web Services Prevail Despite Travail

Tech slump yields economies for supply-chain and sell-chain automation.

The notion that e-business got dumped along with tech stocks clearly misses the reality of how much business has been transformed electronically. Even though IT budgets have been stripped, many information technologies are now fundamental to business practice. In fact, many Internet-based technologies survived and even thrived in the down-turn.

The reason is simple. These technologies transform costly, clumsy, painful business processes into processes that are faster and cheaper. Even though, there is an up-front cost, the bottom-line is impacted enough over time that these technologies and the efficiencies they provide are now often operationally essential in order to firms to remain competitive in an even more competitive environment. They not only impact the bottom-line in the short-term but they also may win customers in the long-term.

Two major forces are reshaping e-business. The first is bundling, where once separately sold applications are now sold together. The second is outsourcing in which software previously sold as a product are now sold as a service. In essence, e-business got harder for the technology competitors selling the wares because of increased competition for lower budgets and it also got smarter for the companies employing the technologies. The competition fueled more reasonable prices, better service and better products. However, it may still be difficult, given a limited budget, to assess what your company might need and what efficiencies can be achieved within specific cost constraints. In order to evaluate this, Table 1 compares various e-business components that medium-to-large sized enterprises are using to compete.

Certainly efficiencies in the supply-chain and the sell-chain enable firms to remain competitive during lean times. The current technologies are all about getting the right information to the right people. A good portion of this is increasingly being achieved via web-services. These services are fueled by the reality that full integration is still a long way off so it is still not possible for servers at different companies to really share and exchange information spontaneously without considerable effort.

Web services are arising as the key technology for enabling this interaction, if only superficially sometimes. It is still impossible to connect your bank, airline, rental care and frequent flier program seamlessly. However, the web-service technologies are certainly moving in that direction. The interesting thing about achieving these efficiencies through web-services is that there is nothing ‘flashy’ about the technology. The tools that make this possible are not at all glamorous, but boring nuts-and-bolts technologies. Maybe this means that while the honeymoon is over with e-business solutions and they may not be as ‘sexy’ as once considered, more importantly they are stable and integral to our daily business lives.

Table 1. Basic eBusiness Tools and Costs

Tools

Price Range (US $)

Data Storage and Management

Storage Systems(Hardware)

.03 – .15 per megabyte (for a large 4 tera-byte ebusiness application the range is in the 1.5 million area)

Database Software

11,000 – 40,000 per CPU
Network Hardware

Carriers

13,000-23,000 per month

Routers

10,000-55,000 depending on processing, dataports, software, memory

Load Balancers

4,500 (for eight 100 megabit ports)-30,000 (more for gigabit Ethernet ports and advanced functionality)

Switches

4,000-20,000 (pay more for the number of ports)
Web Applications

Web Server Software

Free – should be included free of charge with an application server

Application Server Software

8,000 – 35,000 per CPU

Enterprise Portal Software

100 – 200 per seat (the more applications you want to integrate the higher the price)

Integration Software

300,000 – 500,000

Content Management Software

250,000 (applies to licensing only – integration costs up to 5 times the cost of the software)
Business Applications

Supply Chain Management

1 million – 5 million (for midrange retailer)

Customer Relationship Management

5,000 per user (list price)

Procurement

500,000 (without customization or integration)

Financials

190,000 (for a midsized company purchasing 70 seats with mixed read-only and full-access)
Security

Authentication

20 per user (enterprise wide deployments can cost considerably more)

Encryption

11,000 – 18,000 (costs range from $10 per user to $200 per user depending on the complexity and scalability required)

Disaster Recovery

100,000 and up (depending on the complexity of the system)

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

Go Directly to Jail?

Critical thinking means analyzing available information and considering legal and ethical implications of decisions.

The Enron/Andersen debacle demonstrates yet again how easy it is for basically decent people to step over ethical and legal lines in the pursuit of success. From what we know, these were not people who set out to deliberately defraud their employees of their pensions or shareholders of their investments. In many ways they were pillars of the community, supporters of charities and the arts, creators of jobs. They love their families. But they made decisions that eventually had disastrous effects for many people, including themselves. At least some of them are likely to face criminal charges as a result of their actions.

They are not unique. Every year business people find themselves on the wrong side of the law as a consequence of decisions they have made. Most cases are not as high profile as the Enron case, but as an attorney who has often defended business clients, I know their stories all too well. These are not people who sit down and deliberately formulate some strategy to commit a crime. Instead, they usually find themselves in the justice system because of one or more of the following three factors:

  • Incremental decisions made without consideration of where they are leading,
  • A sense of corporate collectivity that leads to a loss of the sense of individual ethical accountability, and
  • A lack of critical thinking skills that might have helped them deal with each of the other factors. Moreover, these factors are frequently inter-related.

Decisions follow other decisions, often without a conscious awareness of their consequences and the alternatives that are being eliminated. The effects are often incremental, and, of themselves, may seem innocuous. The problem is that these decisions are often made without an awareness of other decisions being made within the same organization and how they all will interact. Sometimes they are decisions not to act because the options are unpleasant, and sometimes they are decisions to ignore something because no one thought that this might be a real problem. Sometimes the decision not to act is a form of not exercising personal responsibility. Because there has not been a clear analysis of possible – and at times, probable – consequences, people find themselves in a place they never intended to be, a place with few good options left.

Cognitive Levels and Critical Thinking:

Psychological literature suggests that there are several different cognitive levels or activities that need to be differentiated when talking about critical thinking. These are:

  • Recall - Recalling previously-memorized or experienced data.
  • Comprehension - Expressing recalled information in one’s own terms.
  • Application - Reordering the comprehended information and reaching a conclusion or coming up with a solution to some situation based on that information. Legal reasoning, for example, is largely the process of applying legal principles to a particular situation.
  • Analysis - Breaking the information into parts, with each part considered separately as well as in terms of how it fits into the larger whole.
  • Synthesis - Integrating all of the other levels. Usually it involves application to a new problem. Synthesis is required when there is not a single right answer to a problem, but the best answer still needs to be found.

Incremental and Uncoordinated Decisions:

The Story: As one example, a chemist in a manufacturing company decided to add a trace of acetone to the water used in an outdoor humidifying system as a means of getting rid of used acetone from the manufacturing process. It amounted to less than one percent of the water in one sprinkler head out of several that were spraying water, and no one noticed. However, the process was automatic, and over the course of several years no one checked on the amount of acetone being used, even though manufacturing processes were changed. In time employees began to complain about the odor. Management did not think critically. They did check the humidifying process or investigate the water source. They made the assumption that the employees were just looking for things to complain about. When some employees finally went to the County Department of Health with complaints about the odor, it was discovered that the water from that sprayer head was now more than three percent acetone — certainly an irritant if not a health hazard.

When County officials came out to check the humidifying system they also observed hundreds of barrels of hazardous waste that were being stored even though the company did not have a hazardous storage permit. (The latter situation was largely due to the fact that the paper work to apply for the permit had been lost, and the manager decided it was too much work to re-do it.) This discovery triggered a wider investigation and a request for more information. The manager did not respond to the County’s request within the required time frame because he was not ready to reveal that they had deferred making some required environmental modifications largely to save money in that fiscal year. Year-end bonuses were tied to the division’s financial performance. The environmental violations in themselves were not major, but the combination of decisions (including one by the safety officer not to expeditiously complete and forward to the management a long memo that he had written detailing their environmental problems) added up to a major case and jail time for two executives. There had been no critical thinking about consequences of all of these decisions, or lack thereof.

Application: Looking at this series of decisions, no one seemingly even recalled the fact that the acetone level was linked to a particular manufacturing process, let alone moving to the stage of comprehending what that meant or analyzing what would need to be done to limit the percentage, or deciding if it was even appropriate to dispose of the acetone this way. No one comprehended that employees might complain over a lack of management action, or analyzed what their actions would look like to government authorities. No one synthesized the information that once the authorities became involved, many facets of the company business would come under scrutiny and that it would be well to respond quickly and affirmatively to limit the liabilities. Although the managers were well-educated and decent people, they did not employ good critical thinking skills.[1]

Corporate Collectivity and Personal Accountability

Especially in large organizations there is often a sense of “corporate collectivity” that encourages some people to lose their sense of accountability as well. When many persons touch the process or product without having fixed accountability for the final product or decision, critical-thinking questions such as potential risk, risk assessment, and risk management do not have the same sense of importance they would have were one individual responsible for the entire task or the ultimate decision. No one has specific responsibility for challenging what appears to be a deficient process or decision. Indeed, the group norms may discourage questioning the efforts of others. It becomes easy to rationalize and assume that if there is a problem, someone else — somewhere — must be expected to catch it. “It is not my responsibility. It is not something I decide.” It is difficult to make people recognize that in their willingness not to openly challenge something that they sense is problematic or unethical, they are often, in fact, making a decision to adjust their personal ethical standards.

The Story: As an example of both a lack of critical thinking and the passing of ethical responsibility, consider the case of another client, Bill, a hard working blue-collar independent contractor for a tool distributorship similar to Snap-On Tools. Bill had some credits from a junior college, but no degree. He was a good salesman in his field, a friendly guy, and most people liked him and trusted him. Similarly, Bill trusted most of the people he met. Along the way Bill met a teenager with a gift for salesmanship (something he should have recognized and comprehended given his own activities) who had just opened his own carpet-cleaning business. As a favor, Bill allowed this young man to use his resale number to purchase cleaning supplies. He also let him use his company checking account before helping him set up his own business account. The young man soon became very successful, and shortly thereafter took his business public.

In what Bill assumed to be a gesture of thanks for his help, the young man asked him tojoin his Board Of Directors, a position which Bill accepted, even though he had little business training or understanding of the legal responsibilities of a board member. The Board had a core group of business people who were close to the owner. There were also others, like Bill, who were long-time friends, but who were ill-prepared for this type of activity. The latter group basically went along with whatever the owner and his coterie proposed. They did not ask questions at board meetings. In fact, Bill did not even really understand what was happening when the company began to engage in insurance and securities fraud, although had he been willing to ask about things he did not understand, he could have found out much more.

At one point the cleaning company submitted fraudulent bills to an insurance company for chemicals that were allegedly to be used on a major restoration project, listing Bill’s company as the one submitting the claim. Initially the insurance company did not check on the claim and paid the bogus bill with a check made out to Bill’s business. He did not question why his company was involved. Rather than informing the insurance company that he had not filed a claim, he endorsed it over to the cleaning company believing that it was the appropriate payee. He genuinely thought that he was doing the right thing, but he was charged with bank fraud since he had endorsed a check to which he knew he was not entitled. When the government started investigating, he was also was convicted of a securities violation because on the IPO registration of the cleaning company, he had allowed them to say that his company had sold more than $1 million worth of product to the cleaning company.

Application: Bill initially made the decision to let the young man use his resale number as a favor to a friend. But even without a college degree, if he did not know, he should have known, that this was not legal. Likewise, the decision about the checking account. The decision to accept the board appointment, even though he realized that did not have the business background that others on the board had, was something he should have questioned, but did not. He was flattered to be a part of something that was receiving glowing reviews and so ignored the implications of his increasing involvement. When he had questions at board meetings, he chose (another decision) not to ask them rather than to appear ignorant. There were plenty of signs that things were not proper, but he went along, accepting the idea that if everyone else thought it was all right, he did need to worry about being accountable. He let himself be used.

Bill was beginning to feel uneasy about some of this, but he was either unable, or unwilling, to think critically about what was happening. He could have comprehended much of what was happening, but he had accepted the notion that it really was the responsibility of others on the board who understood these things better than he to make the ethical judgments. He passed the buck. As a result he was sentenced to two years in prison.

The Refusal to Think Critically:

There are many other examples that could be recounted. Let me give only one. Joe was a pillar of his community, owned a family business, was a good family man and church member. Yet, because he did not ask the right questions, he sold chemicals to people who were making methamphetamines. He decided to believe them when they said that they were buying the chemicals to use in servicing automobiles, even though they wanted them delivered after hours, paid in cash, did not have a business address nor any business identification, and did not use a proper vehicle to transport chemicals. Blinded by the prospect of easy money, he also rationalized that he was not accountable if he did not specifically know that the chemicals he was selling were being used to manufacture the drug. He chose not to ask the questions that would have forced him to critically evaluate his actions. He now faces the potential of a significant prison sentence where he can ponder his lack of critical thinking.

Conclusion:

These were not bad individuals deliberating trying to defraud or commit crimes. But they did not think through where their decisions could possibly lead, and they were willing to ignore even their own professed ethical standards in the pursuit of the golden ring of money, status, or acceptance. They did not make the effort to comprehend, analyze or synthesize the information that was right before them. Had they been in the habit of critical thinking, it is much less likely they would have found themselves starting down the path of decisions that led them into the justice system.

Think about it.


[1] For a more detailed description and analysis of this case see Magasin, M. & Gehlen, F. (1999) “Unwise Decisions and Unanticipated Consequences;” Sloan Management Review, 41 (1), pp. 47-60

2014 Volume 17 Issue 1

2013 Volume 16 Issue 3

2013 Volume 16 Issue 2

2013 Volume 16 Issue 1

2012 Volume 15 Issue 3

2012 Volume 15 Issue 2

2012 Volume 15 Issue 1

2011 Volume 14 Issue 4

2011 Volume 14 Issue 3

2011 Volume 14 Issue 2

2011 Volume 14 Issue 1

2010 Volume 13 Issue 4

2010 Volume 13 Issue 3

2010 Volume 13 Issue 2

2010 Volume 13 Issue 1

2009 Volume 12 Issue 4

2009 Volume 12 Issue 3

2009 Volume 12 Issue 2

2009 Volume 12 Issue 1

2008 Volume 11 Issue 4

2008 Volume 11 Issue 3

2008 Volume 11 Issue 2

2008 Volume 11 Issue 1

2007 Volume 10 Issue 4

2007 Volume 10 Issue 3

2007 Volume 10 Issue 2

2007 Volume 10 Issue 1

2006 Volume 9 Issue 4

2006 Volume 9 Issue 3

2006 Volume 9 Issue 2

2006 Volume 9 Issue 1

2005 Volume 8 Issue 4

2005 Volume 8 Issue 3

2005 Volume 8 Issue 2

2005 Volume 8 Issue 1

2004 Volume 7 Issue 3

2004 Volume 7 Issue 2

2004 Volume 7 Issue 1

2003 Volume 6 Issue 4

2003 Volume 6 Issue 3

2003 Volume 6 Issue 2

2003 Volume 6 Issue 1

2002 Volume 5 Issue 4

2002 Volume 5 Issue 3

2002 Volume 5 Issue 2

2002 Volume 5 Issue 1

2001 Volume 4 Issue 4

2001 Volume 4 Issue 3

2001 Volume 4 Issue 2

2001 Volume 4 Issue 1

2000 Volume 3 Issue 4

2000 Volume 3 Issue 3

2000 Volume 3 Issue 2

2000 Volume 3 Issue 1

1999 Volume 2 Issue 4

1999 Volume 2 Issue 3

1999 Volume 2 Issue 2

1999 Volume 2 Issue 1

1998 Volume 1 Issue 3

1998 Volume 1 Issue 2

1998 Volume 1 Issue 1

Using Internet Portals to Manage the Information Deluge

COPE Model can help to choose the portal that’s right for you.

Imagine if libraries had no indexing systems and librarians simply put materials away on the nearest shelf. Or what if there were no shelves and every library devised a different method of stacking books on the floor? This is largely the reality of the Internet. Although the Internet has extensive information resources, there is no standard indexing scheme, and people can put things pretty much anywhere they like.

To make Internet resources more accessible, netropreneurs started developing web sites that concentrate information for users. From this idea arose the concept of the Internet portal – a tool for managing the massive flood of online information and utilizing a new generation of management tools. Now, portals can provide a competitive edge for managers who use them to obtain business information, communicate, and organize daily activities.

What makes Internet portals useful to the business practitioner? According to one review it is their ability to deliver content in a useful way, facilitate communication, and create a sense of community among users. These services bring users to the portal over and over again.

Many popular portals are free services that generate revenue by carrying banner ads. Yahoo!, Excite, and LYCOS each offer free portals that deliver information in a highly customizable format and provide a suite of useful communication and organization tools. These portals can be accessed over the Internet from any location at any time. Individuals or organizations needing specialized applications, or access to proprietary data, can implement custom-designed portals from firms such as Plumtree, SAP, and Oracle. These systems can interface with a firm’s existing information system to deliver information to selected users.

For the purposes of harnessing information, practitioners need to COPE (“The Empty Glass and Too Much Water: Controlling Data Overflow,” Bjorner, Susanne, Online Magazine, March 1998).

Table 1

C.O.P.E.

C – Consolidate time and information management tools as much as possible.

O – Organize time and tasks with as few tools as possible.

P – Personalize information.

E – Edit the amount and type of incoming information.

A variety of portals offer this potential, and a new rating scale based on the COPE model can be used to determine which portal is most suited to a particular business setting. Portal services useful to business people can be broken down into four major categories:

  1. Time Management
  2. Information Management
  3. Information Resources
  4. Communication Services

Table 2 rates the most popular portals based on the general criteria of time/information management, business/information resources, personalization, and communication. Other services provided by some portals, such as shopping, auctions, and ticket sales, are less relevant to the typical businessperson.

Table 2

Time/Information Management Ratings
Yahoo Excite Lycos Netcenter
Time & Information Management
Calendars
Address Books
To Do Lists
Notepads
PDA/mobile links Briefcases
Channels/Links

X
X
X
X
X
X
X

X
X
X
X
X
X
X

X
X
X
X
X

X

X

X

Information Resources
Maps
Travel/weather information
Stocks Info
News
Small Business News
Business Tips
News Clipper Services
Traffic Reports

X
X
X
X
X

X
X

X
X
X
X

X

X
X
X