CUNY B r o o k l y n  C o l l e g e            Department of Computer and Information Science
CIS 763X: Software Methodology
with Dr. D. Kopec                           Fall, 2000


Back tondividual Class Presentations
 


Software Failure:  Management Failure

by Steven Flowers

Government Computer Projects

Presented by Harold Gadiare




 

Introduction

Computerized information systems are at the heart of all modern organizations.  Governments are, without a doubt, the largest users of information technology.  They spend billions of dollars a year acquiring and maintaining information systems.  But, unfortunately governments are not always the most effective or successful users of technology.  There are many examples of government computer projects that have failed dramatically.  This report summarizes the specific nature of government information systems developments and highlights the lessons that can be learnt from some recent information systems failures.
 

Are government computer projects different?

Government computer systems are generally different from most commercial developments.  Because of the nature of services that governments provide to its constituents, for example, administering tax and other national programs, the system they must develop are often far removed from standard commercial systems.  Government information systems differ from commercial systems in five major respects:

Size – Government systems are often intended to serve large populations with millions of people.  This means that the system often handle a huge amount of data.

One-Offs – Because government applications tend to be unique, they are often forced to create their own system rather than purchase ready-made ones.

Complexity – The systems that governments create are often based on complex legislation that is built up over years.  When combined with the requirements necessary to put them in operation, the result is a system of incredible complexity.

Very long development timescale – The large scale on which government operates, and the slow speed at which they effect changes, mean that the development of computer systems is usually protracted.

Very high cost – With all the factors mentioned above – size, custom built applications, complexity, and long time scale – government systems can’t help but command a very high price tag.

The US IRS Tax System modernization project is a key example that illustrates these differences.  Since its inception in 1986, the IRS has been engaged in this project to create a fully automated and modernized tax-processing system.  Between 1986-1990 the IRS spent $80 million on the projected.  The project is expected to have a final cost of $23 billion and last until the year 2008.

The process of building the large and complex information systems required by governments is often very different from the development of most commercial systems.  Some of the factors include:
 

1. Technology-led -- There is a tendency for government information systems to be technology-led rather than oriented towards satisfying the needs of the users and the public it is intended to serve.

2. Low-Cost Solution not sought – The preference for the solution that make use of the leading – edge technologies within government projects indicates that there may be a tendency to adopt high-cost approaches to systems, rather than low-cost solution.

3. Custom system rather than packaged solutions preferred – Because of the unique nature of many core governmental information systems, there is a tendency to assume that all government systems require a custom rather than a packaged solution.

4. Short-term tenure of managers overseeing projects – Continuity of the management overseeing large information systems developments is essential.  However, management charges occur frequently in government projects, with adverse effects on the project.

5. Priorities may be refocused – Changes in government policies that occur during the development of a system can cause a shift in focus.

6. Imposition of external deadlines – The effective date for changes in government policies, which are usually selected for political reasons, are often the deadlines by which information systems must be operational.

7. Highly bureaucratic decision-making processes – The Bureaucratic decision-making process in government is probably good for the accountability of public funds.  However, such level of bureaucracy often creates organizational structures unsuited for the effective management of information systems projects.

8. High-level of public interest and oversight – Government projects are carried out in the public arena surrounded by a level of interest and public access unknown in commercial developments.


In 1992, the General Accounting office of the US produced a summary report of problems with government computer projects.   The top four problems identified were:
 

a) Inadequate management of information systems lifecycle.

b) Ineffective oversight and control of Information Resources Management (IRM).

c) Cost overturns

d) Schedule delays


These problems, along with the other factors mentioned above, are illustrated in a brief review of three recent information systems projects:  The Wessex RISP System and The Field System in the UK, and the Veterans System in the United States.  These projects represent different types of computer projects, planned at different times by different governments, in different countries.  However, similar mistakes were made and similar lessons can be learned from each project.
 

The Wessex Regional Information Systems Plan (RISP)

This project is an example of a grand computerization scheme that became a major public scandal, five years and £43 million ($64.5) after it started.

At the time of the RISP development, Wessex Regional Health Authority was responsibility for the provision of healthcare for a major part of southern England.  The regional Health Authority (RHA) was divided into ten Districts that provided healthcare for its local area.

The RISP project evolved from a decision to develop an information system strategy that would cover all information requirements within the entire Wessex RHA. The project was adopted in 1984 with its aim being: “to use modern technology in order to optimize the use of information in the continuing improvement of the effectiveness and efficiency of clinical and other health services.”  The main point of the plan were:
 


There were many problems within the RISP development.  During the development external auditors presented a series of reports that were critical of various aspects of the project. No significant action was taken to correct these problems. For example, on numerous occasions it was evident that there was conflict of interest in the procurement process.  For example, Digital Equipment Corporation (DEC) was selected out of five proposals to supply hardware for the project.  Then, Anderson Consulting /IBM consortion, one of the losing bidders, was appointed to advise on Software for the DEC based system.  Ultimately, Anderson consulting/IBM ended up with the contract for both hardware and software.

There was also lack of a clear definition of the scope of RISP.  As a result there, were difficulties in budgeting and controlling expenditure of the project.  The official estimate for the project was between £24 to 33 million ($36-50 million).  However, the internal unofficial estimate was that the project would cost not less that £80 million ($120 million).

RISP was a technology-led project that was pursued even though there were practical evidence that it was not working out.  It was reported that within Wessex senior management there was near obsessional belief in RISP.  It was evident that a full project plan was never developed and that many of the projects were authorized on an ad-hoc basis.  There was also poor supervision of the consultants, who basically managed the entire project.

RISP was abandoned in 1990, five years after it started.  By this time £43 million ($64.5 million) had been spent on the project, £20 million ($30 million) of which was wasted.  There was only partial implementation of three of the five core systems.  In the end, Wessex RHA still did not have the information system that they perceived as essential to the effective management of their operation.
 

The Field System

This case is the example of a System that was built by one organization (the Department of Employment), on behalf of others (the Training and Education Councils, or TECs), and then rejected by its intended users once installed.   All at a cost to the Taxpayers of £48 million ($72 million).  This is also an example in which there was a decision to develop a custom system when packaged software could have provided an equal level of functionality at much lower cost.

Plans for The Field Systems (TFS) stemmed from a 1988 review by the Department of Employment into the systems operating within its regional offices.  The decision was made to replace the five stand-alone systems that had evolved over the years with a single integrated system.  It was expected that the new system would supply the needs of each field office, be more usable, and provide better-quality information to improve decision making.  TFS was to be implemented between June 1990 and April 1992.  Development of TFS for the field offices was already underway when government announced the establishment of TECs.  The TECs were commercial organizations, independent of the government, that were to assume the work of the Department of Employment’s field offices.

The business case prepared for TFS was seriously flawed.  It assumed that it was necessary to provide the News TECs with a common information system rather that allow them to determine their own needs.  The prudent approach at this point would have been to allowed the TECs to begin their operations with manual systems until they were in a position to determine their own needs.   Instead, the Department decided to adapt the Field System already under development.  The Business case for TFS ignored the following risks:
 


An external consultant commissioned to review the project reported that the plans were over-ambitious and of doubtful feasibility given all the risks.   They also found that the personnel assigned to the project lacked the experience essential to the management of a project of this scale.

Inspite of those reports, development continued.  Due to the lack of user involvement, the users needs were not met, because they were never clearly defined.  There was ineffective testing and early versions of the software were released with a large number of errors.  The TECs reported that the system was slow, cumbersome and not user friendly.  Of the 71 TECs, most were not using all the facilities of the system, and six were not using it al all, having preferred to purchase their own system.

An external review of TFS done in September 1992 by the National Audit Office found that many of the TECs intended to reduce reliance on the system or replace it within a year.  In that same month the Department of Employment announced it would withdraw from the Field System.  The Field System had three key deficiencies that contributed to its failure: it was not subjected to rigorous and realistic risk assessment, it did not have the best possible user involvement from the outset, and it did not have effective project management.
 

Veterans Benefits Administration

This case examines how a timely review by the US General Accounting Office (GAO) of plans to modernize information system within the Veterans Benefits Administration (VBA) saved it from becoming another IS failure.

The VBA is part of the US Department of Veteran Affairs (VA) and is responsible for the distribution of non-medical benefits to approximately 27 million veterans and their dependants.  In 1991, the VBA handed over 3.5 million compensation and pension (C & P) claims.  By the end of 80’s the VBA’s existing mainframe computer system could no longer fully support all the VBA’s program.  Consequently, the VBA embarked upon a modernization project to replace its existing system with a decentralized system that integrates all information about a veteran.  This would enable claims to be processed more quickly.  The VBS also expected to improve the services it offered through the use of document imaging, workflow systems and expert systems.  The VBA’s Information Resources Management (IRM) office was to lead the project, and design and development would run in parallel to the acquisition of hardware and software.

The GAO review of the VBA’s modernization plans found that it was flawed in numerous ways.  First, it was a technology led project.  VBA had planned to purchase hardware and software before their information architecture or information requirements were defined.  Second, their current business process was poorly understood.  The major part of the work of the VBA is processing claims related to C & P, with the average claim taking 151 days to process.  The VBA’s IRM Office attributed the delays to the paper based system and expected to reduce the delays with the use of an imaging system linked to workflow software.  However, a GAO study showed that 55 percent of the time to process a claim was taken up in waiting for someone to work on the claim.  Therefore, the proposed plan would result in only marginal improvements of 6 to 12 days.  Third, the goals of the system were unclear.  The VBA has not outlined any specific levels of service to be achieved nor had any performance indicators been established.  Finally, there was poor communication between those responsible for creating the new system, (the IRM office) and the C&P managers who were to use it.  The C&P managers did not see how the new system would address their business problems and believed that the IRM office was acquiring technology solutions without regard to their needs.

The GOA recommended that the VBA postpone the procurement of any hardware or software under the modernization scheme, until all the major flaws were addresses.
 

Lessons from Government Computer Projects

Two types of lessons can be extracted from this review of government computer projects: Organizational lessons and Project lessons
 

Organizational Lessons

First, a strategic vision and coherent strategic planning must go hand-in hand.  It is just as pointless possessing a strategic vision that cannot be realized as it is to try to realize a system that has no strategic context. Second, positive and consistent leadership plays a key role in the definition and realization of an information system strategy.  And third, structures should be in place to provide an informed oversight of the progress of systems developments.
 

Project Lessons

There are dangers in adapting a technology-led solution to an organizational problem.  It is better to encourage user participation first to define the problem and then assist in developing an appropriate solution.  Also, continuity of management is very crucial to the success of a major computer project.  Frequent changes in management at the project level act to undermine the accountability of individuals.
 

System Failure – What can be learned?

In order to learn something from failed system we must examine the factors that played a part in the failure of the system.  Information System project of any size are highly complex combination, organizational, financial, technical, human and political factors.  Interaction between these factors will determine the success or failure of a project.  These factors form the basis for the identification of a set of generic failure factor called critical failure factors (CFFs)  (See fig 1 below).
 
 

    Organizational context

  • Hostile culture
  • Poor reporting structure
  • Management of project

  • Over-commitment
  • Political pressure
  • Conduct of the project

  • Initiation phase
    • Technology focused; lure of leading edge; complexity underestimated
  • Analysis phase
    • Poor consultation; design by committee; technical ‘fix’ for management problem; poor procurement
  • Development phase
    • Staff turnover; competency; communication
  • Implementation phase
    • Receding deadline; inadequate testing; inadequate user training

CFFs may be useful in providing a set cautionary pointers to those involved in IS development.  They can help managers quickly to identify troubled projects and enable them to take appropriate remedial actions.  CFFs that are in less than optimal state will increase the chance of an IS development becoming either a failure or a disaster.
 

Conclusion

The words of Bill Gates sum it up best.  He said, “When you’re failing, you’re forced to be creative, to dig deep and think hard, night and day.  Every company needs people who have been through that.  Every company needs people who have made mistakes and then made the most of them.” (The Guardian, April 1995)  Unfortunately, the number of IS failures that are made public is like the tip of a very large iceberg of IS failures.  Many more system failures are buried away secretly.  This trend should not be allowed to continue.  The industry needs to grow up and start learning from its mistakes instead of pretending they never happen.



 
 

References

Flowers, Steven, Software Failure: Management Failure, Chichester, England (1996)
 
 
 
 
 
 
 
 
 
 
 
 
 
 


Comments and suggestions e-mail to Sergey D.