©Information Disciplines, Inc., Chicago
July, 2002 for Compsac conference (reformatted for the web with links, July, 2009)
AbstractWe generally associate the designation "legacy system" with the most inflexible and unmaintainable sort of application developed by the most unenlightened programmers for an obsolete mainframe computer. Today, however, despite decades of dramatic breakthroughs in software development methodologies, many organizations are surprised and disappointed to discover that they have replaced those old applications with expensive new ones that are just as costly to maintain. We're still developing legacy applications!
Responsibility for this alarming situation is shared by software vendors, academic programs, fad methodologists, and contract development firms. Fortunately, a remedy is still well within the reach of disciplined management in user organizations.
We often assume that such an application was designed and originally developed more than a decade ago by a team of developers who lacked knowledge of the tools, techniques, and methodologies that today's professionals take for granted. That assumption, however, is invalid. Growing evidence from large and small organizations shows that many newly developed applications, even those that exploit the latest breakthrough methodologies, exhibit those same characteristics.
Naturally, that comes as a surprise and serious disappointment to the managers who sponsor those projects in a user organization. Not only does the organization fail to get the high-quality system it expected, but also:
Most software developers today have a degree or at least some formal background in computer science or management information systems. Employers criticize academic programs for placing insufficient emphasis on real-world applications, but that's by no means their worst shortcoming. Many institutions, including some of the most prestigious, place little emphasis on software quality, often none at all.
Students become accustomed to getting an A grade on any program that runs to completion, produces the right answer, and uses the prescribed algorithm. Those students may later be shocked to discover that there's far more to software development. Worse, they may never discover it, pursuing remunerative careers in body-shop contractor firms where their shoddy end products do little short-term damage to their employer's bottom line, but inflict immense long-term harm on their employer's customers.
Some instructors are themselves unaware of essential quality criteria. Students not only report getting astonishingly naïve guidance, but also complain of being penalized by their instructor (more likely by a teaching assistant) for applying long-established good practices.
Even those instructors who are well aware of good techniques, confronted with huge enrollments, seldom have time to evaluate each student's work critically. As a result, many students get no feedback on their work except points off for deviations from some orthodox solution.
Many academic programs fail to practice what they preach. For example:
Although many graduates of such programs go on to do high-quality work, others, not surprisingly, join the ranks of mediocre developers using new tools and techniques to produce new legacy software.
In many organizations today management's concern is more and more on extremely short-range performance. They avoid making investments that yield payback beyond the current quarter.
That emphasis has led to a dismantling of the information systems infrastructure that was originally intended to foster quality and productivity in system development. Many younger managers lack understanding of the value of such infrastructure, viewing it as bureaucratic red tape. They can easily earn their bosses' approval by eliminating such "overhead" from their budgets, before moving on to another role in the organization. Their successors will pay a big price, but no one in the organization ever associates cause and effect.
Some organizations that had established a sensible date representation standard in the 1970s, for example, ended up with costly Y2K compliance problems after a later manager discarded the earlier standards.
In the past few years, the term "quality assurance" in recruiting advertisements has evolved to mean little more than testing or finding bugs ("defects") in a nearly complete software product. The discredited 1960s cliché "Any program that works is better than one that doesn't" is making an amazing comeback.
Of course, finding and correcting bugs has little to do with the actual quality of software, in terms of its future maintainability and long-term robustness. Indeed, an urgent patch to a discovered operational defect may well undermine the structure of a program or a database, and nudge the software further toward a legacy-like status.
In the 1960s, just about every programming organization had its own set of programming standards for techniques to be used or avoided in Cobol or Fortran. Today Java, C++, and Visual Basic (VB) offer the programmer 100 times more choices (ways to go wrong) than Fortran or Cobol did. Nevertheless very few organizations have invested in similar in-house standards for Java, C++, or Visual Basic programming techniques.
The explanation for the surprising indifference to programming standards lies partly in naive expectations about the benefits yielded by newer technologies:
The explosion of new methodologies has spawned a corresponding growth in the number of contract development firms claiming to specialize in those methodologies. When your organization engages the world's greatest experts in client-server application architecture to design and develop a clientserver application, you ought to feel confident that the software you'll get will reflect the vendor's reputation.
You'll be sorely disappointed, however. Command of the details of some methodology is quite different from expertise in organizing and documenting high-quality software. The contractor's staff may consist of people who have impressive detailed knowledge of some set of facilities but hardly a clue how to put them all together to produce maintainable software.
The recent failures of many such firms are due more to financial overreaching and incompetent marketing than to customers' reaction against their poor-quality products. Assuming that the delivered application initially works, it may take years for a customer to realize that their contractor has delivered a brand-new legacy system.
Every year or two, we're treated to yet another major dramatic breakthrough (MDB) in software development methodology. MDBs fall into two categories:
Some recent MDBs actually return to some of the worst discredited practices of the distant past! Younger developers who have no memory of 1959s user-programmer iterative collaboration or of 1969's "Victorian Novel" requirements specifications are now eagerly embracing remarkably similar "innovations".
Here are three MDBs of the recent past that, along with some positive impact, are directly contributing to the ongoing production of legacy software.
Having been blessed by the Object Management Group (OMG), the Unified Modeling Language (UML) has become so widely accepted that many organizations consider it to be a requirement for every project and they demand UML fluency from every job applicant. Uncertainty remains, however, over just what problems UML solves.
The main issue centers on confusion between analysis and design, an old problem that UML makes worse. Although UML provides a rich repertoire of modeling tools for one software developer to communicate to other software developers, it's practically useless for communicating a system specification to the sponsoring end users. Instead UML enthusiasts emphasize two mechanisms for documenting the user's requirements and securing the user representatives' concurrence:
Although UML claims to be object-oriented, there's nothing at all object-oriented about either of the above. You can do object-oriented analysis (OOA) without use-cases, and you can prepare use cases without mentioning objects. Apart from superficial notation, use-cases take us back to the discredited sequential narrative technique of the 1960's, called by DeMarco3 the Victorian Novel approach to system specification.
A serious practical problem with both use-cases and want lists in particular and UML in general is that there's neither a clear starting point nor a clear end to the analysis (detailed requirements or external specification) phase. The systems analysts just keep writing use-cases and drawing related diagrams until they can't think of any more. Then they hope that the developers can build a coherent application system based on the resulting pile of documentation and that the sponsoring users can make sense of it. The lack of a definite end to systems analysis is actually cited as an advantage by a sizable community who denigrate formal specifications in favor of trial-and-error incremental iteration.
Although the UML is put forth as an industry standard, it has now become closely associated with one dominant vendor of computer assisted software engineering (CASE) tools. The three pioneering contributors to UML, called "the three amigos" by insiders, have now all gone to work for that company.
The amigos place stress on distinguishing between a language4 (UML) and a process (life-cycle methodology), asserting that UML is independent of the choice of life-cycle. They then go on to put forth a companion life-cycle methodology5, UP (Unified Process), that further steers a project toward a new legacy system. In particular:
One of the latest fad methodologies is so-called extreme programming (XP), a disciplined form of iterative incremental development.6 It came out of the Smalltalk community, and reflects a style and approach that worked well for projects suited to Smalltalk's strengths and weaknesses.7
XP returns to the dominant style of 1950s application development, based on close collaboration between a programmer and a problem-sponsor user, with little if any formal written specification. Such incremental cooperative development has been shown to work well for producing single-program applications with simple database designs, even where the single program is large and complicated. For more complicated systems, however, XP's shortcuts lead both to likely loss of project control and to ongoing maintenance nightmares.
An alarming element of XP is the euphemism refactoring to describe what we do after we discover that we've developed a lot of code based on a faulty design structure. Of course, programmers should always be open to undoing a bad design choice, but when you don't know where you're headed at the start, regrettable dead-ends are inevitable.
The past half-dozen years have brought us two unforeseen phenomena:
Java began as a simple programming language, and has evolved into a monstrously complex operating platform. Recruiting advertisements now demand fluency in EJB, JSP, JINI, JDBC, JFC, and other fragmented technologies we hadn't heard of in 1995. At the core lies the original badly flawed Java language that imposes unreasonably high maintenance costs.
Java's most destructive impact has been a growing distortion of the essence of object-oriented programming, OOP. A novelty 15 years ago, object-orientation has justifiably taken its firm place as today's mainstream approach to design and programming. Although Java partisans proclaim that Java is a pure object-oriented language ("In Java everything is an object!"), when we examine a typical Java program we nearly always find:
As a result, most large programs written in Java fail to exploit the benefits of OOP while at the same time paying a huge price in unnecessary complexity. Those characteristics are exhibited not only in typical programs developed by in-house staff or outside contractors, but also in articles and textbooks from Java experts. Indeed, even the vendor's official Java libraries contain more than their share of misguided design and atrocious code.
Note that UML, XP, and Java usually work well for small applications. Of course, almost any approach works if the problem is small enough. Very few of the legacy systems that torment us now or will torment us in the future, however, are small.
Some cynical managers have given up. It's just the nature of software development and maintenance, they concede, to be utterly unpredictable and uncontrollable.
A few aspects of software development are indeed beyond the control of an organization of professional developers. We can bring only slight influence upon academic institutions. We can't predict when the next MDB will arrive or what its impact will be. We feel helpless.
Let's firmly reject such defeatism. Our organization can be the exception. What it takes is management discipline in applying proven techniques throughout the application development life cycle. Here are a few things we should do:
Legacy systems, as that pejorative term is commonly understood, never were inevitable. We have to struggle harder and harder to avoid them, but it's a struggle we can win.
Return to IDI Home page