From structured to unstructured to anti-structured . . .

Comment from an
experienced reader

Systems Analysis Methodology Sliding Backwards

by Conrad Weisert, June 30, 2006
© Information Disciplines, Inc., Chicago

NOTE: This article may be circulated freely as long as the copyright notice is included.


Before 1978: What led to "structured analysis"?

For a couple of decades most textbooks and courses on systems analysis emphasized a structured approach to documenting (or modeling) a proposed new system. Since structured programming had already become established by 1978 as an enlightened way of designing, coding, and testing computer programs, the world was then ready for practically anything with "structured" in its name.

Structured Systems Analysis was largely a reaction against widespread informal methods of specifying a proposed computer application that were leading to notorious large-scale project fiascos until the mid 1970s. The central feature of those older methods was a massive narrative sequential description of what the proposed system was to do, which DeMarco called the "Victorian novel" approach to specifying a system. Other parts of the specification were record layouts, report (or display) mock-ups, and flowcharts. The total bulk might be several hundred pages.

Typical sponsoring end users who had to approve the specifications would:

  • read carefully the short overview sections to confirm that the proposed system was intended to solve the problem they believed they had,

  • skim uncomprehending through the main body of the material,

  • praise the analysts's efforts, announce that they had correctly captured their organization's requirements, and approve the rest of the project.

Pioneering Books

Structured analysis was first popularized by:
  • Chris Gane & Trish Sarson: Structured Systems Analysis: tools and techniques, 1979, Prentice Hall, ISBN 0-13-854547-2.

  • Tom De Marco: Structured Systems Analysis and System Specification, 1978, Yourdon, ISBN 0-917072-07-3.
Although the examples are weak and the mechanics are primitive, both are still worth reading for a professional systems analyst. More recent books, such as Robertsons' and Hay's, draw upon and extend the classic SA principles set forth in the earlier works, as does the modern approach recommended on this web site.

Feeling overwhelmed by the bulk and intimidated by its quasi-technical flavor, busy user representatives would rarely raise issues. Omissions, inconsistencies, and misunderstandings would then go undetected until the late stages of system testing, when correcting them was very expensive and disruptive. Structured systems analysis was embraced by organizations as a reliable and predicatable way of assuring that the developers were solving the users' real problem, and of avoiding loss of project control, massive schedule slippage, and huge cost overruns.

The Structured Revolution — What does "structured" mean?

Structured Systems Analysis (SA) as presented by Gane & Sarson and by DeMarco turned out to be nothing at all like structured programming, but it caught on nevertheless. The central ideas that constitute structuredness include these six criteria:
  1. A definite place to start, both for the analyst creating the specification and for anyone reading it.

  2. A definite and obvious place for every piece of relevant requirements information and clear relationships among all components of the system specification.

  3. Avoidance of repetition.

  4. Consistency among all components of the system specification.

  5. Avoidance of vagueness and ambiguity at all levels.

  6. A definite end with assurance that the specification is, with respect to known requirements, complete.

SA Deliverables

Note that "Structured Analysis" (SA) is not a way of doing systems analysis, but just a set of conventions for documenting and presenting the results (or deliverables) of systems analysis. Those results are called by various names, including:

  • Detailed user requirements
  • Functional specifications
  • External system design
  • Business system design
  • System model

None of those criteria applied to the pre-1978 hit-or-miss methods. All of them are generally considered to be beneficial, even essential to project success — so much so that anyone who proposes any other technique for specifying user requirements is obliged either

The above criteria don't require any particular version of structured analysis or any specific documentation technique. If you don't like, say, entity-relationship diagrams, you may substitute an equivalent component and still call your approach structured, as long as the criteria are satisfied.

The complete structured specification, prepared by one or more systems analysts, is aimed at two audiences:

  1. Responsible representatives of the sponsoring end users, who must understand and approve the specification and (usually) fund the project.

  2. Programmers and other technical professionals who are to build (or purchase and install) the software components of the system.

When produced by competent systems analysts, a structured system specification is understandable in complete detail to a non-technical reader. No one needs to take an orientation course in how to understand a structured system specification, nor should any reader feel too intimidated to raise questions.

Use Cases: The beginning of the end of structure

If structured was the magic word for the 1970s, then object-oriented was the magic word for the 1990s. As soon as object-oriented programming became well established, methodologists began seeking ways to adapt some of its concepts to "object-oriented systems analysis" (OOA).

Despite the efforts of those methodologists to disseminate their ideas through books, courses, and presentations, it soon became clear that no one knew exactly what OOA was and that its concepts, useful and intriguing as they were, couldn't fully replace structured analysis as a complete set of tools for modeling an external system specification (or users' requirements). Knowledgeable systems analysts adopted certain notions from OOA, e.g. class diagrams, to supplement SA. However, those who tried to go further by adopting OOA in its entirety as a replacement for SA found that the user representatives couldn't understand the documentation. We needed something else.

The something else turned out to be use cases.1, a return to the 1960s narrative emphasis (DeMarco's "Victorian Novel") in specifying a new application system.

Meanwhile, Booch and Rumbaugh were ironing out the superficial differences between their versions of OOA. The result, "Unified Modeling Language" (UML), including OOA with use cases, was then blessed as a standard by the Object Management Group a self-appointed standards body supported by corporations having an interest in object-oriented techniques and tools.

Early books on OOA

Object-oriented analysis was the subject of these early books:

  • Peter Coad & Edward Yourdon: Object-Oriented analysis, Yourdon Press, 1991, ISBN 0-13-629981-4.
  • Grady Booch: Object-Oriented Analysis and Design, Benjamin Cummings Publishing, 1994, ISBN 0-8053-5340-2.
  • James Rumbaugh: Object-Oriented Modeling and Design, Prentice Hall, 1991, ISBN 8-1203-1046-2.
Each book employed different graphical notations and different terminology, and two of them strongly emphasized design over analysis. They're of interest to today's readers mainly as historical curiosities.

While UML works well for modeling an internal system design and for communicating among the developers, it's next to useless for communicating with the non-technical sponsoring end users. Without special training they just don't understand the arcane notations and multiplicity of unconnected diagrams and "views". As a consequence, desperate application system developers were tempted either:

User stories: The death of structure

The incremental approach is fundamental also to the so-called agile methodologies, such as extreme programming. Users' requirements are captured, if at all, through a set of small "stories" intended to describe chunks of functionality

The problem is that stories are not limited to chunks of functionality. Examples, even from the sources that strongly promote them, include "stories" that may be:

Some of them even contain a mixture of two or three of the above. In particular, data definitions are now often embedded in the details of functionality descriptions; we noted that problem last year in Implicit Data Dictionaries are Dangerous! Furthermore, such embedded information may get repeated in multiple stories, sometimes with subtle differences that may go undetected until late in the project. There is no longer an obvious logical place in a project's repository of information for a business rule or a data definition; they're scattered about in stories where we wouldn't think to look for them. Stories, then, not only don't support structure, as defined by our six criteria; they actually repudiate and undermine structure.

But wait. It gets worse.

Leading proponents of user stories reject the goal of avoiding vagueness and ambiguity. On the contrary, they recommend omitting rigorous detail and suggest that a story need only be a reminder to hold a "conversation" with the user representatives about a particular issue!

That conversation may or may not lead to the creation of "supporting documentation" (i.e. the real requirement specification), but XP offers little guidance on the form such documentation should take, when it must be created, or how it should be reviewed and approved. As in 1968 anything goes!


1 -- Use cases were popularized by Ivar Jacobsen, a Swedish business executive, in The Object Advantage, Addison Wesley, 1995, ISBN 0-201-42289-1. Dr. Jacobson later joined the Rational Corporation, now owned by IBM.

Return to IDI home page
technical articles
methodology materials

Last modified July 12, 2006