From structured to unstructured to anti-structured . . .
|Comment from an|
by Conrad Weisert, June 30, 2006
© Information Disciplines, Inc., Chicago
NOTE: This article may be circulated freely as long as the copyright notice is included.
For a couple of decades most textbooks and courses on systems analysis emphasized a structured approach to documenting (or modeling) a proposed new system. Since structured programming had already become established by 1978 as an enlightened way of designing, coding, and testing computer programs, the world was then ready for practically anything with "structured" in its name.
Structured Systems Analysis was largely a reaction against widespread informal methods of specifying a proposed computer application that were leading to notorious large-scale project fiascos until the mid 1970s. The central feature of those older methods was a massive narrative sequential description of what the proposed system was to do, which DeMarco called the "Victorian novel" approach to specifying a system. Other parts of the specification were record layouts, report (or display) mock-ups, and flowcharts. The total bulk might be several hundred pages.
Typical sponsoring end users who had to approve the specifications would:
Pioneering BooksStructured analysis was first popularized by:
Feeling overwhelmed by the bulk and intimidated by its quasi-technical flavor, busy user representatives would rarely raise issues. Omissions, inconsistencies, and misunderstandings would then go undetected until the late stages of system testing, when correcting them was very expensive and disruptive. Structured systems analysis was embraced by organizations as a reliable and predicatable way of assuring that the developers were solving the users' real problem, and of avoiding loss of project control, massive schedule slippage, and huge cost overruns.
Structured Systems Analysis (SA) as presented by Gane & Sarson and by DeMarco turned out to be nothing at all like structured programming, but it caught on nevertheless. The central ideas that constitute structuredness include these six criteria:
Note that "Structured Analysis" (SA) is not a way of doing systems analysis, but just a set of conventions for documenting and presenting the results (or deliverables) of systems analysis. Those results are called by various names, including:
None of those criteria applied to the pre-1978 hit-or-miss methods. All of them are generally considered to be beneficial, even essential to project success — so much so that anyone who proposes any other technique for specifying user requirements is obliged either
The above criteria don't require any particular version of structured analysis or any specific documentation technique. If you don't like, say, entity-relationship diagrams, you may substitute an equivalent component and still call your approach structured, as long as the criteria are satisfied.
The complete structured specification, prepared by one or more systems analysts, is aimed at two audiences:
When produced by competent systems analysts, a structured system specification is understandable in complete detail to a non-technical reader. No one needs to take an orientation course in how to understand a structured system specification, nor should any reader feel too intimidated to raise questions.
If structured was the magic word for the 1970s, then object-oriented was the magic word for the 1990s. As soon as object-oriented programming became well established, methodologists began seeking ways to adapt some of its concepts to "object-oriented systems analysis" (OOA).
Despite the efforts of those methodologists to disseminate their ideas through books, courses, and presentations, it soon became clear that no one knew exactly what OOA was and that its concepts, useful and intriguing as they were, couldn't fully replace structured analysis as a complete set of tools for modeling an external system specification (or users' requirements). Knowledgeable systems analysts adopted certain notions from OOA, e.g. class diagrams, to supplement SA. However, those who tried to go further by adopting OOA in its entirety as a replacement for SA found that the user representatives couldn't understand the documentation. We needed something else.
The something else turned out to be use cases.1, a return to the 1960s narrative emphasis (DeMarco's "Victorian Novel") in specifying a new application system.
Meanwhile, Booch and Rumbaugh were ironing out the superficial differences between their versions of OOA. The result, "Unified Modeling Language" (UML), including OOA with use cases, was then blessed as a standard by the Object Management Group a self-appointed standards body supported by corporations having an interest in object-oriented techniques and tools.
Early books on OOA
Object-oriented analysis was the subject of these early books:
While UML works well for modeling an internal system design and for communicating among the developers, it's next to useless for communicating with the non-technical sponsoring end users. Without special training they just don't understand the arcane notations and multiplicity of unconnected diagrams and "views". As a consequence, desperate application system developers were tempted either:
Of course, an incremental approach rules out not only the purchase of application software products, which have become the mainstream solution for many organizations, but also replacements for many major obsolete systems that can't logically be moved a chunk at a time.
The incremental approach is fundamental also to the so-called agile methodologies, such as extreme programming. Users' requirements are captured, if at all, through a set of small "stories" intended to describe chunks of functionality
The problem is that stories are not limited to chunks of functionality. Examples, even from the sources that strongly promote them, include "stories" that may be:
Some of them even contain a mixture of two or three of the above. In particular, data definitions are now often embedded in the details of functionality descriptions; we noted that problem last year in Implicit Data Dictionaries are Dangerous! Furthermore, such embedded information may get repeated in multiple stories, sometimes with subtle differences that may go undetected until late in the project. There is no longer an obvious logical place in a project's repository of information for a business rule or a data definition; they're scattered about in stories where we wouldn't think to look for them. Stories, then, not only don't support structure, as defined by our six criteria; they actually repudiate and undermine structure.
But wait. It gets worse.
Leading proponents of user stories reject the goal of avoiding vagueness and ambiguity. On the contrary, they recommend omitting rigorous detail and suggest that a story need only be a reminder to hold a "conversation" with the user representatives about a particular issue!
That conversation may or may not lead to the creation of "supporting documentation" (i.e. the real requirement specification), but XP offers little guidance on the form such documentation should take, when it must be created, or how it should be reviewed and approved. As in 1968 anything goes!
Return to IDI home page
Last modified July 12, 2006