5 Towards object technology usbilityand eibilityprincpale defined in the preceding chapters.To achieve these conditions,we need a systematic method for decomposing systems into modules. This chapter presents the basic elements of such a method,based on a simple but far- reaching idea:build every module on the basis of some object type.It explains the idea, develops the rationale for it,and explores some of the immediate consequences. A word of warning.Given today's apparent prominence of object technology,some readers might think that the battle has been won and that no further rationale is necessary. This would be a mistake:we need to understand the basis for the method,if only to avoid common misuses and pitfalls.It is in fact frequent to see the word "object-oriented"(like "structured"in an earlier era)used as mere veneer over the most conventional techniques. Only by carefully building the case for object technology can we learn to detect improper uses of the buzzword,and stay away from common mistakes reviewed later in this chapter. 5.1 THE INGREDIENTS OF COMPUTATION The crucial question in our search for proper software architectures is modularization: what criteria should we use to find the modules of our software? To obtain the proper answer we must first examine the contending candidates. The basic triangle Three forces are at play when we use software to perform some computations: The three forces of computation Action Object Processor
5 Towards object technology Extendibility, reusability and reliability, our principal goals, require a set of conditions defined in the preceding chapters. To achieve these conditions, we need a systematic method for decomposing systems into modules. This chapter presents the basic elements of such a method, based on a simple but farreaching idea: build every module on the basis of some object type. It explains the idea, develops the rationale for it, and explores some of the immediate consequences. A word of warning. Given today’s apparent prominence of object technology, some readers might think that the battle has been won and that no further rationale is necessary. This would be a mistake: we need to understand the basis for the method, if only to avoid common misuses and pitfalls. It is in fact frequent to see the word “object-oriented” (like “structured” in an earlier era) used as mere veneer over the most conventional techniques. Only by carefully building the case for object technology can we learn to detect improper uses of the buzzword, and stay away from common mistakes reviewed later in this chapter. 5.1 THE INGREDIENTS OF COMPUTATION The crucial question in our search for proper software architectures is modularization: what criteria should we use to find the modules of our software? To obtain the proper answer we must first examine the contending candidates. The basic triangle Three forces are at play when we use software to perform some computations: The three forces of computation Action Object Processor
102 TOWARDS OBJECT TECHNOLOGY $5.1 To execute a software system is to use certain processors to apply certain actions to certain objects. The processors are the computation devices,physical or virtual,that execute instructions.A processor can be an actual processing unit (the CPU of a computer),a process on a conventional operating system,or a "thread"if the OS is multi-threaded. The actions are the operations making up the computation.The exact form of the actions that we consider will depend on the level of granularity of our analysis:at the hardware level,actions are machine language operations;at the level of the hardware- software machine,they are instructions of the programming language;at the level of a software system,we can treat each major step of a complex algorithm as a single action. The objects are the data structures to which the actions apply.Some ofthese objects, the data structures built by a computation for its own purposes,are internal and exist only while the computation proceeds;others (contained in the files,databases and other persistent repositories)are external and may outlive individual computations. Processors will become important when we discuss concurrent forms of Concurrency is the computation,in which several sub-computations can proceed in parallel;then we will topic of chapter 30. need to consider two or more processors,physical or virtual.But that is the topic of a later chapter;for the moment we can limit our attention to non-concurrent,or seguential computations,relying on a single processor which will remain implicit. This leaves us with actions and objects.The duality between actions and objects- what a system does vs.what it does it to-is a pervasive theme in software engineering. A note of terminology.Synonyms are available to denote each of the two aspects:the word data will be used here as a synonym for objects;for action the discussion will often follow common practice and talk about the functions of a system. The term "function"is not without disadvantages,since software discussions also use it in at least two other meanings:the mathematical sense,and the programming sense of subprogram returning a result.But we can use it without ambiguity in the phrase the functions ofa system,which is what we need here. The reason for using this word rather than"action"is the mere grammatical convenience of having an associated adjective,used in the phrase functional decomposition."Action" has no comparable derivation.Another term whose meaning is equivalent to that of "action"for the purpose of this discussion is operation. Any discussion of software issues must account for both the object and function aspects;so must the design of any software system.But there is one question for which we must choose-the question of this chapter:what is the appropriate criterion for finding the modules of a system?Here we must decide whether modules will be built as units of functional decomposition,or around major types of objects. From the answer will follow the difference between the object-oriented approach and other methods.Traditional approaches build each module around some unit of functional decomposition-a certain piece of the action.The object-oriented method, instead,builds each module around some type of objects
102 TOWARDS OBJECT TECHNOLOGY §5.1 To execute a software system is to use certain processors to apply certain actions to certain objects. The processors are the computation devices, physical or virtual, that execute instructions. A processor can be an actual processing unit (the CPU of a computer), a process on a conventional operating system, or a “thread” if the OS is multi-threaded. The actions are the operations making up the computation. The exact form of the actions that we consider will depend on the level of granularity of our analysis: at the hardware level, actions are machine language operations; at the level of the hardwaresoftware machine, they are instructions of the programming language; at the level of a software system, we can treat each major step of a complex algorithm as a single action. The objects are the data structures to which the actions apply. Some of these objects, the data structures built by a computation for its own purposes, are internal and exist only while the computation proceeds; others (contained in the files, databases and other persistent repositories) are external and may outlive individual computations. Processors will become important when we discuss concurrent forms of computation, in which several sub-computations can proceed in parallel; then we will need to consider two or more processors, physical or virtual. But that is the topic of a later chapter; for the moment we can limit our attention to non-concurrent, or sequential computations, relying on a single processor which will remain implicit. This leaves us with actions and objects. The duality between actions and objects — what a system does vs. what it does it to — is a pervasive theme in software engineering. A note of terminology. Synonyms are available to denote each of the two aspects: the word data will be used here as a synonym for objects; for action the discussion will often follow common practice and talk about the functions of a system. The term “function” is not without disadvantages, since software discussions also use it in at least two other meanings: the mathematical sense, and the programming sense of subprogram returning a result. But we can use it without ambiguity in the phrase the functions of a system, which is what we need here. The reason for using this word rather than “action” is the mere grammatical convenience of having an associated adjective, used in the phrase functional decomposition. “Action” has no comparable derivation. Another term whose meaning is equivalent to that of “action” for the purpose of this discussion is operation. Any discussion of software issues must account for both the object and function aspects; so must the design of any software system. But there is one question for which we must choose — the question of this chapter: what is the appropriate criterion for finding the modules of a system? Here we must decide whether modules will be built as units of functional decomposition, or around major types of objects. From the answer will follow the difference between the object-oriented approach and other methods. Traditional approaches build each module around some unit of functional decomposition — a certain piece of the action. The object-oriented method, instead, builds each module around some type of objects. Concurrency is the topic of chapter 30
$5.2 FUNCTIONAL DECOMPOSITION 103 This book,predictably,develops the latter approach.But we should not just embrace O-O decomposition because the title of the book so implies,or because it is the"in"thing to do.The next few sections will carefully examine the arguments that justify using object types as the basis for modularization-starting with an exploration of the merits and limitations of traditional,non-O-O methods.Then we will try to get a clearer understanding of what the word"object"really means for software development,although the full answer,requiring a little theoretical detour,will only emerge in the next chapter. We will also have to wait until the next chapter for the final settlement of the formidable and ancient fight that provides the theme for the rest of the present discussion: the War of the Objects and the Functions.As we prepare ourselves for a campaign of slander against the functions as a basis for system decomposition,and of corresponding praise for the objects,we must not forget the observation made above:in the end,our solution to the software structuring problem must provide space for both functions and objects-although not necessarily on an equal basis.To discover this new world order, we will need to define the respective roles of its first-class and second-class citizens. 5.2 FUNCTIONAL DECOMPOSITION We should first examine the merits and limitations of the traditional approach:using functions as a basis for the architecture of software systems.This will not only lead us to appreciate why we need something else-object technology-but also help us avoid, when we do move into the object world,certain methodological pitfalls such as premature operation ordering,which have been known to fool even experienced O-O developers. Continuity “Modular contim- A key element in answering the question "should we structure systems around functions iy”,page44. or around data?"is the problem of extendibility,and more precisely the goal called continuity in our earlier discussions.As you will recall,a design method satisfies this criterion if it yields stable architectures,keeping the amount of design change commensurate with the size of the specification change. Continuity is a crucial concern if we consider the real lifecycle of software systems, including not just the production of an acceptable initial version,but a system's long-term evolution.Most systems undergo numerous changes after their first delivery.Any model of software development that only considers the period leading to that delivery and ignores the subsequent era of change and revision is as remote from real life as those novels which end when the hero marries the heroine-the time which,as everyone knows,marks the beginning of the really interesting part. To evaluate the quality of an architecture (and of the method that produced it),we should not just consider how easy it was to obtain this architecture initially:it is just as important to ascertain how well the architecture will weather change. Top-down design was sketched in The traditional answer to the question of modularization has been top-down “Modular decom- functional decomposition,briefly introduced in an earlier chapter.How well does top- posability".page 40.down design respond to the requirements of modularity?
§5.2 FUNCTIONAL DECOMPOSITION 103 This book, predictably, develops the latter approach. But we should not just embrace O-O decomposition because the title of the book so implies, or because it is the “in” thing to do. The next few sections will carefully examine the arguments that justify using object types as the basis for modularization — starting with an exploration of the merits and limitations of traditional, non-O-O methods. Then we will try to get a clearer understanding of what the word “object” really means for software development, although the full answer, requiring a little theoretical detour, will only emerge in the next chapter. We will also have to wait until the next chapter for the final settlement of the formidable and ancient fight that provides the theme for the rest of the present discussion: the War of the Objects and the Functions. As we prepare ourselves for a campaign of slander against the functions as a basis for system decomposition, and of corresponding praise for the objects, we must not forget the observation made above: in the end, our solution to the software structuring problem must provide space for both functions and objects — although not necessarily on an equal basis. To discover this new world order, we will need to define the respective roles of its first-class and second-class citizens. 5.2 FUNCTIONAL DECOMPOSITION We should first examine the merits and limitations of the traditional approach: using functions as a basis for the architecture of software systems. This will not only lead us to appreciate why we need something else — object technology — but also help us avoid, when we do move into the object world, certain methodological pitfalls such as premature operation ordering, which have been known to fool even experienced O-O developers. Continuity A key element in answering the question “should we structure systems around functions or around data?” is the problem of extendibility, and more precisely the goal called continuity in our earlier discussions. As you will recall, a design method satisfies this criterion if it yields stable architectures, keeping the amount of design change commensurate with the size of the specification change. Continuity is a crucial concern if we consider the real lifecycle of software systems, including not just the production of an acceptable initial version, but a system’s long-term evolution. Most systems undergo numerous changes after their first delivery. Any model of software development that only considers the period leading to that delivery and ignores the subsequent era of change and revision is as remote from real life as those novels which end when the hero marries the heroine — the time which, as everyone knows, marks the beginning of the really interesting part. To evaluate the quality of an architecture (and of the method that produced it), we should not just consider how easy it was to obtain this architecture initially: it is just as important to ascertain how well the architecture will weather change. The traditional answer to the question of modularization has been top-down functional decomposition, briefly introduced in an earlier chapter. How well does topdown design respond to the requirements of modularity? “Modular continuity”, page 44. Top-down design was sketched in “Modular decomposability”, page 40
104 TOWARDS OBJECT TECHNOLOGY $5.2 Top-down development There was a most ingenious architect who had contrived a new method for building houses,by beginning at the roof,and working downwards to the foundation,which he justified to me by the like practice of those two prudent insects,the bee and the spider. Jonathan Swift:Gulliver's Travels,Part III,A Voyage to Laputa,etc.,Chapter 5. The top-down approach builds a system by stepwise refinement,starting with a definition of its abstract function.You start the process by expressing a topmost statement of this function,such as [Co] "Translate a C program to machine code" or: [PO] "Process a user command" and continue with a sequence of refinement steps.Each step must decrease the level of abstraction of the elements obtained;it decomposes every operation into a combination of one or more simpler operations.For example,the next step in the first example (the C compiler)could produce the decomposition [C1] "Read program and produce sequence of tokens" "Parse sequence of tokens into abstract syntax tree" “Decorate tree with semantic information” "Generate code from decorated tree" or,using an alternative structure(and making the simplifying assumption that a C program is a sequence of function definitions): [C' from "Initialize data structures" until “All function definitions processed” loop “Read in next function definition” "Generate partial code" end “Fill in cross references
104 TOWARDS OBJECT TECHNOLOGY §5.2 Top-down development There was a most ingenious architect who had contrived a new method for building houses, by beginning at the roof, and working downwards to the foundation, which he justified to me by the like practice of those two prudent insects, the bee and the spider. Jonathan Swift: Gulliver’s Travels, Part III, A Voyage to Laputa, etc., Chapter 5. The top-down approach builds a system by stepwise refinement, starting with a definition of its abstract function. You start the process by expressing a topmost statement of this function, such as [C0] “Translate a C program to machine code” or: [P0] “Process a user command” and continue with a sequence of refinement steps. Each step must decrease the level of abstraction of the elements obtained; it decomposes every operation into a combination of one or more simpler operations. For example, the next step in the first example (the C compiler) could produce the decomposition [C1] “Read program and produce sequence of tokens” “Parse sequence of tokens into abstract syntax tree” “Decorate tree with semantic information” “Generate code from decorated tree” or, using an alternative structure (and making the simplifying assumption that a C program is a sequence of function definitions): [C'1] from “Initialize data structures” until “All function definitions processed” loop “Read in next function definition” “Generate partial code” end “Fill in cross references
$5.2 FUNCTIONAL DECOMPOSITION 105 In either case,the developer must at each step examine the remaining incompletely expanded elements (such as"Read program..."and"All function definitions processed") and expand them,using the same refinement process,until everything is at a level of abstraction low enough to allow direct implementation. We may picture the process of top-down refinement as the development of a tree. Nodes represent elements of the decomposition;branches show the relation "B is part of the refinement of”. Top-down Topmost functional abstraction design:tree structure Sequence (This figure first B appeared on page 41.) Loop Conditional 2 The top-down approach has a number of advantages.It is a logical,well-organized thought discipline;it can be taught effectively;it encourages orderly development of systems;it helps the designer find a way through the apparent complexity that systems often present at the initial stages of their design. The top-down approach can indeed be useful for developing individual algorithms. But it also suffers from limitations that make it questionable as a tool for the design of entire systems: The very idea of characterizing a system by just one function is subject to doubt. By using as a basis for modular decomposition the properties that tend to change the most,the method fails to account for the evolutionary nature of software systems. Not just one function In the evolution of a system,what may originally have been perceived as the system's main function may become less important over time. Consider a typical payroll system.When stating his initial requirement,the customer may have envisioned just what the name suggests:a system to produce paychecks from the appropriate data.His view of the system,implicit or explicit,may have been a more am bitious version of this:
§5.2 FUNCTIONAL DECOMPOSITION 105 In either case, the developer must at each step examine the remaining incompletely expanded elements (such as “Read program …” and “All function definitions processed”) and expand them, using the same refinement process, until everything is at a level of abstraction low enough to allow direct implementation. We may picture the process of top-down refinement as the development of a tree. Nodes represent elements of the decomposition; branches show the relation “B is part of the refinement of A”. The top-down approach has a number of advantages. It is a logical, well-organized thought discipline; it can be taught effectively; it encourages orderly development of systems; it helps the designer find a way through the apparent complexity that systems often present at the initial stages of their design. The top-down approach can indeed be useful for developing individual algorithms. But it also suffers from limitations that make it questionable as a tool for the design of entire systems: • The very idea of characterizing a system by just one function is subject to doubt. • By using as a basis for modular decomposition the properties that tend to change the most, the method fails to account for the evolutionary nature of software systems. Not just one function In the evolution of a system, what may originally have been perceived as the system’s main function may become less important over time. Consider a typical payroll system. When stating his initial requirement, the customer may have envisioned just what the name suggests: a system to produce paychecks from the appropriate data. His view of the system, implicit or explicit, may have been a more ambitious version of this: Top-down design: tree structure (This figure first appeared on page 41.) A B D C C1 I I1 C2 I2 Sequence Loop Conditional Topmost functional abstraction