Development Methodologies
The term software development methodology is used to describe a framework for the development of information systems. A particular methodology is usually associated with a specific set of tools, models and methods that are used for the analysis, design and implementation of information systems, and each tends to favour a particular lifecycle model. Often, a methodology has its own philosophy of system development that practitioners are encouraged to adopt, as well as its own system of recording and documenting the development process. Many methodologies have emerged in the past few decades in response to the perceived need to manage different types of project using different tools and methods. Each methodology has its own strengths and weaknesses, and the choice of which approach to use for a given project will depend on the scale of the project, the nature of the business environment, and the type of system being developed. The following sections describe a small number of software development approaches that have evolved over the years.
Structured Systems Analysis Methodology (SSADM)
Structured Systems Analysis Methodology (SSADM) is a highly structured and rigorous approach to the analysis and design of information systems, one of a number of such methodologies that arose as a response to the large number of information system projects that either failed completely or did not adequately fulfil customer expectations.
Early large scale information systems were often developed using the Cobol programming language together with indexed sequential files to build systems that automated processes such as customer billing and payroll operations. System development at this time was almost a black art, characterised by minimal user involvement. As a consequence, users had little sense of ownership of, or commitment to, the new system that emerged from the process. A further consequence of this lack of user involvement was that system requirements were often poorly understood by developers, and many important requirements did not emerge until late in the development process, leading to costly re-design work having to be undertaken. The situation was not improved by the somewhat arbitrary selection of analysis and design tools, and the absence of effective computer aided software engineering (CASE) tools.
Structured methodologies use a formal process of eliciting system requirements, both to reduce the possibility of the requirements being misunderstood and to ensure that all of the requirements are known before the system is developed. They also introduce rigorous techniques to the analysis and design process. SSADM is perhaps the most widely used of these methodologies, and is used in the analysis and design stages of system development. It does not deal with the implementation or testing stages.
SSADM is an open standard, and as such is freely available for use by companies or individuals. It has been used for all government information systems development since 1981, when it was first released, and has also been used by many companies in the expectation that its use will result in robust, high-quality information systems. SSADM is still widely used for large scale information systems projects, and many proprietary CASE tools are available that support SSADM techniques.
The SSADM standard specifies a number of modules and stages that should be undertaken sequentially. It also specifies the deliverables to be produced by each stage, and the techniques to be used to produce those deliverables. The system development life cycle model adopted by SSADM is essentially the waterfall model, in which each stage must be completed and signed off before the next stage can begin.
SSADM techniques
SSADM revolves around the use of three key techniques that derive three different but complementary views of the system being investigated. The three different views of the system are cross referenced and checked against each other to ensure that an accurate and complete overview of the system is obtained. The three techniques used are:
- Logical Data Modelling (LDM) - this technique is used to identify, model and document the data requirements of the system. The data held by an organisation is concerned with entities (things about which information is held, such as customer orders or product details) and the relationships (or associations) between those entities. A logical data model consists of a Logical Data Structure (LDS) and its associated documentation. The LDS is sometimes referred to as an Entity Relationship Model (ERM). Relational data analysis (or normalisation) is one of the primary techniques used to derive the system's data entities, their attributes (or properties), and the relationships between them.
- Data Flow Modelling - this technique is used to identify, model and document the way in which data flows into, out of, and around an information system. It models processes (activities that act on the data in some way), data stores (the storage areas where data is held), external entities (an external entity is either a source of data flowing into the system, or a destination for data flowing out of the system), and data flows (the paths taken by the data as it moves between processes and data stores, or between the system and its external entities). A data flow model consists of a set of integrated Data Flow Diagrams (DFDs), together with appropriate supporting documentation.
- Entity Behaviour Modelling - this technique is used to identify, model and document the events that affect each entity, and the sequence in which these events may occur. An entity behaviour model consists of a set of Entity Life History (ELH) diagrams (one for each entity), together with appropriate supporting documentation.
SSADM's structured approach
Activities within the SSADM framework are grouped into five main modules. Each module is sub-divided into one or more stages, each of which contains a set of rigorously defined tasks. SSADM's modules and stages are brieffly described in the table below.
Module | Stage | Description |
---|---|---|
Feasibility Study (module 1) |
Feasibility (Stage 0) |
The high level analysis of a business area to determine whether a proposed system can cost effectively support the business requirements identified. A Business Activity Model (BAM) is produced that describes the business activities and events, and the business rules in operation. Problems associated with the current system, and the additional services required, are identified. A high level data flow diagram is produced that describes the current system in terms of its existing processes, data stores and data flows. The structure of the system data is also investigated, and an initial LDM is created. |
Requirements Analysis (module 2) |
Investigation of Current Environment (stage 1) |
The systems requirements are identified and the current business environment is modelled using data flow diagrams and logical data modelling. |
Business System Options (stage 2) |
Up to six business system options are presented, of which one will be adopted. Data flow diagrams and logical data models are produced to support each option. The option selected defines the boundary of the system to be developed. | |
Requirements Specification (module 3) |
Definition of Requirements (stage 3) |
Detailed functional and non-functional requirements (for example, the levels of service required) are identified and the required processing and system data structures are defined. The data flow diagrams and logical data model are refined, and validated against the chosen business system option. The data flow diagrams and logical data model are then validated against the entity life histories, which are also produced during this stage. Parts of the system may be produced as prototypes and demonstrated to the customer to confirm correct interpretation of requirements and obtain agreement on aspects of the user interface. |
Logical System Specification (module 4) |
Technical System Options (stage 4) |
Up to six technical options for the development and implementation of the system are proposed, and one is selected. |
Logical Design (stage 5) |
In this stage the logical design of the system, including user dialogues and database enquiry and update processing, is undertaken. | |
Physical Design (module 5) |
Physical Design (stage 6) |
The logical design and the selected technical system option provide the basis for the physical database design and a set of program specifications. |
SSADM is well-suited to large and complex projects where the requirements are unlikely to change significantly during the project's life cycle. Its documentation-oriented approach and relatively rigid structure makes it inappropriate for smaller projects, or those for which the requirements are uncertain, or are likely to change because of a volatile business environment.
Rapid Application Development
Rapid application development (RAD) is an iterative and incremental software development process that is designed to produce software as quickly as possible. The term tends to refer to a range of techniques geared to the rapid development of applications, such as the use of various application development frameworks. RAD was an early response to more structured and formal approaches like SSADM which were not felt to be appropriate for projects undertaken within a highly volatile and evolving business environment.
The philosophy behind RAD is that there is an acceptable trade-off between the speed of development and the overall functionality or performance of the software delivered. Put another way, RAD can deliver a working solution that provides 80% of the required functionality in 20% of the time required by more traditional approaches. Two major benefits of this approach are that the customer gets to see results very quickly, and that a production version of the software will be available within a relatively short time frame, greatly reducing the likelihood that the customer's business environment will have undergone significant change by the time the new system is delivered. The down side is that some of the desirable (but non-essential) features of the software may be sacrificed in order to speed development, and the performance of the resulting system, while acceptable, may not be optimal. System acceptance is based upon the system achieving the agreed minimum functionality and usability.
A RAD team is usually small (maybe six or so people including developers and users), and developers are usually required both experienced and multi-skilled, since they will be combining the role of analyst, designer and programmer. The project begins with an initial Joint Application Development (JAD) meeting during which developers and customer representatives determine the initial requirements of the system and agree a time frame in which a prototype system will be ready. The developers design, build and test a prototype system that reflects these initial requirements. The customer then evaluates the prototype system to determine how far it meets their requirements, and what functionality or features need to be improved or added.
A focus group meeting then takes place, during which the customer reports back to the development team. The requirements specification is revised to incorporate new features and improvements, and the time frame for the next iteration is agreed. Features that are deemed to be of secondary importance may, by negotiation, be dropped from the new requirements specification if they will negatively impact on the time frame for the new prototype. The cycle of iterations and focus group meetings continues until a final prototype is accepted by the customer as a production version of the new system.
The Rapid Application Development life cycle
The "time box" in which an iteration occurs is short (usually from a few days up to about three weeks). Documentation of requirements and design documentation is usually restricted to notes taken from meetings, rather than the formal documentation associated with more structured methodologies, and will consist of the minimum documentation required to facilitate the development and maintenance of the system. The entire life cycle is relatively short (usually a few months), and should result in a steady convergence between the customer's concept of the new system and that of the development team, resulting in a workable business solution that is fit for its intended purpose.
One of the benefits claimed for RAD was that, because customers often had only a vague idea of what they wanted, the availability of a working prototype would help to crystalise their thoughts in this respect and enable them to evolve a more definitive set of requirements. Whereas some system development methodologies attempted to determine the complete set of requirements in advance in an attempt to eliminate future changes to the scope of the project, RAD was able to incorporate change as part of an evolutionary development process.
RAD leveraged the benefits of a number of software development tools in order to speed up the development process, including a range of computer aided software engineering (CASE) tools. Code re-use, the use of object oriented programming languages, and the utilisation of third-party software components were all embraced by RAD developers. Fourth generation visual programming languages, the forerunners of today's integrated development environments (IDEs) were used to create the graphical user interface (GUI), while code production was further speeded through the use of an appropriate application programming interface (API) that provided much of the base code for the application.
RAD tended to be used successfully for projects that did not have a high degree of criticality, and where the trade off between a short time frame for development on the one hand, and quality and performance on the other, was acceptable. It was not suitable for systems where optimal performance was required, or that must interoperate with existing systems. The flexibility of RAD lay in the ability to produce results quickly and adapt specifications to meet an evolving set of customer requirements. From the customer's point of view, seeing a working prototype early on in the proceedings helped them to focus on what they did or didn't want from the system, and the continuing dialogue with the development team meant that developers had a good understanding of the customer's requirements.
The speed of development and the relatively small size of development teams tended to result in reduced development costs, although the absence of classic project milestones sometimes made it difficult to accurately measure progress. Today, some of the principles of RAD have been adopted by practitioners of agile development methods, themselves a response to an increasingly volatile business environment.
Agile software development
Agile software development refers to a group of loosely related software development methodologies that are based on similar principles. Notable examples include the Unified Software Development Process (USDP) and Extreme Programming (XP). Agile methodologies are characterised by short life-cycle iterations (typically measured in weeks), with minimal planning and documentation. The goal is to deliver working software to the customer at the end of each cycle. Each iteration involves a number of phases including planning, requirements analysis, implementation, and testing. This incremental, iterative approach helps to reduce overall risk, while enabling the output of the project to be adapted to meet changing requirements or circumstances. Documentation is generally limited to what the customer requires. Agile methodologies have evolved as an alternative to more traditional, process-driven methodologies.
The emphasis is on frequent (usually daily) face-to-face communication within the project team, and between the project team and the customer. Written documentation is of secondary importance, and meetings are usually formal but brief. Project teams are typically small (usually less than a dozen people) to facilitate communication and collaboration. Where agile methods are applied to larger projects, the different parts of a project may be allocated to several small teams of developers.
Each iteration of the project life cycle results in the production of working software, which is then evaluated by the customer before the next iteration begins. The production of working software, rather than the completion of extensive project documentation, is seen as the primary measure of progress. The software produced at the end of an iteration has been fully developed and tested, but embodies only a subset of the functionality planned for the project as a whole. The aim is to deliver functionality incrementally as the project progresses. Further functionality will be added, and existing features will be refined, throughout the life of the project.
Agile methods are favoured over more structured methodologies for projects where requirements are not initially well defined, or are likely to change over the lifetime of the project. They work well where the project team is small, and comprised of experienced developers. Both the project management techniques and the development tools used are selected on a project-by-project basis, allowing the overall process to be tailored to the needs of a particular project. The short duration of iterations and the absence of a rigid set of requirements or design documentation allow developers to respond quickly to changing requirements and circumstances. The emphasis on constant interaction between customers and developers provides continual feedback that helps to keep the project on track.
Agile methods are not so well suited to large-scale projects where the requirements are well defined, where the business environment is relatively non-volatile, or where the predominant organisational culture is intolerant of a lack of structure or documentation. The emphasis on frequent face-to-face communication as an essential element of the development process means that agile methods do not lend themselves easily to projects that are distributed over a wide geographical area, or that require large teams of developers. Critics of agile methods have also pointed out the difficulties that may arise in terms of negotiating a contract or determining the cost of a project where the scope of the project is not initially well-defined, and requirements are unclear.
Unified Software Development Process (USDP)
The Unified Software Development Process (USDP) is an iterative and incremental software development process framework which, it is claimed, can be adopted for the development of both small and large scale information systems. The development cycle is divided into four main phases:
- Inception - this is usually a fairly short phase that is primarily used to establish the scope and objectives of the project. It lays down both the overall aims and the specific functional objectives, such as being able to log into and out of the system. These specific functional objectives are referred to as use cases. The phase will also identify one or more candidate architectures for the system, identify risks, and determine a preliminary project schedule and cost estimate. The end of the inception phase is marked by the Objective milestone.
- Elaboration - in this phase, most of the system requirements, the known risks, and the system architecture are established. Use case diagrams, conceptual diagrams and package diagrams are created. A partial implementation of the system is produced in a series of short, time-boxed iterations that includes the core architectural components, and establishes an executable architecture baseline. The other deliverable from this phase is a blueprint for the next phase (the construction phase) that includes estimates of the cost and the time required for completion. The end of the elaboration phase is marked by the Architecture milestone.
- Construction - the remaining parts of the system are built on the foundations laid down in the previous phase in a series of short, time-boxed iterations, each resulting in a software release. A number of common Unified Modelling Language (UML) diagrams are used during this phase for the purpose of specifying visualising, constructing and documenting the system. The end of the construction phase is marked by the Initial Operational Capability milestone.
- Transition - the system is deployed to its operational environment and user community. Feedback from an initial release may lead to further iterations within the transition phase that incorporate further refinements to the system. This phase may also include activities such as data conversion and user training. The end of the transition phase is marked by the Product Release milestone.
The system architecture describes the various functional subsystems that make up the system, such as those responsible for handling input and output, data communications, and information reporting, and the interactions between them and the rest of the system. A risk is any obstacle to success (e.g. insufficient or inexperienced personnel, lack of funding, or severe time restrictions. Each iteration results in a single release of the system, although there can be one or more intermediate builds within a single iteration. The feedback from each release is used to shape future iterations.
The unified process defines six core process disciplines:
- Business modelling
- Requirements
- Analysis & Design
- Implementation
- Testing
- Deployment
Most iterations will include some work in most of the process disciplines. The relative emphasis placed on each activity, and the effort it requires, will change over the course of the project. This is illustrated by the following diagram.
The Unified System Development Process lifecycle
Extreme Programming
Extreme Programming (or XP) is an agile software development methodology that takes traditional software engineering practices to "extreme" levels to achieve a development process that is more responsive to customer needs than traditional methods, while creating good quality software. Changing requirements are seen as an inescapable feature of software development projects in an increasingly unpredictable business environment. XP Practitioners believe that a software development methodology that embodies the capacity to adapt to changing requirements is a more realistic approach than trying to define all of the requirements at the start of a project. Rapidly-changing requirements demand shorter development life-cycles, and are incompatible with traditional methods of software development.
Individual developers are assigned specific tasks, and are responsible for their completion. No code is written until unit tests have been designed for individual code components and subsystems. The customer is responsible for defining appropriate acceptance tests that are subsequently used to validate the software produced during an iteration. At the end of an iteration, the development team delivers a working system to the customer. The system may not be complete, but all functionality implemented works. A further meeting is scheduled to plan the next iteration, and the cycle begins again.
The Extreme Programming methodology encompasses a set of values, principles and practices designed to facilitate the rapid development of high-quality software that satisfies customer requirements.
The twelve core practices of XP are described below.
- The planning game - the development team collaborates with the customer to produce working software as quickly as possible. The customer produces a list of the required system features, each described in a user story, which gives the feature a name and outlines its functionality. User stories are typically written on index cards. The development team estimates the effort required to code each story, and how much effort the team can produce in a single iteration. The customer decides which user stories to implement, and in what order, as well as how often to produce a production releases of the system.
- Small releases - the first iteration produces a working software release that embodies the functionality identified by the customer as being the most essential. Subsequent iterations add additional features as requested by the customer. Iterations are of fixed length (typically from two to three weeks).
- System metaphor - each project has an organising metaphor, which provides an easy to remember naming convention.
- Simple design - the simplest possible design is used that will satisfy customer requirements. Because of the high probability of changes to requirements, only currently known requirements will be considered.
- Test driven development - unit tests are written by developers to test functionality as they write code. Acceptance tests are specified by the customer to test that the overall system is functioning as expected. All tests must be successfully completed before software is released.
- Refactoring - any duplicate or unnecessary code generated in a coding session is eliminated, fostering the utilisation of re-usable code.
- Pair programming - all code is written by two programmers working together on one computer, with the aim of producing high quality code. One person will focus on coding while the other will focus on strategic issues.
- Collective code ownership - no one programmer "owns" a code module. Any developer can be required to work on any part of the code at any time.
- Continuous integration - all changes are integrated into the system daily. Integration testing must be successful carried out before further integration occurs.
- Sustainable pace - developers are expected to be able to go home on time. Excessive overtime is taken as a sign that something is wrong with the development process.
- Whole team - the development team has continuous access to a customer representative.
- Coding standards - all programmers are expected to write code to the same standards. Ideally, it should not be possible to tell which member of the development team has written a code module simply by examining the code.
Extreme Programming may be appropriate for relatively small-scale projects where the requirements change rapidly, or where some initial development is needed before previously unforeseen implementation problems can be determined. It may not work so well for larger projects, or projects where the requirements are unlikely to change.