Event:

Join Xplor! Worldclass education, networking, industry certifications, events and more!

Development

From Xplor Wiki
EDBOK Guide
EDBOK-book cover.png
Body of Knowledge
Document Systems Development Lifecycle
Lifecycle Category
Development
Content Contributor(s)
Neil Merchant m-edp
Original Publication
August 2014
Copyright
© 2014 by Xplor International
Content License
CC BY-NC-ND 4.0

What is Development?

The Development or Build phase in the systems development lifecycle is where the project deliverables start life. In the preceding steps the project objectives have been outlined, the business case developed and the detailed analysis conducted. Design has laid down the overall structure of the solution and produced a detailed set of application, software and/ or product specifications. Now the construction starts.

In a hardware installation project it is getting the equipment on the ground, laying it out, installing power, air-conditioning and waste extraction, and getting it all ready for integration testing. In application development, it is the file/database creation, coding, composition or manipulation tool configuration, JCL writing and unit testing. A commercial software development project uses a similar process for application development

Hardware Installation

Preparatory Work: Any necessary planned upgrades to air conditioning, power supplies, waste extraction and other infrastructure should be completed before equipment installation.

Equipment Delivery and Assembly: Any old equipment should be relocated, or decommissioned and removed, once any remaining workload has been successfully migrated to other hardware. It may be necessary for old and new hardware to co-exist for a period of time, which should be allowed for when planning the use of floor space.

New equipment should be delivered ready for installation. Once onsite it can be moved to its target position as defined in the Design phase (make sure floor loading has been checked), assembled and connected to waste extraction, power and servers as appropriate.

Service clearances should be checked and floor markings (for walkways, clear areas, etc.) painted.

Commissioning and Setup: Once installed, the equipment will be subjected to a range of commissioning tests by the vendor. Inserters will be configured for planned jobs and those configurations saved. Printers will be put through a variety of tests which will vary based on the print technology.

Training: If the equipment is new to the customer or has new operator features initial training should be delivered by the vendor. Out of this training and unit testing, new or revised operational procedures should be prepared.

Unit Testing: Once commissioning and training are complete, the vendors will hand the equipment over to the customer, and unit testing can begin. This typically involves test runs using copies of existing work or standard test packs. The goals are to make the operator familiar, to compare quality and performance, and to identify initial problems.

Any new stocks, inserts, and envelopes should also be tested at this stage, to ensure: compatibility to specifications; reliable processing through the new equipment without wrecks and jams; and ink or toner performance.

Temperatures around new equipment should be checked to identify any hot spots and allow remedial action.

At this point, hardware installation is complete, and the project can progress to the Test phase.

Application Development

The development team’s job is to take the architecture and design specifications that result from the identification of business requirements analysis and build an executable solution. These specifications may define the software to be used, programming languages, database tools, hosting and other constraints as well as the process structure and flow (i.e. the steps in the job and the structure of individual programs.)

Reuse of common modules and exposed services may also be included in the specifications. If not, care should be taken in the Build phase to identify such opportunities.

Architecture and Design may have identified the need for new commercial software (composition or manipulation tools). If so, the initial installation and testing of these may better be considered as a separate project: selection and acquisition may take several months, followed by installation, training and a pilot application. Developing a mission-critical application at first use brings significant additional risk to the project schedule.

Most applications consist of three functional layers – data, business logic and presentation. This is certainly true in transaction document applications, even though business logic and presentation are generally achieved in the same (composition) step. These three layers are discussed below.

Data: Data in this context consists of two major elements: document objects, used repeatedly across many documents, and variable data, unique to each document.

Document Objects: Document objects will generally be prepared or sourced for reuse during this phase. These objects may include the set of fonts specified in the document design guide, company or product logos, blocks of text defining product features, legal terms and conditions, customer servicing and other details, and barcodes and other document integrity marks. These may need to be produced in versions for print, web and mobile use, and all will need to be unit-tested.

Variable Data: Given the volumes typically involved, and the potential diversity of sources (e.g. account database, customer database, CRM systems, marketing/targeting systems) variable data extraction is an important element to get right. It is typically handled in the first step in the overall process, generally known as Extract, Transform and Load (ETL). Poor choice of tools, and inefficient use of appropriate tools, can lead to excessive processing times that can jeopardize meeting service level agreements.

Normalization and pre-formatting of currency amounts, telephone numbers, addresses and other data elements is good practice at this stage, and also provides the opportunity for data validation (for example confirming that all necessary data elements are present, and checking for reasonableness in currency amounts). Intelligent integration and structuring of extracted data from the various sources into the input for the composition or manipulation step can again provide performance efficiency.

Many composition toolsets offer ETL capabilities that include adapters for popular database systems (Oracle, DB2 etc.). Depending on the complexity of the need these may be fit for purpose. In more complex, demanding or unusual circumstances it may be better to develop the ETL functionality in-house.

Business logic such as sorting/splitting for service level, mailing and print operational reasons may also be applied at the ETL stage, as can that for defining additional content based on the account holders behavior as manifested by the transactions incurred. This content can either be included later by means of a metadata trigger added to the account’s data records, or pulled in at this stage.

Careful attention to developing data integrity checks during the application’s Build phase is also key to a successful, high quality deliverable.

Business Logic

Business logic, the set of business rules, is generally applied at the composition or manipulation step, and implemented using the tools provided by the composition or manipulation product set. The majority of these product sets are functionally rich with graphic front ends that greatly simplify the development process. They are far more complex than products such as the MS Office suite, and training in their use is essential. If the tool is new to the development team, professional services assistance from the vendor for the first project is advisable. Learning hands-on how to use the product from an expert in accordance with its design philosophy rather than applying your own preconceptions r misconceptions will pay dividends.

For composition, a data dictionary is generally built first, identifying the contents of each input data record type field-by-field, establishing relationships between them, and assigning attributes.

The document can now be built using the designs and style guide that will have have been developed much earlier in the project, usually with a GUI-based tool. The elements of the document – address box, product and account information areas, and tabulated transaction areas are built up along with functions such as pagination rules. Elements from the data dictionary can be positioned appropriately on the document design, and processing rules applied using the composition tools functionality.

The specifications may also call for other outputs to be emitted from the composition process, such as integrity or control files for managing the downstream print, insert, mail, and email processes and to help track the processing of each document throughout its journey. Again, current composition toolsets have the capability to create these.

In the case of a print-stream manipulation project, the requirements are generally simpler. These are a few examples:

  • reversing the order of documents in a file,
  • address validation and correction; adding or changing barcodes, and
  • use of white space to include marketing content.

As with composition, the tools now available are generally sophisticated and functionally rich, and the required manipulations can generally be achieved without resort to writing traditional code. Service providers, for example, often use print-stream manipulation tools to re/format the bar codes and other document integrity from “print ready” files received from clients to their standard.

Presentation

Where does the presentation layer begin?

It begins somewhere in the composition step.

Most composition tools separate the application of business logic and the creation of presentation streams. They generate an intermediate internal file format containing presentation intent, which is then processed into specific presentation formats for different languages and media: print-streams (e.g. PDF, AFPDS, PCL); email notification (HTML, text); and for serving to web and mobile presentation (typically XML or PDF) where the data is stored and then served up on demand (i.e. it goes back into the data layer). One of the joys of composition tools is that the actual generation of a presentation stream comes “in the box”, which eliminates the need for programming work. It only requires specification via an option or parameter setting, and possibly the purchase of one or more additional emitters.

JCL/Control Scripts

To run unit tests of the developers’ deliverables, control scripts are required to run the application. In a mainframe environment these are usually JCL (Job Control Language) scripts, in UNIX environments shell scripts (*.sh, *.ksh, *.bsh) are usually used, and on a Windows platform these may be Batch Files (*.bat). For the Test phase more extensive scripts are required to run the entire suite, including allocating output files, clean-up and error handling. These scripts may be written during Build or it may be an early activity in the Test phase.

QA Activities

Typical quality assurance activities in the Build phase include re-use checks, code and configuration reviews, document layout reviews, and integrity checks (document counts).

Re-use Checks

Whether developing in-house ETL code or using composition/manipulation tools, there may be reusable modules, data dictionaries, design templates or entire applications available for reuse. In the case of ETL, for example, the formatting of data elements (date, currency, etc.) will likely be consistent with other documents, and that formatting code should be reusable. With good prior design of data and templates, there may be very little coding work needed to implement a new products statement into an existing family.

Code and Configuration Reviews

Code walkthroughs with suitably skilled colleagues not otherwise involved in the project can highlight misinterpretations of specifications, pick up coding errors and introduce better ways of achieving the desired functionality.

Walkthroughs with composition toolsets are more difficult, as much of the work is interactive at the toolset GUI and not amenable to review: here, the vendor’s professional services team may be of help.

Document Reviews

Although the range of output available from the (unit) testing conducted during Development will be limited, reviews against the document design checking for print placement, formatting, and pagination can highlight overt errors and confirm the interpretation of the specification.

Unit Testing

The role of unit testing is to demonstrate that the unit of code (program, composition rule-set, etc.) behaves according to the specification, and is ready to go forward to subsequent, more extensive, testing procedures. What follows is a basic example to demonstrate the purpose; unit testing will be designed to match the specific circumstances of ever project and piece of code.

Valid files: Testing with valid input files will demonstrate correct behavior and the outputting of files for subsequent printing or validation.

Empty files: Testing with one or more empty input files will confirm correct behavior – i.e. detection of the empty file and an orderly termination with a suitable error message.

Data out of range: Testing with input files that contain data out of range in one or more ways (e.g. address line length, currency amount in excess of the value specified, date invalid or unreasonable, phone number with too few/many digits, number of transactions in excess of specified limit)

Missing data: Testing with input files that, for example, have no street address (for a document destined to be mailed), no customer name, or a missing final balance amount, will demonstrate that these conditions too have been checked and correctly catered for.

All required output types: Test output should be generated for every desired language and target medium, and rendered if possible at this stage.

Integrity checks (document counts): Integrity checks throughout the document journey are essential in the production of variable data documents. At every stage, documents should be counted, and the counts passed on to the next stage for comparison with subsequent counts. Although it’s not generally good practice to conduct financial calculations within document systems, hash totaling transaction amounts to check against the final balance supplied from the systems of record is another way of catching errors.

Control file checks: If control files are included in the specification, their content should be reviewed in detail. At this stage it may not be possible to test them directly with the downstream steps in the process, but they should be validated as far as possible.

Phase Deliverables

The deliverables from the Build phase are reports documenting and confirming that the phase activities and tasks have been completed successfully. Depending on the nature of the project, these may include:

  • all hardware installed successfully and subjected to initial tests,
  • sample base stock, inserts and envelopes received and tested,
  • application code written and unit tested,
  • software, such as composition or manipulation tools, configured and unit tested,
  • all output from unit test reviewed and checked,
  • list of QA activities conducted and of any errors or problems detected and fixed, and
  • checklist of phase gating criteria and compliance.