How to establish a minimum standard of acceptance level of internal & external quality of the outsourced work? What should be the basis when working with an agile offshore outsourcing company?
This article aims to establish a strict approach to working with outsourcing companies, whether working with local or delocalised agile offshore outsourcing in Bangalore, Ho Chi Minh City, Buenos Aires, etc.
The procedure illustrated here requires validation steps to be followed strictly and as part of hiring external collaborators for development.
Working with an agile offshore outsourcing
Frequent deliveries are required, functionally complete, at least every week. The typical delivery model follows agile methodologies, defining user stories as functional units.
A typical example could be the first installation login functionality, since it is functionally small but provides longer sufficient architectural elements to allow evaluation and monitoring by the customer. The corollary of the previous point is the dedication of adequate resources by the supplier, to feed the feedback mechanism described.
The delivery occurs by sending the source code, along with all the elements necessary for the proper functioning runtime (eg. Binary dependencies) in the git repository client. The project that is set up as a separate module will automatically build a script. This script allows the developer that is working on the client’s project to build it, in a repeatable and deterministic way, with only 2 steps:
- checkout the sources of the project
- automatic script execution construction
Within these system dependencies, eg installing web server, Java virtual machine, tool construction (SDK compiler, maven, ant) etc., details step installation step will be specified in a file README.my text called hanging at the root of the project.
In the presence of database, or another mechanism of temporal persistence of operating data, the automatic construction will have to establish a system of versioning database, conservative with the data, in anticipation of continued application once is running in production environment. The construction mechanism is independent of the development environment used (IDE). In no way be considered automated build scripts generated by the IDE.
The construction mechanism will be implemented, or at least provide a processing resource configuration for automatic deployments (eg. Tasks that copy a configuration file or another depending on the environment specified in the target (ant) or profiles (Maven)). The building will have a separate mechanism to generate code coverage reports on unit tests tasks.
CLEANING THE WORKSPACE
The workspace, from the pull versioning, have the correct settings to ignore all resources generated from the implementation of automated build script. For example, running the git status command will have to show the same result if the clean bill (just after the pull) and after running the script construction. Likewise, all IDE specific resources will be discussed.
All code written by the supplier requires conbertura of Test Unit as xUnit standards. Coverage values are required:
> 80% – optimal coverage 60-80% – good coverage 40-60% – low coverage, which requires explanation from the supplier, to assess from case to case <40% – insufficient coverage, delivery is null
In any case, the contribution of functional tests in the unit cover is considered in the calculation, this, in fact, is a clear indication that the supplier has not understood the Test Driven Development.
A clear separation between unit testing, integration, functional, each type having a different suite is also required.
Code coverage will also require a uniform distribution of test cases from cases happy path, borderline cases and cases sad path (management error cases) The code quality test will have to meet the same standards that production code, described below.
For projects “service-oriented”, p. eg. API / Web Service, suites of automated functional tests, which reproduce the use cases of the application is required. Each bugfix, compared to a notified by the Quality Department, beside the solution will have to have an automated test that reproduces it, unless manifest impossibility to agree with those responsible for client architecture bug.
Projects should have mechanisms for runtime inspection as logs, trace and debug output, configurable via configuration files. Where possible, we recommend using standard mechanisms (log4j, etc.). The binaries of these frameworks are considered binary dependency of the project, and must be under version control, or equivalent systems (eg. Maven repository).
No deliveries showing any level of code duplication, other than accidental be accepted. They will, for this purpose, duplication detection tool code (eg. CPD). The source code consistently follows the naming convention and defined by the customer.
- In particular, it should avoid:
- Abbreviations, unless very common concepts or acronyms of technologies (eg HTML.)
- Hungarian notation.
The code will comply with the best practices of object orientation, also in case of procedural technologies. The following basic guidelines will be checked to generate code metrics: Most of the methods have less than 10 lines of code (low LOC / NOM) all class attributes will be private or protected. There will be access methods only where necessary. classes exhibit low coupling and high cohesion. Above all, it will be minimized:
- coupling control
- common / global coupling
- temporary coupling
For the same reason, classes stateless instance level (static singleton) is minimized. They try to hold off lows code complexity (ccn), also favoring polymorphic conditional logic solutions. Faced with very large modules, extraction of new modules (classes) as a mechanism for code reuse will be preferred.
We will try to minimize the method calls “illogical” as access methods (getter / setter), favoring the distribution of the logical topology of object orientation. Be used for error handling mechanism for exceptions, the technologies allow. Error codes are avoided as output methods, unless sufficiently motivated reasons.
Unified, only the borders between applicative contexts. It will avoid putting a large number of local treatments exceptions, unless local treatment needs and resources or infrastructure recovery in borders.
Serious mistake considering the use of empty catch blocks.
Will attempt, in addition to the just described rules apply typical test code patterns, among others, include: shall not be more than an assertion in test methods, unless these are not assertions of equality on various fields of an object (avoid roulette assertion) code will be reused as Test Helper or Abstract Test Cases API calls to the system under test in profit / private methods are encapsulated. In particular, the constructor of a class once appear throughout the test code no classes will be tested indirect way and try to reuse test fixtures and constants through Abstract Test Cases and Test Helper Additional convention Osherove names will be used, with the only difference that the names be camelCase.
The concepts of good design and domain-oriented architecture is applied, by type of project. In particular: access to the data (where required) will be hidden behind a data access layer, which hide the persistence mechanism itself (whether that file, NoSQL, relational database) Persistence Ignorance will be preferred over Persistence by reachability. Generally applies a design layered investment injected control by using interfaces. It is individuated by way of example, at least 3 “classical” layers:
- application logic (service layer)
- domain logic
Domain layer not directly depend on infrastructure details (as persistent), which are located encapsulated in service domain and decoupled via interfaces or abstract classes. Service layers have little logic, rather calls for coordination between domain (s) and other services (eg. Persistence). The concept of service thin layer is applied. They try, assessing adaptation costs, reurilizar code even third, where the right conditions persist license. The macro-architectural patterns most appropriate for each case (ie. If simple MVC web application), agreed with the customer shall apply. All kinds of data input validation require at least format, which is not (or not only) in the persistence layer, but in the domain layer, following the principle fail fast. The requirements for information security and internationalization will be assessed for each project by agreement with those responsible for client architecture.
The quality department functionally validate customer delivery, according to two parameters: functional specifications enforcement no bugs or errors “critical” that invalidate delivery. therefore, the provider is required:
- review functional specifications before and during development
- acceptance testing, manual, before the end of the iteration of development, to exclude errors macroscopic
- In order for the department to validate quality versions with a minimum guaranteed quality, to reduce costs validation procedures.
APPROVAL PROCEDURE DELIVERY
The provider makes delivery as specified. The project manager notifies the responsible delivery of client software architecture. The delivery happens Quality department for functional validation. Validation of delivery is effected and revision is marked with a tag in the version control repository on the validated delivery.
In Apiumtech consider this highly detailed technical procedure is necessary to ensure that the relationship between customers and external company is correct.
Much better to make clear all these points before starting a relationship.
In fact … are the guarantee of success.