LDRA Testbed is a unique quality control tool that provides powerful source code testing and analysis facilities for the validation and verification of software applications. It is invaluable where computer software is required to be reliable, rugged and as error free as possible and its use brings substantial time, cost and efficiency savings.
It is a powerful and fully integrated tool suite, which enables advanced software analysis techniques to be applied at key stages of the development lifecycle.
At the heart of the LDRA tool suite is LDRA Testbed, providing the core static and dynamic analysis engines for both host and embedded software. LDRA Testbed enforces compliance with coding standards and provides clear visibility of software flaws that might typically pass through the standard build and test process to become latent problems. In addition, extensive test effectiveness feedback is provided through structural coverage analysis reporting facilities which support the requirements of the DO-178B standard up to and including Level-A.
LDRA Testbed utilises its own proprietary parsing engine, giving ultimate flexibility for tailoring the tool to meet user requirements and take advantage of new analysis techniques.
LDRA Testbed was the first tool to be utilised for certification to the Federal Aviation Authority’s DO-178B standard for both airborne and ground-based systems.
In 1998, The Motor Industry Software Reliability Association (MISRA) published the MISRA C standard to promote the use of “Safe C” in the motor industry. This standard is now being adopted through the software industry as a basis for encouraging good programming practice, focussing on coding rules, complexity measurement and code coverage, ensuring well designed and tested code which is safe in service. LDRA Testbed is the only tool available to enable static rule checking, complexity and dynamic analysis for MISRA C compliance.
Companies are becoming more and more aware that improvements in the software development process can bring about significant savings in the costs of development and maintenance of software. This all stems from efficient development of well constructed, documented and tested software.
For developers, project leaders and senior managers, the administration of software quality can be time-consuming and difficult.
LDRA Testbed solves these problems by enabling managers to easily gather information concerning the system under development; from which they can make a sound judgement as to whether the software meets required quality standards.
A user-defined quality model can be set up based on industry standards, against which all source code analysed can be compared.
LDRA Testbed analyses the use of both local and global variables, as well as procedure parameters.
This information is then presented in a graphical displays and textual reports, which clearly identify any problems with variable usage. This method can be used across unit or system boundaries, enabling quicker identifications of errors.
Independent studies have shown this technique, known as Data Flow Analysis, to be one of the most cost-effective ways to remove bugs from software.
In order to deliver reliable software and avoid high maintenance costs, it is vital that software testing is carried out at code level.
Through automatic source code instrumentation, LDRA Testbed reports on the areas of code that were not executed at run time. This facilitates quick identifications of missing test data.
When an error has been identified by the test data, LDRA Testbed will show exactly which code areas were executed through textual and graphical reports. These features save time in fixing the error and re-testing.
Through LDRA Testbed’s measurement of these coverage metrics, testing strategies can be implemented and enhanced to reach a desired coverage standard. This will greatly increase confidence in the tested code.
With LDRA Testbed, execution of tests can be monitored through instrumentations in a Host/Target environment where the target can be an embedded system, a mainframe, a simulator or emulator, or a real-time operating system (RTOS).
System and Integration Testing
LDRA Testbed is able to analyse source files and interfaces across an entire system or subsystem to check for mismatches between interfaces during integration testing.
Graphical and textual reports provide accurate results at either system level or for individual elements in the system. This enables the uniform enforcement of standards across the system or project.
LDRA Testbed helps to speed up regression testing by analysing the coverage achieved by test data sets, reporting on the smallest set that retains the current level of coverage and redundant test cases that do not add to the existing test coverage.
By utilising this facility, previous analysis runs are not repeated unnecessarily, thereby saving testing resources. This results in more efficient testing, reducing costs to a minimum.
To ensure that software maintenance is carried out effectively, it is essential that there is a full understanding of code functionality and the current state of its quality.
LDRA Testbed provides an invaluable aid to software maintenance by enabling the visualisation of source code through Callgraphs and Flowgraphs. This leads to a quicker understanding of the unit or system code structure, enabling changes to be made without introducing errors.
LDRA Testbed also highlights any unreachable code that can be removed, making the code easier to understand and more efficient.
Infeasible code can also be identified and removed to increase efficiency and maintainability.
LDRA Testbed automatically produces accurate and up-to-date detailed documentation which includes Callgraphs, Flowgraphs and cross-referencing listings.
This documentation can be imported into word processing packages or included in code comments.
Accurate and easily generated documentation ensure that code can be understood, enhanced, maintained and re-documented throughout its life-cycle.
LDRA Testbed’s quality metrics give an important insight into the current state of the software complexity, test path density, structure, comments, dataflow anomalies and other valuable information highlight areas of software that may need more attention.
LDRA Testbed analyses the interdependencies of data items through the unit or system’s source code on a procedure-by-procedure basis for all paths.
Analysis reports give a breakdown of the functionality of each variable. This in-depth analysis enhances confidence when making source code changes.