LDRA Testbed® - System Tests and Coverage

LDRA Testbed is a unique quality control tool that provides powerful source code testing and analysis facilities for the validation and verification of software applications. It is invaluable where computer software is required to be reliable, rugged and as error free as possible and its use brings substantial time, cost and efficiency savings.

It is a powerful and fully integrated tool suite, which enables advanced software analysis techniques to be applied at key stages of the development lifecycle.

Related Imagery

/images/related/a04b2ce62c991023889bfdb47d10d4a1.jpg
/images/related/e310e8a617f6aeef37e8f6a223924a7b.jpg
/images/related/7d5a7806fb3eff76424bd6e1b97568d1.jpg
/images/related/614adc0f559bd4b87795839a158361a9.jpg
/images/related/68dd4b1a7fb6ca97be82198232a4c294.jpg

LDRA Testbed® - System Tests and Coverage

Overview

At the heart of the LDRA tool suite is LDRA Testbed, providing the core static and dynamic analysis engines for both host and embedded software. LDRA Testbed enforces compliance with coding standards and provides clear visibility of software flaws that might typically pass through the standard build and test process to become latent problems. In addition, extensive test effectiveness feedback is provided through structural coverage analysis reporting facilities which support the requirements of the DO-178B standard up to and including Level-A.

LDRA Testbed utilises its own proprietary parsing engine, giving ultimate flexibility for tailoring the tool to meet user requirements and take advantage of new analysis techniques.

Testbed Select AnalysisIndependent studies have shown that through use of the LDRA Testbed reported bugs can be reduced by up to 75% and testing efficiency improved by 46%.

LDRA Testbed was the first tool to be utilised for certification to the Federal Aviation Authority’s DO-178B standard for both airborne and ground-based systems.

In 1998, The Motor Industry Software Reliability Association (MISRA) published the MISRA C standard to promote the use of “Safe C” in the motor industry.  This standard is now being adopted through the software industry as a basis for encouraging good programming practice, focussing on coding rules, complexity measurement and code coverage, ensuring well designed and tested code which is safe in service.  LDRA Testbed is the only tool available to enable static rule checking, complexity and dynamic analysis for MISRA C compliance.

For further information on LDRA Testbed and availability please complete the LDRA reply form or email info@ldra.com.

Process Improvement

Graphical Results MenuCompanies are becoming more and more aware that improvements in the software development process can bring about significant savings in the costs of development and maintenance of software. This all stems from efficient development of well constructed, documented and tested software.

Standards Enforcement

For developers, project leaders and senior managers, the administration of software quality can be time-consuming and difficult.

LDRA Testbed solves these problems by enabling managers to easily gather information concerning the system under development; from which they can make a sound judgement as to whether the software meets required quality standards.

A user-defined quality model can be set up based on industry standards, against which all source code analysed can be compared.

Error Detection

LDRA Testbed analyses the use of both local and global variables, as well as procedure parameters.

This information is then presented in a graphical displays and textual reports, which clearly identify any problems with variable usage.  This method can be used across unit or system boundaries, enabling quicker identifications of errors.

Independent studies have shown this technique, known as Data Flow Analysis, to be one of the most cost-effective ways to remove bugs from software.

For further information on LDRA Testbed and availability please complete the LDRA reply form or email info@ldra.com.

Software Testing

In order to deliver reliable software and avoid high maintenance costs, it is vital that software testing is carried out at code level.

Code Coverage

Bar chart showing coverage metrics for previous current and combined runsIf code coverage is not monitored, there is the possibility that errors will still be present in the code that has not been executed by any test data.

Through automatic source code instrumentation, LDRA Testbed reports on the areas of code that were not executed at run time.  This facilitates quick identifications of missing test data.

When an error has been identified by the test data, LDRA Testbed will show exactly which code areas were executed through textual and graphical reports.  These features save time in fixing the error and re-testing.

Through LDRA Testbed’s measurement of these coverage metrics, testing strategies can be implemented and enhanced to reach a desired coverage standard.  This will greatly increase confidence in the tested code.

Host/Target Testing

With LDRA Testbed, execution of tests can be monitored through instrumentations in a Host/Target environment where the target can be an embedded system, a mainframe, a simulator or emulator, or a real-time operating system (RTOS).

System and Integration Testing

LDRA Testbed is able to analyse source files and interfaces across an entire system or subsystem to check for mismatches between interfaces during integration testing.

Graphical and textual reports provide accurate results at either system level or for individual elements in the system. This enables the uniform enforcement of standards across the system or project.

Regression Testing

LDRA Testbed helps to speed up regression testing by analysing the coverage achieved by test data sets, reporting on the smallest set that retains the current level of coverage and redundant test cases that do not add to the existing test coverage.

By utilising this facility, previous analysis runs are not repeated unnecessarily, thereby saving testing resources.  This results in more efficient testing, reducing costs to a minimum.

For further information on LDRA Testbed and availability please complete the LDRA reply form or email info@ldra.com.

Software Maintenance

To ensure that software maintenance is carried out effectively, it is essential that there is a full understanding of code functionality and the current state of its quality.

Code VisualisationKiviat diagram for the set cashregister

LDRA Testbed provides an invaluable aid to software maintenance by enabling the visualisation of source code through Callgraphs and Flowgraphs.  This leads to a quicker understanding of the unit or system code structure, enabling changes to be made without introducing errors.

LDRA Testbed also highlights any unreachable code that can be removed, making the code easier to understand and more efficient.

Infeasible code can also be identified and removed to increase efficiency and maintainability.

Documentation

Testablility kiviat for the file cashregister

LDRA Testbed automatically produces accurate and up-to-date detailed documentation which includes Callgraphs, Flowgraphs and cross-referencing listings.

 

This documentation can be imported into word processing packages or included in code comments.

Accurate and easily generated documentation ensure that code can be understood, enhanced, maintained and re-documented throughout its life-cycle.

Maintainability

Dynamic overview report showing the coverage for the set cashregisterLDRA Testbed’s quality metrics give an important insight into the current state of the software complexity, test path density, structure, comments, dataflow anomalies and other valuable information highlight areas of software that may need more attention.

Data Tracking

LDRA Testbed analyses the interdependencies of data items through the unit or system’s source code on a procedure-by-procedure basis for all paths.

Analysis reports give a breakdown of the functionality of each variable.  This in-depth analysis enhances confidence when making source code changes.

For further information on LDRA Testbed and availability please complete the LDRA reply form or email info@ldra.com.

Events

20th May 2014
UK Device Developers' Conference
Bristol, UK
22nd May 2014
UK Device Developers' Conference
Cambridge, UK
3rd Jun 2014
UK Device Developers' Conference
Manchester, UK

Contact Details

Email: info@ldra.com
Tel EMEA: + 44 (0) 151 649 9300
Tel USA: +1 (855) 855 5372
Tel India: +91 80 4080 8707

Follow Us