However, as embedded devices become more ubiquitous, engineers are facing significant challenges to ensuring the quality of these devices on time and on budget. To keep pace with this rapidly evolving complexity while also ensuring safe, high-quality vehicles, development teams must evolve the methods organizations use to design and validate embedded software.
The embedded software challenge
A modern automobile has millions of lines of code, which can range in functionality from engine control to navigation to automated seat adjustment. While no off-highway vehicle currently incorporates this many ECUs, National Instruments is quite sure this level of software integration is most certainly on the horizon for agriculture and construction equipment.
Everyone who has used a webpage, phone app, or other consumer software product is aware that errors in software are common and expected. The challenge is that in the consumer space, these errors are typically resolved with a quick reset of the application with relatively little impact to the user (lost data excluded).
However, when this paradigm is applied to an electromechanical system these errors can have catastrophic consequences if they are present when the customer operates the machine.
To remain competitive or even present in the marketplace, manufacturers of these machines must be able to find and eliminate errors before they are passed on to the customer. Unfortunately, competitive pressures of the off-highway vehicle market do not allow the amount of time or budget available to validate the software in these systems.
This dichotomy necessitates that organizations continuously evaluate and improve their development practices and harness the latest technologies to both ensure product quality and meet time-to-market requirements.
Evolving to validate intelligent systems
As these development teams evolve their approaches to address the quality challenge for these intelligent systems, there are three key areas where they are achieving improvements in operational efficiency and reducing the overall cost of quality.
One of these areas is the expansion of test automation to increase the accuracy and quantity of testing. Technicians manually adjusting knobs and dials on a box or screen cannot keep pace with the additional testing necessary to ensure the quality of these software-enabled systems.
To increase test throughput, engineers are writing test scripts to automate usage scenarios complete with fault scenarios. Expanding the value of these test scripts, test engineers are parameterizing these scripts and publishing them in shared libraries that can be re-used across different configurations of the machines.
As this area continues to evolve, the automated generation of test scripts is proposing that even the creation of the test case can be accomplished more efficiently using high-level abstraction tools to intelligently create test scripts based on expected usage patterns and potential software vulnerabilities.
Another area where organizations are improving operational efficiency is in the scope-of-use of their test IP and the correlation of results produced by this IP. Organizations are removing the silos separating development teams at different phases of the process by making provisions for the sharing of test components such as simulation models, stimulus profiles, and report templates.
This reuse of test components enables developers to standardize the way they stimulate, visualize, and analyze data across development phases reducing the development and maintenance cost of these systems. The end result is more consistent information, more capacity for test, and greater insight into the product performance as it transitions through the development process from concept to production.
In addition to connecting the various groups performing test functions across the process, organizations are also making investments to bridge the worlds of design and test, connecting them more tightly without inhibiting their individual execution.
Historically, other than high-level coordination of their project efforts, communication between these worlds is typically accomplished through manual and often error-susceptible processes that offer little opportunities for collaboration and traceability.
Staying competitive in rapidly changing markets
To fully realize the investments in test automation and test component reuse, organizations must evolve the way design and test teams coordinate efforts and communicate results.
Evolving tools to enable end-to-end traceability and collaboration across teams allows greater visibility into future resource demands and access to up- and down-stream impact of defects and change requests.
By incorporating traceability between testing and life-cycle management, development assets can be linked to and managed with quality plan test cases, and test results can be made available to all teams and linked to test cases and requirements. This evolution is allowing teams to make more informed decisions and reduce their time to confidence in quality.
Through investments in test automation, test component re-use, and traceability and collaboration between all members of the development team, organizations can produce the operational efficiency necessary to exceed their customer quality expectations while continuing to innovate without putting the project timeline and budget at risk.