The stringent design assurance guideline imposed by DO-254 for custom micro-coded devices like FPGAs present significant verification challenges within the avionics community. As the complexity of the FPGA design increases, so does the verification activities needed to satisfy the verification objectives of DO-254. As defined in the guidance, verification process activities may be satisfied through a combination of methods such as peer reviews, simulation analyses and tests. For design assurance level (DAL) A and B, it is critical that all FPGA pin-level requirements are verified through simulation and hardware tests and evidence of results documented and provided.
This article describes several significant challenges that can be encountered when verifying FPGA pin-level requirements during board level testing under DO-254 guidelines. More importantly, this article proposes a methodology that augments board level testing to overcome these challenges.
FPGA testing challenges Traditionally, FPGA testing is performed at the board level. The board contains the FPGA under test as its primary component with the most complex functions and controls, and possibly even the main IP (Intellectual Property) of the board. The FPGA is also interconnected with other components on the board, and with the lack of test headers on the FPGA pins visibility and controllability at the FPGA pin level is limited. There are times that the board even contains multiple FPGAs, and verification at the board level without first stabilizing each FPGA individually can lead to many problems and longer project delays. A simplified example of a board under test is shown in Figure 1.
Figure 1: Board under test (Board level view)
There are several significant challenges that come up when verifying FPGA pin-level requirements at the board level, and they include the following:
• Matching the simulation and testing results which involves re-mapping and comparing the hardware outputs to its corresponding RTL simulation results, and tracing them to the design requirements is a considerable challenging. Since most of the traditional testing methods do not allow driving the hardware with all combinations of stimulus defined by code coverage for RTL simulation, not all requirements can be traced and additional analysis is usually required.
• Creation of test vectors to verify requirements is a manual effort that could take 3-6 months for DALs A/B designs. For RTL simulation, testbenches are created to verify specific requirements, and the challenge is determining a way to reuse them to shorten the time spent during hardware testing.
• The FPGA design implementation process may introduce errors in functionality or timing of the design under test (DUT). Sometimes, even an update of the synthesis tool may result in such errors. Since the target device cannot contain any additional debug modules (e.g. probes used by JTAG debug tools), it is not easy to analyze and debug the problems.
• Visibility and controllability at the FPGA pin level is limited. The FPGA under test is interconnected with other components in the board. Additionally, the board might not have enough test headers on the FPGA pins or logic analyzers are not able to capture all required data,
• Multiple testing environments are needed to verify different sets of test cases. This also involves manual bypasses of wires and cables that is prone to human errors. Documenting this process is also a challenge.
• Automation of the verification process is critical and can significantly reduce the project completion time and the overall project cost. Typically, multiple test cases must be developed, executed and analyzed to verify design requirements and any design change or fix. This process, if handled manually, could take months to complete.
Verifying FPGA level requirements is imperative, but performing this process at the board level is quite challenging and at times not feasible. The methodology proposed in this article provides a feasible Hardware-In-The-Loop testing environment to test the target FPGA at-speed by re-using the simulation testbench as test inputs.