A Survey on Formal Verification Techniques for Safety-Critical Systems-on-Chip
The high degree of miniaturization in the electronics industry has been, for several years, a driver to push embedded systems to different fields and applications. One example is safety-critical systems, where the compactness in the form factor helps to reduce the costs and allows for the implementation of new techniques. The automotive industry is a great example of a safety-critical area with a great rise in the adoption of microelectronics. With it came the creation of the ISO 26262 standard
... with the goal of guaranteeing a high level of dependability in the designs. Other areas in the safety-critical applications domain have similar standards. However, these standards are mostly guidelines to make sure that designs reach the desired dependability level without explicit instructions. In the end, the success of the design to fulfill the standard is the result of a thorough verification process. Naturally, the goal of any verification team dealing with such important designs is complete coverage as well as standards conformity, but as these are complex hardware, complete functional verification is a difficult task. From the several techniques that exist to verify hardware, where each has its pros and cons, we studied six well-established in academia and in industry. We can divide them into two categories: simulation, which needs extremely large amounts of time, and formal verification, which needs unrealistic amounts of resources. Therefore, we conclude that a hybrid approach offers the best balance between simulation (time) and formal verification (resources). Electronics 2018, 7, 81 2 of 27 specialized functionalities. Along with these advantages, functional safety standards are in use to help guarantee a minimum level of dependability, since, in these fields, human lives are at stake. Nevertheless, these standards, e.g., ISO 26262 for automotive applications and DO-254 for the aerospace industry, are mostly like general guidelines, to steward the design process in order to achieve the desired dependability level. The "real" procedures that will make sure that this result is achieved usually takes shape in a thorough verification process  . During the design of hardware systems, verification teams use several techniques to verify the functionality to assure that no errors exist when the product is ready. However, in the development flow of critical systems, further measures must be taken into account, as these systems are constantly dealing with human lives. Therefore, techniques that can somehow prove that a design conforms to its specification or to a standard are great additions to the verification flow. Such tools employ formal techniques based on mathematical methodologies, and with them, it is possible to guarantee that portions of an architecture are correct. Formal Verification (FV) provides methods and techniques to mathematically prove the correctness of a system, such as Theorem Proving [6, 7] and Model Checking [8, 9] and can be adopted early in the development process. Theorem Proving relies heavily on high order logic and uses mathematical structures to build formulas that correspond to the behavior of the system. The verification process, then, is the evaluation of these formulas. Projects of any complexity can apply this method since it does not deal with states, but with formulas. However, it demands a high degree of knowledge of the design under verification (DUV) and of logic for the user to perform the verification. Moreover, since this process involves complex formulas, it is difficult to automate. The user must perform the proving systematically, creating the formulas, feeding them to the tool and analyzing the results [10, 11] . Model checking verifies properties against a model to prove that the design conforms to the specifications. The main advantage of this technique is that it reports illegal paths, known as counterexamples, which supports the correction of bugs. One problem inherent to this technique, however, is state space explosion, since the number of states used to model the system grows exponentially with the number of variables [12, 13] . To counter this problem, new techniques to reduce the state space footprint exist, thus allowing verification of larger projects. Other techniques used in the verification flow of hardware systems are Equivalence Checking, Simulation, and Semiformal Verification. However, these techniques need a more mature version of the architecture to begin the verification process. Equivalence Checking compares different levels of the same implementation to guarantee that they are functionally the same. This is a step commonly performed after the netlist is generated, where the generated description is compared to the register transfer level (RTL) that describes it. Furthermore, this technique checks the consistency of an optimization after its implementation or between different abstraction levels. In such case, a tool compares a higher-level functional description of the system to the generated or implemented low-level RTL code  . Some caution is necessary in this case, as the correlation between these implementations is not direct. Simulation is a method widely used in industry, due to its easy adoption and it gives a very good idea of the system's behavior. However, because it depends on input vectors to stimulate the circuit, when the number of input signals increases, more test vectors are needed to ensure that the system is completely correct and complying with the specifications. Thus, simulation cannot cover exhaustively the state space of large systems. In addition to formal verification techniques, which suffer from scalability problems, there are also hybrid approaches that combine the superior coverage of formal verification and the scalability of simulation. This survey aims to provide a more detailed overview of the most widely used formal verification techniques and discusses their value for critical SoCs' verification.