What does formal verification mean?
The Formal verification is to prove if the software satisfies the requirements. This concept is not just limited to the IT industry, but also applicable to electronics and embedded systems as well. The formal techniques are applied to verify the system. These formal verification techniques are applicable mainly at the levels of software development like design and development stages.
Formal verification testing is something like using mathematical techniques to convincingly argue that a chunk of software implements a specification. The specification is also very domain-specific as a robot might not injure an individual or through inaction allow a person to harm or it should be generic as in “no execution of system dereferences a null pointer”. So far we all know extremely little about what could be included in verifying the specifications that identify entities on the planet, as against entities inside the software applications.
For eg: Formal verification is logical equivalence checking, which verifies the logical equivalence of RTL, gate, transistor-level netlists. It has a faster verification cycle for the technical design specification. Only generates functional vectors for simulation to spot the bugs.
Formal verification application
Let’s consider the embedded systems design. Mainly, it verifies the equivalence of design at different stages of ASIC flow. It makes sure that logical changes are made.
RTL (Resistor-Transistor level) design needs proper verification to test it, and check if there exists any damages or bugs. Here functionality isn’t changed but focused on the coding style or has some rules that have to be followed. We can use formal verification here, in order to ensure if the new RTL matches the old RTL. For small projects, the Formal verification will take less time as we would not run all the test cases.
The basic observation of formal verification is to eliminate the chance of certain sorts of bugs in an exceedingly narrow layer of the system, the one which is being verified. Consider like an application software that follows all the possible laws but once we compile it with a bug compiler, run it on a defective operating system, or run it in on a defective processor, the problems to be discovered through testing. To make formal verification of a good range of layers, we need operating systems, compilers, libraries, chips. If people are engaged in all these things, through the chaining together of correctness proof for realistic software stack which has not yet happened.
The types of requirements or specifications that can be verified are functional, protocols, real-time applications, memory, security, the robustness of the software. Examples of formal verification methods are deductive verifications, modeling tools, simulators, static analysis, etc.
Why formal verification is used to replace some kinds of testing?
Now the question is under what conditions the formal verification replaces testing
- The formal verification tool is trustworthy
- The specification is taken into account to be trustworthy.
- Specification is taken into account to capture entire properties of interest.
- Layers of system not being verified (compiler, OS) are considered to be trustworthy.
- Suitable for safety and security critical applications
- More precise testing and deductive verification
- reuse and automation is possible
- documentation and maintenance is possible
The main reason is cost. Testing techniques are improving but not at a rapid rate. The basic problem with testing is that we can’t do an awfully good job of it in any realistic amount of time. The great hope of formal verification is that after the massive initial investments are undergone to get going, incremental re-verification is going to be relatively inexpensive, permitting top-quality software that is to be delivered very rapidly. In the future, reduction in cost, not reduction in defects, are the killer applications of formal methods. Testing and formal methods converge as we expand the one restrict the opposite.