Publication:
Selective Neuron Re-Computation (SNRC) for Error-Tolerant Neural Networks

Loading...
Thumbnail Image
Identifiers
Publication date
2022-03-01
Defense date
Advisors
Tutors
Journal Title
Journal ISSN
Volume Title
Publisher
IEEE
Impact
Google Scholar
Export
Research Projects
Organizational Units
Journal Issue
Abstract
Artificial Neural networks (ANNs) are widely used to solve classification problems for many machine learning applications. When errors occur in the computational units of an ANN implementation due to for example radiation effects, the result of an arithmetic operation can be changed, and therefore, the predicted classification class may be erroneously affected. This is not acceptable when ANNs are used in many safety-critical applications, because the incorrect classification may result in a system failure. Existing error-tolerant techniques usually rely on physically replicating parts of the ANN implementation or incurring in a significant computation overhead. Therefore, efficient protection schemes are needed for ANNs that are run on a processor and used in resource-limited platforms. A technique referred to as Selective Neuron Re-Computation (SNRC), is proposed in this paper. As per the ANN structure and algorithmic properties, SNRC can identify the cases in which the errors have no impact on the outcome; therefore, errors only need to be handled by re-computation when the classification result is detected as unreliable. Compared with existing temporal redundancy-based protection schemes, SNRC saves more than 60 percent of the re-computation (more than 90 percent in many cases) overhead to achieve complete error protection as assessed over a wide range of datasets. Different activation functions are also evaluated.
Description
Keywords
Neural networks, Machine learning, Sigmoid, Error-tolerance
Bibliographic citation
IEEE Transactions on Computers (2022), 71(3), pp.: 684-695.