In-Memory Computing (IMC) accelerators based on resistive crossbars are emerging as a promising pathway toward improved energy efficiency in artificial neural networks. While significant research efforts are directed toward designing advanced resistive memory devices, the nonidealities associated with practical device implementation are often overlooked. Existing solutions typically compensate for these nonidealities during off-chip training, introducing additional complexities and failing to account for random errors such as noise, device failures, and cycle-to-cycle variability. To tackle this challenge, this work proposes a self-calibrated activation neuron topology that offers a fully online non-linearity compensation for IMC accelerators. The neuron merges multiply-accumulate operations with Rectified Linear Unit (ReLU) activation function in the analog domain for increased efficiency. The self-calibration is integrated into the data conversion process to minimize overheads and be fully online. The proposed activation neuron is designed and simulated using 22 nm FDSOI CMOS technology. The design demonstrates robustness across a wide temperature range (-40°C to 80°C) and under various process corners, with a maximum accuracy loss of 1 LSB for an 8-bit activation accuracy.
A Self-Calibrated Activation Neuron Topology for Efficient Resistive-Based in-Memory Computing