Options
September 30, 2025
Master Thesis
Title
Design of a Simply Sufficient Leaky Integrate-and-Fire Neuron in 22nm FDSOI
Abstract
In this thesis the design process of a simply sufficient leaky-integrate and fire (LIF) neuron is discussed to examine how much precision is needed in a hardware design. Therefore, variations of hardware parameters are analyzed and then the impact of the variations are tested in the snnTorch framework.
Initially different hardware approaches are compared and ultimately the circuit by Eslahi is chosen as a basis for a circuit implementation in 22 nm FDSOI technology. Monte Carlo simulations are conducted to observe hardware neuron and synapse pa rameters under the impact of process variations and mismatch. The results have shown that, the hardware neuron design is robust regarding the variation of the parameters.
After the variation values are acquired from the hardware modules, a custom soft ware model is designed in snnTorch framework to simulate and test the impact of the parameter variations on the spiking neural network (SNN) performance. In the training process of the SNN with 500 neurons in hidden layer an accuracy of 88.34 % is reached. The software test results have shown that, the synapse parameters have the biggest influence on the accuracy of the network and influence of the neuron parameters on the other hand is negligible.
Initially different hardware approaches are compared and ultimately the circuit by Eslahi is chosen as a basis for a circuit implementation in 22 nm FDSOI technology. Monte Carlo simulations are conducted to observe hardware neuron and synapse pa rameters under the impact of process variations and mismatch. The results have shown that, the hardware neuron design is robust regarding the variation of the parameters.
After the variation values are acquired from the hardware modules, a custom soft ware model is designed in snnTorch framework to simulate and test the impact of the parameter variations on the spiking neural network (SNN) performance. In the training process of the SNN with 500 neurons in hidden layer an accuracy of 88.34 % is reached. The software test results have shown that, the synapse parameters have the biggest influence on the accuracy of the network and influence of the neuron parameters on the other hand is negligible.
Thesis Note
München, TU, Master Thesis, 2025
Author(s)
File(s)
Rights
Use according to copyright law
Language
English