Physics Overview
Paper Read
Physics for Neuromorphic Computing 2020
Human brain:
- large inter-connectivity - high dimension (higher factorial dimension for white matter).
- memory and computation are not separated. synchronous communication is avoided.
- large fan-in/fan-out & low energy consumption.
- Compared to neuron networks :
- neurons are more than non-linear functions : spike, leaky, stochastic, oscillate, synchronize, etc.
- synapses are more than analog weights : leaky, time scale, parameter pattern, stochastic, etc.
Map AI to physical system:
- Neuromorphic chips - using memristor (memory-resistor).
- hard for learning.
- Photonic Neural networks.
- neurons - optical resonators; synapses - interferometers & optically active phase change materials.
- large size; energy cost of lasers.
Materials and physics used - oxide (氧化物) electronics.
- Conductive bridge devices can emulate long short term memory.
- Materials exhibit phase transitions (e.g. Mott insulators) can emulate spiking neurons.
- Chaicogenide-based phase change memories.
- Organic materials.
- Flux quantization in superconductive Josephson junctions.
Unsupervised learning with Spike Timing Dependent Plasticity (STDP) - weight updated depending on the the timing of spikes occurring on both sides of a synapse.
CMOS synapses and neurons | Resistive switching synapses with CMOS neurons | Photonic synapses and neurons | Spintronic synapses and neurons | Superconductive synapses and neurons | |
Connections | wires | wires | light | microwaves | wires or microwaves |
Min neuron lateral size | 10 µm | 10 µm | 100 µm | 10nm | 20 nm |
Min synapse lateral size | 10 µm | 10 nm | 1 µm | 10nm | 20 nm |
Advantages | commercial | Nanoscale synapse, technology-ready | Wavelength multiplexing, can be totally passive (zero energy consumption) | Nanoscale synapses and neurons, almost commercial technology | Low energy consumption beside cryogenic requirements, all identical spikes |
Disadvantages | Size of neurons and synapses, no in-memory computing | Size of neurons, complex wiring | Size of neurons and synapses, dissipation due to lasers | Scalability to be demonstrated | Scalability to be demonstrated |
Chips | Inference | Inference coming soon | no | no | no |
Spiking-Neural-Network
Paper Read
Spiking Neural Networks and Their Applications: A Review 2022
- Introduction of : biological neurons (dendrites, soma, axon, synapse, neurotransmitters), artificial neural networks (r=f(Wu+b)), spiking neural networks (spike times).
- Spiking Neuron Models (see wiki - Biological neuron model for more):
- Hodgkin-Huxley Model. (efficiency-, plausibility+) include K Na channels.
- Leaky Integrate and Fire Model. (efficiency+, plausibility-) ignore iron channels.
- Izhikevich Model. (efficiency+, plausibility+) use 2d system (potential & ionic current).
- Adaptive Exponential Integrate-and-Fire Model. (efficiency+, plausibility=) 2d system (potential & adaption slow variable).
- Synaptic Models: decay and rise of the PSC (post-synaptic current).
- SNN Learning:
- Spike-Based Backpropagation.
- Spike Timing Dependent Plasticity (STDP).
- ANN-to-SNN Conversion. convert RELU to IF neurons.
- Spike Encoding : decode/encode spikes into/from information - rate encoding & pulse encoding.
Spike-FlowNet: Event-based Optical Flow Estimation with Energy-Efficient Hybrid Neural Networks 2020, github. ANN+SNN optical flow for event camera.
- ANN for pixel-based images rely on photo-consistency constraints; SNN fits Event camera (bio-inspired silicon retinas).
- SNN problem : The number of spikes drastically vanish at deeper layers.
- Make an simple implementation of IF (integrate-and-fire) SNN in python.
Intel Loihi (wikichip) based works:
- Reinforcement co-Learning of Deep and Spiking Neural Networks for Energy-Efficient Mapless Navigation with Neuromorphic Hardware 2020 hybrid SNN + DNN framework.
- SNN (LIF leaky-integrate-and-fire) : state-to-action network. (trained by backpropagation using pseudo-gradient function)
- transform to an end-to-end SNN compared to their previous work.
- DNN : action-value (critic ) network.
- SNN (LIF leaky-integrate-and-fire) : state-to-action network. (trained by backpropagation using pseudo-gradient function)
- Spiking Neural Network on Neuromorphic Hardware for Energy-Efficient Unidimensional SLAM 2019,
- mammalian brains space representation.
- head direction network, reference frame transformation network, distance mapping network, observation likelihood network, bayesian inference network.
python implementation 2018, SpykeTorch 2021, Brian2 2008.
First-Spike-Based Visual Categorization Using Reward-Modulated STDP 2017 : supervised learingin - Reward-Modulated STDP (using RL). R-STDP can change the behavior of a neuron. implementation using SpykeTorch.
- Layer 1 : convert image to spike latencies based on the saliency of its oriented edges.
- Layer 2 : local pooling.
- Layer 3 : integrate-and-fire neurons. (trainable)
- Layer 4 : decision making.
- RL supervised update github :
- My test playground - SNN Heading Estimation.
Bioinspired Programming of Memory Devices for Implementing an Inference Engine 2015.
- problem : von Neumann bottleneck - requirement of large energy budget from the separation of memory and computing. (a neuron only performs basis operations, which depend on a high number of memory access)
- long term memory <- synaptic plasticity. synapse tend to reinforce causal links. -> Spike-timing-dependent plasticity - local, unsupervised.
- Any synapse that contribute to the firing of a post-synaptic neuron should be made strong i.e it’s value should be increased.
- Synapses that don’t contribute to the firing of a post-synaptic neuron should be dimished i.e it’s value should be decreased.
- Memory devices:
- Multilevel Memory : (a) cumulative memristive device; (b) phase change memory.
- Stochastic Synapse : (c) conductive bridge memory; (d) STT-MTJ (basic cell of STT-MARAM).