Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Speaking of Analog computation:

A single artificial neuron could be implemented as:

Weighted Sum

Using a summing amplifier:

net = Σ_i (Rf/Ri * xi)

Where resistor ratios set the synaptic weights.

Activation Function

Common op-amp activation circuits:

Saturating function: via op-amp with clipping diodes → approximated sigmoid

Hard limiter: comparator behavior for step activation

Tanh-like response: differential pair circuits

Learning

Early analog systems often lacked on-device learning; weights were manually set with potentiometers or stored using:

Memristive elements (recent)

Floating-gate MOSFETs

Programmable resistor networks



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: