Understanding Neurons and Artificial Neural Networks

From Biological Foundations to Early Models

The human brain is composed of approximately 100 billion neurons. Each neuron consists of three main parts: dendrites (which receive input), the soma (which performs calculations), and the axon (which sends output). Information is conveyed between neurons through synapses, which are small gaps between the axon of one neuron and the dendrite of the next. In the soma, if the information received surpasses a certain threshold, the neuron "fires," sending a signal through its axon.

This structure is similar to a perceptron, which is a basic model of a neural network. A perceptron is a mathematical model that takes several binary inputs, multiplies them by weights, sums them up, and then applies an activation function to produce a single binary output. The concept of the artificial neuron was proposed by neuroscientist Warren McCulloch and logician Walter Pitts in 1943. Based on their work, psychologist Frank Rosenblatt developed and implemented the first artificial neural network in 1958.

Back to Blog