The McCulloch-Pitts Neuron was the first mathemetical of how neurons work. It was created by Warren MuCulloch (neuroscientist) and Walter Pitts (logician) in 1943. It is overly simplified as it takes in dendrites as binary values of 0 or 1, and then runs them through a sum function and based off the threshold limit, will return a 0 or 1. Notice that the model only processes scalar values of 0 or 1, 0 representing not firing and 1 representing firing. All inputs to the neuron provide equal weight to contributing if the neuron will fire.
A link to their paper: https://scholar.google.com/scholar?q=A+Logical+Calculus+of+Ideas+Immanent+in+Nervous+Activity
In 1957, Frank Rosenblatt modified their model to work on weights of non-binary values (0-1) and added synaptic weights to each input allowing different weights for each input value . This model is called the perceptron.
One thing to note is that these neuron models only work on linearly separable data.
To increase the computational power of these artificial neurons, we put them into networks.
Modern neural networks are still based off this model from the 1940’s!!!