Tag Archives: Neurocomputation

IEEEGor Knows What You’re Thinking…

Last month, I, EEGor, took part in the Brain-Computer Interface Designers Hackathon (BR41N.IO), the opening event of the IEEE Systems, Man and Cybernetics Conference in Bari, Italy. Brain-Computer Interfaces (BCIs) are a class of technologies designed to translate brain activity into machine actions to assist (currently in clinical trials) as well as (one day) enhance human beings. BCIs are receiving more and more media attention, most recently with the launch of Elon Musk’s newest company, Neuralink which aims to set up a two-way communication channel between man and machine using a tiny chip embedded in the brain. With the further aim of one-day perhaps making our wildest transhumanist dreams come true…

Continue reading

Neuronal Complexity: A Little Goes a Long Way…To Clean My Apartment

The classical model of signal integration in both biological and Artificial Neural Networks looks something like this,

f(\mathbf{s})=g\left(\sum\limits_i\alpha_is_i\right)

where g is some linear or non-linear output function whose parameters \alpha_i adapt to feedback from the outside world through changes to protein dense structures near to the point of signal input, namely the Post Synaptic Density (PSD). In this simple model integration is implied to occur at the soma (cell body) where the input signals s_i are combined and broadcast to other neurons through downstream synapses via the axon. Generally speaking neurons (both artificial and otherwise) exist in multilayer networks composing the inputs of one neuron with the outputs of the others creating cross-linked chains of computation that have been shown to be universal in their ability to approximate any desired input-output behaviour.

See more at Khan Academy

Models of learning and memory have relied heavily on modifications to the PSD to explain modifications in behaviour. Physically these changes result from alterations in the concentration and density of the neurotransmitter receptors and ion channels that occur in abundance at the PSD, but, in actuality these channels occur all along the cell wall of the dendrite on which the PSD is located. Dendrites are something like a multi-branched mass of input connections belonging to each neuron. This begs the question as to whether learning might in fact occur all along the length of each densely branched dendritic tree.

Continue reading