allelization and other training optimizations. Weight
modification must take into account the impact of the
dendritic tree on learning signals. Our trainable im-
plementation of the AD uses an RBF metric based
on spike propagation time. We are uncertain how
well compartmental separation can be applied to rate-
based models. The model has not been tested using a
large number of inputs (¿ 100) or in networks with
hidden layers of AD neurons.
There are many open questions related to dendritic
computation. The AI community is faced with a size
and energy bottleneck on the networks we can create.
We need tools allowing us to do more with less which
might require a return to basics and biology. For
the neuroscientific community, there remains a gap in
our understanding how micro-level phenomena con-
struct meso-level information processing which then
contribute to macro-level behaviors. We suspect that
utilitarian neuron models of increased complexity can
make a contribution to both.
REFERENCES
Beniaguev, D., Segev, I., and London, M. (2021). Sin-
gle cortical neurons as deep artificial neural networks.
Neuron, 109(17):2727–2739.
Bower, J. M. (2015). The 40-year history of modeling active
dendrites in cerebellar purkinje cells: emergence of
the first single cell “community model”. Frontiers in
computational neuroscience, 9:129.
Cuntz, H., Remme, M. W. H., and Torben-Nielsen, B.
(2014). The computing dendrite: from structure to
function, volume 11. Springer.
Elias, J. G. (1992). Spatial-temporal properties of artificial
dendritic trees. In [Proceedings 1992] IJCNN Inter-
national Joint Conference on Neural Networks, vol-
ume 2, pages 19–26. IEEE.
Elias, J. G., Chu, H.-H., and Meshreki, S. M. (1992). Sil-
icon implementation of an artificial dendritic tree. In
[Proceedings 1992] IJCNN International Joint Con-
ference on Neural Networks, volume 1, pages 154–
159. IEEE.
Fox, C. A. and Barnard, J. W. (1957). A quantitative study
of the purkinje cell dendritic branchlets and their rela-
tionship to afferent fibres. Journal of Anatomy, 91(Pt
3):299.
H
¨
ausser, M. and Mel, B. (2003). Dendrites: bug or feature?
Current opinion in neurobiology, 13(3):372–383.
Jones, I. S. and Kording, K. P. (2020). Can single neurons
solve mnist? the computational power of biological
dendritic trees. arXiv preprint arXiv:2009.01269.
McCulloch, W. S. and Pitts, W. (1943). A logical calculus
of the ideas immanent in nervous activity. The bulletin
of mathematical biophysics, 5(4):115–133.
Megıas, M., Emri, Z., Freund, T., and Gulyas, A. (2001).
Total number and distribution of inhibitory and exci-
tatory synapses on hippocampal ca1 pyramidal cells.
Neuroscience, 102(3):527–540.
Mel, B. W. (2016). Toward a simplified model of an active
dendritic tree. Dendrites,, pages 465–486.
Papoutsi, A., Kastellakis, G., Psarrou, M., Anastasakis, S.,
and Poirazi, P. (2014). Coding and decoding with den-
drites. Journal of Physiology-Paris, 108(1):18–27.
Poirazi, P. and Papoutsi, A. (2020). Illuminating dendritic
function with computational models. Nature Reviews
Neuroscience, 21(6):303–321.
Polsky, A., Mel, B. W., and Schiller, J. (2004). Compu-
tational subunits in thin dendrites of pyramidal cells.
Nature neuroscience, 7(6):621–627.
Rall, W. (1964). Theoretical significance of dendritic trees
for neuronal input-output relations. In Reiss, R. F.,
of Scientific Research, U. S. A. F., and General Pre-
cision, I., editors, Neural theory and modeling: pro-
ceedings of the 1962 Ojai symposium, Stanford, Calif.
Stanford University Press.
Teng, F. and Todo, Y. (2019). Dendritic neuron model and
its capability of approximation. In 2019 6th Inter-
national Conference on Systems and Informatics (IC-
SAI), pages 542–546. IEEE.
Wu, X., Liu, X., Li, W., and Wu, Q. (2018). Improved
expressivity through dendritic neural networks. In
Advances in neural information processing systems,
pages 8057–8068.
An Artificial Dendritic Neuron Model Using Radial Basis Functions
783