SciCombinator

Discover the most talked about and latest scientific content & concepts.

Concept: Computational neuroscience

517

Some birds achieve primate-like levels of cognition, even though their brains tend to be much smaller in absolute size. This poses a fundamental problem in comparative and computational neuroscience, because small brains are expected to have a lower information-processing capacity. Using the isotropic fractionator to determine numbers of neurons in specific brain regions, here we show that the brains of parrots and songbirds contain on average twice as many neurons as primate brains of the same mass, indicating that avian brains have higher neuron packing densities than mammalian brains. Additionally, corvids and parrots have much higher proportions of brain neurons located in the pallial telencephalon compared with primates or other mammals and birds. Thus, large-brained parrots and corvids have forebrain neuron counts equal to or greater than primates with much larger brains. We suggest that the large numbers of neurons concentrated in high densities in the telencephalon substantially contribute to the neural basis of avian intelligence.

Concepts: Nervous system, Neuron, Brain, Human brain, Bird, Cerebrum, Mammal, Computational neuroscience

168

Spike pattern classification is a key topic in machine learning, computational neuroscience, and electronic device design. Here, we offer a new supervised learning rule based on Support Vector Machines (SVM) to determine the synaptic weights of a leaky integrate-and-fire (LIF) neuron model for spike pattern classification. We compare classification performance between this algorithm and other methods sharing the same conceptual framework. We consider the effect of postsynaptic potential (PSP) kernel dynamics on patterns separability, and we propose an extension of the method to decrease computational load. The algorithm performs well in generalization tasks. We show that the peak value of spike patterns separability depends on a relation between PSP dynamics and spike pattern duration, and we propose a particular kernel that is well-suited for fast computations and electronic implementations.

Concepts: Action potential, Machine learning, Computer, Computer science, Support vector machine, Pattern recognition, Computational neuroscience, Supervised learning

86

Neuromorphic computing promises to markedly improve the efficiency of certain computational tasks, such as perception and decision-making. Although software and specialized hardware implementations of neural networks have made tremendous accomplishments, both implementations are still many orders of magnitude less energy efficient than the human brain. We demonstrate a new form of artificial synapse based on dynamically reconfigurable superconducting Josephson junctions with magnetic nanoclusters in the barrier. The spiking energy per pulse varies with the magnetic configuration, but in our demonstration devices, the spiking energy is always less than 1 aJ. This compares very favorably with the roughly 10 fJ per synaptic event in the human brain. Each artificial synapse is composed of a Si barrier containing Mn nanoclusters with superconducting Nb electrodes. The critical current of each synapse junction, which is analogous to the synaptic weight, can be tuned using input voltage spikes that change the spin alignment of Mn nanoclusters. We demonstrate synaptic weight training with electrical pulses as small as 3 aJ. Further, the Josephson plasma frequencies of the devices, which determine the dynamical time scales, all exceed 100 GHz. These new artificial synapses provide a significant step toward a neuromorphic platform that is faster, more energy-efficient, and thus can attain far greater complexity than has been demonstrated with other technologies.

Concepts: Nervous system, Neuron, Brain, Human brain, Demonstration, Neural network, Volt, Computational neuroscience

61

An enduring and richly elaborated dichotomy in cognitive neuroscience is that of reflective versus reflexive decision making and choice. Other literatures refer to the two ends of what is likely to be a spectrum with terms such as goal-directed versus habitual, model-based versus model-free or prospective versus retrospective. One of the most rigorous traditions of experimental work in the field started with studies in rodents and graduated via human versions and enrichments of those experiments to a current state in which new paradigms are probing and challenging the very heart of the distinction. We review four generations of work in this tradition and provide pointers to the forefront of the field’s fifth generation.

Concepts: Psychology, Brain, Decision making, Cognition, Memory, Decision theory, Decision making software, Computational neuroscience

49

The predominant focus in the neurobiological study of memory has been on remembering (persistence). However, recent studies have considered the neurobiology of forgetting (transience). Here we draw parallels between neurobiological and computational mechanisms underlying transience. We propose that it is the interaction between persistence and transience that allows for intelligent decision-making in dynamic, noisy environments. Specifically, we argue that transience (1) enhances flexibility, by reducing the influence of outdated information on memory-guided decision-making, and (2) prevents overfitting to specific past events, thereby promoting generalization. According to this view, the goal of memory is not the transmission of information through time, per se. Rather, the goal of memory is to optimize decision-making. As such, transience is as important as persistence in mnemonic systems.

Concepts: Psychology, Sleep, Neuroscience, Memory, Cognitive science, Cognitive neuroscience, Neurobiology, Computational neuroscience

36

At the centenary of D'Arcy Thompson’s seminal work ‘On Growth and Form’, pioneering the description of principles of morphological changes during development and evolution, recent experimental advances allow us to study change in anatomical brain networks. Here, we outline potential principles for connectome development. We will describe recent results on how spatial and temporal factors shape connectome development in health and disease. Understanding the developmental origins of brain diseases in individuals will be crucial for deciding on personalized treatment options. We argue that longitudinal studies, experimentally derived parameters for connection formation, and biologically realistic computational models are needed to better understand the link between brain network development, network structure, and network function.

Concepts: Medicine, Epidemiology, Disease, Infectious disease, Causality, Biology, Sociology, Computational neuroscience

33

In neuroscience, as in many other scientific domains, the primary form of knowledge dissemination is through published articles. One challenge for modern neuroinformatics is finding methods to make the knowledge from the tremendous backlog of publications accessible for search, analysis and the integration of such data into computational models. A key example of this is metascale brain connectivity, where results are not reported in a normalised repository. Instead, these experimental results are published in natural language, scattered among individual scientific publications. This lack of normalisation and centralisation hinders the large-scale integration of brain connectivity results. In this paper, we present text-mining models to extract and aggregate brain connectivity results from 13.2 million PubMed abstracts and 630,216 full-text publications related to neuroscience. The brain regions are identified with three different named entity recognisers and then normalised against two atlases: the Allen Brain Atlas (ABA) and the atlas from the Brain Architecture Management System (BAMS). We then use three different extractors to assess inter-region connectivity.

Concepts: Neuroanatomy, Neuroscience, Human brain, Academic publishing, Science, Publishing, Computational neuroscience, Brotherhood of Evil

26

Computational neuroscience has uncovered a number of computational principles used by nervous systems. At the same time, neuromorphic hardware has matured to a state where fast silicon implementations of complex neural networks have become feasible. En route to future technical applications of neuromorphic computing the current challenge lies in the identification and implementation of functional brain algorithms. Taking inspiration from the olfactory system of insects, we constructed a spiking neural network for the classification of multivariate data, a common problem in signal and data analysis. In this model, real-valued multivariate data are converted into spike trains using “virtual receptors” (VRs). Their output is processed by lateral inhibition and drives a winner-take-all circuit that supports supervised learning. VRs are conveniently implemented in software, whereas the lateral inhibition and classification stages run on accelerated neuromorphic hardware. When trained and tested on real-world datasets, we find that the classification performance is on par with a na├»ve Bayes classifier. An analysis of the network dynamics shows that stable decisions in output neuron populations are reached within less than 100 ms of biological time, matching the time-to-decision reported for the insect nervous system. Through leveraging a population code, the network tolerates the variability of neuronal transfer functions and trial-to-trial variation that is inevitably present on the hardware system. Our work provides a proof of principle for the successful implementation of a functional spiking neural network on a configurable neuromorphic hardware system that can readily be applied to real-world computing problems.

Concepts: Nervous system, Neuron, Brain, Insect, Neuroscience, Axon, Synapse, Computational neuroscience

22

We show that deep convolutional neural networks combined with nonlinear dimension reduction enable reconstructing biological processes based on raw image data. We demonstrate this by reconstructing the cell cycle of Jurkat cells and disease progression in diabetic retinopathy. In further analysis of Jurkat cells, we detect and separate a subpopulation of dead cells in an unsupervised manner and, in classifying discrete cell cycle stages, we reach a sixfold reduction in error rate compared to a recent approach based on boosting on image features. In contrast to previous methods, deep learning based predictions are fast enough for on-the-fly analysis in an imaging flow cytometer.The interpretation of information-rich, high-throughput single-cell data is a challenge requiring sophisticated computational tools. Here the authors demonstrate a deep convolutional neural network that can classify cell cycle status on-the-fly.

Concepts: DNA, Cell nucleus, Cell division, Chromosome, Neuroscience, Neural network, Computational neuroscience, Unsupervised learning

17

The success of fMRI places constraints on the nature of the neural code. The fact that researchers can infer similarities between neural representations, despite fMRI’s limitations, implies that certain neural coding schemes are more likely than others. For fMRI to succeed given its low temporal and spatial resolution, the neural code must be smooth at the voxel and functional level such that similar stimuli engender similar internal representations. Through proof and simulation, we determine which coding schemes are plausible given both fMRI’s successes and its limitations in measuring neural activity. Deep neural network approaches, which have been forwarded as computational accounts of the ventral stream, are consistent with the success of fMRI, though functional smoothness breaks down in the later network layers. These results have implications for the nature of the neural code and ventral stream, as well as what can be successfully investigated with fMRI.

Concepts: DNA, Brain, Neuroimaging, Neuroscience, Logic, Neural network, Neural networks, Computational neuroscience