1
The structure and components of our neural systems affect many func...
One of the main goals of this paper is to identify characteristics ...
The body can determine what it needs and through selective gene exp...
The study monitors synapse density over time in an area responsible...
To get an idea of the precision of the data, the team used a method...
We know that flipping a coin will either give us heads, or tails, a...
The computer algorithm and network has many parallels with the natu...
In Figure B: For the pruning algorithm, we start with a network tha...
How the pruning algorithm works: Pruning decisions were made locall...
Here, we are exploring why experimentally observed decreasing pruni...
Rate Comparisons: Increasing: Starts out slow, by eliminating a...
The more edges existing, the more energy is consumed. An optimal ne...
2s-patch (even)- A specific group of source nodes is connected to a...
The Erdős–Rényi (ER) random graph model (G(n,M) model) creates all ...
As q represents the likelihood of getting an S T edge, it may be i...
Can evolution really help construct the brain in such a way that th...
Scientists theorize that the success of decreasing rate pruning is ...
Much of the study of neurobiology is heavily focused on the regulat...
Decreasing pruning rates are also found in the brain, consistent wi...
The EPTA staining method allows for improved visualization of synap...
SVMs are classifiers- they are machine learning models that are tra...
Theoretically, this allows the system to remove edges that can make...
You may ask, why don't we just link the source node to the target n...
RESEARCH ARTICLE
Decreasing-Rate Pruning Optimizes the
Construction of Efficient and Robust
Distributed Networks
Saket Navlakha
1
, Alison L. Barth
2
*, Ziv Bar-Joseph
3
*
1 Center for Integrative Biology, The Salk Institute for Biological Studies, La Jolla, California, United States of
America, 2 Department of Biological Sciences, Center for the Neural Basis of Cognition, Carnegie Mellon
University, Pittsburgh, Pennsylvania, United States of America, 3 Lane Center for Computational Biology,
Machine Learning Department, Carnegie Mellon University, Pittsburgh, Pennsylvania, United States of
America,
* barth@cmu.edu (ALB); zivbj@cs.cmu.edu (ZBJ)
Abstract
Robust, efficient, and low-cost networks are advantageous in both biological and engi-
neered systems. During neural network development in the brain, synapses are massively
over-produced and then pruned-back over time. This strategy is not commonly used when
designing engineered networks, since adding connections that will soon be removed is con-
sidered wasteful. Here, we show th at for large distributed routing networks, network function
is markedly enhanced by hyper-connectivity followed by aggressive pruning and that the
global rate of pruning, a developmental parameter not previously studied by experimental-
ists, plays a crit ical role in optimizing network structure. We first used high-throughput
image analysis techniques to quantify the rate of pruning in the mammalian neocortex
across a broad developmental time window and found that the rate is decreasing over time.
Based on these results, we analyzed a model of computational routing networks and show
using both theoretical analysis and simulations that decreasing rates lead to more robust
and efficient networks compared to other rates. We also present an applicatio n of this strat-
egy to improve the distributed design of airline networks. Thus, inspiration from neural net-
work formation suggests effective ways to design distributed networks across several
domains.
Author Summary
During development of neural circuits in the brain, synapses are massively over-produced
and then pruned-back over time. This is a fundamental process that occurs in many brain
regions and organisms, yet, despite decades of study of this process, the rate of synapse
elimination, and how such rates affect the function and structure of networks, has not
been studied. We performed large-scale brain imaging experiments to quantify synapse
elimination rates in the developing mouse cortex and found that the rate is decreasing
over time (i.e. aggressive elimination occurs early, followed by a longer phase of slow
PLOS Computational Biology | DOI:10.1371/journal.pcbi.1004347 July 28, 2015 1 / 23
a11111
OPEN ACCESS
Citation: Navlakha S, Barth AL, Bar-Joseph Z (2015)
Decreasing-Rate Pruning Optimizes the Construction
of Efficient and Robust Distributed Networks. PLoS
Comput Biol 11(7): e1004347. doi:10.1371/journal.
pcbi.1004347
Editor: Lyle J. Graham, Université Paris Descartes,
Centre National de la Recherche Scientifique,
FRANCE
Received: December 12, 2014
Accepted: May 20, 2015
Published: July 28, 2015
Copyright: © 2015 Navlakha et al. This is an open
access article distributed under the terms of the
Creative Commons Attribution License, which permits
unrestricted use, distribution, and reproduction in any
medium, provided the original author and source are
credited.
Data Availability Statement: All images generated
to study developmental pruning in the mouse
somatosensory cortex are available in an online
database at: http://www.snl.salk.edu/~navlakha/
pruning_data/
Funding: This work was supported in part by the
National Institutes of Health award no. F32-
MH099784 to SN; by grants from the National
Institutes of Health award no. 0171-088 and the
McKnight Foundation to ALB; and by grants from the
McDonnell Foundation programme on Studying
Complex Systems and from the US National Science
elimination). We show that such rates optimize the efficiency and robustness of distrib-
uted routing networks under several models. We also present an application of this strat-
egy to improve the design of airline networks.
Introduction
Neural networks in the brain are formed during development using a pruning process that
includes expansive growth of synapses followed by activity-dependent elimination. In humans,
synaptic density peaks around age 2 and subsequently declines by 5060% in adulthood [14].
It has been hypothesized that synaptic pruning is important for experience-dependent selection
of the most appropriat e subset of connections [1, 5], and it occurs in many brain regions and
species [69]. This strategy substantially reduces the amount of genetic information required
to code for the trillions of connections made in the human brain [10]. Instead of instructing
precise connections, more general rules can be applied, which are then fine-tuned by activity-
dependent selection. Although the molecular and cellular mechanisms driving activity-depen-
dent pruning have been extensively investigated [1, 3, 4], global aspects of this highly-distrib-
uted process, including the rate at which synapses are pruned, the impact of these rates on
network function, and the contrast of pruning-versus growth-based strategies commonly used
in engineering to construct networks, has not been studied.
While the specific computations performed within neural and engineered networks may be
very different, at a broad level, both types of networks share many goals and constraints [11].
First, networks must propagate signals efficiently while also being robust to malfunctions (e.g.
spike propagation failures in neural networks [1214]; computer or link failures in communi-
cation networks [15]). Second, both types of networks must adapt connections based on pat-
terns of input activity [16]. Third, these factors must be optimized under the constraint of
distributed processing (without a centralized coordinator) [17, 18], and using low-cost solu-
tions that conserve important metabolic or physical resources (e.g. number of synapses or wir-
ing length in biological networks; energy consumption or battery-life in engineered networks)
[1921]. For example, on the Internet or power grid, requests can be highly dynamic and vari-
able over many time-scales and can lead to network congestion and failures if networks are
unable to adapt to such conditions [22, 23]. In wireless or mobile networks, broadcast ranges
(which determine network topology) need to be inferred in real-time based on the physical dis-
tribution of devices in order to optimize energy efficiency [24]. Although optimizing network
design is critical for such engineered systems across a wide range of applications, existing algo-
rithms used for this problem are not, to our knowledge, based on experience-based pruning, in
part because adding connections that will soon be eliminated is considered wasteful.
Here, we develop a computational approach informed by experimental data to show that
pruning-inspired algorithms can enhance the design of distrib uted routing networks. First, we
experimentally examined developmental pruning rates in the mouse somatosensory cortex, a
well-characterized anatomical structure in the mouse brain [25]. Using electron microscopy
imaging across 41 animals and 16 developmental time-points, coupled with unbiased and
high-throughput image analysis [26], we counted over 20,000 synapses and determined that
pruning rates are decreasing over time (i.e. early, rapid synapse elimination is followed by a
period of slower, gradual elimination). Next, to translate these observations to the computa-
tional domain, we developed a simulated environment for comparing algorithms for distrib-
uted network construction. We find that over-connection followed by pruning leads to
significant improvements in efficiency (routing distance in the network) and robustness
Pruning Optimizes Construction of Efficient and Robust Networks
PLOS Computational Biology | DOI:10.1371/journal.pcbi.1004347 July 28, 2015 2 / 23
Foundation award nos. DBI-0965316 and DBI-
1356505 to ZBJ. The funders had no role in study
design, data collection and analysis, decision to
publish, or preparation of the manuscript.
Competing Interests: The authors have declared
that no competing interests exist.
(number of alternative routes between two nodes) compared to commonly-used methods that
add connections to initially-sparse networks. To determine if these results hold more generally,
we analyzed the theor etical basis of network construction by pruning and found that dec reas-
ing rates led to networks with near-optimal connectivity compared to other rates (increasing,
constant, etc.), which we also confirmed using simulations. Finally, we adapted a pruning-
based strategy to improve the design of airline networks using real traffic pattern data.
The novelty of our approach is two-fold. First, while synaptic pruning has been studied for
decades, previous analyses have determined that synaptic density peaks during early develop-
ment and is reduced by late adolescence and adulthood [69]. However, fine-scale measure-
ments to statistically establish the rate of synapse elimination have not been made. Second,
while substantial prior work linking neural and computational networks has focused on the
computation performed by neural networks [27, 28], our work focuses on the construction of
networks and provides a quantitative platform to compare different network construction pro-
cesses based on their cost, efficiency, and robustness. Our goals here are to model pruning
from an abstract, graph-theoretic perspective; we do not intend to capture all the requirements
of information processing in the brain, and instead focus on using pruning-inspired algorithms
for improving routing in distributed networks. Overall, our results suggest that computational
thinking can simultaneously lead to novel, testable biological hypotheses and new distributed
computing algorithms for designing better networks.
Results
Neural networks employ decreasing rates of synapse elimination
Many generative models have been propos ed to understand how networks evolve and develop
over time (e.g. preferential attachment [29], small-world models [30], duplication-divergence
[31, 32]), yet most of these models assume that the numb er of nodes and edges strictly grows
over time. Synaptic pruning, however, diverges from this strategy. To better understand how
pruning is implemented and whether it can be used to construct networks for broad routing
problems, we sought to measure this process experimentally. Although pruning is a well-estab-
lished neurodevelopmental phenomenon, previous experimental studies have primarily
focused on identifying the time period over which pruning begins and ends but have largely
ignored the dynamics in between these end-points [6, 9, 33], lacking crucial pruning rate infor-
mation that may be useful for using pruning-based strategies for building distributed networks.
To determine the rate of synapse loss in developing neural networks, we focused on a well-
characterized region of the neocortex, layer 4 of somatosensory cortex representing the D1
whisker (Fig 1A), where both thalamic inputs and recurrent circuitry are established in the first
two postnatal weeks [3436]. Because this region of primary sensory cortex does not receive
significant input from other cortical layers [37], measurements of synaptic pruning reflect the
maturation of an extant network, uncontaminated by the addition of synapses over the analysis
window. In addition, the somatotopic anatomy of the whisker (barrel) cortex insured that com-
parisons across different animals and time-points could be made for the identical small cortical
region (Fig 1B).
Changes in synaptic density over time were obtained from sampling 41 animals over 16
developmental time-points ranging from postnatal day 14 (P14) to P40 (Table 1). Over 20,000
synapses in nearly 10,000 images were identified using a synapse-enhancing reaction that spe-
cifically highlights synaptic contacts for electron microscopy [ 38, 39], coupled with unbiased
machine learning algorithms (Fig 1C; Materials and Methods)[26]. Consistent with prior esti-
mates that sampled only the peak and the end-point [9, 33], peak synaptic density occurred at
Pruning Optimizes Construction of Efficient and Robust Networks
PLOS Computational Biology | DOI:10.1371/journal.pcbi.1004347 July 28, 2015 3 / 23