Skip to content

Commit b32a3a3

Browse files
Merge pull request #3279 from jessica-mitchell/fix-gapjunctions
Update documentation on Gap Junctions
2 parents a343246 + b418db6 commit b32a3a3

File tree

1 file changed

+49
-30
lines changed

1 file changed

+49
-30
lines changed

doc/htmldoc/synapses/simulations_with_gap_junctions.rst

Lines changed: 49 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -3,14 +3,6 @@
33
Simulations with gap junctions
44
==============================
55

6-
**Note:** This documentation describes the usage of gap junctions in
7-
NEST 2.12. A documentation for NEST 2.10 can be found in `Hahne et al.
8-
2016 <http://link.springer.com/chapter/10.1007/978-3-319-50862-7_4>`__.
9-
It is however recommended to use NEST 2.12 (or later), due to several
10-
improvements in terms of usability.
11-
12-
Introduction
13-
------------
146

157
Simulations with gap junctions are supported by the Hodgkin-Huxley
168
neuron model ``hh_psc_alpha_gap``. The synapse model to create a
@@ -27,17 +19,19 @@ possibility to create both connections with a single call to
2719
2820
import nest
2921
30-
a = nest.Create('hh_psc_alpha_gap')
31-
b = nest.Create('hh_psc_alpha_gap')
22+
a = nest.Create("hh_psc_alpha_gap")
23+
b = nest.Create("hh_psc_alpha_gap")
24+
gap_weight = 0.5
25+
syn_dict = {"synapse_model": "gap_junction", "weight": gap_weight}
26+
conn_dict = {"rule": "one_to_one", "make_symmetric": True}
3227
# Create gap junction between neurons a and b
33-
nest.Connect(a, b, {'rule': 'one_to_one', 'make_symmetric': True},
34-
{'model': 'gap_junction', 'weight': 0.5})
28+
nest.Connect(a, b, conn_dict, syn_dict)
3529
3630
In this case the reverse connection is created internally. In order to
3731
prevent the creation of incomplete or non-symmetrical gap junctions the
3832
creation of gap junctions is restricted to
3933

40-
- ``one_to_one`` connections with ``'make_symmetric': True``
34+
- ``one_to_one`` connections with ``"make_symmetric": True``
4135
- ``all_to_all`` connections with equal source and target populations
4236
and default or scalar parameters
4337

@@ -61,25 +55,29 @@ level with e.g. the ``random`` module of the Python Standard Library:
6155
# total number of gap junctions
6256
n_gap_junction = 3000
6357
64-
n = nest.Create('hh_psc_alpha_gap', n_neuron)
58+
gap_weight = 0.5
59+
n = nest.Create("hh_psc_alpha_gap", n_neuron)
60+
n_list = n.tolist()
6561
6662
random.seed(0)
6763
68-
# draw n_gap_junction pairs of random samples from the list of all
69-
# neurons and reshaped data into two corresponding lists of neurons
70-
m = np.transpose(
71-
[random.sample(n, 2) for _ in range(n_gap_junction)])
64+
# draw n_gap_junction pairs of random samples
65+
connections = np.random.choice(n_list, [n_gap_junction, 2])
66+
67+
for source_node_id, target_node_id in connections:
68+
nest.Connect(
69+
nest.NodeCollection([source_node_id]),
70+
nest.NodeCollection([target_node_id]),
71+
{"rule": "one_to_one", "make_symmetric": True},
72+
{"synapse_model": "gap_junction", "weight": gap_weight},
73+
)
7274
73-
# connect obtained lists of neurons both ways
74-
nest.Connect(m[0], m[1],
75-
{'rule': 'one_to_one', 'make_symmetric': True},
76-
{'model': 'gap_junction', 'weight': 0.5})
7775
7876
As each gap junction contributes to the total number of gap-junction
7977
connections of two neurons, it is hardly possible to create networks
8078
with a fixed number of gap junctions per neuron. With the above script
8179
it is however possible to control the approximate number of gap
82-
junctions per neuron. E.g. if one desires ``gap_per_neuron = 60`` the
80+
junctions per neuron. For example, if one desires ``gap_per_neuron = 60`` the
8381
total number of gap junctions should be chosen as
8482
``n_gap_junction = n_neuron * gap_per_neuron / 2``.
8583

@@ -92,18 +90,16 @@ total number of gap junctions should be chosen as
9290
full set of random numbers and temporarily represent the total
9391
connectivity in variable ``m``. Therefore it is advisable to use the
9492
internal random connection rules of NEST for the creation of connections
95-
whenever possible. For more details see `Hahne et al.
96-
2016 <http://link.springer.com/chapter/10.1007/978-3-319-50862-7_4>`__.
93+
whenever possible. For more details see Hahne et al. [1]_
9794

9895
Adjust settings of iterative solution scheme
9996
--------------------------------------------
10097

101-
For simulations with gap junctions NEST uses an iterative solution
98+
For simulations with gap junctions, NEST uses an iterative solution
10299
scheme based on a numerical method called Jacobi waveform relaxation.
103100
The default settings of the iterative method are based on numerical
104-
results, benchmarks and previous experience with gap-junction
105-
simulations (see `Hahne et al.
106-
2015 <http://journal.frontiersin.org/article/10.3389/fninf.2015.00022/full>`__)
101+
results, benchmarks, and previous experience with gap-junction
102+
simulations [2]_.
107103
and should only be changed with proper knowledge of the method. In
108104
general the following parameters can be set via kernel parameters:
109105

@@ -116,4 +112,27 @@ general the following parameters can be set via kernel parameters:
116112
nest.wfr_interpolation_order = 3
117113
118114
For a detailed description of the parameters and their function see
119-
(`Hahne et al. 2016 <https://arxiv.org/abs/1610.09990>`__, Table 2).
115+
[3]_, Table 2.
116+
117+
.. seealso::
118+
119+
* :doc:`/auto_examples/gap_junctions_inhibitory_network`
120+
* :doc:`/auto_examples/gap_junctions_two_neurons`
121+
122+
References
123+
----------
124+
125+
.. [1] Hahne J, et al. 2016. Including Gap Junctions into Distributed Neuronal Network Simulations.
126+
In: Amunts K, Grandinetti L, Lippert T, Petkov N. (eds) Brain-Inspired Computing.
127+
BrainComp 2015. Lecture Notes in Computer Science(), vol 10087. Springer, Cham.
128+
https://doi.org/10.1007/978-3-319-50862-7_4
129+
130+
.. [2] Hahne J, Helias M, Kunkel S, Igarashi J, Bolten M, Frommer A, Diesmann M 2015.
131+
A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations.
132+
Frontiers in Neuroinformatics. 9
133+
https://www.frontiersin.org/journals/neuroinformatics/articles/10.3389/fninf.2015.00022
134+
135+
.. [3] Hahne J, Dahmen D , Schuecker J, Frommer A, Bolten M, Helias M, Diesmann M. 2017.
136+
Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator.
137+
Frontiers in Neuroinformatics. 11.
138+
https://www.frontiersin.org/journals/neuroinformatics/articles/10.3389/fninf.2017.00034

0 commit comments

Comments
 (0)