3
3
Simulations with gap junctions
4
4
==============================
5
5
6
- **Note: ** This documentation describes the usage of gap junctions in
7
- NEST 2.12. A documentation for NEST 2.10 can be found in `Hahne et al.
8
- 2016 <http://link.springer.com/chapter/10.1007/978-3-319-50862-7_4> `__.
9
- It is however recommended to use NEST 2.12 (or later), due to several
10
- improvements in terms of usability.
11
-
12
- Introduction
13
- ------------
14
6
15
7
Simulations with gap junctions are supported by the Hodgkin-Huxley
16
8
neuron model ``hh_psc_alpha_gap ``. The synapse model to create a
@@ -27,17 +19,19 @@ possibility to create both connections with a single call to
27
19
28
20
import nest
29
21
30
- a = nest.Create(' hh_psc_alpha_gap' )
31
- b = nest.Create(' hh_psc_alpha_gap' )
22
+ a = nest.Create(" hh_psc_alpha_gap" )
23
+ b = nest.Create(" hh_psc_alpha_gap" )
24
+ gap_weight = 0.5
25
+ syn_dict = {" synapse_model" : " gap_junction" , " weight" : gap_weight}
26
+ conn_dict = {" rule" : " one_to_one" , " make_symmetric" : True }
32
27
# Create gap junction between neurons a and b
33
- nest.Connect(a, b, {' rule' : ' one_to_one' , ' make_symmetric' : True },
34
- {' model' : ' gap_junction' , ' weight' : 0.5 })
28
+ nest.Connect(a, b, conn_dict, syn_dict)
35
29
36
30
In this case the reverse connection is created internally. In order to
37
31
prevent the creation of incomplete or non-symmetrical gap junctions the
38
32
creation of gap junctions is restricted to
39
33
40
- - ``one_to_one `` connections with ``' make_symmetric' : True ``
34
+ - ``one_to_one `` connections with ``" make_symmetric" : True ``
41
35
- ``all_to_all `` connections with equal source and target populations
42
36
and default or scalar parameters
43
37
@@ -61,25 +55,29 @@ level with e.g. the ``random`` module of the Python Standard Library:
61
55
# total number of gap junctions
62
56
n_gap_junction = 3000
63
57
64
- n = nest.Create(' hh_psc_alpha_gap' , n_neuron)
58
+ gap_weight = 0.5
59
+ n = nest.Create(" hh_psc_alpha_gap" , n_neuron)
60
+ n_list = n.tolist()
65
61
66
62
random.seed(0 )
67
63
68
- # draw n_gap_junction pairs of random samples from the list of all
69
- # neurons and reshaped data into two corresponding lists of neurons
70
- m = np.transpose(
71
- [random.sample(n, 2 ) for _ in range (n_gap_junction)])
64
+ # draw n_gap_junction pairs of random samples
65
+ connections = np.random.choice(n_list, [n_gap_junction, 2 ])
66
+
67
+ for source_node_id, target_node_id in connections:
68
+ nest.Connect(
69
+ nest.NodeCollection([source_node_id]),
70
+ nest.NodeCollection([target_node_id]),
71
+ {" rule" : " one_to_one" , " make_symmetric" : True },
72
+ {" synapse_model" : " gap_junction" , " weight" : gap_weight},
73
+ )
72
74
73
- # connect obtained lists of neurons both ways
74
- nest.Connect(m[0 ], m[1 ],
75
- {' rule' : ' one_to_one' , ' make_symmetric' : True },
76
- {' model' : ' gap_junction' , ' weight' : 0.5 })
77
75
78
76
As each gap junction contributes to the total number of gap-junction
79
77
connections of two neurons, it is hardly possible to create networks
80
78
with a fixed number of gap junctions per neuron. With the above script
81
79
it is however possible to control the approximate number of gap
82
- junctions per neuron. E.g. if one desires ``gap_per_neuron = 60 `` the
80
+ junctions per neuron. For example, if one desires ``gap_per_neuron = 60 `` the
83
81
total number of gap junctions should be chosen as
84
82
``n_gap_junction = n_neuron * gap_per_neuron / 2 ``.
85
83
@@ -92,18 +90,16 @@ total number of gap junctions should be chosen as
92
90
full set of random numbers and temporarily represent the total
93
91
connectivity in variable ``m ``. Therefore it is advisable to use the
94
92
internal random connection rules of NEST for the creation of connections
95
- whenever possible. For more details see `Hahne et al.
96
- 2016 <http://link.springer.com/chapter/10.1007/978-3-319-50862-7_4> `__.
93
+ whenever possible. For more details see Hahne et al. [1 ]_
97
94
98
95
Adjust settings of iterative solution scheme
99
96
--------------------------------------------
100
97
101
- For simulations with gap junctions NEST uses an iterative solution
98
+ For simulations with gap junctions, NEST uses an iterative solution
102
99
scheme based on a numerical method called Jacobi waveform relaxation.
103
100
The default settings of the iterative method are based on numerical
104
- results, benchmarks and previous experience with gap-junction
105
- simulations (see `Hahne et al.
106
- 2015 <http://journal.frontiersin.org/article/10.3389/fninf.2015.00022/full> `__)
101
+ results, benchmarks, and previous experience with gap-junction
102
+ simulations [2 ]_.
107
103
and should only be changed with proper knowledge of the method. In
108
104
general the following parameters can be set via kernel parameters:
109
105
@@ -116,4 +112,27 @@ general the following parameters can be set via kernel parameters:
116
112
nest.wfr_interpolation_order = 3
117
113
118
114
For a detailed description of the parameters and their function see
119
- (`Hahne et al. 2016 <https://arxiv.org/abs/1610.09990 >`__, Table 2).
115
+ [3 ]_, Table 2.
116
+
117
+ .. seealso ::
118
+
119
+ * :doc: `/auto_examples/gap_junctions_inhibitory_network `
120
+ * :doc: `/auto_examples/gap_junctions_two_neurons `
121
+
122
+ References
123
+ ----------
124
+
125
+ .. [1 ] Hahne J, et al. 2016. Including Gap Junctions into Distributed Neuronal Network Simulations.
126
+ In: Amunts K, Grandinetti L, Lippert T, Petkov N. (eds) Brain-Inspired Computing.
127
+ BrainComp 2015. Lecture Notes in Computer Science(), vol 10087. Springer, Cham.
128
+ https://doi.org/10.1007/978-3-319-50862-7_4
129
+
130
+ .. [2 ] Hahne J, Helias M, Kunkel S, Igarashi J, Bolten M, Frommer A, Diesmann M 2015.
131
+ A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations.
132
+ Frontiers in Neuroinformatics. 9
133
+ https://www.frontiersin.org/journals/neuroinformatics/articles/10.3389/fninf.2015.00022
134
+
135
+ .. [3 ] Hahne J, Dahmen D , Schuecker J, Frommer A, Bolten M, Helias M, Diesmann M. 2017.
136
+ Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator.
137
+ Frontiers in Neuroinformatics. 11.
138
+ https://www.frontiersin.org/journals/neuroinformatics/articles/10.3389/fninf.2017.00034
0 commit comments