This code computes the influence scores of a neuron or a group of neurons (seed) on all downstream neurons in the connectome based on a linear dynamical model of neural signal propagation.
Download the repository, using either the current main branch or one of the past releases. Then, either directly install the zipped version using
python3 -m pip install ConnectomeInfluenceCalculator-main.zip
or unzip the files into a folder and run
python3 -m pip install .
in that folder.
The package relies on the the PETSc and SLEPc libraries to perform sparse matrix computations. These libraries consist of the core libraries, and associated Python wrappers. The core libraries might not install correctly through pip
. If this happens, a possible work-around is to install them through other means (e.g., using Homebrew on OS X), and then tell the Python wrappers petsc4py
and slepc4py
where to find them by calling
export PETSC_DIR = /path/to/PETSc/installation
export SLEPC_DIR = /path/to/SLEPc/installation
before running the above pip
commands. Please make sure that installed core libraries have the same version numbers as the Python wrappers that will be installed.
Alteratively, both libraries and their Python wrappers can be installed using conda
. In this case, we highly recommend creating a virtual environment with a specific Python version (version 3.13.1
worked for us):
conda create -n ic-venv python=3.13.1
After activation of the virtual environment ic-venv
, the packages can be installed by executing the following commands:
conda install -c conda-forge petsc petsc4py
conda install -c conda-forge slepc slepc4py
This code computes the influence scores of a neuron or a group of neurons, as specified through the seed vector
where
To ensure stable neural dynamics, we rescale
where
All matrix computations are performed using parallel computing libraries PETSc and SLEPc which adapt well to problems involving large, sparse matrices, like neural connectivity matrices, and allow fast computation of the steady-state solution.
The influence of any seed is defined as the magnitude of neural activity at steady state,
To run a test example, start by importing the InfluenceCalculator package:
from InfluenceCalculator import InfluenceCalculator
Then instantiate a class object ic
using the filepath to the BANC connectome dataset (should be an SQLite file):
# Build InfluenceCalculator object
ic = InfluenceCalculator('BANC_dataset.sqlite')
By default, the programme simulates neural signal propagation based on an unsigned version of the connectivity matrix. However, it is possible to use a signed version whereby synaptic weights of inhibitory neurons are assigned negative values. Moreover, users can specify a minimum threshold count for the number of postsynaptic connections that will be considered in the analysis (default is 5
). To do so, run the following command, instead:
# Build InfluenceCalculator object
ic = InfluenceCalculator('BANC_dataset.sqlite', signed=True, count_thresh=5)
Let us now, define the seed group as all 'olfactory' neurons and calculate the influence of this seed on all downstream neurons while making sure to inhibit all non-seed sensory neurons:
# Define seed category (depending on how neurons are labelled in metadata)
meta_column = 'seed_01'
seed_category = 'olfactory'
# Get seed neuron ids
seed_ids = ic.meta[ic.meta[meta_column] == seed_category].root_id
# Get neuron ids to inhibit (sensory neurons in this case)
silenced_neurons = ic.meta[
ic.meta['super_class'].isin(['sensory',
'ascending_sensory'])].root_id
# Calculate influence scores and store them in a Pandas dataframe
influence_df = ic.calculate_influence(seed_ids, silenced_neurons)
Executing this script returns a dataframe with a column of neurons IDs, a column of Boolean entries indicating whether the corresponding neuron is part of the seed group or not, and a column of influence scores relative to the seed neurons.
Note that even though we selected all sensory neurons to be inhibited, the calculate_influence
method ensures that no seed neurons are being inhibited.
For contributions and bug reports, please see the contribution guidelines.