alleninstitute / bmtk Goto Github PK
View Code? Open in Web Editor NEWBrain Modeling Toolkit
Home Page: https://alleninstitute.github.io/bmtk/
License: BSD 3-Clause "New" or "Revised" License
Brain Modeling Toolkit
Home Page: https://alleninstitute.github.io/bmtk/
License: BSD 3-Clause "New" or "Revised" License
Hey Kael et al.! Firstly, thanks for doing an incredible amount of work on developing BMTK and the SONATA standard!
I'm trying to test-run BMTK on simple, single-compartment neurons, but after some digging into the code of both BMTK and BBP's libSONATA, it seems like this model_type
(single_compartment
) is either not yet or will not be implemented. Are there concrete plans to implement the single_compartment
model_type
, or is there an official roadmap for the future of BMTK in general? I also noticed that there may be plans to implement a model_type
called point_soma
, but I'm not sure if that's only meant to work with Pointnet/NEST instead of Bionet/NEURON.
I also tried using the default model_type
of biophysical
and then just using a different NEURON/hoc model_template
, but I couldn't get this to work. I'm not sure, but from reading the code, it seems like which model_template
is allowed are hardcoded, rather than being able to provide your own cell.hoc
template -- am I right that the model_template
s are hardcoded for each model_type
?
Thanks,
Austin
When trying to modify and re-run a network interactively in ipython/jupyter, NetworkBuilder.save_nodes fails with the message (from h5py): IOError: Unable to create file (unable to truncate a file which is already open)
It seems like the file is left open on loading the network when BioNetwork.from_config
is called, maybe in SonataNodes.load()
?. Not sure if that's intentional, or could be fixed?
There is an assert error statement error occurring when I run 14cells_nsyns in the bionet examples. The simulation is running and outputting correctly, but the spike output is not identical to the expected/spikes.txt file which is causing the error. Please see the error message and my pandas version below:
step:115000 t_sim:2875.000 ms -- t_wall: 118.63 s
step:120000 t_sim:3000.000 ms -- t_wall: 123.42 s
Name: pandas
Version: 0.14.1
Location: /shared/utils.x86_64/python-2.7/lib/python2.7/site-packages
Requires:
Hi, I also posted this somewhere obscurer but thought it was worth bringing to your attention in a seperate issue: All the Travis Python 3.7 pipelines are failing. The errors that are listed seem quite familiar and iirc are caused by a Cython version that is too old. You should try upgrading to the latest Cython before the step that errors out ๐
Error when running docker build -t alleninstitute/bmtk .
:
...
...
Step 11/31 : RUN conda install -y -c kaeldai neuron
---> Running in d5436094c58c
Collecting package metadata (current_repodata.json): ...working... done
Solving environment: ...working... failed with initial frozen solve. Retrying with flexible solve.
Solving environment: ...working... failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... failed with initial frozen solve. Retrying with flexible solve.
Found conflicts! Looking for incompatible packages.
This can take several minutes. Press CTRL-C to abort.
failed
UnsatisfiableError: The following specifications were found
to be incompatible with the existing python installation in your environment:
Specifications:
- neuron -> python[version='>=2.7,<2.8.0a0']
Your python: python=3.7
If python is on the left-most side of the chain, that's the version you've asked for.
When python appears to the right, that indicates that the thing on the left is somehow
not available for the python version you are constrained to. Note that conda will not
change your python version to a different minor version unless you explicitly specify
that.
The following specifications were found to be incompatible with each other:
Package pip conflicts for:
python=3.7 -> pip
neuron -> python[version='>=2.7,<2.8.0a0'] -> pip
Package certifi conflicts for:
neuron -> python[version='>=2.7,<2.8.0a0'] -> pip -> setuptools -> certifi[version='>=2016.09|>=2016.9.26']
python=3.7 -> pip -> setuptools -> certifi[version='>=2016.09|>=2016.9.26']
Package wheel conflicts for:
neuron -> python[version='>=2.7,<2.8.0a0'] -> pip -> wheel
python=3.7 -> pip -> wheel
Package ca-certificates conflicts for:
python=3.7 -> openssl[version='>=1.1.1b,<1.1.2a'] -> ca-certificates
neuron -> python[version='>=2.7,<2.8.0a0'] -> ca-certificates
Package setuptools conflicts for:
python=3.7 -> pip -> setuptools
neuron -> python[version='>=2.7,<2.8.0a0'] -> pip -> setuptools
The command '/bin/sh -c conda install -y -c kaeldai neuron' returned a non-zero code: 1
According to the Sonata specification, the "mechanisms_dir" field of the circuit config is optional.
However, due to I think this line
https://github.com/AllenInstitute/bmtk/blob/develop/bmtk/simulator/bionet/config.py#L68
bmtk fails with an exception when I try to load a config that doesn't have mechanisms specification
Would it be possible to be more specific where external nodes synapse such that it targets the segment in which a synapse is already present? Thus, creating a triadic synapse?
Since LGN Model is none-recurrent we should be able to implement parallelization in FilterNet. The main places this should happen are at filternetwork.build_nodes() and filtersimulator.run().
Some more documentation in the readme file would be great. What levels of resolution are described? What is its logic for building and running models and storing results, and what need does the package fill? Is there a jupyter notebook tutorial or something to help people get started?
I kept running into the two following errors until I downgraded h5py from 2.8.0
to 2.7.0
and pandas from 0.24.1
to 0.20.3
pip install h5py==2.7.0 pandas==0.23.0
(base) C:\Users\Tyler\Desktop\git_stage\project>python run_bionet.py
2019-03-12 22:19:58,928 [INFO] Created log file
2019-03-12 22:19:59,038 [INFO] Building cells.
2019-03-12 22:19:59,773 [INFO] Building recurrent connections
2019-03-12 22:20:38,338 [INFO] Building virtual cell stimulations for exc_spikes
Traceback (most recent call last):
File "C:\Users\Tyler\Anaconda3\lib\site-packages\bmtk-0.0.7-py3.7.egg\bmtk\utils\sonata\population.py", line 586, in _get_index
raise StopIteration
StopIteration
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "run_bionet.py", line 34, in <module>
run('simulation_config.json')
File "run_bionet.py", line 22, in run
sim = bionet.BioSimulator.from_config(conf, network=graph)
File "C:\Users\Tyler\Anaconda3\lib\site-packages\bmtk-0.0.7-py3.7.egg\bmtk\simulator\bionet\biosimulator.py", line 291, in from_config
network.add_spike_trains(spikes, node_set)
File "C:\Users\Tyler\Anaconda3\lib\site-packages\bmtk-0.0.7-py3.7.egg\bmtk\simulator\bionet\bionetwork.py", line 260, in add_spike_trains
for edge in edge_pop.get_target(trg_nid):
File "C:\Users\Tyler\Anaconda3\lib\site-packages\bmtk-0.0.7-py3.7.egg\bmtk\simulator\core\sonata_reader\network_reader.py", line 174, in get_target
for edge in self._edge_pop.get_target(node_id):
RuntimeError: generator raised StopIteration
Pandas error:
(base) C:\Users\Tyler\Desktop\git_stage\project>python run_bionet.py
C:\Users\Tyler\Anaconda3\lib\site-packages\h5py\__init__.py:34: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
from ._conv import register_converters as _register_converters
2019-03-12 23:17:37,341 [INFO] Created log file
Traceback (most recent call last):
File "run_bionet.py", line 34, in <module>
run('simulation_config.json')
File "run_bionet.py", line 21, in run
graph = bionet.BioNetwork.from_config(conf)
File "C:\Users\Tyler\Anaconda3\lib\site-packages\bmtk-0.0.7-py3.6.egg\bmtk\simulator\core\simulator_network.py", line 169, in from_config
network.add_nodes(node_pop)
File "C:\Users\Tyler\Anaconda3\lib\site-packages\bmtk-0.0.7-py3.6.egg\bmtk\simulator\core\simulator_network.py", line 87, in add_nodes
node_population.initialize(self)
File "C:\Users\Tyler\Anaconda3\lib\site-packages\bmtk-0.0.7-py3.6.egg\bmtk\simulator\core\sonata_reader\network_reader.py", line 44, in initialize
model_types.update(set(np.unique(grp.get_values(self._adaptor.COL_MODEL_TYPE))))
File "C:\Users\Tyler\Anaconda3\lib\site-packages\bmtk-0.0.7-py3.6.egg\bmtk\utils\sonata\group.py", line 182, in get_values
self.build_indicies()
File "C:\Users\Tyler\Anaconda3\lib\site-packages\bmtk-0.0.7-py3.6.egg\bmtk\utils\sonata\group.py", line 178, in build_indicies
self._parent_indicies = self._parent.group_indicies(self.group_id, build_cache=True)
File "C:\Users\Tyler\Anaconda3\lib\site-packages\bmtk-0.0.7-py3.6.egg\bmtk\utils\sonata\population.py", line 125, in group_indicies
tmp_index['grp_id'] = pd.Series(self._group_id_ds, dtype=self._group_id_ds.dtype)
File "C:\Users\Tyler\Anaconda3\lib\site-packages\pandas\core\series.py", line 262, in __init__
raise_cast_failure=True)
File "C:\Users\Tyler\Anaconda3\lib\site-packages\pandas\core\internals\construction.py", line 625, in sanitize_array
subarr = _try_cast(data, False, dtype, copy, raise_cast_failure)
File "C:\Users\Tyler\Anaconda3\lib\site-packages\pandas\core\internals\construction.py", line 695, in _try_cast
subarr = maybe_cast_to_integer_array(arr, dtype)
File "C:\Users\Tyler\Anaconda3\lib\site-packages\pandas\core\dtypes\cast.py", line 1307, in maybe_cast_to_integer_array
casted = arr.astype(dtype, copy=copy)
TypeError: astype() got an unexpected keyword argument 'copy'
In bionet, running a network multiple times (specifically, the build_env
step) can give an error OSError: [Errno 39] Directory not empty: './output'
. Should be able to reproduce by simply opening an example ipynb and executing the "run" cell twice.
Seems to be an issue with Python's shutil.rmtree
, which should be able to delete non-empty directories but is failing in certain cases.
This is not hard to work around (just delete manually), but slightly annoying - I'm curious if it's specific to my installation or if others have seen it too (I'm using Anaconda Python 2.7.15).
Note that this doesn't occur from simply running run_bionet.py
twice, but it's also not restricted to jupyter or ipython (can reproduce by executing the relevant lines directly in a python console. Strange!
Hi. I noticed in some of my simulations of a large scale network set up in bmtk that a substantial amount of the overall simulation time for a job is spent reading in and converting temporary spike files from each MPI rank (w. MPI pool size 1536). The issue at hand seems to be due to the serial function _merge_files
at https://github.com/AllenInstitute/bmtk/blob/develop/bmtk/utils/io/spike_trains.py#L116, where cvs.reader(...)
returns an iterable over each line in the input files. Reading in files as typed arrays and sorting and dumping everything in the end is many times faster as the double for loop containing a function call to write stuff is avoided; something like:
import numpy as np
import h5py
from glob import glob
import os
outputdir = 'output'
dtype=[('timestamps', 'f8'), ('gids', 'u8')]
spikes = np.empty(shape=(), dtype=dtype)
for fname in glob(os.path.join(outputdir, '_bmtk_tmp_spikes_*.csv')):
spikes = np.r_[spikes, np.loadtxt(fname, delimiter=' ', dtype=dtype)]
sort = np.argsort(spikes['timestamps'])
spikes = spikes[sort]
np.savetxt(os.path.join(outputdir, 'spikes.csv'), spikes)
f = h5py.File(os.path.join(outputdir, 'spikes.h5'), 'w')
grp = f.create_group('spikes')
grp.attrs['sorting'] = 'time'
grp['gids'] = spikes['gids']
grp['timestamps'] = spikes['timestamps']
f.close()
Any plans on addressing this issue? In my case with somewhat long simulation durations (20s biological time) the utilization of resources is poor as 1 process converts files while 1535 other processes are dormant for 1-2 hours before it to finishes.
Just a suggestion, maybe you can also tell users they can install bmtk using:
pip install git+https://github.com/AllenInstitute/bmtk.git
Maybe you could also upload it to pypi ? (I saw the bmtk name is still available, could be useful to register it at least so that no other python project can use it)
Currently, the only way to create a cell without a morphology file that I can find is to use morphology=None
in add_nodes. This does not create a visible issue when running most simulations, but a bug is uncovered when reporting ecp.
The bug is in make_morphologies:
bmtk/bmtk/simulator/bionet/bionetwork.py
Lines 188 to 217 in 85e8aa2
The issue is that there is a morphology cache that uses morphology_file as a key. Therefore, if multiple types of cells were created without a morphology file (meaning they all used morphology=None
), they will all be assigned the same morphology (from whichever cell type was added first).
This bug is uncovered on line 206 of ecp.py:
bmtk/bmtk/simulator/bionet/modules/ecp.py
Line 206 in 85e8aa2
It is caused by the fact that tr gets its second dimension from the number of segments in the morphology object assigned to the cell, where as im gets its length from the actual counted number of segments in the cell's hobj.
I am not sure of the implications this bug has on the overall simulation results, but it atleast stops ecp reporting from working if a cell is added without a morphology file.
Currently the user gets an obscure error message from mtrand.RandomState.choice()
: ValueError: a must be non-empty.
This should maybe be caught as a warning, with the result that no synapses are inserted?
Has anyone made a function to calculate the local field potential (LFP)? I will write it if not. We use the line-source approximation from
Schomburg, E. W., Anastassiou, C. A., Buzsaki, G., & Koch, C. (2012). The Spiking Component of Oscillatory Extracellular Potentials in the Rat Hippocampus. Journal of Neuroscience, 32(34), 11798โ11811.
Minor and may not be worth the time to fix. Feel free to close.
When plotting a single variable for a gid other than 0, using bmtk/analyzer/cell_vars.plot_report()
I get the following error.
Traceback (most recent call last):
File "/home/tbg28/anaconda3/envs/py36/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/tbg28/anaconda3/envs/py36/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/home/tbg28/git_stage/bmtools/bmtools/plot.py", line 621, in <module>
handling_func(**v)
File "/home/tbg28/git_stage/bmtools/bmtools/plot.py", line 375, in plot_report_default
plot_report(config_file=config, report_file=report_file, report_name=report_name, variables=variables, gids=gids);
File "/home/tbg28/git_stage/bmtk/bmtk/analyzer/cell_vars.py", line 71, in plot_report
plt.plot(time_steps, var_report.data(gid=0, var_name=variables[0]), label='gid {}'.format(gid))
File "/home/tbg28/git_stage/bmtk/bmtk/utils/cell_vars/var_reader.py", line 121, in data
gid_slice = self._gid2data_table[gid][0]
KeyError: 0
It appears there's a hard-coded 0
that should be replaced with gid
here:
https://github.com/AllenInstitute/bmtk/blob/develop/bmtk/analyzer/cell_vars.py#L71
for gid in gids:
plt.plot(time_steps, var_report.data(gid=0, var_name=variables[0]), label='gid {}'.format(gid))
Should support setting dynamics_params for individual nodes, both at network building (saving to h5) and simulation instantiation (loading from h5).
Somehow https://libraries.io/pypi/nest/dependents thinks the BMTK needs nest
, but nest on PyPI is this piece of junk source code:
def print_list(the_list, level=0):
for each_item in the_list:
if isinstance(each_item, indent = False ,list):
print_list(each_item, indent, level +1)
else:
if indent:
for tab_stop in range(level):
pritn("/t", end='')
print(each_item)
(Which contains abundant typo's and syntax errors)
Could you look into where this link comes from, do you specify it in any setup.py
or requirement files? Would you be willing, if necessary (see nest/nest-simulator#2007) to clarify the misunderstanding to PyPA during the removal procedure as well?
Sorry to bother again. I am studying the tutorial chapter 3: Multi-cell, single population network (with BioNet). At the sim.run() step, there returned a value error said: " mtrand.pyx in numpy.random.mtrand.RandomState.choice(), ValueError: 'a' cannot be empty unless no samples are taken ", and cannot move onto next step.
Looking forward to your reply, thank you very much!
Hello, I try to use BMTK to build a one compartment cell, I deleted all lines in morphology file except the first line for soma:
1 1 356.5848 666.8376 28.56 6.4681 -1
it comes the error:
NEURON: x : object prefix is NULL
near line 0
^
Import3d_SWC_read[0].mksections()
Import3d_SWC_read[0].input("./biophys_...")
Biophys1[0].init("./biophys_...")
If I left two more lines in .swc file to build a two compartments cell:
1 1 356.5848 666.8376 28.56 6.4681 -1
2 3 354.1858 672.2133 28.4684 0.1608 1
It comes another error:
NEURON: section in the object was deleted
near line 0
{soma { pt3dadd(363.053, 666.838, 28.56, 12.9362) }}
^
*** longjmp causes uninitialized stack frame ***:
This is my method to build the network:
net = NetworkBuilder('BL')
net.add_nodes(cell_name='BL',
potental='exc',
model_type='biophysical',
model_template='ctdb:Biophys1.hoc',
model_processing='aibs_perisomatic',
dynamics_params='fit_parameters.json',
morphology='reconstruction.swc')
net.build()
net.save_nodes(output_dir='network')
How to solve these problems and is there any right way to build a one compartment or several compartments cell?
Hi,
There seems to be an issue with the calculation of the ECP when using the latest version of Neuron (7.8.2). Running, for example, the bio_450cells_exact example provided with BMTK I get the following warning:
/Users/atleeskelandrimehaug/sources/bmtk_test_py3_6/bmtk/bmtk/simulator/bionet/modules/ecp.py:283: RuntimeWarning: invalid value encountered in true_divide
rll = abs(rlldl / dlmag) # component of r parallel to the segment axis it must be always positive
And in the ecp.h5 output file all values are NaN.
There are no problems with earlier versions of Neuron (I haven't testet all, of course, but 7.8.0 f. ex. works fine), so I'm merely posting this to bring it to your attention.
When running the tutorials users running Python3 will be met with the following error after executing a section of code:
from bmtk.utils.spike_trains import SpikesGenerator
sg = SpikesGenerator(nodes='network/mthalamus_nodes.h5', t_max=3.0)
sg.set_rate(10.0)
sg.save_csv('thalamus_spikes.csv', in_ms=True)
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-5-c6b681b962ce> in <module>
3 sg = SpikesGenerator(nodes='network/mthalamus_nodes.h5', t_max=3.0)
4 sg.set_rate(10.0)
----> 5 sg.save_csv('thalamus_spikes.csv', in_ms=True)
~\AppData\Local\Continuum\anaconda3\lib\site-packages\bmtk-0.0.7-py3.7.egg\bmtk\utils\spike_trains\spikes_csv.py in save_csv(self, csv_file_name, in_ms)
91 csv_writer.writerow(['gid', 'spike-times'])
92 for gid, rate_gen in self._nodes.items():
---> 93 csv_writer.writerow([gid, ','.join(str(r*conv) for r in rate_gen)])
94
TypeError: iter() returned non-iterator of type 'NormalRates'
Hi,
I'm sorry if this is the wrong place to post this problem.
I'm trying to reproduce the Model Response of this cell:
http://celltypes.brain-map.org/experiment/electrophysiology/490387590
I have downloaded the model, and I try to run it in BMTK, using the code below.
("fit_parameters.json", and "reconstruction.swc" were put in the folders where BMTK seemed to expect to find them)
In general, the results look good, and the simulated cell-behavior seems reasonable, except that the resting potential is very different. In my attempted recreation, the resting potential is -93 mV (which seems very low), while on the web page it seems it should be about -79 mV? I noticed that this difference, -14 mV, exactly equals the given 'junction_potential' in 'fit_parameters.json', but presumably I should not have to correct for the liquid junction potential in the simulations?
Any idea about what could be causing this?
'''
from bmtk.builder.networks import NetworkBuilder
net = NetworkBuilder("mcortex")
net.add_nodes(cell_name="491623973",
potental="exc",
model_type="biophysical",
model_template="ctdb:Biophys1.hoc",
model_processing="aibs_perisomatic",
dynamics_params="fit_parameters.json",
morphology="reconstruction.swc")
net.build()
net.save_nodes(output_dir="network")
for node in net.nodes():
print(node)
from bmtk.utils.sim_setup import build_env_bionet
build_env_bionet(base_dir="neuronal_model", # Where to save the scripts and config files
network_dir="network", # Location of directory containing network files
tstop=1200.0, dt=0.1, # Run a simulation for 2000 ms at 0.1 ms intervals
report_vars=["v"], # Tells simulator we want to record membrane potential and calcium traces
current_clamp={ # Creates a step current from 500.ms to 1500.0 ms
"amp": 0.61,
"delay": 100.0,
"duration":
1000.0
},
include_examples=False, # Copies components files
compile_mechanisms=False # Will try to compile NEURON mechanisms
)
from bmtk.simulator import bionet
conf = bionet.Config.from_json("neuronal_model/simulation_config.json")
conf.build_env()
net = bionet.BioNetwork.from_config(conf)
sim = bionet.BioSimulator.from_config(conf, network=net)
sim.run()
from bmtk.analyzer.spike_trains import to_dataframe
to_dataframe(config_file="neuronal_model/simulation_config.json")
from bmtk.analyzer.cell_vars import plot_report
plot_report(config_file="neuronal_model/simulation_config.json")
'''
I'm attempting to install bmtk for the first time by cloning the git repo. When I do git clone https://github.com/AllenInstitute/bmtk.git, I get the following error:
fatal: cannot create directory at 'bmtk/builder/aux': Invalid argument warning: Clone succeeded, but checkout failed.
I even tried downloading the zip and extracting it and the same thing happens. I checked around and apparently that folder name isn't aloud on Windows. I'm going to try to work around it, but just thought I would mention it for others who use Windows.
Depending on the spike train buffer, the spikes may be a list instead of a numpy array. This leads the line assert(np.all(spikes >= 0))
to fail because you cannot compare a list with an integer.
Tutorials 1-4 run fine but it appears that no spikes are being recorded by the simulator in tutorial 05. I'm familiar with NEST but not Sonata and bmtk so I haven't had much luck troubleshooting. This was on a clean conda environment on Linux x86-64 with Python 3.6.12, pandas 0.23.0 and h5py 2.7.0:
from bmtk.analyzer.spike_trains import plot_raster, plot_rates
plot_raster(config_file='sim_ch05/simulation_config.json', group_by='pop_name')
IndexError Traceback (most recent call last)
in
1 from bmtk.analyzer.spike_trains import plot_raster, plot_rates
2
----> 3 plot_raster(config_file='sim_ch05/simulation_config.json', group_by='pop_name')
~/anaconda3/envs/newenv/lib/python3.6/site-packages/bmtk/analyzer/spike_trains.py in plot_raster(config_file, population, with_histogram, times, title, show, group_by, group_excludes, spikes_file, nodes_file, node_types_file)
185 config_file=config_file, population=population, times=times, title=title, show=show,
186 group_by=group_by, group_excludes=group_excludes,
--> 187 spikes_file=spikes_file, nodes_file=nodes_file, node_types_file=node_types_file
188 )
189
~/anaconda3/envs/newenv/lib/python3.6/site-packages/bmtk/analyzer/spike_trains.py in _plot_helper(plot_fnc, config_file, population, times, title, show, group_by, group_excludes, spikes_file, nodes_file, node_types_file)
99 spikes_file=None, nodes_file=None, node_types_file=None):
100 sonata_config = SonataConfig.from_json(config_file) if config_file else None
--> 101 pop, spike_trains = _find_spikes(config_file=config_file, spikes_file=spikes_file, population=population)
102
103 # Create the title
~/anaconda3/envs/newenv/lib/python3.6/site-packages/bmtk/analyzer/spike_trains.py in _find_spikes(spikes_file, config_file, population)
76 raise ValueError('Spikes file {} contains more than one node population.'.format(spikes_f))
77 else:
---> 78 return spikes_obj.populations[0], spikes_obj
79
80
IndexError: list index out of range
The default value of None
for the kwarg config_file
to bmtk.utils.sim_setup.build_env_filternet
is an invalid one and causes an error, since a string or Path is expected. I suggest only passing config_file to env_builder.build
if it is not None
, so that env_builder can specify its own correct default.
In Windows Neuron doesn't like the use of double slashes (\\
) and we get the following error.
(clean) C:\Users\Tyler\Desktop\BMTK Morpho Work\hippocampus-bmtk\ca3>python run_bionet.py simulation_config.json
2018-12-29 00:38:09,774 [INFO] Created log file
loading stuff
C:\Users\Tyler\Anaconda3\envs\clean\lib\site-packages\bmtk-0.0.7-py3.7.egg\bmtk\simulator\bionet
NEURON: Can't open C:UsersTylerAnaconda3envscleanlibsite-packagemtk-0.0.7-py3.7.egmtksimulatoionetimport3d.hoc
near line 1
character \010 at position 54 is not printable
{xopen("C:UsersTylerAnaconda3envscleanlibsite-packagemtk-0.0.7-py3.7.egmtksimulatoionetimport3d.hoc")}
^
xopen("C:UsersTyl...")
execute1("{xopen("C:...")
load_file("C:\Users\T...")
C:\Users\Tyler\Anaconda3\envs\clean\lib\site-packages\bmtk-0.0.7-py3.7.egg\bmtk\simulator\bionet\import3d.hoc
NEURON: Can't open C:UsersTylerAnaconda3envscleanlibsite-packagemtk-0.0.7-py3.7.egmtksimulatoionetdefault_templatesadvance.hoc
near line 1
character \010 at position 54 is not printable
{xopen("C:UsersTylerAnaconda3envscleanlibsite-packagemtk-0.0.7-py3.7.egmtksimulatoionetdefault_templatesadvance.hoc")}
^
xopen("C:UsersTyl...")
execute1("{xopen("C:...")
load_file("C:\Users\T...")
NEURON: Import3d_SWC_read is not a template
in BioAxonStub.hoc near line 25
nl = new Import3d_SWC_read()
^
xopen("BioAxonStub.hoc")
execute1("{xopen("Bi...")
load_file("BioAxonStub.hoc")
NEURON: Import3d_SWC_read is not a template
in Biophys1.hoc near line 24
nl = new Import3d_SWC_read()
^
xopen("Biophys1.hoc")
execute1("{xopen("Bi...")
load_file("Biophys1.hoc")
I will follow up with a pull request for a possible solution.
Is there already a solution to specify the model_processing method when using nml files ? As mentioned in the comment here:
https://github.com/AllenInstitute/bmtk/blob/develop/bmtk/simulator/bionet/default_setters/cell_models.py#L399
The problem is that when one uses a single compartmental cell, the aibs perisomatic model processing crashes becauses there is no axon. When specifying 'fullaxon' in the biophysical_node_types, it seems to be ignored (which is logical if one looks at the comment above).
e.g. if this value for dt:
https://github.com/AllenInstitute/sonata/blob/ad1d46a97029749aea835f9f016bfa1ccba1f258/examples/300_pointneurons/simulation_config.json#L10
is changed to a more realistic 0.01 ms, the following error occurs:
...
2019-04-29 19:27:47,332 [INFO] Setting up output directory
2019-04-29 19:27:47,332 [INFO] Building cells.
2019-04-29 19:27:47,343 [INFO] Building recurrent connections
2019-04-29 19:27:47,468 [INFO] Build virtual cell stimulations for external_spike_trains
Traceback (most recent call last):
File "run_pointnet.py", line 18, in <module>
main('config.json')
File "run_pointnet.py", line 11, in main
sim = pointnet.PointSimulator.from_config(configure, graph)
File "/home/padraig/anaconda2/lib/python2.7/site-packages/bmtk-0.0.7-py2.7.egg/bmtk/simulator/pointnet/pointsimulator.py", line 230, in from_config
graph.add_spike_trains(spikes, node_set)
File "/home/padraig/anaconda2/lib/python2.7/site-packages/bmtk-0.0.7-py2.7.egg/bmtk/simulator/pointnet/pointnetwork.py", line 155, in add_spike_trains
nest.SetStatus([nest_id], {'spike_times': np.array(spike_trains.get_spikes(node_id))})
File "/home/padraig/opt/nest/lib/python2.7/site-packages/nest/lib/hl_api_helper.py", line 230, in stack_checker_func
return f(*args, **kwargs)
File "/home/padraig/opt/nest/lib/python2.7/site-packages/nest/lib/hl_api_info.py", line 199, in SetStatus
sr('Transpose { arrayload pop SetStatus } forall')
File "/home/padraig/opt/nest/lib/python2.7/site-packages/nest/__init__.py", line 117, in catching_sli_run
raise _kernel.NESTError(encode(errorstring))
pynestkernel.NESTError: BadProperty in SetStatus_id: Setting status of a 'spike_generator' with GID 301: spike_generator: Time point 0.384322 is not representable in current resolution.
The most recent feature merge for recording synapse properties works great. Initial testing went fine until I had multiple synapse types on the post synaptic cell. I started running into this issue:
(base) C:\Users\Tyler\Desktop\git_stage\hippocampus-bmtk\HummosBanks-bmtk>python run_bionet.py
C:\Users\Tyler\Anaconda3\lib\site-packages\h5py\__init__.py:34: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
from ._conv import register_converters as _register_converters
2019-03-19 17:57:25,900 [INFO] Created log file
2019-03-19 17:57:26,291 [INFO] Building cells.
2019-03-19 17:57:27,072 [INFO] Building recurrent connections
2019-03-19 17:57:36,431 [INFO] Building virtual cell stimulations for exc_spikes
2019-03-19 17:57:36,778 [INFO] Running simulation for 100.000 ms with the time step 0.100 ms
2019-03-19 17:57:36,778 [INFO] Starting timestep: 0 at t_sim: 0.000 ms
2019-03-19 17:57:36,778 [INFO] Block save every 1000 steps
Traceback (most recent call last):
File "C:\Users\Tyler\Anaconda3\lib\site-packages\bmtk-0.0.7-py3.6.egg\bmtk\simulator\bionet\biosimulator.py", line 253, in post_fadvance
mod.step(self, self.tstep)
File "C:\Users\Tyler\Anaconda3\lib\site-packages\bmtk-0.0.7-py3.6.egg\bmtk\simulator\bionet\modules\record_netcons.py", line 129, in step
self._var_recorder.record_cell(gid, var_name, syn_values, tstep)
File "C:\Users\Tyler\Anaconda3\lib\site-packages\bmtk-0.0.7-py3.6.egg\bmtk\utils\io\cell_vars.py", line 205, in record_cell
buffer_block[update_index, gid_beg:gid_end] = seg_vals
ValueError: cannot copy sequence with size 22 to array axis with dimension 25
NEURON: PyObject method call failed: post_fadvance
near line 0
{DA_IZH = 2}
^
advance()
step()
continuerun(100)
run(100)
Traceback (most recent call last):
File "run_bionet.py", line 94, in <module>
run('simulation_config.json')
File "run_bionet.py", line 86, in run
sim.run()
File "C:\Users\Tyler\Anaconda3\lib\site-packages\bmtk-0.0.7-py3.6.egg\bmtk\simulator\bionet\biosimulator.py", line 224, in run
h.run(h.tstop) # <- runs simuation: works in parallel
RuntimeError: hoc error
It appears there's an array mismatch in cell_vars.py brought on when registering cells in record_netcons.py.
I've got a followup pull request to fix it with your approval. (#69)
Update bmtk to support python3.
Hi,
I'm a new user of bmtk and I want to know if it's possible to use an external function as iterator when we create a new edge. So we can have: net.add_edge(N=1, iterator=external_function, ...). If it's possible what are the steps to follow.
Thank you
Could the ability to disable the log file be added btmk ?
When specifying the 'output' block
Sonata doesn't require to generate such a log file, the only 'default' behavior that is defined in sonata is the a spike output file is written.
The tutorial ipynb has some tutorials for PointNet and PopNet (Ch. 4 and Ch. 5) I was interested in reading about but the links are broken
Broken Links:
Part 1: Complete workflow tutorials
...
Chapter 5: Point-neuron simulation (with PointNet)
Chapter 6: Population-level simulation (with PopNet)
The tests and pycaches are unconventionally placed inside the source directory and as such are packaged and installed with every pip install bmtk
:
PS C:\Users\robin> pip show -f bmtk
Name: bmtk
Version: 0.0.7
Summary: Brain Modeling Toolkit
Home-page: https://github.com/AllenInstitute/bmtk
Author: Kael Dai
Author-email: [email protected]
License: UNKNOWN
Location: c:\users\robin\appdata\local\programs\python\python39\lib\site-packages
Requires: matplotlib, h5py, jsonschema, six, numpy, pandas
Required-by:
Files:
bmtk-0.0.7.dist-info\INSTALLER
bmtk-0.0.7.dist-info\LICENSE.txt
bmtk-0.0.7.dist-info\METADATA
bmtk-0.0.7.dist-info\RECORD
bmtk-0.0.7.dist-info\REQUESTED
bmtk-0.0.7.dist-info\WHEEL
bmtk-0.0.7.dist-info\top_level.txt
bmtk\__init__.py
bmtk\__pycache__\__init__.cpython-36.pyc
bmtk\__pycache__\__init__.cpython-39.pyc
bmtk\analyzer\__init__.py
bmtk\analyzer\__pycache__\__init__.cpython-39.pyc
bmtk\analyzer\__pycache__\cell_vars.cpython-39.pyc
bmtk\analyzer\__pycache__\firing_rates.cpython-39.pyc
bmtk\analyzer\__pycache__\io_tools.cpython-39.pyc
bmtk\analyzer\__pycache__\spike_trains.cpython-39.pyc
bmtk\analyzer\__pycache__\spikes_analyzer.cpython-39.pyc
bmtk\analyzer\__pycache__\spikes_loader.cpython-39.pyc
bmtk\analyzer\__pycache__\utils.cpython-39.pyc
bmtk\analyzer\cell_vars.py
bmtk\analyzer\firing_rates.py
bmtk\analyzer\io_tools.py
bmtk\analyzer\spike_trains.py
bmtk\analyzer\spikes_analyzer.py
bmtk\analyzer\spikes_loader.py
bmtk\analyzer\utils.py
bmtk\analyzer\visualization\__init__.py
bmtk\analyzer\visualization\__pycache__\__init__.cpython-39.pyc
bmtk\analyzer\visualization\__pycache__\rasters.cpython-39.pyc
bmtk\analyzer\visualization\__pycache__\spikes.cpython-39.pyc
bmtk\analyzer\visualization\__pycache__\widgets.cpython-39.pyc
bmtk\analyzer\visualization\rasters.py
bmtk\analyzer\visualization\spikes.py
bmtk\analyzer\visualization\widgets.py
bmtk\builder\__init__.py
bmtk\builder\__pycache__\__init__.cpython-39.pyc
bmtk\builder\__pycache__\connection_map.cpython-39.pyc
bmtk\builder\__pycache__\connector.cpython-39.pyc
bmtk\builder\__pycache__\edge.cpython-39.pyc
bmtk\builder\__pycache__\functor_cache.cpython-39.pyc
bmtk\builder\__pycache__\id_generator.cpython-39.pyc
bmtk\builder\__pycache__\iterator.cpython-39.pyc
bmtk\builder\__pycache__\network.cpython-39.pyc
bmtk\builder\__pycache__\node.cpython-39.pyc
bmtk\builder\__pycache__\node_pool.cpython-39.pyc
bmtk\builder\__pycache__\node_set.cpython-39.pyc
bmtk\builder\auxi\__init__.py
bmtk\builder\auxi\__pycache__\__init__.cpython-39.pyc
bmtk\builder\auxi\__pycache__\edge_connectors.cpython-39.pyc
bmtk\builder\auxi\__pycache__\node_params.cpython-39.pyc
bmtk\builder\auxi\edge_connectors.py
bmtk\builder\auxi\node_params.py
bmtk\builder\bionet\__init__.py
bmtk\builder\bionet\__pycache__\__init__.cpython-39.pyc
bmtk\builder\bionet\__pycache__\swc_reader.cpython-39.pyc
bmtk\builder\bionet\swc_reader.py
bmtk\builder\connection_map.py
bmtk\builder\connector.py
bmtk\builder\edge.py
bmtk\builder\formats\__init__.py
bmtk\builder\formats\__pycache__\__init__.cpython-39.pyc
bmtk\builder\formats\__pycache__\hdf5_format.cpython-39.pyc
bmtk\builder\formats\__pycache__\iformats.cpython-39.pyc
bmtk\builder\formats\hdf5_format.py
bmtk\builder\formats\iformats.py
bmtk\builder\functor_cache.py
bmtk\builder\id_generator.py
bmtk\builder\io\__init__.py
bmtk\builder\io\__pycache__\__init__.cpython-39.pyc
bmtk\builder\iterator.py
bmtk\builder\network.py
bmtk\builder\networks\__init__.py
bmtk\builder\networks\__pycache__\__init__.cpython-39.pyc
bmtk\builder\networks\__pycache__\dm_network.cpython-39.pyc
bmtk\builder\networks\__pycache__\input_network.cpython-39.pyc
bmtk\builder\networks\__pycache__\mpi_network.cpython-39.pyc
bmtk\builder\networks\__pycache__\nxnetwork.cpython-39.pyc
bmtk\builder\networks\__pycache__\sparse_network.cpython-39.pyc
bmtk\builder\networks\dm_network.py
bmtk\builder\networks\input_network.py
bmtk\builder\networks\mpi_network.py
bmtk\builder\networks\nxnetwork.py
bmtk\builder\networks\sparse_network.py
bmtk\builder\node.py
bmtk\builder\node_pool.py
bmtk\builder\node_set.py
bmtk\simulator\__init__.py
bmtk\simulator\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\bionet\README.md
bmtk\simulator\bionet\__init__.py
bmtk\simulator\bionet\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\biocell.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\bionetwork.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\biosimulator.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\cell.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\config.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\iclamp.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\io_tools.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\morphology.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\nml_reader.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\nrn.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\pointprocesscell.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\pointsomacell.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\pyfunction_cache.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\sonata_adaptors.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\utils.cpython-39.pyc
bmtk\simulator\bionet\__pycache__\virtualcell.cpython-39.pyc
bmtk\simulator\bionet\biocell.py
bmtk\simulator\bionet\bionetwork.py
bmtk\simulator\bionet\biosimulator.py
bmtk\simulator\bionet\cell.py
bmtk\simulator\bionet\config.py
bmtk\simulator\bionet\default_setters\__init__.py
bmtk\simulator\bionet\default_setters\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\bionet\default_setters\__pycache__\cell_models.cpython-39.pyc
bmtk\simulator\bionet\default_setters\__pycache__\synapse_models.cpython-39.pyc
bmtk\simulator\bionet\default_setters\__pycache__\synaptic_weights.cpython-39.pyc
bmtk\simulator\bionet\default_setters\cell_models.py
bmtk\simulator\bionet\default_setters\synapse_models.py
bmtk\simulator\bionet\default_setters\synaptic_weights.py
bmtk\simulator\bionet\default_templates\BioAxonStub.hoc
bmtk\simulator\bionet\default_templates\Biophys1.hoc
bmtk\simulator\bionet\default_templates\advance.hoc
bmtk\simulator\bionet\iclamp.py
bmtk\simulator\bionet\import3d.hoc
bmtk\simulator\bionet\import3d\import3d_gui.hoc
bmtk\simulator\bionet\import3d\import3d_sec.hoc
bmtk\simulator\bionet\import3d\read_morphml.hoc
bmtk\simulator\bionet\import3d\read_nlcda.hoc
bmtk\simulator\bionet\import3d\read_nlcda3.hoc
bmtk\simulator\bionet\import3d\read_nts.hoc
bmtk\simulator\bionet\import3d\read_swc.hoc
bmtk\simulator\bionet\io_tools.py
bmtk\simulator\bionet\modules\__init__.py
bmtk\simulator\bionet\modules\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\bionet\modules\__pycache__\ecp.cpython-39.pyc
bmtk\simulator\bionet\modules\__pycache__\record_cellvars.cpython-39.pyc
bmtk\simulator\bionet\modules\__pycache__\record_netcons.cpython-39.pyc
bmtk\simulator\bionet\modules\__pycache__\record_spikes.cpython-39.pyc
bmtk\simulator\bionet\modules\__pycache__\save_synapses.cpython-39.pyc
bmtk\simulator\bionet\modules\__pycache__\sim_module.cpython-39.pyc
bmtk\simulator\bionet\modules\__pycache__\xstim.cpython-39.pyc
bmtk\simulator\bionet\modules\__pycache__\xstim_waveforms.cpython-39.pyc
bmtk\simulator\bionet\modules\ecp.py
bmtk\simulator\bionet\modules\record_cellvars.py
bmtk\simulator\bionet\modules\record_netcons.py
bmtk\simulator\bionet\modules\record_spikes.py
bmtk\simulator\bionet\modules\save_synapses.py
bmtk\simulator\bionet\modules\sim_module.py
bmtk\simulator\bionet\modules\xstim.py
bmtk\simulator\bionet\modules\xstim_waveforms.py
bmtk\simulator\bionet\morphology.py
bmtk\simulator\bionet\nml_reader.py
bmtk\simulator\bionet\nrn.py
bmtk\simulator\bionet\pointprocesscell.py
bmtk\simulator\bionet\pointsomacell.py
bmtk\simulator\bionet\pyfunction_cache.py
bmtk\simulator\bionet\schemas\config_schema.json
bmtk\simulator\bionet\schemas\csv_edge_types.json
bmtk\simulator\bionet\schemas\csv_node_types_external.json
bmtk\simulator\bionet\schemas\csv_node_types_internal.json
bmtk\simulator\bionet\schemas\csv_nodes_external.json
bmtk\simulator\bionet\schemas\csv_nodes_internal.json
bmtk\simulator\bionet\sonata_adaptors.py
bmtk\simulator\bionet\utils.py
bmtk\simulator\bionet\virtualcell.py
bmtk\simulator\core\__init__.py
bmtk\simulator\core\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\core\__pycache__\config.cpython-39.pyc
bmtk\simulator\core\__pycache__\edge_population.cpython-39.pyc
bmtk\simulator\core\__pycache__\graph.cpython-39.pyc
bmtk\simulator\core\__pycache__\io_tools.cpython-39.pyc
bmtk\simulator\core\__pycache__\network_reader.cpython-39.pyc
bmtk\simulator\core\__pycache__\node_population.cpython-39.pyc
bmtk\simulator\core\__pycache__\node_sets.cpython-39.pyc
bmtk\simulator\core\__pycache__\simulator.cpython-39.pyc
bmtk\simulator\core\__pycache__\simulator_network.cpython-39.pyc
bmtk\simulator\core\config.py
bmtk\simulator\core\edge_population.py
bmtk\simulator\core\graph.py
bmtk\simulator\core\io_tools.py
bmtk\simulator\core\network_reader.py
bmtk\simulator\core\node_population.py
bmtk\simulator\core\node_sets.py
bmtk\simulator\core\simulator.py
bmtk\simulator\core\simulator_network.py
bmtk\simulator\core\sonata_reader\__init__.py
bmtk\simulator\core\sonata_reader\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\core\sonata_reader\__pycache__\edge_adaptor.cpython-39.pyc
bmtk\simulator\core\sonata_reader\__pycache__\network_reader.cpython-39.pyc
bmtk\simulator\core\sonata_reader\__pycache__\node_adaptor.cpython-39.pyc
bmtk\simulator\core\sonata_reader\edge_adaptor.py
bmtk\simulator\core\sonata_reader\network_reader.py
bmtk\simulator\core\sonata_reader\node_adaptor.py
bmtk\simulator\filternet\__init__.py
bmtk\simulator\filternet\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\filternet\__pycache__\cell.cpython-39.pyc
bmtk\simulator\filternet\__pycache__\cell_models.cpython-39.pyc
bmtk\simulator\filternet\__pycache__\config.cpython-39.pyc
bmtk\simulator\filternet\__pycache__\filternetwork.cpython-39.pyc
bmtk\simulator\filternet\__pycache__\filters.cpython-39.pyc
bmtk\simulator\filternet\__pycache__\filtersimulator.cpython-39.pyc
bmtk\simulator\filternet\__pycache__\io_tools.cpython-39.pyc
bmtk\simulator\filternet\__pycache__\pyfunction_cache.cpython-39.pyc
bmtk\simulator\filternet\__pycache__\sonata_adaptors.cpython-39.pyc
bmtk\simulator\filternet\__pycache__\transfer_functions.cpython-39.pyc
bmtk\simulator\filternet\__pycache__\utils.cpython-39.pyc
bmtk\simulator\filternet\cell.py
bmtk\simulator\filternet\cell_models.py
bmtk\simulator\filternet\config.py
bmtk\simulator\filternet\default_setters\__init__.py
bmtk\simulator\filternet\default_setters\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\filternet\default_setters\__pycache__\cell_loaders.cpython-39.pyc
bmtk\simulator\filternet\default_setters\cell_loaders.py
bmtk\simulator\filternet\filternetwork.py
bmtk\simulator\filternet\filters.py
bmtk\simulator\filternet\filtersimulator.py
bmtk\simulator\filternet\io_tools.py
bmtk\simulator\filternet\lgnmodel\__init__.py
bmtk\simulator\filternet\lgnmodel\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\cellmodel.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\cursor.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\fitfuns.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\kernel.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\lattice_unit_constructor.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\lgnmodel1.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\linearfilter.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\lnunit.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\make_cell_list.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\movie.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\poissongeneration.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\singleunitcell.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\spatialfilter.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\temporalfilter.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\transferfunction.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\util_fns.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\__pycache__\utilities.cpython-39.pyc
bmtk\simulator\filternet\lgnmodel\cell_metrics\sOFF_cell_data.csv
bmtk\simulator\filternet\lgnmodel\cell_metrics\sON_cell_data.csv
bmtk\simulator\filternet\lgnmodel\cell_metrics\sus_sus_cells_v3.csv
bmtk\simulator\filternet\lgnmodel\cell_metrics\tOFF_cell_data.csv
bmtk\simulator\filternet\lgnmodel\cell_metrics\tON_cell_data.csv
bmtk\simulator\filternet\lgnmodel\cell_metrics\trans_sus_cells_v3.csv
bmtk\simulator\filternet\lgnmodel\cellmodel.py
bmtk\simulator\filternet\lgnmodel\cursor.py
bmtk\simulator\filternet\lgnmodel\fitfuns.py
bmtk\simulator\filternet\lgnmodel\kernel.py
bmtk\simulator\filternet\lgnmodel\lattice_unit_constructor.py
bmtk\simulator\filternet\lgnmodel\lgnmodel1.py
bmtk\simulator\filternet\lgnmodel\linearfilter.py
bmtk\simulator\filternet\lgnmodel\lnunit.py
bmtk\simulator\filternet\lgnmodel\make_cell_list.py
bmtk\simulator\filternet\lgnmodel\movie.py
bmtk\simulator\filternet\lgnmodel\poissongeneration.py
bmtk\simulator\filternet\lgnmodel\singleunitcell.py
bmtk\simulator\filternet\lgnmodel\spatialfilter.py
bmtk\simulator\filternet\lgnmodel\temporalfilter.py
bmtk\simulator\filternet\lgnmodel\transferfunction.py
bmtk\simulator\filternet\lgnmodel\util_fns.py
bmtk\simulator\filternet\lgnmodel\utilities.py
bmtk\simulator\filternet\modules\__init__.py
bmtk\simulator\filternet\modules\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\filternet\modules\__pycache__\base.cpython-39.pyc
bmtk\simulator\filternet\modules\__pycache__\create_spikes.cpython-39.pyc
bmtk\simulator\filternet\modules\__pycache__\record_rates.cpython-39.pyc
bmtk\simulator\filternet\modules\base.py
bmtk\simulator\filternet\modules\create_spikes.py
bmtk\simulator\filternet\modules\record_rates.py
bmtk\simulator\filternet\pyfunction_cache.py
bmtk\simulator\filternet\sonata_adaptors.py
bmtk\simulator\filternet\transfer_functions.py
bmtk\simulator\filternet\utils.py
bmtk\simulator\mintnet\Image_Library.py
bmtk\simulator\mintnet\Image_Library_Supervised.py
bmtk\simulator\mintnet\__init__.py
bmtk\simulator\mintnet\__pycache__\Image_Library.cpython-39.pyc
bmtk\simulator\mintnet\__pycache__\Image_Library_Supervised.cpython-39.pyc
bmtk\simulator\mintnet\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\mintnet\analysis\LocallySparseNoise.py
bmtk\simulator\mintnet\analysis\StaticGratings.py
bmtk\simulator\mintnet\analysis\__init__.py
bmtk\simulator\mintnet\analysis\__pycache__\LocallySparseNoise.cpython-39.pyc
bmtk\simulator\mintnet\analysis\__pycache__\StaticGratings.cpython-39.pyc
bmtk\simulator\mintnet\analysis\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\mintnet\hmax\C_Layer.py
bmtk\simulator\mintnet\hmax\Readout_Layer.py
bmtk\simulator\mintnet\hmax\S1_Layer.py
bmtk\simulator\mintnet\hmax\S_Layer.py
bmtk\simulator\mintnet\hmax\Sb_Layer.py
bmtk\simulator\mintnet\hmax\ViewTunedLayer.py
bmtk\simulator\mintnet\hmax\__init__.py
bmtk\simulator\mintnet\hmax\__pycache__\C_Layer.cpython-39.pyc
bmtk\simulator\mintnet\hmax\__pycache__\Readout_Layer.cpython-39.pyc
bmtk\simulator\mintnet\hmax\__pycache__\S1_Layer.cpython-39.pyc
bmtk\simulator\mintnet\hmax\__pycache__\S_Layer.cpython-39.pyc
bmtk\simulator\mintnet\hmax\__pycache__\Sb_Layer.cpython-39.pyc
bmtk\simulator\mintnet\hmax\__pycache__\ViewTunedLayer.cpython-39.pyc
bmtk\simulator\mintnet\hmax\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\mintnet\hmax\__pycache__\hmax.cpython-39.pyc
bmtk\simulator\mintnet\hmax\hmax.py
bmtk\simulator\pointnet\__init__.py
bmtk\simulator\pointnet\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\pointnet\__pycache__\config.cpython-39.pyc
bmtk\simulator\pointnet\__pycache__\glif_utils.cpython-39.pyc
bmtk\simulator\pointnet\__pycache__\io_tools.cpython-39.pyc
bmtk\simulator\pointnet\__pycache__\pointnetwork.cpython-39.pyc
bmtk\simulator\pointnet\__pycache__\pointsimulator.cpython-39.pyc
bmtk\simulator\pointnet\__pycache__\property_map.cpython-39.pyc
bmtk\simulator\pointnet\__pycache__\pyfunction_cache.cpython-39.pyc
bmtk\simulator\pointnet\__pycache__\sonata_adaptors.cpython-39.pyc
bmtk\simulator\pointnet\__pycache__\utils.cpython-39.pyc
bmtk\simulator\pointnet\config.py
bmtk\simulator\pointnet\default_setters\__init__.py
bmtk\simulator\pointnet\default_setters\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\pointnet\default_setters\__pycache__\synapse_models.cpython-39.pyc
bmtk\simulator\pointnet\default_setters\__pycache__\synaptic_weights.cpython-39.pyc
bmtk\simulator\pointnet\default_setters\synapse_models.py
bmtk\simulator\pointnet\default_setters\synaptic_weights.py
bmtk\simulator\pointnet\glif_utils.py
bmtk\simulator\pointnet\io_tools.py
bmtk\simulator\pointnet\modules\__init__.py
bmtk\simulator\pointnet\modules\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\pointnet\modules\__pycache__\multimeter_reporter.cpython-39.pyc
bmtk\simulator\pointnet\modules\__pycache__\record_spikes.cpython-39.pyc
bmtk\simulator\pointnet\modules\multimeter_reporter.py
bmtk\simulator\pointnet\modules\record_spikes.py
bmtk\simulator\pointnet\pointnetwork.py
bmtk\simulator\pointnet\pointsimulator.py
bmtk\simulator\pointnet\property_map.py
bmtk\simulator\pointnet\pyfunction_cache.py
bmtk\simulator\pointnet\sonata_adaptors.py
bmtk\simulator\pointnet\utils.py
bmtk\simulator\popnet\__init__.py
bmtk\simulator\popnet\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\popnet\__pycache__\config.cpython-39.pyc
bmtk\simulator\popnet\__pycache__\popedge.cpython-39.pyc
bmtk\simulator\popnet\__pycache__\popnetwork.cpython-39.pyc
bmtk\simulator\popnet\__pycache__\popnetwork_OLD.cpython-39.pyc
bmtk\simulator\popnet\__pycache__\popnode.cpython-39.pyc
bmtk\simulator\popnet\__pycache__\popsimulator.cpython-39.pyc
bmtk\simulator\popnet\__pycache__\sonata_adaptors.cpython-39.pyc
bmtk\simulator\popnet\__pycache__\utils.cpython-39.pyc
bmtk\simulator\popnet\config.py
bmtk\simulator\popnet\popedge.py
bmtk\simulator\popnet\popnetwork.py
bmtk\simulator\popnet\popnetwork_OLD.py
bmtk\simulator\popnet\popnode.py
bmtk\simulator\popnet\popsimulator.py
bmtk\simulator\popnet\property_schemas\__init__.py
bmtk\simulator\popnet\property_schemas\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\popnet\property_schemas\__pycache__\base_schema.cpython-39.pyc
bmtk\simulator\popnet\property_schemas\__pycache__\property_schema_ver0.cpython-39.pyc
bmtk\simulator\popnet\property_schemas\__pycache__\property_schema_ver1.cpython-39.pyc
bmtk\simulator\popnet\property_schemas\base_schema.py
bmtk\simulator\popnet\property_schemas\property_schema_ver0.py
bmtk\simulator\popnet\property_schemas\property_schema_ver1.py
bmtk\simulator\popnet\sonata_adaptors.py
bmtk\simulator\popnet\utils.py
bmtk\simulator\utils\__init__.py
bmtk\simulator\utils\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\utils\__pycache__\config.cpython-39.pyc
bmtk\simulator\utils\__pycache__\graph.cpython-39.pyc
bmtk\simulator\utils\__pycache__\io.cpython-39.pyc
bmtk\simulator\utils\__pycache__\load_spikes.cpython-39.pyc
bmtk\simulator\utils\__pycache__\nwb.cpython-39.pyc
bmtk\simulator\utils\__pycache__\property_maps.cpython-39.pyc
bmtk\simulator\utils\__pycache__\sim_validator.cpython-39.pyc
bmtk\simulator\utils\__pycache__\simulation_inputs.cpython-39.pyc
bmtk\simulator\utils\__pycache__\simulation_reports.cpython-39.pyc
bmtk\simulator\utils\config.py
bmtk\simulator\utils\graph.py
bmtk\simulator\utils\io.py
bmtk\simulator\utils\load_spikes.py
bmtk\simulator\utils\nwb.py
bmtk\simulator\utils\property_maps.py
bmtk\simulator\utils\scripts\convert_filters.py
bmtk\simulator\utils\sim_validator.py
bmtk\simulator\utils\simulation_inputs.py
bmtk\simulator\utils\simulation_reports.py
bmtk\simulator\utils\stimulus\LocallySparseNoise.py
bmtk\simulator\utils\stimulus\NaturalScenes.py
bmtk\simulator\utils\stimulus\StaticGratings.py
bmtk\simulator\utils\stimulus\__init__.py
bmtk\simulator\utils\stimulus\__pycache__\LocallySparseNoise.cpython-39.pyc
bmtk\simulator\utils\stimulus\__pycache__\NaturalScenes.cpython-39.pyc
bmtk\simulator\utils\stimulus\__pycache__\StaticGratings.cpython-39.pyc
bmtk\simulator\utils\stimulus\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\utils\stimulus\lsn.npy
bmtk\simulator\utils\tools\__init__.py
bmtk\simulator\utils\tools\__pycache__\__init__.cpython-39.pyc
bmtk\simulator\utils\tools\__pycache__\process_spikes.cpython-39.pyc
bmtk\simulator\utils\tools\__pycache__\spatial.cpython-39.pyc
bmtk\simulator\utils\tools\process_spikes.py
bmtk\simulator\utils\tools\spatial.py
bmtk\tests\builder\__pycache__\test_connection_map.cpython-39.pyc
bmtk\tests\builder\__pycache__\test_connector.cpython-39.pyc
bmtk\tests\builder\__pycache__\test_densenetwork.cpython-39.pyc
bmtk\tests\builder\__pycache__\test_edge_iterator.cpython-39.pyc
bmtk\tests\builder\__pycache__\test_id_generator.cpython-39.pyc
bmtk\tests\builder\__pycache__\test_iterator.cpython-39.pyc
bmtk\tests\builder\__pycache__\test_node_pool.cpython-39.pyc
bmtk\tests\builder\__pycache__\test_node_set.cpython-39.pyc
bmtk\tests\builder\test_connection_map.py
bmtk\tests\builder\test_connector.py
bmtk\tests\builder\test_densenetwork.py
bmtk\tests\builder\test_edge_iterator.py
bmtk\tests\builder\test_id_generator.py
bmtk\tests\builder\test_iterator.py
bmtk\tests\builder\test_node_pool.py
bmtk\tests\builder\test_node_set.py
bmtk\tests\simulator\bionet\__pycache__\bionet_virtual_files.cpython-39.pyc
bmtk\tests\simulator\bionet\__pycache__\set_cell_params.cpython-39.pyc
bmtk\tests\simulator\bionet\__pycache__\set_syn_params.cpython-39.pyc
bmtk\tests\simulator\bionet\__pycache__\set_weights.cpython-39.pyc
bmtk\tests\simulator\bionet\__pycache__\test_biograph.cpython-39.pyc
bmtk\tests\simulator\bionet\__pycache__\test_nrn.cpython-39.pyc
bmtk\tests\simulator\bionet\bionet_virtual_files.py
bmtk\tests\simulator\bionet\set_cell_params.py
bmtk\tests\simulator\bionet\set_syn_params.py
bmtk\tests\simulator\bionet\set_weights.py
bmtk\tests\simulator\bionet\test_biograph.py
bmtk\tests\simulator\bionet\test_nrn.py
bmtk\tests\simulator\pointnet\__pycache__\pointnet_virtual_files.cpython-39.pyc
bmtk\tests\simulator\pointnet\__pycache__\test_pointgraph.cpython-39.pyc
bmtk\tests\simulator\pointnet\pointnet_virtual_files.py
bmtk\tests\simulator\pointnet\test_pointgraph.py
bmtk\tests\simulator\popnet\__pycache__\popnet_virtual_files.cpython-39.pyc
bmtk\tests\simulator\popnet\__pycache__\test_popgraph.cpython-39.pyc
bmtk\tests\simulator\popnet\popnet_virtual_files.py
bmtk\tests\simulator\popnet\test_popgraph.py
bmtk\tests\simulator\utils\__pycache__\test_config.cpython-39.pyc
bmtk\tests\simulator\utils\__pycache__\test_nwb.cpython-39.pyc
bmtk\tests\simulator\utils\files\circuit_config.json
bmtk\tests\simulator\utils\files\config.json
bmtk\tests\simulator\utils\files\simulator_config.json
bmtk\tests\simulator\utils\test_config.py
bmtk\tests\simulator\utils\test_nwb.py
bmtk\utils\__init__.py
bmtk\utils\__pycache__\__init__.cpython-39.pyc
bmtk\utils\__pycache__\property_schema.cpython-39.pyc
bmtk\utils\__pycache__\sim_setup.cpython-39.pyc
bmtk\utils\cell_vars\__init__.py
bmtk\utils\cell_vars\__pycache__\__init__.cpython-39.pyc
bmtk\utils\cell_vars\__pycache__\var_reader.cpython-39.pyc
bmtk\utils\cell_vars\var_reader.py
bmtk\utils\converters\__init__.py
bmtk\utils\converters\__pycache__\__init__.cpython-39.pyc
bmtk\utils\converters\__pycache__\hoc_converter.cpython-39.pyc
bmtk\utils\converters\hoc_converter.py
bmtk\utils\converters\sonata\__init__.py
bmtk\utils\converters\sonata\__pycache__\__init__.cpython-39.pyc
bmtk\utils\converters\sonata\__pycache__\edge_converters.cpython-39.pyc
bmtk\utils\converters\sonata\__pycache__\node_converters.cpython-39.pyc
bmtk\utils\converters\sonata\edge_converters.py
bmtk\utils\converters\sonata\node_converters.py
bmtk\utils\io\__init__.py
bmtk\utils\io\__pycache__\__init__.cpython-39.pyc
bmtk\utils\io\__pycache__\cell_vars.cpython-39.pyc
bmtk\utils\io\__pycache__\firing_rates.cpython-39.pyc
bmtk\utils\io\__pycache__\spike_trains.cpython-39.pyc
bmtk\utils\io\__pycache__\tabular_network.cpython-39.pyc
bmtk\utils\io\__pycache__\tabular_network_v0.cpython-39.pyc
bmtk\utils\io\__pycache__\tabular_network_v1.cpython-39.pyc
bmtk\utils\io\cell_vars.py
bmtk\utils\io\firing_rates.py
bmtk\utils\io\spike_trains.py
bmtk\utils\io\tabular_network.py
bmtk\utils\io\tabular_network_v0.py
bmtk\utils\io\tabular_network_v1.py
bmtk\utils\property_schema.py
bmtk\utils\scripts\bionet\__pycache__\run_bionet.cpython-39.pyc
bmtk\utils\scripts\bionet\default_config.json
bmtk\utils\scripts\bionet\hoc_templates\BioAllen_old.hoc
bmtk\utils\scripts\bionet\hoc_templates\BioAxonStub.hoc
bmtk\utils\scripts\bionet\hoc_templates\Biophys1.hoc
bmtk\utils\scripts\bionet\intfire\IntFire1_exc_1.json
bmtk\utils\scripts\bionet\intfire\IntFire1_inh_1.json
bmtk\utils\scripts\bionet\mechanisms\modfiles\CaDynamics.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\Ca_HVA.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\Ca_LVA.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\Ih.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\Im.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\Im_v2.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\K_P.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\K_T.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\Kd.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\Kv2like.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\Kv3_1.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\NaTa.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\NaTs.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\NaV.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\Nap.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\SK.mod
bmtk\utils\scripts\bionet\mechanisms\modfiles\vecevent.mod
bmtk\utils\scripts\bionet\point_neuron_templates\IntFire1_exc_1.json
bmtk\utils\scripts\bionet\point_neuron_templates\IntFire1_inh_1.json
bmtk\utils\scripts\bionet\run_bionet.py
bmtk\utils\scripts\bionet\synaptic_models\AMPA_ExcToExc.json
bmtk\utils\scripts\bionet\synaptic_models\AMPA_ExcToInh.json
bmtk\utils\scripts\bionet\synaptic_models\GABA_InhToExc.json
bmtk\utils\scripts\bionet\synaptic_models\GABA_InhToInh.json
bmtk\utils\scripts\bionet\synaptic_models\instanteneousExc.json
bmtk\utils\scripts\bionet\synaptic_models\instanteneousInh.json
bmtk\utils\scripts\pointnet\__pycache__\run_pointnet.cpython-39.pyc
bmtk\utils\scripts\pointnet\point_neuron_templates\472363762_point.json
bmtk\utils\scripts\pointnet\point_neuron_templates\472912177_point.json
bmtk\utils\scripts\pointnet\point_neuron_templates\473862421_point.json
bmtk\utils\scripts\pointnet\point_neuron_templates\473863035_point.json
bmtk\utils\scripts\pointnet\point_neuron_templates\473863510_point.json
bmtk\utils\scripts\pointnet\point_neuron_templates\IntFire1_exc_point.json
bmtk\utils\scripts\pointnet\point_neuron_templates\IntFire1_inh_point.json
bmtk\utils\scripts\pointnet\point_neuron_templates\filter_point.json
bmtk\utils\scripts\pointnet\run_pointnet.py
bmtk\utils\scripts\pointnet\synaptic_models\ExcToExc.json
bmtk\utils\scripts\pointnet\synaptic_models\ExcToInh.json
bmtk\utils\scripts\pointnet\synaptic_models\InhToExc.json
bmtk\utils\scripts\pointnet\synaptic_models\InhToInh.json
bmtk\utils\scripts\pointnet\synaptic_models\instanteneousExc.json
bmtk\utils\scripts\pointnet\synaptic_models\instanteneousInh.json
bmtk\utils\scripts\popnet\__pycache__\run_popnet.cpython-39.pyc
bmtk\utils\scripts\popnet\population_models\exc_model.json
bmtk\utils\scripts\popnet\population_models\inh_model.json
bmtk\utils\scripts\popnet\run_popnet.py
bmtk\utils\scripts\popnet\synaptic_models\ExcToExc.json
bmtk\utils\scripts\popnet\synaptic_models\ExcToInh.json
bmtk\utils\scripts\popnet\synaptic_models\InhToExc.json
bmtk\utils\scripts\popnet\synaptic_models\InhToInh.json
bmtk\utils\scripts\popnet\synaptic_models\input_ExcToExc.json
bmtk\utils\scripts\popnet\synaptic_models\input_ExcToInh.json
bmtk\utils\scripts\sonata.circuit_config.json
bmtk\utils\scripts\sonata.simulation_config.json
bmtk\utils\sim_setup.py
bmtk\utils\sonata\__init__.py
bmtk\utils\sonata\__pycache__\__init__.cpython-39.pyc
bmtk\utils\sonata\__pycache__\column_property.cpython-39.pyc
bmtk\utils\sonata\__pycache__\config.cpython-39.pyc
bmtk\utils\sonata\__pycache__\edge.cpython-39.pyc
bmtk\utils\sonata\__pycache__\file.cpython-39.pyc
bmtk\utils\sonata\__pycache__\file_root.cpython-39.pyc
bmtk\utils\sonata\__pycache__\group.cpython-39.pyc
bmtk\utils\sonata\__pycache__\node.cpython-39.pyc
bmtk\utils\sonata\__pycache__\population.cpython-39.pyc
bmtk\utils\sonata\__pycache__\types_table.cpython-39.pyc
bmtk\utils\sonata\__pycache__\utils.cpython-39.pyc
bmtk\utils\sonata\column_property.py
bmtk\utils\sonata\config.py
bmtk\utils\sonata\edge.py
bmtk\utils\sonata\file.py
bmtk\utils\sonata\file_root.py
bmtk\utils\sonata\group.py
bmtk\utils\sonata\node.py
bmtk\utils\sonata\population.py
bmtk\utils\sonata\types_table.py
bmtk\utils\sonata\utils.py
bmtk\utils\spike_trains\__init__.py
bmtk\utils\spike_trains\__pycache__\__init__.cpython-39.pyc
bmtk\utils\spike_trains\__pycache__\spikes_csv.cpython-39.pyc
bmtk\utils\spike_trains\__pycache__\spikes_file.cpython-39.pyc
bmtk\utils\spike_trains\spikes_csv.py
bmtk\utils\spike_trains\spikes_file.py
When I run the codes in the tutorial, I keep coming into the error: astype() got an unexpected keyword argument 'copy'. I have no idea how to solve it. I read some reports that the version of h5py and pandas should be downgraded to 2.8 and 0.23 respectively, but I tried and encountered another problem : "Error: Fail to build wheel for pandas"
When the PoissonSpikeGenerator is used in both tutorials, the numbers passed in for times and firing_rate are in the wrong units. The problem is that when the spikes are eventually played into the vecstim in the virtual cells, the times are interpreted as ms instead of seconds. Therefore, the start and stop that is passed into times for psg should be in ms and the firing_rate passed into the psg should be in kHz instead of Hz.
For example, in tutorial 2, the spike train is supposed to be 3 seconds long with spikes firing at 10 Hz. In the tutorial they use firing_rate=10.0 and times=(0.0, 3.0), but what they really want is firing_rate=0.01 and times=(0.0, 3000.0).
Error received:
>>> raster_plot(cells_file='network/hippocampus_nodes.h5', cell_models_file='network/hippocampus_node_types.csv', spikes_file='output/spikes.h5')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\user\AppData\Local\Continuum\anaconda3\lib\site-packages\bmtk-0.0.7-py3.6.egg\bmtk\analyzer\visualization\spikes.py", line 148, in plot_spikes
population = list(cells_h5['/nodes'].keys())
TypeError: 'KeysView' object does not support indexing
The root of the problem seems to be bmtk.analyzer.visualizer.spikes.plot_spikes is not using python3 h5py friendly syntax.
Changing line 148 from
population = cells_h5['/nodes'].keys()[0]
to
population = list(cells_h5['/nodes'])[0]
will prevent this error but I'm not sure this will hold in python2.
As mentioned here: #107 (comment) there is still an issue with pointsimulator.py with the new code for adding currents. Example error (running https://github.com/AllenInstitute/bmtk/blob/develop/docs/examples/point_iclamp/run_pointnet.py):
2019-12-12 16:20:21,171 [INFO] Building recurrent connections
Traceback (most recent call last):
File "run_pointnet.py", line 14, in <module>
main('config.json')
File "run_pointnet.py", line 9, in main
sim = pointnet.PointSimulator.from_config(configure, network)
File "/home/padraig/git/bmtk_ai/bmtk/simulator/pointnet/pointsimulator.py", line 266, in from_config
network.add_step_currents(amp_times, amp_values, node_set, sim_input.name)
File "/home/padraig/git/bmtk_ai/bmtk/simulator/pointnet/pointsimulator.py", line 153, in add_step_currents
nest_ids = [self.net._gid2nestid[gid] for gid in node_set.gids()]
It's just a suggestion, but wouldn't it make sense to install run_bionet.py, run_popnet.py etc as commandline tools when pip installing bmtk ? (Or maybe wrap everything in one commandline tool ('bmtk' ?), with 'popnet' / 'bionet' etc has argument.
As far as I can see these scripts get reused a lot without modification, and always have to be present in the project directories ? Being able to have them available immediately would make things easier.
Hello, would be great to include the possibility to specify different sections of the "source" internal nodes like dendrites as opposed to axons, like you do for the target_sections to establish connectivity rules. This will allow for dendrodendritic synapses.
FilterNet tutorial is incomplete
For BioNet implement voltage clamping mechanisms. See SEClamp input stimulus in SONATA documentation for examples of how it should be instantiated.
Currently BMTK will delete all files & subfolders in the output_dir as specified of a Sonata file:
"output":{
"output_dir": "$OUTPUT_DIR",
...
}
including presumably the user's home directory if they specify that...
https://github.com/AllenInstitute/bmtk/blob/develop/bmtk/simulator/core/io_tools.py#L73
It would be better to only delete the files BMTK expects to generate (as is suggested by the spec: "An optional attribute named "overwrite_output_dir" provides a hint to simulators to let them know if already existing output files must be overwritten."), or better still warn the user that that dir contains files BMTK didn't create.
Hello,
I am currently packaging bmtk1 for Debian-Med2. It seems like the last release tarball was on the 5th of May 2019. We take preference on always packaging the release tarball and not straight from the git repository. Unfortunately, with the latest release tarball being close to a year old and a couple hundred commits on the most recent one, there is more potential in packaging the latest version as opposed to the release tarball generated a year ago.
Please could you consider prioritising for generating a release tarball?
Kind regards,
Shayan Doust
there is no file called "318331342_fit.json", and also even if I changed to some existing file, it would still give me error TypeError: init() got multiple values for keyword argument 'network'
at line 24: sim = Simulation(conf, network=net) # initialize a simulation
I asked this question as a followup from AllenInstitute/AllenSDK#107
I'm running the bmtk chapter 4 tutorial and using a data reporting method from the bmtk-howto from @tjbanks. However, what I need is an output of all the dynamic variables at each time step. I don't see how to get access to more than just a few variables or spike times. Is there a way to get a complete dump of the ODE output?
For reference, here is my configuration file:
{
"target_simulator": "NEURON",
"run": {
"tstop": 3000.0,
"dt": 0.1,
"dL": 20.0,
"spike_threshold": -15.0,
"nsteps_block": 5000
},
"conditions": {
"celsius": 34.0,
"v_init": -80.0
},
"inputs": {
"LGN_spikes": {
"input_type": "spikes",
"module": "nwb",
"input_file": "./lgn_spikes.nwb",
"node_set": "LGN",
"trial": "trial_0"
}
},
"output": {
"log_file": "log.txt",
"output_dir": "./output",
"spikes_file": "spikes.h5",
"spikes_file_csv": "spikes.csv",
"overwrite_output_dir": true,
"cell_vars_dir": "./output/cellvars"
},
"reports": {
"membrane_report": {
"cells": [
10,80
],
"sections": "soma",
"module": "membrane_report",
"variable_name": [
"v",
"cai"
]
},
"ecp": {
"cells": [10,80],
"variable_name": "v",
"module": "extracellular",
"electrode_positions": "../components/recXelectrodes/linear_electrode.csv",
"file_name": "ecp.h5",
"contributions_dir": "ecp_contributions"
}
},
"recXelectrode": {
"positions": "$COMPONENT_DIR/recXelectrodes/linear_electrode.csv"
},
"network": "./circuit_config.json",
"config_path": "/Users/chris/Documents/Brown/Research/v1corticalcol/sim_ch04/simulation_config.json",
"config_dir": "/Users/chris/Documents/Brown/Research/v1corticalcol/sim_ch04",
"components": {
"morphologies_dir": "./biophys_components/morphologies",
"synaptic_models_dir": "./biophys_components/synaptic_models",
"mechanisms_dir": "./biophys_components/mechanisms",
"biophysical_neuron_models_dir": "./biophys_components/biophysical_neuron_templates",
"point_neuron_models_dir": "./biophys_components/point_neuron_templates"
},
"networks": {
"nodes": [
{
"node_types_file": "./network/LGN_node_types.csv",
"nodes_file": "./network/LGN_nodes.h5"
},
{
"node_types_file": "./network/V1_node_types.csv",
"nodes_file": "./network/V1_nodes.h5"
}
],
"edges": [
{
"edge_types_file": "./network/V1_V1_edge_types.csv",
"edges_file": "./network/V1_V1_edges.h5"
},
{
"edge_types_file": "./network/LGN_V1_edge_types.csv",
"edges_file": "./network/LGN_V1_edges.h5"
}
]
}
}
The neuron VecStim definition, from vecevent.mod, is required to run a network with artificial cells created by bmtk - it would be great if it could be compiled and added to the path automatically, either on installation, or maybe when running build_env_bionet or something similar (ie. compile not only modfiles in the local network directory but also some core modfiles in a central directory).
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.