Comments (9)
I thought the gaussian falloff was for local network connection weights but that synchronous evoked inputs targeted all relevant cells (according to their type) at the same time and same strength. Is that not correct?
The gaussian fall-off is controlled by the "lamtha" parameter. For rhythmic inputs, it seems to be set at 100 and for evoked inputs, it's set at 3. So, what @stephanie-r-jones is saying seems correct. We need to make this stuff more transparent but getting their slowly!
Sure enough. Maybe we can add lamtha
as an optional argument in our updated feed-creation API.
from hnn-core.
@rythorpe I'm closing this issue. It seems to be resolved now that we have a new feed-creation API and lamtha is a parameter with a different name -- space_constant
. Please feel free to reopen if you disagree
from hnn-core.
I just observed this too with the default parameter set as a starting point where rhythmic inputs appear to have more of an effect. The code for generating inputs could use a lot of improvement. There aren't any methods for inspecting the generated rhythmic feeds in feed.py right now. If you wanted to add a method that allows comparison at the python level, we could definitely reuse that code as a test once integrated with hnn core.
from hnn-core.
One important difference in the evoked vs rhythmic input is the spatial spread of the input in the network. If I recall correctly, the evoked input targets the center of the network with a gaussian falloff, while the rhythmic input has a wider fall off and contacts all of the cells with nearly equal strength. There is also the issue of synchrony of the inputs, and evoked inputs have the option of being synchronous or asynchronous to the cells, while this is not an option for rhythmic inputs. We need to clarify this on the website.
from hnn-core.
from hnn-core.
@rythorpe I believe you are the best person to say what work remains to be done with this or if it will be part of hnn-core. Could you update this issue?
from hnn-core.
@blakecaldwell I think it will be much more intuitive to address this issue in hnn-core. Most of it can be addressed as we update the API.
from hnn-core.
Closed in favor of #114
from hnn-core.
I thought the gaussian falloff was for local network connection weights but that synchronous evoked inputs targeted all relevant cells (according to their type) at the same time and same strength. Is that not correct?
The gaussian fall-off is controlled by the "lamtha" parameter. For rhythmic inputs, it seems to be set at 100 and for evoked inputs, it's set at 3. So, what @stephanie-r-jones is saying seems correct. We need to make this stuff more transparent but getting their slowly!
from hnn-core.
Related Issues (20)
- BUG: MPI simulations break for `dt` of certain size HOT 1
- Upload data error - HNN gui HOT 2
- Clean up optimization API
- Improve error messages for adding drives
- rename simulate_dipole -> simulate HOT 3
- Proposed Enhancements for the Current GUI: A Refined Feature List HOT 1
- change name from calcium model to Kohl_2022 HOT 4
- [JOSS Review] HNN-core: A Python software for cellular and circuit-level interpretation of human MEG/EEG HOT 2
- [JOSS] Documentation HOT 1
- Optimization example error HOT 4
- Problems using non-'soma' values for record_isec argument in simulate_dipole() HOT 12
- [JOSS] Software Paper HOT 2
- pre-allocate arrays for storing continuous simulation data in network_builder.py
- issue with GUI install
- GUI does not show dipole plots HOT 2
- installation on m2 mac HOT 1
- [BUG] `plot_dipole` not showing in GUI with `matplotlib>=3.8.0` HOT 9
- tests: add axis data checks for all plots available in GUI HOT 2
- BUG: Deleting drives in GUI after file upload prevents loading of the same file HOT 1
- GUI callbacks error messages are not logged
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from hnn-core.