Git Product home page Git Product logo

Comments (14)

golosio avatar golosio commented on August 19, 2024 1

That's exactly what I was looking for , thank you! I see now, that piece of code checks if the npy files exist, and if not it reads the gdf files and convert them into the npy files. Sorry, from your previous answer I thought the exception was just to give some error message, I did not realize that it could also create the npy files.

from multi-area-model.

AlexVanMeegen avatar AlexVanMeegen commented on August 19, 2024 1

Apologies for the late answer, hope it is still relevant. Yes, you are right, one first has to instantiate the class with analysis=False, run the simulation, and then instantiate the class with analysis=True. If you are running the simulation on HPC, I think the most sensible option is to use two scripts. Alternatively, you could adjust run_simulation.py and add a line in the end to instantiate the class with analysis=True. However, this has the drawback that the analysis gets executed within the same job as the simulation although its hardware requirements are completely different.

All handling of labels / path should be done under the hood. Thus, after running the simulation, you can instantiate

M = MultiAreaModel(network_label, simulation=True, sim_spec=simulation_label, analysis=True)

in the 'second script'. Put differently, the intended way to handle labels is to pass them to the MultiAreaModel class.

from multi-area-model.

AlexVanMeegen avatar AlexVanMeegen commented on August 19, 2024 1

For a minimal example (in one script), see e.g. test_analysis.py.

from multi-area-model.

golosio avatar golosio commented on August 19, 2024 1

Thank you for you answer @AlexVanMeegen. We were able to run the simulations on the JUSUF cluster. I'll try to run the analysis following your instructions.

from multi-area-model.

AlexVanMeegen avatar AlexVanMeegen commented on August 19, 2024

Thanks again for reporting this! Your first fix looks good to me, I'll correct this with the default parameter of K_stable (of course you are also most welcome to make a PR if you want to!).

Regarding the follow up issue: did you run the required simulations as specified in the README? Naively, it looks to me like the simulation output is missing. Just to prevent problems: for the simulations, you need access to HPC resources; in particular, you need a combined ~1.5 TB RAM to instantiate the connectivity in memory. As an alternative, the original simulation data is available.

from multi-area-model.

golosio avatar golosio commented on August 19, 2024

Thank you @AlexVanMeegen! I ran the downscaled simulation (run_example_downscaled.py) on a workstation. Clearly I am aware that it will not provide realistic distributions, but isn't it suitable for producing the plots, even if they are not realistic?
I just started to run full scale simulations on the JUSUF cluster in Juelich; I will let you know if those simulations go well. However there is something that I still do not understand. It seems that the simulations write the spikes in gdf files, e.g.
a263375d575756279cc32ea0e049e1ae-spikes-00001-06.gdf
where 06 is the local thread index, while the scripts in the figures/Schmidt2018_dyn folder are looking for spike recordings in a npy format, as spikes_V1_6E.npy
I could not find any piece of code that extracts the spikes from the gdf files and convert them in the spikes_{area}_{pop}.npy format...

from multi-area-model.

AlexVanMeegen avatar AlexVanMeegen commented on August 19, 2024

Ah, I see. In principle, i agree it could also work with a downscaled simulation but I haven't tested that. Would have to check in detail how much some things are baked into the snakemake workflow.

Regarding your question: This exception is called if no npy files are found. First, probably this is not documented properly - can you pinpoint a place where you would have expected this information? Second, and I assume this was your point, this does not create recordings/spikes_{area}_{pop}.npy files but recordings/{hash}-spikes-{area}-{pop}.npy files. Do you have those for the downscaled simulation?

from multi-area-model.

golosio avatar golosio commented on August 19, 2024

Thank you again @AlexVanMeegen. I will give you my suggestion about documentation as soon as I have some results from the fullscale simulation, first I need to understand the differences in the output in the two cases, downscaled and fullscale.
I do not have the recordings/{hash}-spikes-{area}-{pop}.npy files for the downscaled simulation, just the gdf files.
My question is, which piece of code produces those recordings/{hash}-spikes-{area}-{pop}.npy files? Because I was able to find scripts that try to read this kind of files, but not the code that should produce them.

from multi-area-model.

AlexVanMeegen avatar AlexVanMeegen commented on August 19, 2024

The exception that I linked above should produce the npy files, here is the final np.save. Or am I missing your point?

from multi-area-model.

AlexVanMeegen avatar AlexVanMeegen commented on August 19, 2024

Yep, precisely. Sorry, in hindsight my comment was pretty ambiguous.

Maybe one more point: since this is in the analysis class, you want it to be initialized to trigger the conversion gdf -> npy. Thus, you probably want to pass analysis=True to the MultiAreaModel class.

from multi-area-model.

golosio avatar golosio commented on August 19, 2024

Sorry again. If I pass analysis=True to the MultiAreaModel class in the script run_example_fullscale.py, I receive the error:
FileNotFoundError: [Errno 2] No such file or directory: '/p/scratch/icei-hbp-2020-0007/mam/aa6370000042084c4725111b24007734/recordings/network_gids.txt'
Apparently, the script attempts the conversion gdf -> npy before the job is completed and the network_gids.txt and gdf files are created.
Should I use two separate scripts, first one for running the full scale simulation, second after the simulation is finished for doing the analysis? In this case, do you have a template and/or recommendations about this second script? What is a clean way to make this script use the label/path of the previous simulation?

from multi-area-model.

AlexVanMeegen avatar AlexVanMeegen commented on August 19, 2024

Great to hear! Let me know how it works out.

Apologies for all these hiccups, I did the code review for this project when I was still young and naive so I am at least partly responsible ^^

from multi-area-model.

golosio avatar golosio commented on August 19, 2024

Thank you @AlexVanMeegen.
I think this issue can be closed. The first part has been solved with last commits, while the second part is more a matter of documentation. I will raise a new issue with suggestions for improving documentation.

from multi-area-model.

AlexVanMeegen avatar AlexVanMeegen commented on August 19, 2024

Thanks for reporting this, @golosio. Suggestions for the documentation would be highly appreciated!

from multi-area-model.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.