Comments (7)
Well python doesn't do tail-call optimisation, and PyPet doesn't allow us a way to write things as a loop. So if you are allocating large variables in postprocess, I would suggest using del
to clean up memory manually before starting the next iteration.
from l2l.
Although I'm open to suggestions for a fix if there is a better way to do it.
from l2l.
How exactly is the postprocess function called recursively? I just checked the PyPet source code and couldn't find anything that suggested such an occurrence? Is it possible for you to give some context to this? Like the line of code / debug trace that makes you suspect this?
The way i think things work now, is in the following iterative fashion:
- Initialization specifies individuals for generation 0
- All individuals are run
- Results are processed (this is where the postprocess function is called). this function calls
f_expand_trajectory
to expand the trajectory with the new individuals to evaluate. - if trajectory is expanded, continue from 2.
from l2l.
@maharjun: From my understanding the issue lies already within the postprocess function.
This function is called whenever all individuals are evaluated. The point is that expand trajectory is also called at the end of this function causing postprocess to run again after the new individuals are evaluated.
Variables created in the post process function are not deleted in this way.
The flow is illustrated below:
Initial population
-> expand trajectory
postprocess
-> expand trajectory
postprocess
-> expand trajectory
postprocess
-> expand trajectory
@anandtrex:
I also considered that option and will have a closer look on that
from l2l.
What if we embed the env.run
method in a loop like:
while optimizer.running():
env.run(optimizee.simulate)
at the same time we delete the _expand_trajectory
statement in post_process
and instead set a flag that the trajectory should be expanded and optimizer.running
returns this flag.
Or even shorter, having a loop like:
while len(optimizer.eval_pop) > 0:
env.run(optimizee.simulate)
Would just be a minor change and the post_process
method could return every time.
from l2l.
@bohnstingl
I think you have a misconception. if you look at the code of the Optimizer::expand_trajectory
function, you will notice a call to traj.f_expand(). What this does is that it adds the new parameters to be explored to the trajectory. NOTE that this function is NOT responsible for running these new individuals.
After expand_trajectory
is executed, the post_process
function is terminated, and control is passed to the env.run(...)
function. Here, the function takes a look at the trajectory to see if it has been expanded or not. If it has, then it loops and runs the new individuals, and calls the post_process
.... and so on. (NOTE again that all of this is after the previous post_process
function has terminated). If it finds that the trajectory has not been expanded, then, no more individuals are run, the simulation stops and env.run(...)
ends
[In all of the above, one needs to remember that the env
has access to the traj
object that is passed to the post_process
function]
<Having read this, look at my previous comment, It should make much more sense now.>
from l2l.
Closing issue since it looks like it may not be a bug. Feel free to reopen if that's not the case.
from l2l.
Related Issues (20)
- Add documentation regarding storing of the generation-wise parameters
- Fix the CircleCI testing of the python2 version HOT 1
- Create Zenodo link after final release
- missing working directories
- GS and CE optimizers give 'ParameterDict' object is not callable error
- add pep8 speaks to repository HOT 1
- Index confusion HOT 2
- Optimizers should not have default parameters
- stop_criterion requires knowledge of optima HOT 4
- Cross entropy noise addition should be to the parameters and not to the samples HOT 2
- Fitness reported by recorder should be actual fitness
- Run all optimizers on mixture of gaussians
- Use t-SNE for visualising high dimensional space
- Rosenbrock Function incorrectly implemented HOT 1
- Pypet Inefficiencies HOT 1
- Conflicting ZMQ addresses of Gradient Descent Optimizer and LTL-SPORE optimizee HOT 1
- SCOOP Leaking Memory Too
- SCOOP Bugs and things
- Linear smoothing makes no sense for gaussian mixture
- Optimizer with positive weight trying to minimize instead of maximizing HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from l2l.