icb-dcm / parpe Goto Github PK
View Code? Open in Web Editor NEWParameter estimation for dynamical models using high-performance computing, batch and mini-batch optimizers, and dynamic load balancing.
License: MIT License
Parameter estimation for dynamical models using high-performance computing, batch and mini-batch optimizers, and dynamic load balancing.
License: MIT License
Use h5copy
for now, e.g.:
h5copy -i amici/examples/steadystate/data.h5 -s / -o steadystate_2017-06-28_rank00000.h5 -d /inputData
NOTE: https://support.hdfgroup.org/HDF5/doc/RM/RM_H5O.html#Object-Copy
HDF5 file: measurements/y(sigma): condition x t x y
-> should enable chunked storage / compression
-> need to label timepoints: /measurements/[attr]timepoints double; [infinity] for steady-state data
So far only happening for ipopt
Need to reorganize training data
class MinibatchDataProvider : public MultiConditionDataProvider
Unlike MultiConditionDataProvider, need one instance per optimization to keep track of batches
Add class Optimizer
, class LocalOptimizer : public Optimizer
, class MiniBatchOptimizer : public LocalOptimizer
Add Optimizer::optimize(OptimizationProblem*)
-> need to refactor Ceres and Ipopt wrappers
Need to run simulation in separate thread on worker.
This will also facilitate sending cancel requests from master.
Exception of type: OPTION_INVALID in file "IpAlgBuilder.cpp" at line 271:
Exception message: Selected linear solver MA27 not available.
Tried to obtain MA27 from shared library "libhsl.so", but the following error occured:
libhsl.so: cannot open shared object file: No such file or directory
EXIT: Invalid option encountered.
Update: Should terminate if any unknown options are set
To allow for negative scaling parameters.
Should only be necessary for offset parameters?!
Specify amici options in hdf5 file
See /fixedParameters/referenceConditions
To speed up tests
Check versions
see amiciUserDataSerialization
same for ExpData
, so worker can simply run simulations based on received messages.
Will simplify #10
Make as generic as possible
add to ParallelOptimizationProblem?
For IpOpt and Ceres
For Ipopt, mind difference between ApplicationReturnStatus
and SolverReturn
.
Will make it easier to minibatch methods later on
add factory method to OptimizationOptions
Subclass OptimizationOptions for each Optimizer to account for unique options?
Don't wait for all jobs to finish.
IpOpt may call Eval_F
and Eval_Grad_F
with identical parameters. Save previous results and check for new_x
to avoid recomputation.
Currently disabled because __node / STL problem.
To be able to use same starting points for different optimizers
Because execute_process
is not run before every build.
TODO: Add dummy target as in https://stackoverflow.com/questions/13920072/how-to-always-run-command-when-building-regardless-of-any-dependency
e.g. https://github.com/robotology/icub-main/blob/master/conf/FindIPOPT.cmake https://github.com/cdcseacave/openMVS/blob/master/build/Modules/FindCERES.cmake
should be usable like ./optimizationOptions bla.h5 -s ceres/maxiter 1
of model, suitesparse, ipopt, ceres, ...
HDF5 generation script still looks for _genotypespecific_
in parameter names. get rid of that.
Currently only integer values are used, which is hard to read and error-prone. Need string-to-enum. Available for ceres in types.cc.
Currently all dependencies are scanned as well.
Add signal handler to cancel all tasks and flush output.
To avoid recomputation in case of crash or termination due to exceeding wall-time limits.
EDIT: warm-start probably not possibly with all optimizers
Find problem and reenable mock support.
http://people.sc.fsu.edu/~jburkardt/f_src/toms611/toms611.html
https://github.com/pyushkevich/cmrep/blob/master/extras/toms611/toms611.c
entry functions:
sumsl_
/* *** minimize general unconstrained objective function using *** /
/ *** analytic gradient and hessian approx. from secant update *** */
smsno_
/* *** minimize general unconstrained objective function using /
/ *** finite-difference gradients and secant hessian approximations. */
In intermediate function, evaluate model on test set and see if prediction likelihood improves
-> need to add validation set to dataprovider
Should be usable for both batch and minibatch optimizers
Ensure proper naming in result files
E.g.
_offset_
, scaling,
sigma`)A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.