hips / hypergrad Goto Github PK
View Code? Open in Web Editor NEWExploring differentiation with respect to hyperparameters
Exploring differentiation with respect to hyperparameters
Hi,
followed by steps in https://github.com/HIPS/hypergrad/blob/master/readme.md, i get the experiment to run, get log:
========== Starting omap ==========
Submitting 20 tasks (output in /tmp/jobdir_744870565721)
but there is nothing running.
so, how can i get the experiment running.
Hi,
I'm trying to run some of the hyper-gradient code and have been running into a problem with some dependencies. I installed the autograd package using your setup.py file.
In particular, some of the experiment.py files try to import the funkyyak module. I changed the module name to autograd, but the reference to kylist and getval still appear to be undefined.
Thanks,
I'm using the latest code from master branch.
Running python experiments/Feb_1_learning_alphabet_corr/1/experiment.py
results in exception:
Traceback (most recent call last):
File "experiments/Feb_1_learning_alphabet_corr/1/experiment.py", line 136, in <module>
results = map(run, all_script_corr)
File "experiments/Feb_1_learning_alphabet_corr/1/experiment.py", line 44, in run
[11, 2, 2], RS, num_alphabets=N_scripts)
File "/Users/Sida/Projects/pilots experiments/hypergrad/hypergrad/omniglot.py", line 42, in load_data_split
raw_data = load_data(np.sort(alphabets_to_load))
File "/Users/Sida/Projects/pilots experiments/hypergrad/hypergrad/omniglot.py", line 28, in load_data
with open(datapath("omniglot_data.pkl")) as f:
IOError: [Errno 2] No such file or directory: '/Users/Sida/repos/hypergrad/data/omniglot/omniglot_data.pkl'
I looked into the code and found that on line 15 of omniglot.py
in function datapath
, the path is hard-coded as datadir = os.path.expanduser('~/repos/hypergrad/data/omniglot')
.
Hi All --
In the paper [1], Algorithm 1 is
Should line 6 actually say
w[t+1] = w[t] + alpha[t] * v[t + 1]
Otherwise, you're updating the weights w/ a velocity that doesn't consider the most recent gradient. AFAICT, the hypergrad
code does the latter, but wanted to see if I could verify whether this was indeed a typo.
(Reason for this is I'm trying to derive Algorithm 2, so was looking very closely at the notation. If there's any insight into Algorithm 2 that someone could provide, that'd be much appreciated. Specifically lines 9-13, where the gradients w.r.t. the hyperparameters are calculated.)
Hi,
I have done everything as what readme.md said. It works well.
But when i run something in /hypergrad/experiments, error occurs:
nina@nina-virtual-machine:~/code/hypergrad/experiments/Feb_1_learning_alphabet_corr/1$ python experiment.py
Traceback (most recent call last):
File "experiment.py", line 13, in
from hypergrad.odyssey import omap
File "/home/nina/anaconda2/lib/python2.7/site-packages/hypergrad-1.0-py2.7.egg/hypergrad/odyssey.py", line 25, in
from odyssey_config import root_working_dir, slurm_options
ImportError: No module named odyssey_config
I don't know how odyssey and odyssey_config works, and can't get the mining of the following text in odyssey.py.
Can u tell me how to make the code works step by step?
Thank u very much!
""" Provides omap
, which works like Python's map
, but farms out to
Odyssey. Here's how to use it:
odyssey_config.py
that points to it (look at odyssey_config_example.py
)Hi All --
I'm playing with implementing some of the ideas here in PyTorch. I was looking at the ExactRep
code and (maybe?) found an issue:
>>> np.__version__
'1.13.1'
>>> x = ExactRep(np.array([2.0]))
>>> a = np.array([0.5])
>>> i = 11
>>>
>>> for _ in range(i):
... _ = x.div(a)
...
>>> for _ in range(i):
... _ = x.mul(a)
...
>>> x.val
array([-0.00067149])
Presumably, ExactRep
should be able to handle this situation, and correctly return 2.0
. Any ideas what might be going on? (If i <= 10
, the code works as expected.)
Note that to get ExactRep
to run I had to change
class BitStore(object):
def __init__(self, length):
...
self.store = np.array([0L] * length, dtype=object)
to
class BitStore(object):
def __init__(self, length):
...
self.store = np.array([0L] * length, dtype=long)
otherwise I get the error
TypeError: ufunc 'add' output (typecode 'O') could not be coerced to provided output parameter (typecode 'l') according to the casting rule ''same_kind''
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.