agnezio / agnez Goto Github PK
View Code? Open in Web Editor NEWAnalytics for Deep Learning projects.
License: BSD 3-Clause "New" or "Revised" License
Analytics for Deep Learning projects.
License: BSD 3-Clause "New" or "Revised" License
Hi Eder, amazing project, really like where you are going with this. Thanks for putting in the work.
When I run the example provided in MNIST.ipynb, following the installation instructions as provided, on OSX 10.10.4, I get the following error after the first epoch:
/my/path/agnez_app/node_modules/feathers/node_modules/feathers-commons/lib/sockets/helpers.js:61
socket[info.method](eventName, dispatchData);
^
TypeError: undefined is not a function
at /my/path/agnez_app/node_modules/feathers/node_modules/feathers-commons/lib/sockets/helpers.js:61:32
at Object.defaultDispatcher (/my/path/agnez_app/node_modules/feathers/node_modules/feathers-commons/lib/sockets/helpers.js:39:3)
at /my/path/agnez_app/node_modules/feathers/node_modules/feathers-commons/lib/sockets/helpers.js:56:20
at Array.forEach (native)
at Object.<anonymous> (/my/path/agnez_app/node_modules/feathers/node_modules/feathers-commons/lib/sockets/helpers.js:55:35)
at emitTwo (events.js:87:13)
at Object.emit (events.js:169:7)
at /my/path/agnez_app/node_modules/feathers/lib/mixins/event.js:49:21
at Array.forEach (native)
at EventEmitter.<anonymous> (/my/path/agnez_app/node_modules/feathers/lib/mixins/event.js:48:16)
at emitMany (events.js:108:13)
at EventEmitter.emit (events.js:179:7)
at Object.exports.emitEvents (/my/path/agnez_app/node_modules/feathers/node_modules/rubberduck/lib/utils.js:32:18)
at callbackWrapper (/my/path/agnez_app/node_modules/feathers/node_modules/rubberduck/lib/rubberduck.js:56:15)
at /my/path/agnez_app/node_modules/feathers-nedb/lib/index.js:137:5
at callback (/my/path/agnez_app/node_modules/feathers-nedb/node_modules/nedb/lib/executor.js:30:17)
at Cursor.execFn (/my/path/agnez_app/node_modules/feathers-nedb/node_modules/nedb/lib/datastore.js:462:14)
at Cursor._exec (/my/path/agnez_app/node_modules/feathers-nedb/node_modules/nedb/lib/cursor.js:172:17)
at /my/path/agnez_app/node_modules/feathers-nedb/node_modules/nedb/lib/executor.js:40:13
at Object.q.process (/my/path/agnez_app/node_modules/feathers-nedb/node_modules/nedb/node_modules/async/lib/async.js:731:21)
at next (/my/path/agnez_app/node_modules/feathers-nedb/node_modules/nedb/node_modules/async/lib/async.js:728:27)
at Immediate._onImmediate (/my/path/agnez_app/node_modules/feathers-nedb/node_modules/nedb/node_modules/async/lib/async.js:24:16)
at processImmediate [as _immediateCallback] (timers.js:368:17)
Any idea how this comes about? Thanks.
When running the IPython notebook example, I get the following error:
AttributeError Traceback (most recent call last)
<ipython-input-6-9386fd8b8064> in <module>()
1 model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=nb_epoch,
2 show_accuracy=True, verbose=1, validation_data=(X_test, Y_test),
----> 3 callbacks=[loss_plot, conv_weights])
/usr/local/lib/python2.7/dist-packages/keras/models.pyc in fit(self, x, y, batch_size, nb_epoch, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, **kwargs)
406 shuffle=shuffle,
407 class_weight=class_weight,
--> 408 sample_weight=sample_weight)
409
410 def evaluate(self, x, y, batch_size=32, verbose=1,
/usr/local/lib/python2.7/dist-packages/keras/engine/training.pyc in fit(self, x, y, batch_size, nb_epoch, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight)
1051 verbose=verbose, callbacks=callbacks,
1052 val_f=val_f, val_ins=val_ins, shuffle=shuffle,
-> 1053 callback_metrics=callback_metrics)
1054
1055 def evaluate(self, x, y, batch_size=32, verbose=1, sample_weight=None):
/usr/local/lib/python2.7/dist-packages/keras/engine/training.pyc in _fit_loop(self, f, ins, out_labels, batch_size, nb_epoch, verbose, callbacks, val_f, val_ins, shuffle, callback_metrics)
810 for l, o in zip(out_labels, val_outs):
811 epoch_logs['val_' + l] = o
--> 812 callbacks.on_epoch_end(epoch, epoch_logs)
813 if callback_model.stop_training:
814 break
/usr/local/lib/python2.7/dist-packages/keras/callbacks.pyc in on_epoch_end(self, epoch, logs)
37 def on_epoch_end(self, epoch, logs={}):
38 for callback in self.callbacks:
---> 39 callback.on_epoch_end(epoch, logs)
40
41 def on_batch_begin(self, batch, logs={}):
/usr/local/lib/python2.7/dist-packages/agnez-0.1.0-py2.7.egg/agnez/app_callbacks.pyc in on_epoch_end(self, epoch, logs)
75
76 def on_epoch_end(self, epoch, logs={}):
---> 77 self.train_values.append(self.totals['loss']/self.seen)
78 self.valid_values.append(logs['val_loss'])
79
AttributeError: 'LossPlot' object has no attribute 'totals'
...and I don't entirely know how to fix this. Any help appreciated.
I checked and there is a self.totals
in agnez/keras_callbacks.py
(in the Plot class), which doesn't seem to be used here, and in keras/callbacks.py
(BaseLogger class, but also not sure if used here).
My Keras is version 1.0.6, Theano version 0.9.0dev1.dev-a668c6c5b6d055b233aa5bc50b22800d996ffce1 on Python 2.7, Ubuntu 15.10 64bit.
Both installation methods, pip and easy_install fails. Are there other ways to install?
Thanks,
Satya
First of all thanks for putting your efforts on making deep visualizations available to a broader audience.
Due to the lack of other options I would like to ask you here about your DeepPref class.
If my memories are correct, visualizing normalized weights from the first layer corresponds to a close-form solution of the optimization problem 'which input maximizes 1st layer activations'. I see that something similar makes sort of sense between the 3rd and 1st layers too - as it is done in the pref_grid function. However, I fail to see any interpretation with further layers. For instance DeepPref between 4th and 1st layers returns K weights between the two first layers, where K is sorted according to the the activations of the 4th layer. What does it mean?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.