tonio73 / dnnviewer Goto Github PK
View Code? Open in Web Editor NEWDeep Neural Network viewer
Home Page: https://tonio73.github.io/dnnviewer/
License: MIT License
Deep Neural Network viewer
Home Page: https://tonio73.github.io/dnnviewer/
License: MIT License
Create a panel on the right of the main network view, size md3
In this panel:
model information
-- loss (losses) description (type) (.loss, .loss_function, .loss_weights)
-- metrics description
-- optimizer description (.optimizer)
configuration of the view
-- show topn connections parameter
Implementation: categorize as in #19
Often used in GAN and VAE
Beware of the stripe parameters (decimal)
Within the convolutional filter view of weights (tiled), sort the tiles to show the most active first.
TBD: add a UI control to switch between views
Three panes structure for the application : top, center, bottom
Separate modules containing for each pane:
Provided that the application is stateless on the server side (see #7), solve remaining code issues to be able to deploy online on a free or quasi free host
Create and submit a Pypi package (automate if possible)
Given a neural unit of a Dense layer, given an input sample (image), display the bar chart of the products of the weight times the input as a new tab in the unit quadrant
Work around Dash double firing of routes : plotly/dash#1049 that is causing the DNN Keras model to be loaded twice
Missing layer information in the viewer:
Missing unit information in the viewer:
Later:
Implementation:
Within layer detail, the "minimax" graph is displaying the min and max weight amplitude of each unit.
Enhance this plot to allow for unit selection when clicking on a unit min or max bar.
Shall update the selection in:
Main challenge: the layer object is required by most callbacks handling the selection. We may first need to save this layer within the layer detail widget
New widget at the top of the window to display the model loss and metrics history.
Stripe and pooling parameters are not handled in the top-n weights computation when output is flattened: when propagating weight computation backward along layers from a dense to a convo through a flatten, the index are wrapped modulo the number of units, should take into account for the strip/pool parameters
--model-directories
value is a comma separated list of directory pathsEnsure backward compability with existing options to select models (single or sequence)
Currently the tensor shape of the test data is checked before computing the gradients while model is being loaded and when computing an activation map.
Refactor this to:
Following #35, a dirty workaround has been installed on main_network_view to skip 1/2 page loads.
When Dash is solving plotly/dash#1049, we may remove this workaround
-1- B&W images like in MNIST are often described by 2D arrays (number of channels is 1), but Keras requires 3D tensors
=> Detect and expand dimensions
-2- Some networks require pre-padding of input images to cope with convolution margins.
=> Detect dimension lag and pad image
Replace the few Keras Datasets with the many of Tensorflow Datasets (https://www.tensorflow.org/datasets/catalog/overview), for image classification
Provide basic and quick saliency map based on gradient ascent with Ridge regularization
Parameters (to wire to the UI):
Activation is either a parameter on a layer or a specific layer of type "Activation"
Add support for the latter: set the activation structural property on previous layer.
Note: the case in which the previous layer already has an activation (not the linear activation) is not handled
There is currently no colorbar for the network view. The color bar is within the Convo unit details only (filter heatmaps).
Add a colorbar on the left of the view
t-SNE or UMAP could be used to display the cloud of activation maps of a test batch at a given layer output.
Example : https://github.com/tonio73/data-science/blob/master/cnn/CnnVsDense-Part2-Visualization.ipynb
To solve:
Show the network evolution accross training epochs
Keras checkpoints documentation :
https://www.tensorflow.org/guide/checkpoint
Dash is recommending to be stateless on the server... we are definitively not since the server keeps:
To do:
Create and submit a Conda package (automate if possible)
=> zoom, selection... are lost
Solution might be to return only the updated property in a dictionary as in : https://stackoverflow.com/questions/46075960/live-updating-only-the-data-in-dash-plotly
It also has a performance impact as the full figure is redrawn
When the number of layer is increasing and the number of units per layer is large, it takes from 10 to 20s to update the view when selecting a new number of displayed connections
Two cases :
For the 2nd case, ideally, insert a step in the model selection to check model loading AND compatibility with test data
Create an icon for the DNN Viewer.
To be displayed in:
Windows file and directory path handling is quite different to Unix & Macos.
As an alternative to displaying weights, display gradients :
Graphical design:
If there is no test data, then:
Check all code and UI locations impacted
Give possibility to load own test data
Issues:
Handle as the Activation layer: append an output property to the previous layer
Upsampling layer(s) are not displayed as a layer on the network representation but as a feature to the output of previous layer.
Also, the sampling factor is set on the previous layer through append_sampling_factor() method.
When selecting some unit on the main view, the previously selected unit's connections (topn weights) is not always cleared.
It might be related to the React handling of animated dcc.Graph()
Currently the Plotly figures use default Plotly height. There is not enough height on a laptop screen (15") to see at the same time the main and detail figures.
Set the height of the figures to be adaptive as function of screen height, with a minimum TBD
Introduce the selection of the task at hand, leading to the specific view.
Currently, the task is image classification.
Verify that generic classification works fine.
Verify also that no test dataset selection works fine.
Study and initial development to support the Pytorch equivalent to the Tensorflow Keras Sequential model and common layers
In current implementation, the selected unit is described on the bottom panel, but there is no clear indication on the central view. One may only guess it is the unit of the layer with single unit connected.
Proposal for a better visual of the selected layer and unit :
To be used for:
Currently:
This creates issue if the task is not classification or if classification is on the class recognition. Example: the GAN discriminator is not looking for the object class but is a binary classification fake/genuine.
To Do:
As an alternative the current image classification task, provide support for time series as input, and a classification or regression at the output.
Adjust the font sizes to minimize the vertical height allocated to this section
Refactor the UI of the layer detail subpanel to use tabs from Dash Bootstrap Components.
Sub-panel structure:
Tab are:
Tab content should be computed when displayed (not when the page is displayed as in https://dash-bootstrap-components.opensource.faculty.ai/examples/graphs-in-tabs/). But that is a challenge since the weight graph ID is required to bind the Dash callback.
To provide more information about each unit "strength" within the layer detail "minimax" plot, add a line plot of the mean of absolute values of the layer weights.
Currently, the initial selection of a neuron unit is "mocked" by drawing the top n weight connections of the output unit corresponding to the label of the selected test sample (#0).
This is improper as:
To do : find a mean to set the selected data of the main view, and automatically call linked callbacks
Install an application central logger to handle all current print out (model loading...)
Current assumed DNN task is image classification (image in, class probabilities out)
Generative Adversarial networks could be also supported for simple architectures: image in, image out, OR alternate random in of the latent space to generate an image or a map of images.
Refactor the activation widget to handle all layer types, for the time being : Dense and Convo2D
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.