notes, demos, interactive visuals, art projects, references, further readings, etc.
more documentation to be added soon.
machine learning for artists
Home Page: http://ml4a.github.io
License: GNU General Public License v2.0
notes, demos, interactive visuals, art projects, references, further readings, etc.
more documentation to be added soon.
working on it here:
https://github.com/micuat/ml4a.github.io
http://naotohieda.com/ml4a.github.io/ml4a/jp/neural_networks/
Hi @genekogan, in this line of chapter 4:
https://github.com/ml4a/ml4a.github.io/blame/master/_chapters/how_neural_networks_are_trained.md#L62
(sorry for using blame mode, it's just because github does not allow users to point to a line of rendered markdown)
This principle is closely related to what we call in machine learning βthe curse of dimensionality.β Each dimension we add into a search space exponentially blows up the number of samples we require to get good generalization for any model learned from it. The curse of dimensionality is more often applied to datasets; simply put, the more columns or variables a dataset is represented with, the exponentially more samples in that dataset we need to understand it. In our case, we are thinking about the weights rather than the inputs, but the principle remains the same; high-dimensional space is enormous!
the first 'samples' refers to the sampling of the space, and the second one refers to elements of the dataset (if I understand correctly). Since in this chapter the former usage is more common, I suggest that the latter should be replaced by 'data' / 'element' etc to avoid confusion.
We are missing toc in other languages - and I heard this will be useful for non English speakers. Should we translate this page
http://ml4a.github.io/ml4a/
to URLs like
http://ml4a.github.io/ml4a/cn/
http://ml4a.github.io/ml4a/es/
...
well, perhaps we should translate the front page as well.
I am a 28 years old artist, self taught myself ML for a year made a lot of progress, but then I got stuck. I decided this is my passion so I went back to school to study CS in a more formal way. I am about to finish my first year in college and I don't feel like I am being challenged so I am looking for internship opportunities. The problem is I don't know if any well known companies or start ups are doing 2d/3d animation + Machine Learning/Deep Learning on the east coast. Quick google search shows none. I know this is probably not the kind of post I should make for issues but I would like to try my luck. Does any one know any ML/Art internship opportunities?
it could be useful to have space for people to ask for clarifications and have other discussions; probably better to have it per page, but not sure yet.
Hey @genekogan π
After working with this, I thought it would be useful to add german translations since they are still missing. Is there some kind of workflow on how to do this?
I thought of throwing the chapters into https://www.deepl.com/translator and then proof-read the result. I meanβ¦ the translation for this should be done with the help of machine learning, right? π
all the demos are broken because i removed the MNIST/CIFAR images from the repo to make it smaller. CIFAR alone is around 120MB. would it make sense to host the images somewhere else to keep the repo smaller (and likewise with other large content)? if so, where's a reliable place to do so? if we do that we save the space but the demos can't run on localhost without an internet connection.
I started to translate _chapters_ko/neural_networks.md
If somebody already work in _chapters_ko folder, let me know about it.
Thanks. :)
"are simply the the image classes"
https://github.com/ml4a/ml4a.github.io/blame/master/_chapters/convnets.md#L115
from #37 (comment)
Currently the demo javascripts are hardcoded with english texts but they should be translated too. I tested i18next and it seems working well, which can be found in my branch:
master...micuat:js-multilng
I wish I finish it in tomorrow. ;-)
https://ml4a.github.io/ml4a/how_neural_networks_are_trained/
What 10^80 atoms in the universe you are talking about?
There is 5 x 10^21 atoms in a drop o water already. How many atoms would there be in Earth? The Earth is 10^6 times smaller than the Sun! How many Suns are there in the Universe? How many planets? 10^80 atoms in the Universe ? Ahahahha! I can only laugh about this. And what if the Dark Matter has atoms too? There is no way the Universe is so small that it only has 10^80 atoms!
Brute force and random search will be the algorithms of the future when we will make our computing devices with some other type of technology than moving the electrons. Nanotechnology will help. And then, the computers will be so cheap the we will brute force everything.
https://ml4a.github.io/ml4a/machine-learning throws two 404 errors for missing videos (https://ml4a.github.io/images/video.mp4 and https://ml4a.github.io/images/video.webm). This video is the one to illustrate linear classifiers in 3 dimensions (I'm guessing)
Apparently the README link should point to http://ml4a.github.io/
the section in the neural nets chapter which says "Random set of dat, 3 cols. 1 regression value" needs a good example.
this would just describe what regression is and how it works. no code included -- it would not need to say much about how the solution is found (except maybe to link to the "training neural nets" chapter). it's a "magic trick" that shows what happens if a neural net has the correct weights for regression.
this leads into the classification example which is more complete already.
@genekogan, I found redundant 'left' in this sentence (probably the second one can be omitted):
https://github.com/ml4a/ml4a.github.io/blame/master/_chapters/how_neural_networks_are_trained.md#L169
... and so solving the above linear regression will be left as an exercise left to the reader.
(sorry for pointing out small things, but when I translate I find these things)
@TiborUdvari @weakish @irealva @tornoteli @rickiepark
hi ml4a translators! right now, your names do not appear anywhere except github contributors. in general, none of the pages, including the book, guides, or demos have names right now. i'm trying to figure out a good way to list contributors. do you have any comments about how you might like to be credited? one option would be to put it at the top of the actual chapters (example: "translated by __" or "translated by __ and __") or we can possibly make a new page which lists the translations and the authors next to them. do you have any preferences?
right now, mathjax is externally hosted via CDN. hosting it locally makes it easier to work offline. additionally, it's not clear how to change some of the style parameters via CSS, like the size equations and figures should appear at. centralizing this in the main.css would be helpful for styling.
Just keep going.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. πππ
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google β€οΈ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.