Comments (3)
What is the desired value in the dataset?
I try to predict float value by its cosine and sine. More precisely, input is [(sin(y)+1.0)/2.0, (cos(y)+1.0)/2.0]
I changed (also in gist)
let z = g.matmul(xww2, w3) + b;
to
let z = g.sigmoid(g.matmul(xww2, w3) + b);
and
let mean_loss = g.reduce_mean(g.sigmoid_cross_entropy(z, &y), &[0,1], false);
to
let mean_loss = g.reduce_mean(g.square(g.sub(z, &y)), &[0,1], false);
Now error is near 0.03 which is much better (but still not perfect). Thank you for help.
(Although, it would be very nice to have example for such cases with MSE loss function and a lot of layers and etc - for those ones who migrating from fann)
from rust-autograd.
I'm sorry I can't figure out the reason why that is, but
let values = z.eval(&[x.given(x_test.view()), y.given(y_test.view())]).unwrap();
should be like below because z
is not normalized to be between 0~1.
let values = g.sigmoid(z).eval(&[x.given(x_test.view()), y.given(y_test.view())]).unwrap();
stuck on error ~ 0.585, which is very large for task detecting value from 0.0 to 1.0.
In your code, acc
seems to be (binary) cross entropy. What is the desired value in the dataset?
from rust-autograd.
Although, it would be very nice to have example for such cases with MSE loss function and a lot of layers and etc - for those ones who migrating from fann)
Sorry about the poor example code.
I know that the num of use cases is important for this kind of tools, but I don't have enough time to do it...
from rust-autograd.
Related Issues (20)
- segfault when calling grad() in a loop HOT 19
- access_elem failed HOT 1
- Unexpected gradient shape HOT 3
- Not differentiable with the given tensors error when trying to train multi-input neural network HOT 1
- "ndarray" and "autograd::ndarray" HOT 2
- Support for lgamma function HOT 22
- Gradient error for tensor of different dimensions HOT 7
- Alternatives for `tf.where()` HOT 12
- Bug for `g.argmax` HOT 1
- Index out of bound in DivOp when dividing 2 scalars HOT 2
- Documentation about GradientContext::set_input_grads is misleading HOT 1
- Use of 'extern crate' in examples HOT 1
- Dead PDF link in source HOT 1
- dropout with train=false produces error HOT 1
- Could I get a value "0" when it is not differentiable at any variable ,No panic? HOT 2
- Wrong docstrings
- softmax_cross_entropy outputs shape [-1], when it should output shape [-1, 1]. HOT 1
- Newbie-friendly documentation would be a huge benefit HOT 3
- Upgrade to ndarray 0.15 HOT 2
- "unreachable code" panic on certain uses of `grad_with_default`
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from rust-autograd.