A collections of some fun toy examples
The neural net code snippets were written by Alec Radford here
- Getting error bars in neural network predictions by some hacky ways, note that the prediction is too confident away from the training data and I would strongly argue that this is not the correct way to get predict error bars
-
for reference, this is the prediction made using only one single set of weights after training
-
using dropout, i.e. after training, make prediction as normal but with dropout, then average over the predictions made
-
using weights from last few SGD runs, i.e. after training, run SGD for a few more iterations and make prediction after each run, then average over the predictions made
-
using weights from last few SGD runs with some gaps in between, i.e. after training, run SGD for a few more iterations and make prediction after every 10 runs, then average over the predictions made
- Getting error bars using an alternative method, GPs. Here I use GPs with zero mean and SE kernel -- the prediction looks more promising and much much better than the previous tricks