lukesonnet / krls Goto Github PK
View Code? Open in Web Editor NEWR package for machine learning technique to fit flexible, interpretable functional forms for continuous and binary outcomes.
License: Other
R package for machine learning technique to fit flexible, interpretable functional forms for continuous and binary outcomes.
License: Other
allow for model obj instead of summary obj but warn and go get inference.krls2
The optimization for choosing lambda is working great, but with larger samples seems to spend a long time working on the 5th or 6th significant digit, so worth considering smarter tolerance choice or limiting number of iterations.
Things that should be checked for and moved to Rcpp:
sum(diag(x))
to trace_mat
(~5x faster)mult_diag
There are some cases where R is returning an eigenvalue of 0 in the generateK()
function. This causes all sorts of problems downstream. This seems to only happen without truncation, and is thus caused by R's default eigen()
function. Should I prevent this by adding the minimum double R can handle to 0 eigenvalues? I very rarely run into it and it's only when lambda is very large and there is no truncation.
great. I wonder if it's worth throwing an error (or at least warning) if the optimization hits the edge. e.g.
f <- function (x) (x - .5)^2
tryoptimize = function(f, interval){
xmin <- optimize(f, interval = interval)
tooclose=abs(xmin$minimum-lambdainterval)<.Machine$double.eps^.25
if(tooclose[1]) stop("Too close to lower bound of lambda interval; please decrease lower bound.")
if(tooclose[2]) stop("Too close to upper bound of lambda interval; please raise upper bound.")
}
tryoptimize(f, interval=c(0,1))
tryoptimize(f, interval=c(1,2))
Currently we only get pwmfx for the gaussian kernel. Adding this for the linear and polynomial kernels is not difficult, it's simply a matter of taking the partial of the kernel function, finding the new eqn for the pwmfx, and implementing it in code for each kernel.
Try to install KRLS on a Windows machine without RSpectra installed; this will fail silently.
Install RSpectra then install KRLS and it works.
Presumably the intended behaviour should be to automatically resolve dependencies or fail very loudly and immediately, rather than getting to the end of the build process and failing silently?
Currently, if someone tries to get standard errors for KRLogit without truncation, we throw a warning but still return the pwmfx and the avg pwmfx. So if you do:
summary(krls(X=X, y=y, loss='logistic'))
you get a warning and no SEs but you get pwmfx. We could also error and force them to write:
summary(krls(X=X, y=y, loss='logistic'), vcov = F)
or we could change the default so it didn't try to get SEs without truncation, but I think that's worse than throwing the warning. So I think we currently have the right behavior, but I wanted to check.
If someone tries to get robust or clustered standard errors for KRLS without truncation, we throw a warning but still return the pwmfx and the avg pwmfx. So if you do:
summary(krls(X=X, y=y), robust = T)
you will get a warning telling you that you have to truncate, but still returning the PWMFX. The alternative is to error out, since this isn't default behavior, the user specified robust = T
but I think the warning is sufficient. So again I think we have the right behavior, but this one I'm less sure about.
Compare the above on MSE of predicted probabilities to latent Y* (which has noise) and NLL on predicted probabilities to observed Y. Also compare on MSE of average marginal effects. Do this for the following settings:
All of the CPP is in one file. I'm going to split it in to different files:
I am sure Luke will just looooove to get this since I nagged him all day at this point, but KRLS' compile is broken on my Mac due to a missing fortran dependency of some kind.
After a few steps of clang++ compiling, I get:
ld: warning: directory not found for option '-L/usr/local/lib/gcc/x86_64-appple-darwin13.0.0/4.8.2'
ld: library not found for -lgfortran
clang: error: linker command failed with exit code 1 (use -v to see invocation)
make: *** [KRLS2.so] Error 1
ERROR: compilation failed for package 'KRLS2'
Works fine in Windows 10.
Set this in numvectors
As per discussion in #17
Ideally we'd deal with missingness much as lm does -- i.e. it is na.omitted internally but then the predictions etc. are put back into the full length vector with the NA intermixed.
First compare golden search for KRLS with optimize. If golden search works considerably better, build it for KRlogit. If golden search is no better than or worse than KRlogit, replace golden search with optimize as the default but keep it as a legacy option.
Build a test suite which tests the different instances of KRLS and KRLogit. These will be fast and simple tests that just make sure that they pass, not necessarily to check that the answers are all correct. This should test the interaction of:
Top priority:
Lower priority:
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.