fabsig / ktboost Goto Github PK
View Code? Open in Web Editor NEWA Python package which implements several boosting algorithms with different combinations of base learners, optimization algorithms, and loss functions.
License: Other
A Python package which implements several boosting algorithms with different combinations of base learners, optimization algorithms, and loss functions.
License: Other
I think there might be some change required for update_terminal_regions method in LossFunction for If condition. We might need to change this to
if update_step == 'hybrid' or update_step == 'gradient':
then line search of weights.
Please let me know if i am wrong.
Hello!
When I assess my CPU usage using top
, I see only 1 CPU being used. Is it possible to enable or implement multiprocessing?
I get the following error when trying to apply a Grabit model with KTBoost.BoostingRegressor(loss='tobit', yl=0, yu=1).fit(X, Y):
Traceback (most recent call last):
File "", line 2, in
File "...\KTBoost\KTBoost.py", line 1745, in fit begin_at_stage, monitor, X_idx_sorted)
File "...\KTBoost\KTBoost.py", line 1810, in _fit_stages nTreeKernel, X_csc, X_csr)
File "...\KTBoost\KTBoost.py", line 1354, in _fit_stage presort=self.presort)
TypeError: init() got an unexpected keyword argument 'min_weight_leaf'
Hi, KTBoost seems to have problem with the wrong dependency on scikit-learn again? Both the Regressor and Classifier.
Hi Mr. Sigrist:
Thank you for implementing such a cool algorithm! I am wondering whether it is possible to add a monotone constraint to the main function. This is crucial for problems such as credit scoring for which domain knowledge is important and supported by most major implementations of boosting models such as Xgboost and LightGBM.
Happy boosting!
Sincerely,
Yu Cao
In the negative gradient of the tobit loss function, residual accounts for the sample_weight for each observation. In addition to that, sample weight is being accounted for again in the leaf update step. There might double accounting for sample weights for the tobit update step.
Whereas all the other loss functions account for sample weight only in the leaf update step.
Hello,
I did some bench w/ the 3 criterions available for classification, and mae is at least one order of magnitude slower. any reason?
model = KTBoost.BoostingClassifier(loss='deviance',update_step='hybrid', criterion='mae',
n_estimators=50, random_state=seed)
model.fit(X_train,y_train)
probas = model.predict_proba(X_test)
Could it be interesting to have decaying or even oscillating learning rates, as it is common for deep learning? Have never seen it for boosted trees but might be worth exploring. See allso https://github.com/lorenzwalthert/KerasMisc.
Thanks for this repo!
I wonder if it is possible to extend the BoostingRegressor
to work with the data where censoring points yl
, yu
are varied by observations.
I am using KTBoost 0.1.19 and got the following error:
Traceback (most recent call last):
File "train_model.py", line 135, in
ktb_regressor.fit(X_train_cv, y_train_cv)
File "C:\ProgramData\Anaconda3\lib\site-packages\KTBoost\KTBoost.py", line 1755, in fit
n_stages = self._fit_stages(X, y, y_pred, sample_weight, self._rng,
File "C:\ProgramData\Anaconda3\lib\site-packages\KTBoost\KTBoost.py", line 1820, in _fit_stages
y_pred = self._fit_stage(i, X, y, y_pred, sample_weight,
File "C:\ProgramData\Anaconda3\lib\site-packages\KTBoost\KTBoost.py", line 1361, in _fit_stage
tree.fit(X, residual, sample_weight=weights,
File "C:\ProgramData\Anaconda3\lib\site-packages\KTBoost\tree.py", line 1130, in fit
super(DecisionTreeRegressor, self).fit(
File "C:\ProgramData\Anaconda3\lib\site-packages\KTBoost\tree.py", line 368, in fit
builder.build(self.tree_, X, y, sample_weight, X_idx_sorted)
File "sklearn\tree_tree.pyx", line 136, in sklearn.tree._tree.DepthFirstTreeBuilder.build
TypeError: build() takes at most 4 positional arguments (5 given)
Thanks in advance!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.