dclambert / python-elm Goto Github PK
View Code? Open in Web Editor NEWExtreme Learning Machine implementation in Python
License: Other
Extreme Learning Machine implementation in Python
License: Other
The README says that the centers and radius are taken as follows:
centers are taken uniformly from the bounding hyperrectangle of the inputs, and
radius = max(||x-c||)/sqrt(n_centers*2)
but citation [2] only talks about ELM, and [3] talks about RBF, but the centers and radius are taken in a different way. Is the solution in this implementation an idea of @dclambert or is there a citation missing?
Hi
Thanks for the code.
The function atleast2d_or_csr() used in random_layer.py does not seem to be available any more with the new scikit-learn 0.17.0
I managed to work around this temporarily by defining a function by that name which simply returns the matrix without performing any checks.
def atleast2d_or_csr(X):
return X
I'm not sure what checks should be in there or what the equivalent function in the new scikit is though.
Hope this helps
Cheers
Srimal.
from elm import ELMClassifier
import tensorflow as tf
from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets('MNIST_data', one_hot=True)
train = mnist.train.next_batch(100)
elmc = ELMClassifier(n_hidden=1000, activation_func='gaussian', alpha=0.0, random_state=0)
elmc.fit(train[0], train[1])
test = mnist.test.next_batch(100)
print('train acc is %g, test acc is %g ' %( elmc.score(train[0], train[1]), elmc.score(test[0], test[1])))
run and get this,
train acc is 0, test acc is 0
elmc.fit(x_train,y_train)
Traceback (most recent call last):
File "", line 1, in
File "/home/analytics/anaconda3/lib/python3.6/site-packages/sklearn_extensions/extreme_learning_machines/elm.py", line 596, in fit
super(ELMClassifier, self).fit(X, y_bin)
File "/home/analytics/anaconda3/lib/python3.6/site-packages/sklearn_extensions/extreme_learning_machines/elm.py", line 464, in fit
self.genelm_regressor.fit(X, y)
File "/home/analytics/anaconda3/lib/python3.6/site-packages/sklearn_extensions/extreme_learning_machines/elm.py", line 183, in fit
self.hidden_activations = self.hidden_layer.fit_transform(X)
File "/home/analytics/anaconda3/lib/python3.6/site-packages/sklearn/base.py", line 517, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/home/analytics/anaconda3/lib/python3.6/site-packages/sklearn_extensions/extreme_learning_machines/random_layer.py", line 108, in fit
self._generate_components(X)
File "/home/analytics/anaconda3/lib/python3.6/site-packages/sklearn_extensions/extreme_learning_machines/random_layer.py", line 360, in _generate_components
self._compute_centers(X, sp.issparse(X), rs)
File "/home/analytics/anaconda3/lib/python3.6/site-packages/sklearn_extensions/extreme_learning_machines/random_layer.py", line 322, in _compute_centers
spans = max_Xs - min_Xs
TypeError: unsupported operand type(s) for -: 'map' and 'map'
I get this error when fitting and not able to solve it. My x_train is csr matrix and y_train is ndarray.Any suggestions on this?
Please tell me how can i use it for my datasets. what steps are require to implement it.
The “extreme learning machines (ELM)” are indeed worth working on, but they just shouldn’t be called “ELM”. With annotated PDF files at http://elmorigin.wix.com/originofelm , you can easily verify the following facts within 10 to 20 minutes:
Please forward this message to your contacts so that others can also study the materials presented at this website and take appropriate actions, if necessary.
ELM: The Sociological Phenomenon
Since the invention of the name “extreme learning machines (ELM)” in 2004, the number of papers and citations on the ELM has been increasing exponentially. How can this be imaginable for the ELM comprising of 3 decade-old algorithms published by authors other than the ELM inventor? This phenomenon would not have been possible without the support and participation of researchers on the fringes of machine learning. Some (unknowingly and a few knowingly) love the ELM for various reasons:
• Some authors love the ELM, because it is always easy to publish ELM papers in an ELM conference or an ELM special issue. For example, one can simply take a decade-old paper on a variant of RVFL, RBF or kernel ridge regression and re-publish it as a variant of the ELM, after paying a small price of adding 10s of citations on Huang’s “classic ELM papers”.
• A couple of editor-in-chiefs (EiCs) love the ELM and offer multiple special issues/invited papers, because the ELM conference & special issues will bring a flood of papers, many citations and therefore high impact factors to their low quality journals. The EiCs can claim to have faithfully worked within the peer-review system, i.e. the ELM submissions are all rigorously reviewed by ELM experts.
• A few technical leaders, e.g. some IEEE society officers, love the ELM, because it rejuvenates the community by bringing in more activities and subscriptions.
• A couple of funding agencies love the ELM, because they would rather fund a new sexy name, than any genuine research.
One may ask: how can something loved by so many be wrong?
A leading cause of the current Greek economic crisis was that a previous government showered its constituents with jobs and lucrative compensations, in order to gain their votes, thereby raising the debt to an unsustainable level. At that time, the government behavior was welcome by many, but led to severe consequences. Another example of popularity leading to a massive disaster can be found in WW II as Hitler was elected by popular votes.
The seemingly small price to pay in the case of the ELM is the diminished publishing ethics, which, in a long run, will fill the research literature with renamed junk, thereby making the research community and respected names, such as IEEE, Thomson Reuters, Springer and Elsevier, laughing stocks. Similar to that previous Greek government and its supporting constituents, the ELM inventor and his supporters are “borrowing” from the future of the entire research community for their present enjoyment! It is time to wake up to your consciousness.
Our beloved peer-review system was grossly abused and failed spectacularly in the case of the ELM. It is time for the machine learning experts and leaders to investigate the allegations presented here and to take corrective actions soon.
5 Easy but Proven Steps to Academic Fame
One can call the above steps “IP” (Intelligent Plagiarism), as opposed to stupid (verbatim) plagiarism specified by the IEEE in “5 levels”. The machine learning community should feel embarrassed if “IP” (Intelligent Plagiarism) was originally developed and/or grandiosely promoted by this community, while the community is supposed to create other (more ethical) intelligent algorithms to benefit the mankind.
In mid-July 2015, G.-B. Huang posted an email on his [email protected] emailing list. This email was forwarded to [email protected] for our responses. As usual, this email was meaningless and our remarks are available at http://elmorigin.wix.com/originofelm .
Email for feedback: [email protected]
Hi, I have read the classifiers code. In the top of the code it is commented that it can learn different kernels.
How about Normalized Radial Basis Function? Any example of code?
Hi
I recently needed to predict the class probabilities instead of the class labels.
So I wrote a predict_proba() method, sticking to the convention used in other scikit classifiers.
I added the following which simply take considers the exponential ratios of the decision functions,
to class GenELMClassifier to the module elm.py .
"""Predict probability values using the model
Considers exponent of decision_function values
Parameters
X : {array-like, sparse matrix} of shape [n_samples, n_features]
Returns
C : numpy array of shape [n_samples, n_outputs]
Predicted values.
"""
raw_predictions = np.exp(self.decision_function(X))
probabilities = np.zeros(raw_predictions.shape)
rows, cols = raw_predictions.shape
for row in range(0, rows):
total = sum(raw_predictions[row,:])
probabilities[row,:] = raw_predictions[row,:] / total
return probabilities
(The + signs are from my GIT diffs, please ignore).
I'm not overly familiar with ELMs but if you think the above is correct, feel free to add it up. Alternatively I would be happy to contribute code to the project.
Hello,
I'm using your implementation of ELMClassifier
to run some experiments. I see it was implemented following sklearn
coding interface, so i was trying to run your algorithm through a dataset and measure its performance with cross_val_score()
, but i am getting this error:
ValueError: output_type='binary', but y.shape = (30, 3)
It has something to do with line 614, where you set:
class_predictions = self.binarizer.inverse_transform(raw_predictions)
Do you have any clue how can i fix this problem? I am using Iris dataset (3 classes). Please let me know if i can help or even push a pull request for this.
Thanks in advance.
Is anyone aware of any GPU implementation of ELM? This package is based on scikit-learn, so it will probably never support GPU. I am looking for a way to speed up the computation.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.