Git Product home page Git Product logo

waikato / moa Goto Github PK

View Code? Open in Web Editor NEW
584.0 55.0 351.0 31.3 MB

MOA is an open source framework for Big Data stream mining. It includes a collection of machine learning algorithms (classification, regression, clustering, outlier detection, concept drift detection and recommender systems) and tools for evaluation.

Home Page: http://moa.cms.waikato.ac.nz/

License: GNU General Public License v3.0

Java 82.96% TeX 16.93% Shell 0.09% Batchfile 0.01% Makefile 0.01% Dockerfile 0.02%
java machine-learning machine-learning-algorithms streaming-algorithms data-stream-mining clustering moa

moa's Introduction

MOA (Massive Online Analysis)

Build Status Maven Central DockerHub License: GPL v3

MOA

MOA is the most popular open source framework for data stream mining, with a very active growing community (blog). It includes a collection of machine learning algorithms (classification, regression, clustering, outlier detection, concept drift detection and recommender systems) and tools for evaluation. Related to the WEKA project, MOA is also written in Java, while scaling to more demanding problems.

http://moa.cms.waikato.ac.nz/

Using MOA

MOA performs BIG DATA stream mining in real time, and large scale machine learning. MOA can be extended with new mining algorithms, and new stream generators or evaluation measures. The goal is to provide a benchmark suite for the stream mining community.

Mailing lists

Citing MOA

If you want to refer to MOA in a publication, please cite the following JMLR paper:

Albert Bifet, Geoff Holmes, Richard Kirkby, Bernhard Pfahringer (2010); MOA: Massive Online Analysis; Journal of Machine Learning Research 11: 1601-1604

moa's People

Contributors

abifet avatar alessiobernardo avatar aosojnik avatar canoalberto avatar celikmustafa89 avatar corneliusboehm avatar csterling avatar dabrze avatar dakot avatar dependabot[bot] avatar fracpete avatar garawalid avatar henrygouk avatar hfichtenberger avatar hmgomes avatar isvani avatar janvanrijn avatar jmread avatar joaomaiaduarte avatar jpbarddal avatar nuwangunasekara avatar pmgj avatar richard-moulton avatar rjtsousa avatar silasgtcs avatar tpham93 avatar truongtd6285 avatar tsabsch avatar vlosing avatar wldbdsyb avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

moa's Issues

"java.lang.UnsupportedOperationException: Not yet implemented" in InstanceInformation.java

Hello,

I am a Master's student in Computer Science at the University of Ottawa and I am trying to use the moa.streams.clustering.FileStream task in the Clustering tab of MOA as part of my thesis work. Specifically I am trying to read in the covtypeNorm.arff file that I have downloaded from the MOA website to experiment with the task's functionality before continuing on.

Unfortunately when I try to run this I get the following exception:

Exception in thread "AWT-EventQueue-0"
java.lang.UnsupportedOperationException: Not yet implemented
at com.yahoo.labs.samoa.instances.InstanceInformation.deleteAttributeAt (InstanceInformation.java:144)

A quick check of the relevant line of code shows that the deleteAttributeAt method in InstanceInformation is entirely the Exception thrower, leading me to conclude that this is not a problem with the data set but that this capability is not fully included. My two-part question is:

A) Is there an earlier version in which the moa.streams.clustering.FileStream task was functional? and
B) If not, is this something that I could work on and contribute to the project?

Thank you in advance,

Richard

Cannot select result directory in experimenter

When I click the Browse button to select a results directory in the new Experimenter tab, I get the following error in the terminal:

java.lang.NullPointerException
        at com.github.fracpete.jshell.JShellPanel.updateButtons(JShellPanel.java:238)
        at com.github.fracpete.jshell.JShellPanel.finishInit(JShellPanel.java:217)
        at nz.ac.waikato.cms.gui.core.BasePanel.<init>(BasePanel.java:52)
        at nz.ac.waikato.cms.gui.core.BasePanel.<init>(BasePanel.java:40)
        at com.github.fracpete.jshell.JShellPanel.<init>(JShellPanel.java:67)
        at moa.gui.ScriptingTabPanel.<init>(ScriptingTabPanel.java:47)
        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
        at java.base/java.lang.Class.newInstance(Class.java:584)
        at moa.gui.GUI.initGUI(GUI.java:64)
        at moa.gui.GUI.<init>(GUI.java:46)
        at moa.gui.GUI$1.run(GUI.java:97)
        at java.desktop/java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:313)
        at java.desktop/java.awt.EventQueue.dispatchEventImpl(EventQueue.java:770)
        at java.desktop/java.awt.EventQueue$4.run(EventQueue.java:721)
        at java.desktop/java.awt.EventQueue$4.run(EventQueue.java:715)
        at java.base/java.security.AccessController.doPrivileged(Native Method)
        at java.base/java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:85)
        at java.desktop/java.awt.EventQueue.dispatchEvent(EventQueue.java:740)
        at java.desktop/java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:203)
        at java.desktop/java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:124)
        at java.desktop/java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:113)
        at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:109)
        at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
        at java.desktop/java.awt.EventDispatchThread.run(EventDispatchThread.java:90)
Exception in thread "AWT-EventQueue-0" java.lang.NoClassDefFoundError: com/sun/java/swing/plaf/windows/WindowsLookAndFeel
        at com.jidesoft.plaf.LookAndFeelFactory.installJideExtension(LookAndFeelFactory.java:848)
        at com.jidesoft.plaf.LookAndFeelFactory.installJideExtension(LookAndFeelFactory.java:633)
        at com.jidesoft.plaf.LookAndFeelFactory.installJideExtension(LookAndFeelFactory.java:598)
        at com.jidesoft.swing.FolderChooser.updateUI(FolderChooser.java:128)
        at java.desktop/javax.swing.JFileChooser.setup(JFileChooser.java:382)
        at java.desktop/javax.swing.JFileChooser.<init>(JFileChooser.java:348)
        at java.desktop/javax.swing.JFileChooser.<init>(JFileChooser.java:295)
        at com.jidesoft.swing.FolderChooser.<init>(FolderChooser.java:63)
        at nz.ac.waikato.cms.gui.core.BaseDirectoryChooser.<init>(BaseDirectoryChooser.java:61)
        at moa.gui.experimentertab.TaskManagerTabPanel.openDirectory(TaskManagerTabPanel.java:605)
        at moa.gui.experimentertab.TaskManagerTabPanel.jButtonDirActionPerformed(TaskManagerTabPanel.java:624)
        at java.desktop/javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1967)
        at java.desktop/javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2308)
        at java.desktop/javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:405)
        at java.desktop/javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:262)
        at java.desktop/javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:279)
        at java.desktop/java.awt.Component.processMouseEvent(Component.java:6632)
        at java.desktop/javax.swing.JComponent.processMouseEvent(JComponent.java:3342)
        at java.desktop/java.awt.Component.processEvent(Component.java:6397)
        at java.desktop/java.awt.Container.processEvent(Container.java:2263)
        at java.desktop/java.awt.Component.dispatchEventImpl(Component.java:5008)
        at java.desktop/java.awt.Container.dispatchEventImpl(Container.java:2321)
        at java.desktop/java.awt.Component.dispatchEvent(Component.java:4840)
        at java.desktop/java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4918)
        at java.desktop/java.awt.LightweightDispatcher.processMouseEvent(Container.java:4547)
        at java.desktop/java.awt.LightweightDispatcher.dispatchEvent(Container.java:4488)
        at java.desktop/java.awt.Container.dispatchEventImpl(Container.java:2307)
        at java.desktop/java.awt.Window.dispatchEventImpl(Window.java:2772)
        at java.desktop/java.awt.Component.dispatchEvent(Component.java:4840)
        at java.desktop/java.awt.EventQueue.dispatchEventImpl(EventQueue.java:772)
        at java.desktop/java.awt.EventQueue$4.run(EventQueue.java:721)
        at java.desktop/java.awt.EventQueue$4.run(EventQueue.java:715)
        at java.base/java.security.AccessController.doPrivileged(Native Method)
        at java.base/java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:85)
        at java.base/java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:95)
        at java.desktop/java.awt.EventQueue$5.run(EventQueue.java:745)
        at java.desktop/java.awt.EventQueue$5.run(EventQueue.java:743)
        at java.base/java.security.AccessController.doPrivileged(Native Method)
        at java.base/java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:85)
        at java.desktop/java.awt.EventQueue.dispatchEvent(EventQueue.java:742)
        at java.desktop/java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:203)
        at java.desktop/java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:124)
        at java.desktop/java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:113)
        at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:109)
        at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
        at java.desktop/java.awt.EventDispatchThread.run(EventDispatchThread.java:90)
Caused by: java.lang.ClassNotFoundException: com.sun.java.swing.plaf.windows.WindowsLookAndFeel
        at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
        at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
        ... 46 more

This is on Ubuntu 18.04.2 LTS.

Clustream WithKmeans: null center and (radius, weight = 0) for BOTH micro and macro clusters

First of all: I am using moa-release-2019.05.0-bin/moa-release-2019.05.0/lib/moa.jar (obtained from https://moa.cms.waikato.ac.nz/downloads/).

Now, let's go to the point: I am trying to use moa.clusterers.clustream.WithKmeans stream clustering algorithm and I have no idea why this is happening ...

  • My code:
import com.yahoo.labs.samoa.instances.DenseInstance;
import moa.cluster.Clustering;
import moa.clusterers.clustream.WithKmeans;

public class TestingClustream {
    static DenseInstance randomInstance(int size) {
        DenseInstance instance = new DenseInstance(size);
        for (int idx = 0; idx < size; idx++) {
            instance.setValue(idx, Math.random());
        }
        return instance;
    }

    public static void main(String[] args) {
        WithKmeans wkm = new WithKmeans();
        wkm.kOption.setValue(5);
        wkm.maxNumKernelsOption.setValue(300);
        wkm.resetLearningImpl();
        for (int i = 0; i < 10000; i++) {
            wkm.trainOnInstanceImpl(randomInstance(2));
        }
        Clustering clusteringResult = wkm.getClusteringResult();
        Clustering microClusteringResult = wkm.getMicroClusteringResult();
    }
}
  • Info from the debugger:

image

image

I have read the source code many times, and it seems to me that I am using the correct functions, in the correct order ... I do not know what I am missing ... any feedback is welcomed!


EDIT:
Thanks to Anony-Mousse on Stackoverflow, I noticed the fields are unused, likely coming from some parent class with a different purpose. Using the getter methods such as getCenter(), getWeight(), and getRadius(), I could get the values.

Now, are that values I got "reliable"?

Moreover, what is the purporse of the weight field? It seemed to me that it represented the number of 'elements' each cluster has, but sometimes I get a real number ... If the weights are integer, the micro clusters ones does not sum up to the total number of samples, and the macro clusters ones does not sum up to the number of micro clusters .... thanks in advance!

ClusTree serialization bug fix

In order to serialize ClusTree model after training for future use the fields inside it also need to be Serializable. But Node and Entry classes were not serializable and hence throws error when trying to serialize ClusTree. fixed and sent a pr : #111

samoa

Is it possible to apply perturbation technique on data stream mining using SAMOA framework ?PLS RPL ME..

How to Use Evaluation Metrics in Meka

I am a beginner of MOA. I am studying multi-label data stream, but I find that MOA has very few evaluation metrics on multi-label. I would like to try to use the evaluation metrics in meka.Is there any convenient way? I would be very grateful if you could provide me with some suggestions.

Crash on startup on Windows due to StringIndexOutOfBoundsException

The following error happens on startup on Windows 10.

java.lang.StringIndexOutOfBoundsException: String index out of range: 2
        at java.base/java.lang.StringLatin1.charAt(Unknown Source)
        at java.base/java.lang.String.charAt(Unknown Source)
        at moa.DoTask.isJavaVersionOK(DoTask.java:61)
        at moa.gui.GUI.main(GUI.java:78)

Integer overflow in memory counter

I end up with a mean memory usage of -335.88 when running the following configuration:

EvaluatePrequential -l (meta.LimAttClassifier -n 2) -s (ArffFileStream -f /home/henry/Datasets/higgs/HIGGS.arff -c 1)

Where HIGGS.arff is the popular HIGGS dataset. The problem does not appear to be solely with the GUI, as the results buffer also contains negative values in the model serialized size (bytes) column.

System.getProperty("java.version") format in Java 9.

In Java 9, the System.getProperty("java.version") is simply the string "9". Therefore the call version.charAt(2) throws exception.

$ java -version
java version "9"
Java(TM) SE Runtime Environment (build 9+181)
Java HotSpot(TM) 64-Bit Server VM (build 9+181, mixed mode)

$ java -cp moa.jar -javaagent:(pwd)/sizeofag-1.0.0.jar moa.gui.GUI
java.lang.StringIndexOutOfBoundsException: String index out of range: 2
at java.base/java.lang.StringLatin1.charAt(StringLatin1.java:44)
at java.base/java.lang.String.charAt(String.java:704)
at moa.DoTask.isJavaVersionOK(DoTask.java:60)
at moa.gui.GUI.main(GUI.java:78)

On macOS 10.13 with brew cask install java.

plotes in MOA

Hi,
I use Adaptive Hoeffding tree by MOA in my research and I have some problem. I will be wounder if help me.
I have some questions:

  1. How can show number of active leaves for instances by plot?
  2. Can show false positive or ROC metrics for evaluate model?
    and is it possible to compare accuracy of an streaming method with other machine learning method like KNN (batch learning)?

Thanks

Distributed Evaluation

Distributed Evaluation presented in this paper:

Albert Bifet, Gianmarco De Francisci Morales, Jesse Read, Geoff Holmes, Bernhard Pfahringer:
Efficient Online Evaluation of Big Data Stream Classifiers. KDD 2015: 59-68

Tutorial 2 example code compilation error

Your example code fails to compile. I first compiled the contents of moa-2015.11-sources.jar, then attempted the following:

$ javac -cp "moa-release-2015.11/sources/bin:moa-release-2015.11/lib/*" Experiment.java
Experiment.java:27: error: incompatible types: InstanceExample cannot be converted to Instance
Instance trainInst = stream.nextInstance();
^
Experiment.java:29: error: incompatible types: weka.core.Instance cannot be converted to com.yahoo.labs.samoa.instances.Instance
if (learner.correctlyClassifies(trainInst)){
^
Experiment.java:34: error: no suitable method found for trainOnInstance(weka.core.Instance)
learner.trainOnInstance(trainInst);
^
method Learner.trainOnInstance(Example<com.yahoo.labs.samoa.instances.Instance>) is not applicable
(argument mismatch; weka.core.Instance cannot be converted to Example<com.yahoo.labs.samoa.instances.Instance>)
method Classifier.trainOnInstance(com.yahoo.labs.samoa.instances.Instance) is not applicable
(argument mismatch; weka.core.Instance cannot be converted to com.yahoo.labs.samoa.instances.Instance)
Note: Some messages have been simplified; recompile with -Xdiags:verbose to get full output
3 errors

ConcurrentModificationException in MOA

Hi,

I use MCOD for anomalies detection in stream with flink but I have an error in the code when i pass the instance to MCOD:

Caused by: java.util.ConcurrentModificationException: null
at java.util.ArrayList$Itr.checkForComodification(ArrayList.java:909)
at java.util.ArrayList$Itr.next(ArrayList.java:859)
at moa.clusterers.outliers.MCOD.MCOD.ProcessExpiredNode(MCOD.java:360)
at moa.clusterers.outliers.MCOD.MCOD.ProcessNewStreamObj(MCOD.java:398)
at moa.clusterers.outliers.MyBaseOutlierDetector.processNewInstanceImpl(MyBaseOutlierDetector.java:157)
at etl.samoa.SamoaAnomalyDetection.mapPartition(SamoaAnomalyDetection.java:84)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704)
at java.lang.Thread.run(Thread.java:748)

the line of code that causes this error is:
outlierDetector.processNewInstanceImpl(samoaInstance);

how can we avoid this error or fix it ?

Thank's in advance !

Cordially,
Anissa

Selecting wrong branch HoeffdingTree using wrong split attribute index

The commented out code at line 44&45 of
moa.classifiers.core.conditionaltests.NominalAttributeMultiwayTest.branchForInstance(Instance) should be put back in. As it is, a wrong split attribute is selected for selecting the branch.
For me, my class index is 32 and the split attribute index should be 33. But it branchForInstance actually returns 32 which is the class index and thus I get perfect accuracy.
Attached training and test data files. 'label' is the class.
sampledata_class_label.zip

The prequential accuracy drops to 50% steadily after ADWIN detects a drift.

Hi there,

I have found an issue about the ADWIN change detector. The ADWIN change detector in the latest MOA release (MOA 19.05.0) will always drops the prequential accuracy to 50% steadily. This issue is regardless the base learner.

You should be able to recreate this issue by simply create a concept drift stream and use the DriftDetectionMethodClassifier with using ADWIN as change detector. You can pick any base learner as you like, say, the most basic one: Naive bayes. For the evaluator, I used FadingFactor with default alpha 0.999. With these settings, you will then see the performance chart will drop to 50% quickly after the drift detection. The performance will then keep at 50% extremely steady.

The ADWIN change detector was fine in version 18.06, as I have been using this version of MOA in my research.

StreamKM "Width" Parameter

Good afternoon,

I am trying to understand the source code for StreamKM++ with MOA. Simply put, as I understand there are two conflicting roles being played by the width parameter for StreamKM++. [All of the following is done using MOA's Clustering tab, with the default RBF stream generator, StreamKM++ as Algorithm 1, and Algorithm 2 cleared]

  1. The width parameter is defined in the help file as the "Size of Window for training learner." This is borne out in the source code, for example StreamKM.java's trainOnInstanceImpl method:
public void trainOnInstanceImpl(Instance inst) {
       
       ...
       
       manager.insertPoint(new Point(inst, this.numberInstances));    
       
       this.numberInstances++;
       if (this.numberInstances % widthOption.getValue() == 0) {
           
           //compute 5 clusterings of the coreset with kMeans++ and take the best
           ...
       }
    }

Clusterings are only computed every width instances and until width instances are processed the clusterings returned by the getClusteringResult() method have only NULL entries for the centresStreamingCoreset array. This means that no measurements can be calculated for StreamKM++'s performance. When I set width accordingly and run StreamKM++, however, I run into:

 java.lang.ArrayIndexOutOfBoundsException: 6
    at moa.clusterers.streamkm.BucketManager.insertPoint(BucketManager.java:92)
    at moa.clusterers.streamkm.StreamKM.trainOnInstanceImpl(StreamKM.java:71)
    ...
  1. This seems to be occurring because width is also used to calculate the number of buckets to use, as seen in BucketManager.java's constructor method:
public BucketManager(int n,int d,int maxsize, MTRandom random){
        this.clustererRandom = random;
        this.numberOfBuckets = (int) Math.ceil(Math.log((double)n/(double)maxsize) / Math.log(2) )+2;
        this.maxBucketsize = maxsize;
        this.buckets = new Bucket[this.numberOfBuckets];
        for(int i=0; i<this.numberOfBuckets; i++){
            this.buckets[i] = new Bucket(d,maxsize);
        }
        this.treeCoreset = new TreeCoreset();
        //System.out.printf("Created manager with %d buckets of dimension %d \n",this.numberOfBuckets,d);
    }

This constructor is called by StreamKM.java upon initialization in the trainOnInstanceImpl() method: the argument n is, in fact, width (via StreamKM.java's variable length). The problem this causes is that the BucketManager fails when all of its buckets are full: the ArrayIndexOutOfBoundsException is due to the BucketManager trying to move points from the last bucket to a non-existent "last bucket."

Although it would be most intuitive to me to "fix" this behaviour, it is consistent with the algorithm's description in Marcel R. Ackermann, Christiane Lammersen, Marcus Märtens, Christoph Raupach, Christian Sohler, Kamil Swierkot: StreamKM++: A Clustering Algorithms for Data Streams. ALENEX 2010: 173-187 (the paper cited in StreamKM.java's opening comments). The argument n is described therein as the size of the data stream. The paper also describes that coresets should be obtainable at any point during the data stream, something which is not the case at the moment.

My question is: am I missing something in the code or an assumption by the developers? Or does it make sense to modify StreamKM.java's getClusteringResult() method in order to provide proper clusterings as it appears was envisioned in the original paper?

Richard

How to implement Cramer's test in Java? - Cramer - 'ArrayStoreException'

Submitting the parameters to the method:

Cramer c = new Cramer();
c.cramerTest1(CR1, CR2);
pi.CremerTValue = ?

fails with the error 'ArrayStoreException' at the second line after passing the parameters.

I tried:

List<List>CR1 = new ArrayList<List>();
List<List>CR2 = new ArrayList<List>();

CR1.add(First);
CR2.add(Second);

--or--

List<List>CR1 = Arrays.asList(First);
List<List>CR2 = Arrays.asList(Second);

Both declarations failed.

First and Second are both of the type ArrayList with a size of 177 items.
CR1 and CR2 have one item that contain 177 values.

How can I implement the two sample Cramer-Mises T test?

Cannot handle numeric class

Hi:
I made an error when using MOA to learn the Flags.arff dataset incremental:
weka.core.UnsupportedAttributeTypeException: weka.classifiers.trees.HoeffdingTree: Cannot handle numeric class!
The command line is:EvaluatePrequentialMultiTarget -l (multilabel.MEKAClassifier -l (meka.classifiers.multilabel.incremental.BRUpdateable -W weka.classifiers.trees.HoeffdingTree -- -L 2 -S 1 -E 1.0E-7 -H 0.05 -M 0.01 -G 200.0 -N 0.0)) -s (MultiTargetArffFileStream -f (E:\Program Files\multi-Label\dataset\Real\Flags.arff) -c -7) -e BasicMultiLabelPerformanceEvaluator
Detailed error information:
{M}assive {O}nline {A}nalysis
Version: 18.06 June 2018
Copyright: (C) 2007-2018 University of Waikato, Hamilton, New Zealand
Web: http://moa.cms.waikato.ac.nz/

[WARNING] Only 0 labels found! (Expecting 7)
(Ignoring this prediction)
三月 21, 2019 2:43:45 下午 com.github.fommil.netlib.ARPACK
警告: Failed to load implementation from: com.github.fommil.netlib.NativeSystemARPACK
三月 21, 2019 2:43:45 下午 com.github.fommil.jni.JniLoader liberalLoad
信息: successfully loaded C:\Users\芝士\AppData\Local\Temp\jniloader7654186575430500792netlib-native_ref-win-x86_64.dll
[ERROR] Failed to build classifier, L=7
weka.core.UnsupportedAttributeTypeException: weka.classifiers.trees.HoeffdingTree: Cannot handle numeric class!
at weka.core.Capabilities.test(Capabilities.java:1067)
at weka.core.Capabilities.test(Capabilities.java:1256)
at weka.core.Capabilities.test(Capabilities.java:1138)
at weka.core.Capabilities.testWithFail(Capabilities.java:1468)
at weka.classifiers.trees.HoeffdingTree.buildClassifier(HoeffdingTree.java:751)
at meka.classifiers.multilabel.BR.buildClassifier(BR.java:75)
at moa.classifiers.multilabel.MEKAClassifier.trainOnInstanceImpl(MEKAClassifier.java:107)
at moa.classifiers.AbstractMultiLabelLearner.trainOnInstanceImpl(AbstractMultiLabelLearner.java:20)
at moa.classifiers.AbstractClassifier.trainOnInstance(AbstractClassifier.java:178)
at moa.classifiers.AbstractClassifier.trainOnInstance(AbstractClassifier.java:249)
at moa.tasks.EvaluatePrequentialMultiTarget.doMainTask(EvaluatePrequentialMultiTarget.java:211)
at moa.tasks.MainTask.doTaskImpl(MainTask.java:50)
at moa.tasks.AbstractTask.doTask(AbstractTask.java:57)
at moa.tasks.TaskThread.run(TaskThread.java:76)
[WARNING] Failed to get votes from multi-label classifier (not trained yet?).
Can you help me?

Generating Multi-Label Synthetic Data Stream gives a NullPointerException

Hey There,

The issue that I will talk about next is discussed here previously: https://groups.google.com/forum/#!topic/moa-development/ho-_Z22k1-E

The task WriteStreamToARFFFile does not work properly. Although some initial statistics on the distribution of the label sets are outputted to the terminal, the process terminates with a NullPointerException.

The error is replicated by some other user in MOA Development Google Group as well.

The error is similar to this:

Failure reason: Failed writing to file /home/****/Synth.arff *** STACK TRACE ***java.lang.RuntimeException: Failed writing to file /home/****/Synth.arff at moa.tasks.WriteStreamToARFFFile.doMainTask(WriteStreamToARFFFile.java:86) at moa.tasks.MainTask.doTaskImpl(MainTask.java:50) at moa.tasks.AbstractTask.doTask(AbstractTask.java:57) at moa.tasks.TaskThread.run(TaskThread.java:76) Caused by: java.lang.NullPointerException at com.yahoo.labs.samoa.instances.SparseInstanceData.locateIndex(SparseInstanceData.java:237) at com.yahoo.labs.samoa.instances.SparseInstanceData.setValue(SparseInstanceData.java:220) at com.yahoo.labs.samoa.instances.InstanceImpl.setValue(InstanceImpl.java:269) at moa.streams.generators.multilabel.MetaMultilabelGenerator.generateMLInstance(MetaMultilabelGenerator.java:274) at moa.streams.generators.multilabel.MetaMultilabelGenerator.nextInstance(MetaMultilabelGenerator.java:228) at moa.streams.generators.multilabel.MetaMultilabelGenerator.nextInstance(MetaMultilabelGenerator.java:46) at moa.tasks.WriteStreamToARFFFile.doMainTask(WriteStreamToARFFFile.java:80) ... 3 more

The setting which results in the error is as follows:

  1. Pick 'WriteStreamToARFFFile' task. As its options:
  • stream: generators.multilabel.MetaMultilabelGenerator (with default values. I also tried to change some of the options there, such as NumLabels and LabelCardinality)
  • arffFile: An empty file that I specified with proper read write permissions.
  • maxInstances: 100,000. Or any other value
  • taskResultFile: This is left blank, as it is for the results on the generated data (for most common labelset etc.)

Recommender system example throws exception

I get the following error when trying to run the recommender system example using code in master.

*** STACK TRACE ***java.lang.ClassCastException: class java.lang.String cannot be cast to class moa.evaluation.preview.Preview (java.lang.String is in module java.base of loader 'bootstrap'; moa.evaluation.preview.Preview is in unnamed module of loader 'app')
	at moa.gui.PreviewPanel.setLatestPreview(PreviewPanel.java:171)
	at moa.gui.PreviewPanel.setTaskThreadToPreview(PreviewPanel.java:144)
	at moa.gui.PreviewPanel.latestPreviewChanged(PreviewPanel.java:220)
	at moa.tasks.StandardTaskMonitor.setLatestResultPreview(StandardTaskMonitor.java:141)
	at moa.tasks.EvaluateOnlineRecommender.doMainTask(EvaluateOnlineRecommender.java:128)
	at moa.tasks.MainTask.doTaskImpl(MainTask.java:50)
	at moa.tasks.AbstractTask.doTask(AbstractTask.java:57)
	at moa.tasks.TaskThread.run(TaskThread.java:76)

BatchCmd does not recognize EvaluateClustering's encoding of "no instance limit" as -1

The instanceLimitOption in the EvaluateClustering task (line 43) describes its function as

Maximum number of instances to test/train on (-1 = no limit).

If "-1" is passed as an argument, however, (for example if the streamOption is set as a FileStream and the user wants the whole ARFF file processed) then no instances are passed to the learner and the results produced in the dumpFile are NULL outside of the header.

This is because the BatchCmd run method (line 147) uses the following while loop to determine if another instance should be passed:

while(m_timestamp < totalInstances && stream.hasMoreInstances())

The totalInstances variable is the local representation of the user's selected value for EvaluateClustering instanceLimitOption, but it is created with no understanding of the meaning of "-1." Instead totalInstances is set to -1 in the BatchCmd constructor method (line 68) and then the condition in the above while loop immediately evaluates as false.

LearnNSE sbkt close to zero

Dear @ALL,

In the implementation of the LearnNSE available in the MOA 2017.06, there is a chance for the sbkt variable to get really close to zero, leading to the computation of the log of infinity when calculating the ensemble weights.

this.ensembleWeights.add(Math.log(1.0 / sbkt));

This leaded to problems in the Gaussian problem suggested by the original author of the Learn++.NSE: http://users.rowan.edu/~polikar/research/NSE/

As a workaround, one of the original authors of the NSE check if the "sbkt" is smaller than 0.01. If so, the value is set to 0.01.
It can be seen in: https://github.com/gditzler/IncrementalLearning/blob/master/src/learn_nse.m

Check the condition:

if net.beta(net.t,net.t)<net.threshold,
net.beta(net.t,net.t) = net.threshold;
end

It seems to solve the problem when implemented in the MOA version of the LearnNSE.

Best regars.

Issue with WriteStreamToARFFFile

Dear all,

WriteStreamToARFFFile outputs categorical variables incorrectly.
It basically outputs the toString() method of instances, and no conversion between the integer that represents the index of the value to the actual string value is performed.
I could send this as a pull request, but I am unsure if the best solution is to make this correction at the toString method or at the WriteStreamToARFFFile manually.

Cheers,

Jean

WriteStreamToARFFFile Issue

As reported by a user on the MOA users group, the WriteStreamToARFFFile class is unable to write streams of the ConceptDriftStream class. This occurs since ConceptDriftStream is now an ExampleStream.

Instance documentation

Is there any documentation on how to convert vectors into Instances, I see that all clusteres expect "Instance" instances not array of numbers, So I was wondering how to generate the header also what is classIndex, classValue. in my case the ground truth clusters are unknown.

provided that I have some vectors (Not one of the supported stream generators) without ground truth clusters, Could you provide an example for how to use any of the available clusterers?

Thanks

getCoresetFromManager of BucketManager

The function getCoresetFromManager() of class BucketManager is responsible to retrieve the coreset summarized in the buckets.

Why does the funcion return only the last bucket if it is full? And about new objects? The last bucket has the oldest objects of stream, and the new objects can spend much time to reach it.

See that when the last bucket is full, the next (2^(L-1))*m objects will make no difference to clustering, since only last bucket is returned.

Selection of MeasureCollections in EvaluateClustering not enabled

The measureCollectionType option in the EvaluateClustering task (line 46) purports to allow the user to select which measures they want to use in evaluating their learner's performance. This, however, is not properly implemented.

The BatchCmd getMeasureSelection method (line 88) takes an integer argument specifying the measure collections to select, but then adds the same measure collections regardless of argument. These are EntropyCollection, F1, General, SSQ, SilhouetteCoefficient and StatisticalCollection.

Additionally, when the MeasureCollection returned by BatchCmd getMeasureSelection is passed to the BatchCmd getMeasures method (line 202) there is no check of whether the measures are enabled - instead the measure collection's default enabled values are used. This makes it impossible to add measure collections whose default enabled value(s) is (are) false.

java.lang.NullPointerException when trying stream clustering algorithm denstream.WithDBSCAN

I do not know what I am missing ... I chose to use the default parameters configuration, and I do not know why I am getting the error. Any help would be appreciated.

  • My code:
import com.yahoo.labs.samoa.instances.DenseInstance;
import moa.cluster.Clustering;
import moa.clusterers.denstream.WithDBSCAN;

public class TestingDenstream {
    static DenseInstance randomInstance(int size) {
        DenseInstance instance = new DenseInstance(size);
        for (int idx = 0; idx < size; idx++) {
            instance.setValue(idx, Math.random());
        }
        return instance;
    }
    public static void main(String[] args) {
        WithDBSCAN withDBSCAN = new WithDBSCAN();
        withDBSCAN.resetLearningImpl();
        for (int i = 0; i < 1500; i++) {
            DenseInstance d = randomInstance(2);
            withDBSCAN.trainOnInstanceImpl(d);
        }
        Clustering clusteringResult = withDBSCAN.getClusteringResult();
        Clustering microClusteringResult = withDBSCAN.getMicroClusteringResult();

        System.out.println(clusteringResult);

    }
}
  • And here is the error I get:

image

Sparse ARFF not Supported

I was attempting to use a sparse ARFF as per the published specification, but it seems that they are not read correctly by the moa.streams.clustering.FileStream file reader. I discovered this when implementing a custom clustering outlier, I have provided the example ARFF files and the output I get from printing the instances below

Sparse ARFF - This one doesn't work

@RELATION navigationsequences 

@ATTRIBUTE "GET /~scottp/publish.html" NUMERIC
@ATTRIBUTE "GET /~lowey/kevin.gif" NUMERIC
@ATTRIBUTE "GET /~ladd/ostriches.html" NUMERIC
@ATTRIBUTE "GET /~lowey/" NUMERIC

@DATA
{0 1}
{2 1}
{0 1}
{3 1, 1 1}

Sparse ARFF Output

instance: [ ] 
instance: [ ] 
instance: [ ] 
instance: [ 0.00 ]

Normal ARFF - This one works

@RELATION navigationsequences 

@ATTRIBUTE "GET /~ladd/ostriches.html" NUMERIC
@ATTRIBUTE "GET /~scottp/publish.html" NUMERIC
@ATTRIBUTE "GET /~lowey/kevin.gif" NUMERIC
@ATTRIBUTE "GET /~lowey/" NUMERIC

@DATA
0, 1, 0, 0
1, 0, 0, 0
0, 1, 0, 0
0, 0, 1, 1

Sparse ARFF Output

instance: [ 0.00 1.00 0.00 ] 
instance: [ 1.00 0.00 0.00 ] 
instance: [ 0.00 1.00 0.00 ] 
instance: [ 0.00 0.00 1.00 ] 

Error when building with Maven

Hi,

I'm not sure how much of this is due to my lack of experience with Maven...

When attempting to build MOA with the command mvn assembly:assembly I encounter the following error:

[ERROR] Failed to execute goal on project weka-package: Could not resolve dependencies for project nz.ac.waikato.cms.moa:weka-package:jar:2017.10-SNAPSHOT: Failed to collect dependencies at nz.ac.waikato.cms.moa:moa:jar:[2017.10-SNAPSHOT,): No versions available for nz.ac.waikato.cms.moa:moa:jar:[2017.10-SNAPSHOT,) within specified range -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :weka-package

This is occurring just after the LaTeX documents are being compiled. Is this because a dependency is missing from pom.xml, or have I missed a step?

ColorArray Limitation for Denstream Cluster id assignment, [cluster_id 51 problem][POSSIBLE BUG]

I am using denstream algorithm for clustering.
The algorithm creates clusters, assigns cluster_ids and colors for each cluster up to 52 clusters[0,1,2….51].
However, after cluster-52 it does not work properly. It assigns id 51 to rest of clusters which should be 52,53,54,55,56… etc.
I have traced the code and got followings;
Denstream algorithm uses setClusterIDs(Clustering clustering) methods which is implemented in moa/clusterers/macro/AbstractMacroClusterer.java class.
In this method, there is following code block which uses ColorArray.java class for assigning ids and colors for clusters.

                // check if there are 2 clusters with the same color (same id, could
		// appear after a split);
		double freeID = 0;
		List<Double> reservedIDs = new Vector<Double>();
		reservedIDs.addAll(countIDs.keySet());
		for (Map.Entry<Double, Integer> entry : countIDs.entrySet()) {
			if (entry.getValue() > 1 || entry.getKey() == -1) {
				// find first free id, search all the clusters which has the
				// same id and replace the ids with free ids. One cluster can
				// keep its id
				int to = entry.getValue();
				if (entry.getKey() != -1)
					to--;

				for (int i = 0; i < to; i++) {
					while (reservedIDs.contains(freeID)
							&& freeID < ColorArray.getNumColors())
						freeID += 1.0;
					for (int c = clustering.size() - 1; c >= 0; c--)
						if (clustering.get(c).getId() == entry.getKey()) {
							clustering.get(c).setId(freeID);
							reservedIDs.add(freeID);
							break;
						}
				}
			}
		}

In this code block, method uses ColorArray.mVisibleColors.length for assigning id’s and colors. This array has only 52 different colors and it limits the cluster_id assignment.
After 52 cluster, it assigns same cluster_id(51) for all clusters. ColorArray has those colors on code-block-2.
I have added more colors to this array and it solved the problem. Here is my issue on google_group. And my pull request.
Can you review it or inform me for further steps?

// CODE-BLOCK-2
public class ColorArray {
	public static ColorObject[] mVisibleColors = {
			new ColorObject("blue", new Color(0x0000ff)),
			new ColorObject("blueviolet", new Color(0x8a2be2)),
			new ColorObject("brown", new Color(0xa52a2a)),
			new ColorObject("burlywood", new Color(0xdeb887)),
			new ColorObject("cadetblue", new Color(0x5f9ea0)),
			//new ColorObject("chartreuse", new Color(0x7fff00)),
			new ColorObject("chocolate", new Color(0xd2691e)),
			new ColorObject("coral", new Color(0xff7f50)),
			new ColorObject("cornflowerblue", new Color(0x6495ed)),
			new ColorObject("crimson", new Color(0xdc143c)),
			new ColorObject("cyan", new Color(0x00ffff)),
			new ColorObject("darkblue", new Color(0x00008b)),
			new ColorObject("darkcyan", new Color(0x008b8b)),
			new ColorObject("darkgoldenrod", new Color(0xb8860b)),
			new ColorObject("darkgreen", new Color(0x006400)),
			new ColorObject("darkkhaki", new Color(0xbdb76b)),
			new ColorObject("darkmagenta", new Color(0x8b008b)),
			new ColorObject("darkolivegreen", new Color(0x556b2f)),
			new ColorObject("darkorange", new Color(0xff8c00)),
			// new ColorObject("darkorchid", new Color(0x9932cc)),
			new ColorObject("darkred", new Color(0x8b0000)),
			new ColorObject("darksalmon", new Color(0xe9967a)),
			new ColorObject("darkseagreen", new Color(0x8fbc8f)),
			new ColorObject("darkslateblue", new Color(0x483d8b)),
			new ColorObject("darkslategray", new Color(0x2f4f4f)),
			// new ColorObject("darkturquoise", new Color(0x00ced1)),
			new ColorObject("darkviolet", new Color(0x9400d3)),
			new ColorObject("deeppink", new Color(0xff1493)),
			new ColorObject("deepskyblue", new Color(0x00bfff)),
			// new ColorObject("dodgerblue", new Color(0x1e90ff)),
			new ColorObject("firebrick", new Color(0xb22222)),
			new ColorObject("forestgreen", new Color(0x228b22)),
			new ColorObject("fuchsia", new Color(0xff00ff)),
			new ColorObject("gold", new Color(0xffd700)),
			new ColorObject("goldenrod", new Color(0xdaa520)),
			//new ColorObject("green", new Color(0x008000)),
			new ColorObject("greenyellow", new Color(0xadff2f)),
			new ColorObject("hotpink", new Color(0xff69b4)),
			new ColorObject("indianred", new Color(0xcd5c5c)),
			new ColorObject("indigo", new Color(0x4b0082)),
			//new ColorObject("lawngreen", new Color(0x7cfc00)),
			// new ColorObject("lime", new Color(0x00ff00)),
			// new ColorObject("limegreen", new Color(0x32cd32)),
			new ColorObject("magenta", new Color(0xff00ff)),
			new ColorObject("maroon", new Color(0x800000)),
			new ColorObject("olive", new Color(0x808000)),
			new ColorObject("orange", new Color(0xffa500)),
			new ColorObject("orangered", new Color(0xff4500)),
			new ColorObject("pink", new Color(0xffc0cb)),
			new ColorObject("powderblue", new Color(0xb0e0e6)),
			new ColorObject("purple", new Color(0x800080)),
			new ColorObject("red", new Color(0xff0000)),
			new ColorObject("royalblue", new Color(0x4169e1)),
			new ColorObject("saddlebrown", new Color(0x8b4513)),
			new ColorObject("salmon", new Color(0xfa8072)),
			new ColorObject("seagreen", new Color(0x2e8b57)),
			new ColorObject("skyblue", new Color(0x87ceeb)),
			new ColorObject("slateblue", new Color(0x6a5acd)),
			new ColorObject("tomato", new Color(0xff6347)),
			new ColorObject("violet", new Color(0xee82ee)) };

	public static Color getColor(int i) {
		Color res;
		try {
			res = mVisibleColors[i].getColor();
		} catch (ArrayIndexOutOfBoundsException e) {
			return Color.BLACK;
		}
		return res;
	}

	public static String getName(int i) {
		String res;
		try {
			res = mVisibleColors[i].getName();
		} catch (ArrayIndexOutOfBoundsException e) {
			throw e;
		}
		return res;
	}

	public static double getNumColors() {
		return mVisibleColors.length;
	}
}

rxjava observable to stream

Hi,

I'm a bit stuck.... I have a range of conceptual, implementation and understanding questions.

  1. [EASY] Where can I find the latest Javadoc?
  2. [A bit history] Why do I see the mix between the yahoo/apache samoa, moa classes, and weka (e.g., InstanceHeader)??
  1. [Impl] Little bit of context: I'm receiving data through a socket.io connection and distribute it directly through a Java rxJava Observable implementation == my stream. As long as I define the InstanceHeader for bootstrapping the algorithms e.g., myOutlierDetector.setModelContext(stream.getHeader()); and use (weka) Instances for the algorithm input I should be fine, right?

Thanks

Reading ARFF file when using MOA’s API with Scala

I will be very grateful if you can help me out how the data with .arff format could be used to the Adaptive Hoeffding tree so that it can get the model accuracy using the prequential method. In the code provided in the "https://moa.cms.waikato.ac.nz/using-moas-api-with-scala/" link, RandomRBFGenerator is utilized as the input data, but to give the .arff formatted data, it is not possible to use the functions used in this code, such as the stream.prepareForUse ().

Is it possible to guide me what functions should be used to give .arff format data as input to the Hoeffding tree model in this code?

Multi-label organisation

I've noticed a few oddities in the multi-label prediction API:

  • There are multiple MultiLabelInstance classes: moa.core.MultiLabelInstance and com.yahoo.labs.samoa.instances.MultiLabelInstance. Should one of these be removed?
  • BasicMultiLabelLearner and BasicMultiLabelClassifier are in the moa.classifier.multitarget package, but the moa.classifiers.multilabel package also exists. Should they be moved? Is BasicMultiLabelClassifier even needed?
  • The classifiers in the moa.classifiers.multilabel package do not implemented the MultiLabelClassifier interface, so they do not show up in the GUI. Are these defunct? Is there some reason they don't implement this interface?
  • A number of multi-label methods are found in the moa.classifiers.rules.multilabel package, even though they are not rule-based methods. Should all the multi-label methods be moved to moa.classifiers.multilabel?

ArrayIndexOutOfBoundsException with an ARFF class index different from -1 in Perceptron

An ArrayIndexOutOfBoundsException is thrown when using an ARFF stream with a class index different from the last one (i.e., -1) for training a Perceptron regressor (moa.classifiers.rules.functions.Perceptron) through MOA API. The AdaptiveNodePredictor regressor (moa.classifiers.rules.functions.AdaptiveNodePredictor) exhibits the same issue since it uses the Perceptron code.

Particularly, the bug is inside the method trainOnInstanceImpl(Instance inst) of the Perceptron class. Check that in lines 184-185, the class attribute numericAttributesIndex stores the indexes of the numeric input attributes, that is, without the class index. For example, for a dataset with 5 numeric attributes where the class is set to the 2nd column, the value of numericAttributesIndex is [0, 2, 3, 4] (1 is the index of the class).

However, in line 207, the method modelAttIndexToInstanceAttIndex adds 1 to the value in numericAttributesIndex if the index is greater than the class index. In the above example, this method will return 0, 3, 4, and 5, respectively. Nevertheless, 4 is the maximum index of the array of attribute values of the instances because the dataset has 5 attributes (i.e., [0, 1, 2, 3, 4]). Therefore, the index 5 will throw an ArrayIndexOutOfBoundsException.

Hope this helps!

trying to add new clusterers

Hello,

I'm trying to test if i can add a new clusterers algorithm to MOA but have some issues.

After I have compiled the clusterers with this command in the "lib" directory of the moa release 2019:
javac -cp moa.jar MyClusterer.java

It say, in the Moa Manual, to create a moa directory and classifiers directory inside for the creation of new classifiers in manual, so i did the same for clusterers and pasted my MyClusterer.class in a clusterers directory in moa. The moa directory I created is in the lib directory, is it right? Because when i lunch moa.sh in bin directory, I can't find my MyClusterer in the GUI.

What did i do wrong? maybe it's not the right directory to paste my MyClusterer.class? But where?

Thanks,

authors & copyright

Hi,

I'm trying to upload the RMOA (https://github.com/jwijffels/RMOA) package to CRAN.
The CRAN maintainers ask me to

  • add a detailed list of all the authors which contributed to the code in moa.jar to the RMOA package
  • as well as a description of who owns the copyright of moa.jar so that I can include this in the RMOA package

Can you provide these, so that I can include them in the RMOA package so that I can put it on CRAN?

thanks

Maven build error on pom.xml

I am trying to build MOA from its source code, but I got an error in the pom.xml file in the top folder. The error is at the tab.
I Googled it. It seems that Sonatype OSS Parent poms is no longer available or may be not compatible with latest Maven and Java versions. I would very much like to know which java and Maven versions this project built on.

Thank you very much.

Integration to Weka from MOA broken?

Error: "Please add weka.jar to the classpath to use the Weka explorer."

I tried to do this

$ java -cp "/Applications/weka-3-8-1-oracle-jvm.app/Contents/Java/weka.jar" -jar /Applications/moa.jar

but I am still getting the error so is the integration the WEKA from MOA broken?

I am testing things in macOS 10.11.6.

Clearing Algorithm 2 in Outliers

Version: moa-release-2017.06b

If you click the "clear" button to run only one outlier algorithm the second algorithm does not clear. The console output error is below. Means that at all times you are required to run both 2 outliers, so this is not a blocking issue.

Exception in thread "AWT-EventQueue-0" java.lang.IllegalArgumentException: Problems with option: Algorithm1
        at moa.options.ClassOption.setValueViaCLIString(ClassOption.java:66)
        at moa.gui.outliertab.OutlierAlgoPanel.actionPerformed(OutlierAlgoPanel.java:142)
        at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:2022)
        at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2348)
        at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:402)
        at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:259)
        at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:252)
        at java.awt.Component.processMouseEvent(Component.java:6533)
        at javax.swing.JComponent.processMouseEvent(JComponent.java:3324)
        at java.awt.Component.processEvent(Component.java:6298)
        at java.awt.Container.processEvent(Container.java:2236)
        at java.awt.Component.dispatchEventImpl(Component.java:4889)
        at java.awt.Container.dispatchEventImpl(Container.java:2294)
        at java.awt.Component.dispatchEvent(Component.java:4711)
        at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4888)
        at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4525)
        at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4466)
        at java.awt.Container.dispatchEventImpl(Container.java:2280)
        at java.awt.Window.dispatchEventImpl(Window.java:2746)
        at java.awt.Component.dispatchEvent(Component.java:4711)
        at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:758)
        at java.awt.EventQueue.access$500(EventQueue.java:97)
        at java.awt.EventQueue$3.run(EventQueue.java:709)
        at java.awt.EventQueue$3.run(EventQueue.java:703)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:80)
        at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:90)
        at java.awt.EventQueue$4.run(EventQueue.java:731)
        at java.awt.EventQueue$4.run(EventQueue.java:729)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:80)
        at java.awt.EventQueue.dispatchEvent(EventQueue.java:728)
        at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:201)
        at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116)
        at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:105)
        at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
        at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:93)
        at java.awt.EventDispatchThread.run(EventDispatchThread.java:82)
Caused by: java.lang.Exception: Class not found: None
        at moa.options.ClassOption.cliStringToObject(ClassOption.java:125)
        at moa.options.ClassOption.setValueViaCLIString(ClassOption.java:63)
        ... 37 more

Problem with Fork

Hi all,

Just forked the current version available, builded and unit tested and everything was fine.
When opening the GUI and clicking "Configure", that's what I get:


Building MOA: Massive Online Analysis 2016.03-SNAPSHOT

--- exec-maven-plugin:1.2.1:exec (default-cli) @ moa ---
Exception in thread "AWT-EventQueue-0" java.lang.VerifyError: Bad type on operand stack
Exception Details:
Location:
moa/classifiers/trees/FIMTDD$FIMTDDSplitNode.learnFromInstance(Lweka/core/Instance;Lmoa/classifiers/trees/FIMTDD;Z)V @461: invokevirtual
Reason:
Type 'moa/classifiers/trees/FIMTDD' (current frame, stack[2]) is not assignable to 'moa/classifiers/trees/HoeffdingTree'
Current Frame:
bci: @461
flags: { }
locals: { 'moa/classifiers/trees/FIMTDD$FIMTDDSplitNode', 'weka/core/Instance', 'moa/classifiers/trees/FIMTDD', integer, integer, 'moa/classifiers/trees/HoeffdingTree$Node' }
stack: { 'moa/classifiers/trees/FIMTDD$FIMTDDActiveLearningNode', 'weka/core/Instance', 'moa/classifiers/trees/FIMTDD' }
Bytecode:

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.