Git Product home page Git Product logo

logstash-output-cassandra's People

Contributors

otokarev avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

logstash-output-cassandra's Issues

using the cassandra output fails

Hi there,

I build the gem file on a debian and when using it in logstash it fails with the following message:

/usr/bin/java -Djava.io.tmpdir=/var/lib/logstash -Xmx2G -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -Djava.awt.headless=true -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -jar /opt/logstash/vendor/jar/jruby-complete-1.7.17.jar -I/opt/logstash/lib /opt/logstash/lib/logstash/runner.rb agent -f /etc/logstash/conf.8 -l /var/log/logstash/logstash8.log --debug --verbose
Sending logstash logs to /var/log/logstash/logstash8.log.
Using milestone 2 input plugin 'redis'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.4.4/plugin-milestones {:level=>:warn}
Using milestone 1 output plugin 'cassandra'. This plugin should work, but would benefit from use by folks like you. Please let us know if you find bugs or have suggestions on how to improve this plugin. For more information on plugin milestones, see http://logstash.net/docs/1.4.4/plugin-milestones {:level=>:warn}
LoadError: no such file to load -- cassandra_murmur3
require at org/jruby/RubyKernel.java:1071
require at /opt/logstash/vendor/jar/jruby-complete-1.7.17.jar!/META-INF/jruby.home/lib/ruby/shared/rubygems/core_ext/kernel_require.rb:55
require at /opt/logstash/vendor/jar/jruby-complete-1.7.17.jar!/META-INF/jruby.home/lib/ruby/shared/rubygems/core_ext/kernel_require.rb:53
require at /opt/logstash/vendor/bundle/jruby/1.9/gems/polyglot-0.3.4/lib/polyglot.rb:65
(root) at /opt/logstash/vendor/bundle/jruby/1.9/gems/cassandra-driver-2.1.4/lib/cassandra.rb:565
require at org/jruby/RubyKernel.java:1071
require at /opt/logstash/vendor/jar/jruby-complete-1.7.17.jar!/META-INF/jruby.home/lib/ruby/shared/rubygems/core_ext/kernel_require.rb:73
require at /opt/logstash/vendor/jar/jruby-complete-1.7.17.jar!/META-INF/jruby.home/lib/ruby/shared/rubygems/core_ext/kernel_require.rb:71
require at /opt/logstash/vendor/bundle/jruby/1.9/gems/polyglot-0.3.4/lib/polyglot.rb:65
(root) at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-cassandra-0.1.0/lib/logstash/outputs/cassandra.rb:1
each at org/jruby/RubyArray.java:1613
register at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-cassandra-0.1.0/lib/logstash/outputs/cassandra.rb:60
outputworker at /opt/logstash/lib/logstash/pipeline.rb:220

Any ideas why? The cassandra-drive is installed.

Logstash 2.3.1 ... NoMethodError: undefined methodto_iso8601

Hi Oleg!!
Many thanks for your connector.
I try to import csv data into cassandra using logstash 2.3.1.
Csv files look like this :

APPC001os;20160301050325;71.3944;
APPC001os;20160301050325;71.3944;

Having compatibility problem, first with logstash 2.3.1, and then with cassandra-driver, I modified the logstash-output-cassandra.gemspec

# s.add_runtime_dependency "logstash-core", '>= 1.4.0', '< 2.0.0'
s.add_runtime_dependency "logstash-core", '>= 1.4.0', '<= 2.3.1'
# s.add_runtime_dependency 'cassandra-driver'
s.add_runtime_dependency 'cassandra-driver', '~> 3.0.0.rc.2'

and finally succeed in bin/plugin-install --no-verify.

My pipeline is starting, but I have still the error : "NoMethodError: undefined methodto_iso8601' for nil:NilClass`".
I found lot of problems with convertion to date, so now my datetime is in text format, but after trying many many conf, the error is still there.

My cassandra table is now very simple :

tag_id text,
tag_datetime text,
tag_value double,
PRIMARY KEY ((tag_id), tag_datetime)

If you have an idea.
I'm not sure what I'm doing - no more what I'm saying, but it's another topic :| -, just starting with logstash and cassandra few days ago :$.
And I'm not really sure to understand the source parameter in the output cassandra entry.
Thanks if You can help.

J

Just for information my logStash.conf

  1 input {
  2     file {
  3         path => "xData/src/data/2016-03-02_0000_to_0059_D_20160303003001.txt"
  4         start_position => "beginning"
  5         codec => plain {
  6             charset => "UTF-8"
  7         }
  8     }
  9 }
 10 filter {
 11     csv {
 12         separator => ";"
 13         columns => [ "tag_id", "tag_datetime", "tag_value" ]
 14         remove_field => ["column4"]
 15     }
 16 #   grok {
 17 #       match => { "message" => "%{WORD:tag_id};%{WORD:tag_datetime};%{NUMBER:tag_value}" }
 18 #       overwrite => [ "message" ]
 19 #   }
 20     date {
 21         match => [ "tag_datetime", "yyyyMMddHHmmss" ]
 22         timezone => "Europe/Paris"
 23         target => "tag_datetime"
 24     }
 25 }
 26 output {
 27     file {
 28         path => "/home/j/xData/src/data/logStash.txt"
 29     }
 30     cassandra {
 31         username => "cassandra"
 32         password => "cassandra"
 33         hosts => ["10.102.0.156", "10.102.0.182", "10.102.0.154"]
 34         keyspace => "imagindata"
 35         table => "dwsensor"
 36
 37         # Options: "any", "one", "two", "three", "quorum", "all",
 38         #    "local_quorum", "each_quorum", "serial", "local_serial",
 39         #    "local_one"
 40         # Default: "one"
 41         consistency => "all"
 42
 43         # Where from the event hash to take a message
 44         # source => "message"
 45
 46         hints => {
 47             tag_id => "text"
 48             tag_datetime => "text"
 49             tag_value => "double"
 50         }
 51
 52         ignore_bad_messages => true
 53         ignore_bad_values => false
 54
 55         # Datastax cassandra driver supports batch insert.
 56         # By default it is 1.
 57         batch_size => 100
 58
 59         # Every batch_processor_thread_period sec. a special thread
 60         # pushes all collected messages to Cassandra. By default it is 1 (sec.)
 61         batch_processor_thread_period => 1
 62
 63         # max max_retries times the plugin will push failed batches
 64         # to Cassandra before give up. By defult it is 3.
 65         max_retries => 3
 66
 67         # retry_delay secs. between two sequential tries to push a failed batch
 68         # to Cassandra. By default it is 3 (secs.)
 69         retry_delay => 3
 70     }
 71     stdout{}
 72 }

CQLTypeParser::IncompleteTypeError

When using the logstash-output-cassandra plugin with elassandra 2.4.2-5 (https://github.com/strapdata/elassandra) , i get the following error when the keyspace kibana defines the following User Defined Type.

Pipeline aborted due to error {:exception=>"Cassandra::Cluster::Schema::CQLTypeParser::IncompleteTypeError"
Host 127.0.0.1 closed connection (Cassandra::Errors::IOError: unable to lookup type "\"visualization_kibanaSavedObjectMeta\"") {:level=>:debug, :file=>"cassandra/cluster/connector.rb", :line=>"341", :method=>"disconnected"}

Kibana keyspace is not really uses by logstash but automatically discovered by the cassandra driver. This probably a cassandra driver issue. Elassandra 2.4.2-5 rely on Cassandra 3.0.10.

CREATE TYPE kibana."visualization_kibanaSavedObjectMeta" (
    "searchSourceJSON" frozen<list<text>>
);

CREATE TABLE kibana.visualization (
    "_id" text PRIMARY KEY,
    description list<text>,
    "kibanaSavedObjectMeta" list<frozen<"visualization_kibanaSavedObjectMeta">>,
    "savedSearchId" list<text>,
    title list<text>,
    "uiStateJSON" list<text>,
    version list<int>,
    "visState" list<text>
) WITH bloom_filter_fp_chance = 0.01
    AND caching = {'keys': 'ALL', 'rows_per_partition': 'NONE'}
    AND comment = 'Auto-created by Elassandra'
    AND compaction = {'class': 'org.apache.cassandra.db.compaction.SizeTieredCompactionStrategy', 'max_threshold': '32', 'min_threshold': '4'}
    AND compression = {'chunk_length_in_kb': '64', 'class': 'org.apache.cassandra.io.compress.LZ4Compressor'}
    AND crc_check_chance = 1.0
    AND dclocal_read_repair_chance = 0.1
    AND default_time_to_live = 0
    AND gc_grace_seconds = 864000
    AND max_index_interval = 2048
    AND memtable_flush_period_in_ms = 0
    AND min_index_interval = 128
    AND read_repair_chance = 0.0
    AND speculative_retry = '99PERCENTILE';

Data type not supported for date time.

I am trying to ship date and time from a log file via logstash to cassandra. I have tried using datestamp and timestampISO8601, but, none of them worked for me. I am using the follwing filter

filter {
grok {
match => ["message", "%{DATESTAMP:date_time},%{GREEDYDATA:mid},%{GREEDYDATA:updated_by},%{GREEDYDATA:type}"]
}
}

File entry :
2011-04-19T03:44:01.103Z,2123216638,kalra46,15478125074000

screenshot from 2019-01-21 14-52-58

Failed to get message from source. Skip it.

Hi.
I've tried to configure logs output to Cassandra in parallel with Elasticsearch (which is working perfectly). Here is my output plugin configuration:

    cassandra {
   username => "bla-bla"
   password => "bla-bla-bla"
   hosts => ["192.168.x.y"]
   keyspace => "logs"
   table => "query_log"
   consistency => "all"
   source => "@message"
   ignore_bad_messages => true
   }

When I applied this configuration, my stdout logs starts to grow very quickly with messages like "Failed to get message from source. Skip it", followed by full message dump. The most strange part, the size of message dump is about 40x times more than original message!

For debug reasons, I've enabled "stdout" output plugin, and recorded plain log messages in parallel. Please take a look at the same log messages received by two output plugins.
Stdout plugin: http://pastebin.com/nhef8vyC
Cassandra output plugin: http://pastebin.com/mB3wQAte

Could you please help me to understand what's wrong with plugin there? Or did I miss something in configuration?
Thanks.

License issue

Hi, When we ran gem build logstash-output-cassandra.gemspec, there's a license issue as follows, any advice would be appreciated.

root@199mysqlmove:/home/mysqlmove/download/logstash-output-cassandra-master# gem build logstash-output-cassandra.gemspec
fatal: Not a git repository (or any of the parent directories): .git
WARNING: WARNING: license value 'Apache License (2.0)' is invalid. Use a license identifier from
http://spdx.org/licenses or 'Nonstandard' for a nonstandard license.
WARNING: open-ended dependency on cassandra-driver (>= 0) is not recommended
if cassandra-driver is semantically versioned, use:
add_runtime_dependency 'cassandra-driver', '> 0'
WARNING: open-ended dependency on logstash-devutils (>= 0, development) is not recommended
if logstash-devutils is semantically versioned, use:
add_development_dependency 'logstash-devutils', '
> 0'
WARNING: See http://guides.rubygems.org/specification-reference/ for help
Successfully built RubyGem
Name: logstash-output-cassandra
Version: 0.1.1
File: logstash-output-cassandra-0.1.1.gem

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.