Git Product home page Git Product logo

logstash-input-jdbc's Introduction

Logstash JDBC Input Plugin

This is a plugin for Logstash.

It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.

JDBC Input Plugin Has Moved

This JDBC Input Plugin is now a part of the JDBC Integration Plugin; this project remains open for backports of fixes from that project to the 5.x series where possible, but issues should first be filed on the integration plugin.

Documentation

Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one central location.

Need Help?

Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.

Developing

1. Plugin Developement and Testing

Code

  • To get started, you'll need JRuby with the Bundler gem installed.

  • Create a new plugin or clone and existing from the GitHub logstash-plugins organization.

  • Install dependencies

bundle install

Test

bundle exec rspec

The Logstash code required to run the tests/specs is specified in the Gemfile by the line similar to:

gem "logstash", :github => "elasticsearch/logstash", :branch => "1.5"

To test against another version or a local Logstash, edit the Gemfile to specify an alternative location, for example:

gem "logstash", :github => "elasticsearch/logstash", :ref => "master"
gem "logstash", :path => "/your/local/logstash"

Then update your dependencies and run your tests:

bundle install
bundle exec rspec

2. Running your unpublished Plugin in Logstash

2.1 Run in a local Logstash clone

  • Edit Logstash tools/Gemfile and add the local plugin path, for example:
gem "logstash-input-jdbc", :path => "/your/local/logstash-input-jdbc"
  • Update Logstash dependencies
rake vendor:gems
  • Run Logstash with your plugin
bin/logstash -e 'input {jdbc {..}}'

At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.

2.2 Run in an installed Logstash

  • Build your plugin gem
gem build logstash-input-jdbc.gemspec
  • Install the plugin from the Logstash home
bin/plugin install /your/local/plugin/logstash-input-jdbc.gem
  • Start Logstash and proceed to test the plugin

Example configuration

Reading data from MySQL:

input {
  jdbc {
    jdbc_driver_library => "/path/to/mysql-connector-java-5.1.33-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://host:port/database"
    jdbc_user => "user"
    jdbc_password => "password"
  # or jdbc_password_filepath => "/path/to/my/password_file"
    statement => "SELECT ..."
    jdbc_paging_enabled => "true"
    jdbc_page_size => "50000"
  }
}

filter {
  [some filters here]
}

output {
  stdout {
    codec => rubydebug
  }
  elasticsearch_http {
    host => "host"
    index => "myindex"
  }
}

Contributing

All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.

Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.

It is more important to me that you are able to contribute.

For more information about contributing, see the CONTRIBUTING file.

logstash-input-jdbc's People

Contributors

a-k- avatar andrewvc avatar bmax avatar dariko avatar diije avatar eriktrom avatar franklauterwald avatar growse avatar guyboertje avatar hskoro avatar jakelandis avatar jeromefroe avatar jordansissel avatar jsvd avatar karenzone avatar magnusbaeck avatar neufeldtech avatar ph avatar raghavgarg1257 avatar robbavey avatar suyograo avatar tac0x2a avatar talevy avatar untergeek avatar yaauie avatar yangineer avatar ycombinator avatar yumweb avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

logstash-input-jdbc's Issues

Add ability for sub_statements

Sometimes one large statement may not be nicest to maintain so a sub_statement that can
be executed alongside the statement, and execute against each result in the result set.

For example, you may wish to annotate all records with content from another table.

this would be a possible syntax for declaring a sub statement and setting the results within a specific field named TARGET_FIELD.

sub_statement => [TARGET_FIELD, SQL_STATEMENT]

example: A database where there are separate contacts and numbers tables for storing contact information and their phone numbers.

jdbc {
   statement => "SELECT contact_id, first_name, last_name from contacts"
   sub_statement => ["phone_numbers", "SELECT number from numbers where id=:contact_id"]
}

this would create an event with the following fields

{ "contact_id" => ...,
  "first_name" => ...,
  "last_name" => ...,
  "phone_numbers" => [ ..., ...]
}

Support for Nested Documents and Arrays

In other implementations of JDBC input tools for ElasticSearch, you can specify Nested Documents or Arrays based on the naming convention of the column. Something like "contact.phone" would create a nested document of type phone. Or contact[phone] would create an array of phone numbers. So, a result set like:

uid, contact[phone]
1, 8001111111
1, 8002222222

would result in a document in ElasticSearch like 1, [8001111111,8002222222].

This issue was also brought up in Elastic Discuss: https://discuss.elastic.co/t/logstash-jdbc-input-for-multi-fields-and-nested-objects/27020

Thanks.

Missing last_run_metadata_path file should result in an error

With this in the config

clean_run => false
record_last_run => true
last_run_metadata_path => "C:\lastrun"

And C:\lastrun exists, this results in this error

The error reported is:
Permission denied - C:\lastrun

The folder is created with the same user and is running Logstash so permissions should be ok. I tried giving everyone access but no change. Tested turning on auditing for that folder but nothing is logged.

logstash is not reconnecting

I am using this plugin to connect to a Sybase ASE 15.7 database. I am executing a stored procedure and in that procedure I have a "kill-if-already-running". If this is executed, the jdbc connection is killed by the database-server. Logstash never reconnects and I have to restart logstash to get things running again.

Docs: Add more info before we publish

We need to add the following sections before we publish the plugin:

  • Storing State (how to restart from a point)
  • Scheduling (syntax)
  • Loading Drivers

Logstash refuses to start if no database-connection can be made

The error reported is:
Java::JavaSql::SQLException: JZ006: Caught IOException: java.net.ConnectException: Connection refused

In my opinion this should not be a complete show-stopper for logstash, as I have several other working inputs. A jdbc connection should be attempted as per defined schedule.

MsSQL JDBC driver cant find class

More frustration with the input plugin side of things. "Error: net.sourceforge.jtds.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?" So /Users/user/jar/jtds-1.3.1.jar exists and contains the class net.sourceforge.jtds.jdbc.Driver. Same setup works with other DB apps.

input {
    jdbc {
        jdbc_driver_library => "/Users/user/jar/jtds-1.3.1.jar"
        jdbc_driver_class => "net.sourceforge.jtds.jdbc.Driver"
        jdbc_connection_string => "jdbc:jtds:sqlserver://..."
        jdbc_user => "user"
        jdbc_password => "pass"
        statement => "SELECT TOP 1000 ..."
        jdbc_page_size => 10000
        jdbc_paging_enabled => true
    }
}

Unable to install dependencies (Step 1)

Hi,

Every time I try to execute bundle install in the source directory (checked out on tag v1.0.0) I get :

There was a NoMethodError while loading logstash-input-jdbc.gemspec: 
undefined method `metadata=' for #<Gem::Specification name=logstash-input-jdbc version=1.0.0> from
  /opt/logstash/plugin-source/logstash-input-jdbc/logstash-input-jdbc.gemspec:18:in `block in <main>' 

I work on Ubuntu 14.04 LTS, with logstash 1.5.3

Some fields are nil or empty

Hello,

I got this line in the database

mandant = 24 
reference = 408
von_datum = 01.01.15    
bis_datum = 01.01.15    
...
SCHLUESSELWERT_1 = LAS_SND_REP
ID = 716785

And this is the result

/opt/logstash/bin/logstash -f jdbc.conf
Logstash startup completed
{
             "mandant" => 24.0,
            "referenz" => 408.0,
           "von_datum" => 2015-01-01 00:00:00 +0100,
           "bis_datum" => 2015-01-01 00:00:00 +0100,
    "ermittlungsdatum" => 2015-01-01 00:11:26 +0100,
               "monat" => 1.0,
                "jahr" => 2015.0,
                "wert" => 10.0,
    "schluesselwert_1" => "",
    "schluesselwert_2" => nil,
    "schluesselwert_3" => nil,
    "schluesselwert_4" => "",
    "schluesselwert_5" => nil,
           "bemerkung" => nil,
    "schluesselwert_6" => nil,
                  "id" => 716785.0,
            "@version" => "1",
          "@timestamp" => "2015-09-07T15:30:00.640Z"
}

The most important problem is that schluesselwert_1 is nil or empty and it's definitly filled in the database. The second problem is that for instance mandant is a normal integer instead of a float.

I tried ojdbc6 and ojdbc7.jar. This is my jdbc.conf

input {
  jdbc {
    jdbc_driver_library => "/opt/ojdbc6.jar"
    jdbc_driver_class => "Java::oracle.jdbc.OracleDriver"
    jdbc_connection_string => "jdbc:oracle:thin:@//db:1521/db"
    jdbc_user => "MGMT"
    jdbc_password => "pw"
    schedule => "* * * * *"

    statement => "select * from OPM where ID='716785'"
  }
}

filter {
}

output {
    #if ("_grokparsefailure" in [tags]) {
        stdout { codec => rubydebug }
    #}
}

Thanks

seed being updated even when JDBC query fails?

cat seed_swd_consumer_resourcelist

--- 2015-10-22 18:00:38.718000000 Z

/opt/logstash/bin/logstash -f /etc/logstash/conf.d/jdbc-perfdb-consumer-swd.conf

Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::JavaSql::SQLSyntaxErrorException: ORA-00904: "CRA"."CLUSTER_NAME_CRA2": invalid identifier>, :level=>:warn}
Logstash startup completed
Logstash shutdown completed

cat seed_swd_consumer_resourcelist

--- 2015-10-22 19:40:24.348000000 Z

oracle.jdbc.OracleDriver not loaded

Hi,

I'm trying to use this plugin with ojdbc7.jar with no success. Error message is
The only error reported is in debug mode:
oracle.jdbc.OracleDriver not loaded

used config:

input {
jdbc {
jdbc_driver_library => '/home//ojdbc7.jar'
jdbc_driver_class => 'oracle.jdbc.OracleDriver'
jdbc_connection_string => 'abc'
statement => 'def'
}
}

Any idea? Already tried with ojdbc6.jar, same story.

thanks

Handle parallel queries

The PR (#70) sets max_work_threads to 1 to prevent consistency issues - which means that the scheduler will not process parallel queries. There are additional code changes in order to support parallel queries. This ticket tracks parallel queries support as an enhancement request.

parameters values as command line argument

I would like to invoke the same logstash jdbc conf file with additional command line arguments to be translated as the "parameters" values. This way I would avoid multiple versions of the same logstash conf files with each one with a hard coded value.
If there is an actual way to do so, I appreciate someone to post here :)

Improve error reporting when a timeout occurs

If a timeout occurs, today, the user is given a rather large dump from Rufus scheduler indicating an error with the real exception buried in the text. Making this error more precise "An error occurred because [reasons]" would be beneficial to users.

Can't install in logstash 1.5 rc3

I'm tryin to install as follows but I always get an error

cilos@nowher:/cilos/opt/logstash-1.5.0-rc3/bin> DEBUG=1 ./plugin install logstash-input-jdbc
Validating logstash-input-jdbc
Installing logstash-input-jdbc
Plugin version conflict, aborting
Bundler::VersionConflict: Bundler could not find compatible versions for gem "logstash":
In Gemfile:
logstash-input-jdbc (>= 0) java depends on
logstash (< 2.0.0, >= 1.4.0) java
Could not find gem 'logstash (< 2.0.0, >= 1.4.0) java', which is required by gem 'logstash-input-jdbc (>= 0) java', in any of the sources.
start at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/bundler-1.9.4/lib/bundler/resolver.rb:203
resolve at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/bundler-1.9.4/lib/bundler/resolver.rb:182
resolve at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/bundler-1.9.4/lib/bundler/definition.rb:193
specs at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/bundler-1.9.4/lib/bundler/definition.rb:132
resolve_remotely! at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/bundler-1.9.4/lib/bundler/definition.rb:121
run at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/bundler-1.9.4/lib/bundler/installer.rb:79
install at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/bundler-1.9.4/lib/bundler/installer.rb:18
run at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/bundler-1.9.4/lib/bundler/cli/install.rb:80
install at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/bundler-1.9.4/lib/bundler/cli.rb:157
run at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/bundler-1.9.4/lib/bundler/vendor/thor/lib/thor/command.rb:27
invoke_command at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/bundler-1.9.4/lib/bundler/vendor/thor/lib/thor/invocation.rb:126
dispatch at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/bundler-1.9.4/lib/bundler/vendor/thor/lib/thor.rb:359
start at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/bundler-1.9.4/lib/bundler/vendor/thor/lib/thor/base.rb:440
start at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/bundler-1.9.4/lib/bundler/cli.rb:10
invoke_bundler! at /cilos/opt/logstash-1.5.0-rc3/lib/logstash/bundler.rb:109
loop at org/jruby/RubyKernel.java:1507
invoke_bundler! at /cilos/opt/logstash-1.5.0-rc3/lib/logstash/bundler.rb:106
call at org/jruby/RubyProc.java:271
capture_stdout at /cilos/opt/logstash-1.5.0-rc3/lib/logstash/bundler.rb:61
invoke_bundler! at /cilos/opt/logstash-1.5.0-rc3/lib/logstash/bundler.rb:105
install_gems_list! at /cilos/opt/logstash-1.5.0-rc3/lib/logstash/pluginmanager/install.rb:98
execute at /cilos/opt/logstash-1.5.0-rc3/lib/logstash/pluginmanager/install.rb:34
run at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/clamp-0.6.4/lib/clamp/command.rb:67
execute at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/clamp-0.6.4/lib/clamp/subcommand/execution.rb:11
run at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/clamp-0.6.4/lib/clamp/command.rb:67
run at /cilos/opt/logstash-1.5.0-rc3/vendor/bundle/jruby/1.9/gems/clamp-0.6.4/lib/clamp/command.rb:132
(root) at /cilos/opt/logstash-1.5.0-rc3/lib/logstash/pluginmanager.rb:12

query results do not index nested fields

here is my setup for the logstash.conf

input {
  jdbc {
    jdbc_driver_library => "C:/logstash-1.5.5/ojdbc6.jar"
    jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
    jdbc_connection_string => "jdbc:oracle:thin:@..."
    jdbc_user => "es_user"
    jdbc_password => "*************"
    statement => "SELECT e.id_number, e.name,  a.address from customer e left join address a on e.id_number = a.id_number"

  }
}
output {
  stdout {
    codec => rubydebug
  }
  elasticsearch_http {
    host => "localhost:9200"
    index => "myindex"
    document_id => "%{id_number}"
  }
}

i am trying to index customer ids with their address information

customers and address have a 1:M relationship

i expect the results to be one document indexed per unique customer id, with each document possibly having a nested amount of addresses

instead, after my conf file runs, the document is created each time it encounters a row, whichever row is last is the version, i.e. if a person has 5 addresses, the document has one address , and is at version 5

its like its not understanding how to fold query results into normalized json object(s), is there a way to fold query results into nested fields?

Scheduler Is Not Running every minute or any interval

I am still trying to get the scheduler to work. At first I have just jdbc input running now I have tried to run many tcp inputs and jdbc. Neither has worked. It will initially run but afterwards never. I really do not understand. Does anyone have any advice? On debugging this? Or even seen this occur?

Thanks,
M

Show "Installation successful",but not included in "plugin list" output.

Env : win7 64bit, jdk8, jruby1,9 comes with logstash1.5.4

  1. I build this project first, finally "logstash-input-jdbc-2.0.0.gem" generated.
  2. copy logstash-input-jdbc-2.0.0.gem to bin folder, run " plugin install logstash-input-jdbc-2.0.0.gem" :
>plugin install logstash-input-jdbc-2.0.0.gem
   io/console not supported; tty will not be manipulated
   Validating logstash-input-jdbc-2.0.0.gem
   Installing logstash-input-jdbc-2.2.0.gem
   Installation successful

and i found one line appended in Gemfile

gem "logstash-input-jdbc", "2.0.0", :path => "vendor/local_gems/5a5a4930/logstash-input-jdbc-2.0.0"

but nothing in folder "vendor\local_gems\5a5a4930\logstash-input-jdbc-2.0.0"
3. run "plugin list", "logstash-input-jdbc" didnt show up.

PS.I just doubt that why a plugin like this not intergrated by default?

Can't load driver

file: contacts-index-logstash.conf

input {
jdbc {
jdbc_connection_string => "jdbc:oracle:thin:@//192.168.54.74:1521/od6oracle"
jdbc_user => "od6oracle"
jdbc_password => "od6oracle"
jdbc_validate_connection => true
jdbc_driver_library => "C:\Users\vikas.saini\Desktop\easy search\ojdbc7.jar"
jdbc_driver_class => "java::oracle.jdbc.driver.OracleDriver"
statement => "SELECT * from PDBDOCUMENT"
}
}
output {
elasticsearch {
index => "contacts"
document_type => "contact"
document_id => "%{uid}"
hosts => "127.0.0.1:9200"
}
}

sql_last_start Different Values

Why is there 2 different values? This is very confusing now -- when I run in verbose mode for logstash test I see that
sql_last_start=>2015-08-13 13:26:47 UTC

but I am also storing this in my .logtstash_since_lastrun which will be have a format of
--- 2015-08-10 18:00:00.000000000 Z

which one is it?

Error: oracle.jdbc.driver.OracleDriver not loaded.

Hi,

I'm working with the latest logstash v1.5.3 on a windows notebook. I installed the logstash-input-jdbc plugin successfully. I tried to perform a SQL connection test, but he is not able to load the Oracle jdbc driver.

My logstash config

input {
  jdbc {    
    jdbc_driver_library => "C:\Daten\tools\logstash-1.5.3\db\ojdbc6.jar"
    jdbc_driver_class => "oracle.jdbc.driver.OracleDriver"
    jdbc_connection_string => "jdbc:oracle:thin:@//localhost:1521/db"
    jdbc_user => "user"
    jdbc_password => "passwd"
    # will execute on the 0th minute of every hour every day
    schedule => "0 * * * *"
    statement => "select 1 from dual"
    jdbc_paging_enabled => "true"
    jdbc_page_size => "1"
  }
}
output {
    stdout { codec => rubydebug }
}

My setup from the windows cmd

C:\Daten\tools\logstash-1.5.3\db>dir
 Volume in drive C is SYSTEM
 Volume Serial Number is 8452-140B

 Directory of C:\Daten\tools\logstash-1.5.3\db

08/18/2015  11:38 AM    <DIR>          .
08/18/2015  11:38 AM    <DIR>          ..
06/04/2014  02:28 PM         3,389,454 ojdbc6.jar
               1 File(s)      3,389,454 bytes
               2 Dir(s)   1,006,612,480 bytes free

Starting the test

C:\Daten\tools\logstash-1.5.3>bin\logstash.bat -f local-db.conf
io/console not supported; tty will not be manipulated
Error: oracle.jdbc.driver.OracleDriver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_
library?

I have checked the jar and database version. Using a java program and a sql client the connection was ok. Any ideas and pointers?

I run the debug trace

{:timestamp=>"2015-08-18T11:50:45.581000+0200", :message=>"Reading config file", :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/agent.rb", :level=>:debug, :line=>"309", :method=>"local_config"}
{:timestamp=>"2015-08-18T11:50:45.630000+0200", :message=>"Compiled pipeline code:\n        @inputs = []\n        @filters = []\n        @outputs = []\n        @periodic_flushers = []\n        @shutdown_flushers = []\n\n          @input_jdbc_1 = plugin(\"input\", \"jdbc\", LogStash::Util.hash_merge_many({ \"jdbc_driver_library\" => (\"C:/Daten/tools/logstash-1.5.3/db/ojdbc6.jar\") }, { \"jdbc_driver_class\" => (\"oracle.jdbc.driver.OracleDriver\") }, { \"jdbc_connection_string\" => (\"jdbc:oracle:thin:@//localhost:1521/db\") }, { \"jdbc_user\" => (\"user\") }, { \"jdbc_password\" => (\"password\") }, { \"schedule\" => (\"0 * * * *\") }, { \"statement\" => (\"select 1 from dual\") }, { \"jdbc_paging_enabled\" => (\"true\") }, { \"jdbc_page_size\" => (\"1\") }))\n\n          @inputs << @input_jdbc_1\n\n          @output_stdout_2 = plugin(\"output\", \"stdout\", LogStash::Util.hash_merge_many({ \"codec\" => (\"rubydebug\") }))\n\n          @outputs << @output_stdout_2\n\n  def filter_func(event)\n    events = [event]\n    @logger.debug? && @logger.debug(\"filter received\", :event => event.to_hash)\n    events\n  end\n  def output_func(event)\n    @logger.debug? && @logger.debug(\"output received\", :event => event.to_hash)\n    @output_stdout_2.handle(event)\n    \n  end", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/pipeline.rb", :line=>"29", :method=>"initialize"}
{:timestamp=>"2015-08-18T11:50:45.633000+0200", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"input", :name=>"jdbc", :path=>"logstash/inputs/jdbc", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/plugin.rb", :line=>"133", :method=>"lookup"}
{:timestamp=>"2015-08-18T11:50:45.651000+0200", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"codec", :name=>"plain", :path=>"logstash/codecs/plain", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/plugin.rb", :line=>"133", :method=>"lookup"}
{:timestamp=>"2015-08-18T11:50:45.669000+0200", :message=>"config LogStash::Codecs::Plain/@charset = \"UTF-8\"", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.670000+0200", :message=>"config LogStash::Inputs::Jdbc/@jdbc_driver_library = \"C:/Daten/tools/logstash-1.5.3/db/ojdbc6.jar\"", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.671000+0200", :message=>"config LogStash::Inputs::Jdbc/@jdbc_driver_class = \"oracle.jdbc.driver.OracleDriver\"", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.673000+0200", :message=>"config LogStash::Inputs::Jdbc/@jdbc_connection_string = \"jdbc:oracle:thin:@//localhost:1521/db\"", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.674000+0200", :message=>"config LogStash::Inputs::Jdbc/@jdbc_user = \"user\"", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.675000+0200", :message=>"config LogStash::Inputs::Jdbc/@jdbc_password = <password>", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.676000+0200", :message=>"config LogStash::Inputs::Jdbc/@schedule = \"0 * * * *\"", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.677000+0200", :message=>"config LogStash::Inputs::Jdbc/@statement = \"select 1 from dual\"", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.678000+0200", :message=>"config LogStash::Inputs::Jdbc/@jdbc_paging_enabled = true", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.679000+0200", :message=>"config LogStash::Inputs::Jdbc/@jdbc_page_size = 1", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.680000+0200", :message=>"config LogStash::Inputs::Jdbc/@debug = false", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.681000+0200", :message=>"config LogStash::Inputs::Jdbc/@codec = <LogStash::Codecs::Plain charset=>\"UTF-8\">", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.682000+0200", :message=>"config LogStash::Inputs::Jdbc/@add_field = {}", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.683000+0200", :message=>"config LogStash::Inputs::Jdbc/@jdbc_validate_connection = false", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.685000+0200", :message=>"config LogStash::Inputs::Jdbc/@jdbc_validation_timeout = 3600", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.686000+0200", :message=>"config LogStash::Inputs::Jdbc/@parameters = {}", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.690000+0200", :message=>"config LogStash::Inputs::Jdbc/@clean_run = false", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.691000+0200", :message=>"config LogStash::Inputs::Jdbc/@record_last_run = true", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.693000+0200", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"output", :name=>"stdout", :path=>"logstash/outputs/stdout", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/plugin.rb", :line=>"133", :method=>"lookup"}
{:timestamp=>"2015-08-18T11:50:45.763000+0200", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"codec", :name=>"rubydebug", :path=>"logstash/codecs/rubydebug", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/plugin.rb", :line=>"133", :method=>"lookup"}
{:timestamp=>"2015-08-18T11:50:45.772000+0200", :message=>"config LogStash::Codecs::RubyDebug/@metadata = false", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.774000+0200", :message=>"config LogStash::Outputs::Stdout/@codec = <LogStash::Codecs::RubyDebug metadata=>false>", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.775000+0200", :message=>"config LogStash::Outputs::Stdout/@type = \"\"", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.776000+0200", :message=>"config LogStash::Outputs::Stdout/@tags = []", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.777000+0200", :message=>"config LogStash::Outputs::Stdout/@exclude_tags = []", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
{:timestamp=>"2015-08-18T11:50:45.777000+0200", :message=>"config LogStash::Outputs::Stdout/@workers = 1", :level=>:debug, :file=>"/Daten/tools/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.3-java/lib/logstash/config/mixin.rb", :line=>"112", :method=>"config_init"}
Error: oracle.jdbc.driver.OracleDriver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
You may be interested in the '--configtest' flag which you can
use to validate logstash's configuration before you choose
to restart a running system.

"no method 'getConnection' for arguments" After upgrade to logstash 2.0

The error reported is:
NameError: no method 'getConnection' for arguments (org.jruby.RubyString,org.jruby.RubyString,org.jruby.RubyObject) on Java::JavaSql::DriverManager

Conf:
input {
jdbc {
jdbc_driver_library => "/etc/logstash/jconn3.jar"
jdbc_driver_class => "com.sybase.jdbc3.jdbc.SybDriver"
jdbc_connection_string => "jdbc:sybase:Tds:172.26.0.1:4200/dbname"
jdbc_user => "readonlyuser"
jdbc_password => "xxx"
schedule => "* * * * *"
statement => "exec logstash"
type => "d2db"
}
}

logstash is unable to start...

Not able to install logstash-input-jdbc-1.0.0.gem

HI Team,

I have downloaded logstash-input-jdbc-1.0.0.gem from https://rubygems.org/gems/logstash-input-jdbc/versions/1.0.0.

Then I tried to run following command:
C:\Vivek\Investment Platform\ELK\logstash-1.5.3\logstash-1.5.3\bin>plugin install logstash-input-jdbc-1.0.0.gem

Below was the error, Please help me in resolving the issue.


io/console not supported; tty will not be manipulated
Validating logstash-input-jdbc-1.0.0.gem
Installing logstash-input-jdbc
Error Bundler::GemspecError, retrying 1/10
There was a Errno::ENOENT while loading logstash-input-jdbc.gemspec:
No such file or directory - git from
C:/Vivek/Investment Platform/ELK/logstash-1.5.3/logstash-1.5.3/vendor/local_gems/5aff99d1/logstash-input-jdbc-1.0.0/logstash-input-jdbc.gemspec:13:in eval_gemspec' Error Bundler::GemspecError, retrying 2/10 There was a Errno::ENOENT while loading logstash-input-jdbc.gemspec: No such file or directory - git from C:/Vivek/Investment Platform/ELK/logstash-1.5.3/logstash-1.5.3/vendor/local_gems/5aff99d1/logstash-input-jdbc-1.0.0/logstash-input-jdbc.gemspec:13:ineval_gemspec'
Error Bundler::GemspecError, retrying 3/10
There was a Errno::ENOENT while loading logstash-input-jdbc.gemspec:
No such file or directory - git from
C:/Vivek/Investment Platform/ELK/logstash-1.5.3/logstash-1.5.3/vendor/local_gems/5aff99d1/logstash-input-jdbc-1.0.0/logstash-input-jdbc.gemspec:13:in eval_gemspec' Error Bundler::GemspecError, retrying 4/10 There was a Errno::ENOENT while loading logstash-input-jdbc.gemspec: No such file or directory - git from C:/Vivek/Investment Platform/ELK/logstash-1.5.3/logstash-1.5.3/vendor/local_gems/5aff99d1/logstash-input-jdbc-1.0.0/logstash-input-jdbc.gemspec:13:ineval_gemspec'
Error Bundler::GemspecError, retrying 5/10
There was a Errno::ENOENT while loading logstash-input-jdbc.gemspec:
No such file or directory - git from
C:/Vivek/Investment Platform/ELK/logstash-1.5.3/logstash-1.5.3/vendor/local_gems/5aff99d1/logstash-input-jdbc-1.0.0/logstash-input-jdbc.gemspec:13:in eval_gemspec' Error Bundler::GemspecError, retrying 6/10 There was a Errno::ENOENT while loading logstash-input-jdbc.gemspec: No such file or directory - git from C:/Vivek/Investment Platform/ELK/logstash-1.5.3/logstash-1.5.3/vendor/local_gems/5aff99d1/logstash-input-jdbc-1.0.0/logstash-input-jdbc.gemspec:13:ineval_gemspec'
Error Bundler::GemspecError, retrying 7/10
There was a Errno::ENOENT while loading logstash-input-jdbc.gemspec:
No such file or directory - git from
C:/Vivek/Investment Platform/ELK/logstash-1.5.3/logstash-1.5.3/vendor/local_gems/5aff99d1/logstash-input-jdbc-1.0.0/logstash-input-jdbc.gemspec:13:in eval_gemspec' Error Bundler::GemspecError, retrying 8/10 There was a Errno::ENOENT while loading logstash-input-jdbc.gemspec: No such file or directory - git from C:/Vivek/Investment Platform/ELK/logstash-1.5.3/logstash-1.5.3/vendor/local_gems/5aff99d1/logstash-input-jdbc-1.0.0/logstash-input-jdbc.gemspec:13:ineval_gemspec'
Error Bundler::GemspecError, retrying 9/10
There was a Errno::ENOENT while loading logstash-input-jdbc.gemspec:
No such file or directory - git from
C:/Vivek/Investment Platform/ELK/logstash-1.5.3/logstash-1.5.3/vendor/local_gems/5aff99d1/logstash-input-jdbc-1.0.0/logstash-input-jdbc.gemspec:13:in eval_gemspec' Error Bundler::GemspecError, retrying 10/10 There was a Errno::ENOENT while loading logstash-input-jdbc.gemspec: No such file or directory - git from C:/Vivek/Investment Platform/ELK/logstash-1.5.3/logstash-1.5.3/vendor/local_gems/5aff99d1/logstash-input-jdbc-1.0.0/logstash-input-jdbc.gemspec:13:ineval_gemspec'
Too many retries, aborting, caused by Bundler::GemspecError
ERROR: Installation Aborted, message: There was a Errno::ENOENT while loading logstash-input-jdbc.gemspec:
No such file or directory - git from
C:/Vivek/Investment Platform/ELK/logstash-1.5.3/logstash-1.5.3/vendor/local_gems/5aff99d1/logstash-input-jdbc-1.0.0/logstash-input-jdbc.gemspec:13:in `eval_gemspec'

Nested document parser failure

I've written the SQL query so that it will return a json document since I need to be able to pull in nested JSON.

This seems to be working except for with documents that contain arrays. I get a parser error when the second object in the array is hit. Is there a work around for this or another way to accomplish what I'm trying to import in?

Logstash - 1.5.3
ElasticSearch 1.7.1

Error message in Logstash:
ParserError: Unexpected character ('{' (code 123)): was expecting double-quote to start field name at [Source: [B@7fc1a174; line: 1, column: 1207]>, :level=>:warn}โ†[0m

Sample of what is returned by the SQL Query:
{"addresses":[{"_id":499,"line1":"123 E Street Ave","line2":"My Store","city":"Bronx","state":"NY","zipCode":"86753","country":"US","isPrimary":"0","types":[{"_id":2,"code":"HOME","description":"HOME Address"}],"zipLocation":{"_id":0,"zipCode":"00000"},"phoneNumbers":[{"_id":29,"number":"8888675309","types":[{"_id":1,"code":"PH - HOME","description":"Phone Number - HOME"}]}]},{"_id":7264579,"line1":"30 E Street Ave","line2":"My Store","city":"Bronx","state":"NY","zipCode":"86753","country":"US","isPrimary":"1","types":[{"_id":2,"code":"HOME","description":"HOME Address"}],"zipLocation":{"_id":2910,"zipCode":"10457","city":"BRONX"},"phoneNumbers":[{"_id":29,"number":"8888675309","types":[{"_id":1,"code":"PH - HOME","description":"Phone Number - HOME"}]}]}]}

SELECT command denied to user 'username'@'logstash_server' exception

I am not sure if it's an issue but more of a use case I think. So here's what's happening -

i have installed Logstash on one server and my database on which I am writing the SELECT query is on a different server. So when I run Logstash with JDBC plugin, I get the SELECT command denied for my username exception. Here's my config -

    input {
        jdbc {
            jdbc_connection_string => "jdbc:mysql://database_server:port/"
            jdbc_user => "username"
            jdbc_password => "password"
            jdbc_validate_connection => true
            jdbc_driver_library => "/tmp/mysql-connector-java-5.0.8-bin.jar"
            jdbc_driver_class => "com.mysql.jdbc.Driver"
            statement => "SELECT * from database.table where time > now() - interval 1 minute"
        }
    }


    output {

       elasticsearch {
                                    cluster => "ClusterName"
                                    host => "hostName"
                                    protocol => "transport"
                                    port => "9300"
                                    index => "logstash-%{+YYYY.MM.dd}"
                                    workers => 4
       }
       stdout {
            codec => rubydebug
       }
    }

Wehn I run logstash process with this configuration on logstash_server, I get the following exception -

 Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::ComMysqlJdbcExceptions::MySQLSyntaxErrorException: SELECT command denied to user 'username'@'logstash_server' for table 'table'>, :level=>:warn}

Does that mean I need Logstash to be installed on database_server itself? Can't Logstash installed on a remote server connect to the database_server using the jdbc_connection_string?

Thanks!

Upper/Lower Case of Column Names and other Identifiers is not preserved in some Drivers

In some driver modes, such as MSSQL, the case of column names and other identifiers is not preserved. This results in output with the incorrect key names. i.e., ColumnOne is not the same as columnone.

The Sequel library has an API to change the default behavior, but the corresponding option is not available in Logstash configuration.

In my specific case, I was losing capitalization on my MSSQL Column Names. To workaround, I modified plugin_mixins/jdbc.rb, adding the following line:

@database.identifier_output_method = :to_s

under the existing Sequel.connect line.

table/database names as an array on parameter for use in the sql statement

I would like to have a single conf file to execute the same SQL across multiple database/table names.

parameters => {
     "tables" => "swd.CONSUMER"
}
statement => "select consumer_name, planned_OWN from :tables where TIME_STAMP >= :sql_last_start "

If one specifies "tables" as an array, it would run multiple SQLs, one for each element.

jdbc input plugin cannot be found after sucessful installation

Hi, all. I am a newbie to both logstash and RubyGems, and now I am stuck setting up my logstash pipeline. Please leave a comment if you might know what went wrong.
After the installation of jdbc input plugin on my Amazon EC2 servre, the plugin still could not be found during the configuration verification:

$ bin/logstash -f simple-out.conf --configtest
The error reported is:
Couldn't find any input plugin named 'jdbc'. Are you sure this is correct? Trying to load the jdbc input plugin resulted in this error: no such file to load -- logstash/inputs/jdbc

Gemfile at Logstash home directory shows:
gem "logstash-input-jdbc", "1.0.0", :path => "vendor/local_gems/d8836756/logstash-input-jdbc-1.0.0"

Therefore, it looks to me that the installation is fine.

"Java::JavaLang::OutOfMemoryError: Java heap space" error

Hello,

As explained in an other issue, I use this plugin to import MySQL data into ElasticSearch.
Everything works fine as far as my query doesn't return more than 100 000 rows. When it does, I always get a "Java::JavaLang::OutOfMemoryError: Java heap space" error:

Logstash startup completed
Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::JavaLang::OutOfMemoryError: Java heap space>, :level=>:warn}
Logstash shutdown completed

I tried updating the heap space with export JVM_ARGS="-Xms4096m -Xmx4096m" but it doesn't seem to change anything.

Any idea on what to do to avoid this ?

My config file:

input {
  jdbc {
    jdbc_driver_library => "/path/to/mysql-connector-java-5.1.33-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://host:port/database"
    jdbc_user => "user"
    jdbc_password => "password"
    statement => "SELECT ..."
  }
}

filter {
  [some filters here]
}

output {
  stdout {
    codec => rubydebug
  }
  elasticsearch_http {
    host => "host"
    index => "myindex"
  }
}

Importing real columns from SQLite

I'm using the SQLite JDBC driver (https://github.com/xerial/sqlite-jdbc) and I have a table with two columns one REAL and the other INTEGER, both with the exact same value (except for the decimal point):

Column C1 (real) Column C2 (integer)
1434703799763.0 1434703799763
1434703801135.0 1434703801135
1434703802033.0 1434703802033

When I try to import the table using logstash into an ES's index, both columns are recognized as numeric, however, the stored values for C1 are different than the originals:

Column C1 (real) Column C2 (integer)
1434703757312 1434703799763
1434703757312 1434703801135
1434703757312 1434703802033

These are the resulting mappings in ElasticSearch

"C1" : { "type" : "long" },
"C2" : { "type" : "long" },

Is this a bug or am I doing something wrong?

Add check for driver in classpath

It is a requirement to have the JDBC driver libraries in the classpath. We should have a check to be explicit of a driver missing from the classpath. instead of "Not Found" error.

Add predefined parameters and format them.

It will be useful for build query statement with predefined parameters
sql_last_end - the last time a statement was finished.
sql_current_start - the current time a statement was started.

Also, will be useful, if the predefined parameters can be converted to specific output format.
Conversion may look, for example:
parameters_format => [
"name", "format", "time_zone"
]
name - parameter name
format - parameter output format
time_zone - optional and can be specified with name or utc_offset

parameters_format => [
"sql_current_start", "%H:%M:%S", "",
"sql_last_start", "%d/%m/%Y %H:%M:%S", "+09:00",
"sql_last_end", "%Y-%m-%dT%H:%M:%S", "CST"
]

JTDS not working

I configured JTDS Driver to connect to SQLServer but I got this error:

Error: net.sourceforge.jtds.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
You may be interested in the '--configtest' flag which you can
use to validate logstash's configuration before you choose

configured like this:
jdbc {
jdbc_driver_library => "/opt/logstash/lib/jtds-1.3.1-linkedserver-2.jar"
jdbc_driver_class => "net.sourceforge.jtds.jdbc.Driver"
jdbc_connection_string => "jdbc:jtds:sqlserver://SERVERDB:1433;DatabaseName=database;domain=MYDOMAIN;appName=LogstashAG.JTDS;socketKeepAlive=true"
jdbc_user => "USER"
jdbc_password => "sdfsdf"
schedule => "0/1 * * * *"
parameters => { "initial_date" => "2015-06-29 00:00:00.000" }
statement => "select EventTime, Value, CounterType from tabe (nolock) where EventTime > :initial_date and EventTime > :sql_last_start"
type => "stats_DB"
}

No results No errors

I only see "Logstash startup completed" then nothing happens whether I use sql_last_start or not.
I use Logstash 1.5.0 and the configuration is,
input {
jdbc {
jdbc_driver_library => '/Users/someWhere/sqljdbc4.jar'
jdbc_driver_class => 'com.microsoft.sqlserver.jdbc.SQLServerDriver'
jdbc_connection_string => 'jdbc:sqlserver://myServer:1433;databaseName=myDB'
jdbc_user => 'aUser'
jdbc_password => 'aPassword'
statement => 'SELECT * FROM LOGSTASH_POINT where CREATED_DATE > :sql_last_start'

parameters => { "sql_last_start" => "2014-01-01" }

jdbc_validate_connection => true
schedule => "* * * * *"
}
}
output {
stdout { codec => rubydebug }
}

Any help is appreciated.

Why having a custom jdbc mixing and not using something like sequel?

I wonder why do we need something like the custom jdbc mixing if we could be using a library like https://github.com/jeremyevans/sequel that has actually adapters for

 ADO, Amalgalite, CUBRID, DataObjects, DB2, DBI, Firebird, FoundationDB SQL Layer, IBM_DB, Informix, JDBC, MySQL, Mysql2, ODBC, OpenBase, Oracle, PostgreSQL, SQLAnywhere, SQLite3, Swift, and TinyTDS.

out of the box.

is there any design reason that makes us avoid using Sequel?

Commit state at every iteration

Currently the .logtstash_since_lastrun file is only updated to disk upon pipeline shutdown. This should be run after every sql run.

latest max column_name instead of sql_last_start

At the moment, it is not possible to query database like: "SELECT * FROM TABLE_NAME where id > : last_saved_id"
and not by using built-in parameter "sql_last_start" e.g. "SELECT * FROM TABLE_NAME where timestamp > : sql_last_start".
I don't have timestamp in my table and I want to schedule logstash to have jdbc input and output in elasticsearch.
I dont want to query whole table, I want to query only new entries from database but I don't know how and where to save the "id" from previous input. This id will be used in next scheduled time.
If you found this worth to be implemented in the future please do.

Not able to config JDBC input

Hello everyone, below is my configuration -

input {
jdbc {
jdbc_driver_class => "org.apache.derby.jdbc.EmbeddedDriver"
jdbc_connection_string => "jdbc:derby://localhost:3306/test"
jdbc_user => "root"
jdbc_password => "password"
statement => "select * from user"
}
}

When I run the config file, I get the following error -

The error reported is:
org.apache.derby.jdbc.EmbeddedDriver not loaded

Can anyone please help me resolve it? My current Java version is -

java version "1.8.0_45"
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)

I'm also confused with the line jdbc_connection_string => "jdbc:derby:memory:testdb;create=true". I have database named test. How do I mention in the above connection_string?

Make timeouts configurable

Currently, the timeouts (I think?) are hard-coded within Sequel to be 5 seconds. It could be useful to have this be configurable.

repeat insert data use logstash-input-jdbc plugin dump data from mysql to elasticsearch

hi
i use logstash-input-jdbc plugin dump data from mysql to elasticsearch
here is my input content inlogstash.conf :

jdbc {
# mysql jdbc connection string to our backup databse
jdbc_connection_string => "jdbc:mysql://192.168.0.49:3306/dfb"
# the user we wish to excute our statement as
jdbc_user => "test"
jdbc_password => "test"
# the path to our downloaded jdbc driver
jdbc_driver_library => "/opt/logstash/mysql-connector-java-5.1.36-bin.jar"
# the name of the driver class for mysql
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_paging_enabled => "true"
jdbc_page_size => "50000"
statement_filepath => "jdbc.sql"
schedule => "* * * * *"
type => "jdbc"
}

the jdbc sql is
select
h.id as id,
h.hotel_name as name,
h.photo_url as img,
ha.id as haId,
ha.finance_person
from
hotel h LEFT JOIN hotel_account ha on h.id = ha.hotel_id
where
h.last_modify_time >= :sql_last_start

when i setting well and first start logstash . it running well. all data is dump to es . and the console is waiting.
when i execute a insert sql to insert data the console print a message and the data also exist in es.
but about 30s later . the console print the message again . and es repeat insert the same data again too.
i test update sql . the same !

i do not konw what is the matter with logstash . and i have konw idea where the :sql_last_start field is worked ใ€‚

Timestamp type not being filtered

I am using Oracle jdbc and trying to get the date filter to work but there has been many issues.
Here is my current config -->> at first it was a class error now its fine. Then there were incompatible types ruby and java strings. So I added the to_char. Now its not recognizing the timestamp. Could it be because of the type?

Name Null? Type


CREATION_TIME NOT NULL TIMESTAMP(6)

input {
jdbc {
jdbc_driver_library => "logstash-1.5.0/ojdbc6.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:@//"
jdbc_user => ""
jdbc_password => ""

statement => "SELECT to_char(creation_time,'YYYY-MM-dd HH24:MI:SSxFF TZR') as logStoreTime from log where creation_time > :sql_last_start"
record_last_run => true
last_run_metadata_path => "/.logstash_jdbc_last_run"
type => "log"

}
}

filter {
grok {
match => {"logStoreTime" => "(?\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}.\d{6} +\d{2}:\d{2})"}
}
# date {
# locale => "en"
# timezone => "GMT"
# match => [
# "logStoreTime" , "YYYY-MM-dd HH:mm:ss.SSSSSS Z"
# ]
# }

}

sample output of what I am getting:
{
"logstoretime" => "2015-08-04 18:34:14.747000 +00:00",
"@Version" => "1",
"@timestamp" => "2015-08-06T12:52:15.850Z",
"type" => "test",
"tags" => [
[0] "_grokparsefailure"
]
}
{
"logstoretime" => "2015-08-04 18:34:18.661000 +00:00",
"@Version" => "1",
"@timestamp" => "2015-08-06T12:52:15.854Z",
"type" => "test",
"tags" => [
[0] "_grokparsefailure"
]
}

And I know the grok pattern will match it has been tested on http://grokdebug.herokuapp.com/

Exception when executing JDBC query

I left logstash running with a scheduler set to * * * * * come back the next day to see these errors:
They look harmless...

Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::JavaSql::SQLRecoverableException: No more data to read from socket>, :level=>:warn}
1066833202 rufus-scheduler intercepted an error:
1066833202 job:
1066833202 Rufus::Scheduler::CronJob "* * * * *" {}
1066833202 error:
1066833202 1066833202
1066833202 Java::JavaSql::SQLRecoverableException
1066833202 Closed Connection
136942014 oracle.jdbc.driver.PhysicalConnection.needLine(oracle/jdbc/driver/PhysicalConnection.java:6295)
136942014 oracle.jdbc.driver.OracleStatement.closeOrCache(oracle/jdbc/driver/OracleStatement.java:1364)
136942014 oracle.jdbc.driver.OracleStatement.close(oracle/jdbc/driver/OracleStatement.java:1343)
136942014 oracle.jdbc.driver.OracleStatementWrapper.close(oracle/jdbc/driver/OracleStatementWrapper.java:100)
136942014 java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:606)
...

Exception when executing JDBC query {:exception=>#<Sequel::DatabaseDisconnectError: Java::JavaSql::SQLRecoverableException: Closed Connection>, :level=>:warn}

last successfull got timestamp

i plan to use the plugin to collect logs from oracle database. since these logs are very sensitive incannot afford ang loose of data.
The situation is that as i understand currently this plugin stores only last run of the query but not actual timestamp of last data got. so when the logs appear after the run (due to delay of write to DB) then during next run the data will not be collected.

I suggest to have the option to select field (in timestamp format) and during each run the last value of this filed will be stored (instead of current time of last run or last run end) and during next run this value will be used as input,

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.