Description
Hi there! First of all - thanks a lot for this plugin - it's great!
However, when using my MySQL DB's timestamp in elasticsearch/kibana I noticed it was off by 1h since the MySQL timestamp is interpreted as UTC but in fact is CET.
So i added:
jdbc_default_timezone => "Europe/Berlin"
But now my logstash-pipeline crashes fatal resulting in an endless loop of trying to read data from MySQL with this error:
A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::Jdbc jdbc_connection_string=>"jdbc:mysql://localhost:3306/MYDBNAME?zeroDateTimeBehavior=convertToNull", jdbc_user=>"MYDBUSER", jdbc_password=><password>, jdbc_driver_library=>"/driver/mysql-connector-java-5.1.35-bin.jar", jdbc_driver_class=>"com.mysql.jdbc.Driver", jdbc_default_timezone=>"Europe/Berlin", statement_filepath=>"/config/queries/shop_order_item.sql", type=>"shop_order_item", codec=><LogStash::Codecs::Plain charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", parameters=>{"sql_last_value"=>1970-01-01 00:00:00 UTC}, last_run_metadata_path=>"/var/lib/logstash/.logstash_jdbc_last_run", use_column_value=>false, clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
Error: TZInfo::AmbiguousTime: 2015-10-25T02:49:45+00:00 is an ambiguous local time. {:level=>:error}
I'm aware that this might probably more like an tzinfo related problem (tzinfo/tzinfo#32) but I don't see any other possibility to make it work correctly.
Exporting a UNIX-timestamp in MySQL and using a date input filter combined with the "unix" timestamp parser is (for whatever reason) painfully slow resulting in jammed elasticsearch input queues.
Best - Max