阿里云-云小站(无限量代金券发放中)
【腾讯云】云服务器、云数据库、COS、CDN、短信等热卖云产品特惠抢购

排查Logstash2.4升级到5.0版本后Kafka不兼容问题

215次阅读
没有评论

共计 7263 个字符,预计需要花费 19 分钟才能阅读完成。

参考文档:

/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.0.5/CHANGELOG.md
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.0.5/DEVELOPER.md
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.0.5/README.md.md
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.0.5/lib/logstash/inputs/kafka.rb
/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-kafka-5.0.4/lib/logstash/outputs/kafka.rb

缘由:

之前对 ELKB 环境从 2.4 版本升级到最新的 5.0 稳定版本,主要升级步骤可以参考 http://www.linuxidc.com/Linux/2016-12/138217.htm,后来发现 kafka 集群运行报错,现在把排查过程记录如下,仅供参考

之前环境:

logstash2.4

logstash-input-kafka-2.0.9

logstash-output-kafka-2.0.5

kafka_2.10-0.8.2.2.tgz

升级后环境:

logstash5.0

logstash-input-kafka-2.0.9

logstash-output-kafka-2.0.5

报错信息:

[2016-11-16T14:35:44,739][ERROR][logstash.inputs.kafka] Unknown setting ‘zk_connect’ for kafka
[2016-11-16T14:35:44,741][ERROR][logstash.inputs.kafka] Unknown setting ‘topic_id’ for kafka
[2016-11-16T14:35:44,741][ERROR][logstash.inputs.kafka] Unknown setting ‘reset_beginning’ for kafka

实施步骤:

1,根据错误查看程序哪里报错

grep  “Unknown setting” /usr/share/logstash/ -R
/usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:          self.logger.error(“Unknown setting ‘#{name}’ for #{@plugin_name}”)

2,查看程序相关代码,发现需要查看 plugins 的 config 定义文件等

    def validate_check_invalid_parameter_names(params)
      invalid_params = params.keys
      # Filter out parameters that match regexp keys.
      # These are defined in plugins like this:
      #  config /foo.*/ => …
      @config.each_key do |config_key|
        if config_key.is_a?(Regexp)
          invalid_params.reject! {|k| k =~ config_key}
        elsif config_key.is_a?(String)
          invalid_params.reject! {|k| k == config_key}
        end
      end
      if invalid_params.size > 0
        invalid_params.each do |name|
          self.logger.error(“Unknown setting ‘#{name}’ for #{@plugin_name}”)
        end
        return false
      end # if invalid_params.size > 0
      return true
    end # def validate_check_invalid_parameter_names

3,进入插件总目录查看具体信息

cd  /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.0.5

发现重点查看如下文件

grep config ./* -R |awk ‘{print $1}’ |uniq
./CHANGELOG.md:
./DEVELOPER.md:See
./lib/logstash/inputs/kafka.rb:#
./lib/logstash/inputs/kafka.rb:
./README.md:-
Binary

1)首先看 CHANGELOG.md,就有发现 logstash-input-3.0.0.beta1 开始就不在向后兼容,且剔除了 jruby-kafka,注意这里有个坑 2)会讲到,4.0.0 版本说开始支持 kafka 0.9,5.0.0 又说开始

支持 0.10 切不向后兼容,这破坏性更新也是够了。看来问题找到了我的 kafka 版本是 kafka_2.10-0.8.2.2.tgz,kafka 版本不兼容导致的。

CHANGELOG.md 部分文档如下:

## 5.0.4
  – Update to Kafka version 0.10.0.1 for bug fixes
## 5.0.0
  – Support for Kafka 0.10 which is not backward compatible with 0.9 broker.
## 4.0.0
  – Republish all the gems under jruby.
  – Update the plugin to the version 2.0 of the plugin api, this change is required for Logstash 5.0 compatibility. See https://github.com/elastic/logstash/issues/5141
  – Support for Kafka 0.9 for LS 5.x
## 3.0.0.beta1
 – Refactor to use new Java based consumer, bypassing jruby-kafka
 – Breaking: Change configuration to match Kafka’s configuration. This version is not backward compatible

2)之前我看 DEVELOPER.md 文档时, 看配置语法都正确,还以为是却少依赖关系 jruby-kafka library 呢,这个再 logstash2.x 是在用的 (另外对比 logstash5.x 发现 5 版本少了不少插件。另外

kafka 版本写的是 0.8.1.1,看来这个 DEVELOPER.md 没有及时更新(与后面 kafka.rb 文件不一致),谁要是看到了麻烦及时更新啊,虽是小问题但是也可能误导我等屁民。当然也有可能是我没

有全面看文档导致的。

DEVELOPER.md 文档结尾如下:

Dependencies
====================
* Apache Kafka version 0.8.1.1
* jruby-kafka library

3)开始看 README.md 文档,特意看了下 kafka 的兼容性 看来 logstas-input-kafka5.0.5 和 logstash-output-kafka5.0.4 只能用 kafka0.10 了。如果你想用 Kafka0.9 还想用 Logstash5.0, 你的

logstash-input-kafka 和 logstash-output-kafka 只能降级版本到 4.0.0 了,这里都说他是中间过渡版本了,所以还是随大流吧。

## Kafka Compatibility
Here’s a table that describes the compatibility matrix for Kafka Broker support. Please remember that it is good advice to upgrade brokers before consumers/producers
since brokers target backwards compatibility. The 0.9 broker will work with both the 0.8 consumer and 0.9 consumer APIs but not the other way around.
| Kafka Broker Version | Logstash Version | Input Plugin | Output Plugin | Why? |
|:—————:|:——————:|:————–:|:—————:|:——|
| 0.8          | 2.0 – 2.x  | < 3.0.0 | <3.0.0 | Legacy, 0.8 is still popular |
| 0.9          | 2.0 – 2.3.x  |  3.0.0 | 3.0.0  | Intermediate release before 0.10 that works with old Ruby Event API `[]`  |
| 0.9          | 2.4, 5.0          |  4.0.0 | 4.0.0  | Intermediate release before 0.10 with new get/set API |
| 0.10        | 2.4, 5.0          |  5.0.0 | 5.0.0  | Track latest Kafka release. Not compatible with 0.9 broker |

4)现在看来只能升级 kafka 版本了。最后我看了下 jar-dependencies 发现了 kafka-clients-0.10.0.1.jar

ls /usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.0.5/vendor/jar-dependencies/runtime-jars/
kafka-clients-0.10.0.1.jar  log4j-1.2.17.jar  lz4-1.3.0.jar  slf4j-api-1.7.21.jar  slf4j-log4j12-1.7.21.jar  snappy-java-1.1.2.6.jar

5)还有一个文件没有看,怀着好奇心我看了一眼,发现之前都白费力气了,这里才是最有参考价值的的主参考文档啊,是捷径啊,隐藏的够深的,差点错过了,汗!

/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.0.5/lib/logstash/inputs/kafka.rb

kafka.rb 部分文档如下:

# This input will read events from a Kafka topic. It uses the the newly designed
# 0.10 version of consumer API provided by Kafka to read messages from the broker.
#
# Here’s a compatibility matrix that shows the Kafka client versions that are compatible with each combination
# of Logstash and the Kafka input plugin:
#
# [options=”header”]
# |==========================================================
# |Kafka Client Version |Logstash Version |Plugin Version |Security Features |Why?
# |0.8      |2.0.0 – 2.x.x  |<3.0.0 | |Legacy, 0.8 is still popular
# |0.9      |2.0.0 – 2.3.x  | 3.x.x |Basic Auth, SSL |Works with the old Ruby Event API (`event[‘product’][‘price’] = 10`) 
# |0.9      |2.4.0 – 5.0.x  | 4.x.x |Basic Auth, SSL |Works with the new getter/setter APIs (`event.set(‘[product][price]’, 10)`)
# |0.10      |2.4.0 – 5.0.x  | 5.x.x |Basic Auth, SSL |Not compatible with the 0.9 broker
# |==========================================================
#
# NOTE: We recommended that you use matching Kafka client and broker versions. During upgrades, you should
# upgrade brokers before clients because brokers target backwards compatibility. For example, the 0.9 broker
# is compatible with both the 0.8 consumer and 0.9 consumer APIs, but not the other way around.

6)升级 kafka_2.10-0.8.2.2.tgz 为 kafka_2.11-0.10.0.1.tgz(我看 kafka-clients-0.10.0.1.jar,所以没有用最新的 kafka_2.11-0.10.1.0.tgz)

大概步骤

关闭老 kafka

/usr/local/kafka/bin/kafka-server-stop.sh /usr/local/kafka/config/server.properties

备份老配置文件

server.properties 和 zookeeper.properties

删除 kafka

rm -rf /usr/local/kafka/

rm -rf /data/kafkalogs/*

安装配置新 kafka

wget http://mirrors.hust.edu.cn/apache/kafka/0.10.0.1/kafka_2.11-0.10.0.1.tgz

tar zxvf kafka_2.11-0.10.0.1.tgz -C /usr/local/

ln -s /usr/local/kafka_2.11-0.10.0.1 /usr/local/kafka

diff 了下 server.properties 和 zookeeper.properties 变动不大可以直接使用

启动新 kafka

/usr/local/kafka/bin/kafka-server-start.sh /usr/local/kafka/config/server.properties &

7) 注意几个关键配置需要修改

config :bootstrap_servers, :validate => :string, :default => “localhost:9092”

config :group_id, :validate => :string, :default => “logstash”

config :topics, :validate => :array, :default => [“logstash”]

config :consumer_threads, :validate => :number, :default => 1

除了上面的几个关键配置外,kafka 的 topic 分片信息需要重新 create 一份,否则 KafkaMonitor 监控不出 Active Topic Consumer 图形,但实际是在工作中。

CentOS 7.2 部署 Elasticsearch+Kibana+Zookeeper+Kafka  http://www.linuxidc.com/Linux/2016-11/137636.htm

CentOS 7 下安装 Logstash ELK Stack 日志管理系统  http://www.linuxidc.com/Linux/2016-08/134165.htm

Apache Kafka 代码实例 http://www.linuxidc.com/Linux/2013-11/92754.htm

Apache Kafka 教程笔记 http://www.linuxidc.com/Linux/2014-01/94682.htm

Apache kafka 原理与特性 (0.8V)  http://www.linuxidc.com/Linux/2014-09/107388.htm

Kafka 部署与代码实例  http://www.linuxidc.com/Linux/2014-09/107387.htm

Kafka 介绍��集群环境搭建  http://www.linuxidc.com/Linux/2014-09/107382.htm

Kafka 的详细介绍 :请点这里
Kafka 的下载地址 :请点这里

本文永久更新链接地址 :http://www.linuxidc.com/Linux/2016-12/138216.htm

正文完
星哥玩云-微信公众号
post-qrcode
 0
星锅
版权声明:本站原创文章,由 星锅 于2022-01-21发表,共计7263字。
转载说明:除特殊说明外本站文章皆由CC-4.0协议发布,转载请注明出处。
【腾讯云】推广者专属福利,新客户无门槛领取总价值高达2860元代金券,每种代金券限量500张,先到先得。
阿里云-最新活动爆款每日限量供应
评论(没有评论)
验证码
【腾讯云】云服务器、云数据库、COS、CDN、短信等云产品特惠热卖中