Skip to content

Kafka Connect Phoenix

Dhananjay edited this page Jul 20, 2018 · 2 revisions

Phoenix Sink Connector for Kafka-connect

  • Supported message format is JSON with schema, make sure you configure kafka producer with appropriate properties
  • This derives table columns based on the schema of the message received
  • Phoenix table must be pre-created and primary key column with name as "ROWKEY"

Configurations

Below are the properties that need to be passed in the configuration file:

name data type required description
pqs.url string yes Phoenix Query Server URL [http:\host:8765]
event.parser.class string yes PhoenixRecordParser
topics string yes list of kafka topics.
hbase.<topicname>.rowkey.columns string yes The columns that represent the rowkey of the hbase table <topicname>
hbase.<topicname>.family string yes Column family of the hbase table <topicname>.

Sample Connector

{ "connector.class": "io.kafka.connect.phoenix.sink.PhoenixSinkConnector",

 `"event.parser.class": "io.kafka.connect.phoenix.parser.PhoenixRecordParser",`

 `"hbase.CQR_MEASURE_PERIOD.family": "cf",`

 `"zookeeper.quorum": "zkip1,zkip2,zkip3",`

 `"tasks.max": "2",`

 `"topics": "TOPIC1",`

 `"hbase.TOPIC1.table.name": "TABLE_NAME",`

 `"name": "APP-PHOENIX-SINK-TABLE",`

 `"zookeeper.znode.parent": "/hbase",`

 `"hbase.TOPIC1.rowkey.columns": "col1,col2"`

}

Clone this wiki locally