https://www.confluent.io/blog/kafka-connect-deep-dive-jdbc-source-connector

 

Kafka Connect Deep Dive – JDBC Source Connector | Confluent

The JDBC source connector for Kafka Connect enables you to pull data (source) from a database into Apache Kafka®, and to push data (sink) from a Kafka topic to a database. Almost all relational databases provide a JDBC driver, including Oracle, Microsoft S

www.confluent.io

 

카프카 커넥트에 대한 설명이 있다. 

 

재미있는 부분은 아래와 같다.

증분 데이터를 DB에서 직접 읽을 수 있는 방법을 설명한다. 실제로 잘 동작하니 참고하면 좋다.

    • MySQL

      CREATE TABLE foo ( … UPDATE_TS TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP );
    • Postgres

      CREATE TABLE foo ( … UPDATE_TS TIMESTAMP DEFAULT CURRENT_TIMESTAMP ); -- Courtesy of https://techblog.covermymeds.com/databases/on-update-timestamps-mysql-vs-postgres/ CREATE FUNCTION update_updated_at_column() RETURNS trigger LANGUAGE plpgsql AS $$ BEGIN NEW.update_ts = NOW(); RETURN NEW; END; $$; CREATE TRIGGER t1_updated_at_modtime BEFORE UPDATE ON foo FOR EACH ROW EXECUTE PROCEDURE update_updated_at_column();
    • Oracle

      CREATE TABLE foo ( … CREATE_TS TIMESTAMP DEFAULT CURRENT_TIMESTAMP , ); CREATE OR REPLACE TRIGGER TRG_foo_UPD BEFORE INSERT OR UPDATE ON foo REFERENCING NEW AS NEW_ROW FOR EACH ROW BEGIN SELECT SYSDATE INTO :NEW_ROW.UPDATE_TS FROM DUAL; END; /
  •  

 

 

 

 

Posted by '김용환'
,