编程知识 cdmana.com

Using logstash to synchronize MySQL data to es to realize full text search with highlighted keywords

1、 Use a background

We are used to , Store data to mysql, So how to use ES To achieve Full-text Retrieval .Elasticsearch Provides us with a plug-in logstash To synchronize our database .

2、 install logsash plug-in unit

matters needing attention : To install the es Version the same version .

logstash Download address :https://www.elastic.co/downloads/logstash

stay windows The following decompression can be used . All I use here are 7.2.0 edition . It doesn't need to be like 2.x Versions need to be integrated logstash-jdbc-input, To achieve data synchronization .

3、 To configure logsash Synchronization

1) Create a new folder in the root directory mysqletc, Used to place files and files needed for synchronization mysql drive

Folder contents :

2) Corresponding to the database to be synchronized mysql Drive into this folder

3) In the new folder created (mysqletc Folder ) Created in sql file

sql The file is that you need to synchronize to es Query statements for data in , If it's full synchronization , Just need to select * from [table] that will do .

If you synchronize multiple tables, you can create multiple sql file .

Such as my :

soul.sql

SELECT * FROM soul

mto_user.sql

SELECT * FROM mto_user

Be careful : Don't end with a semicolon , I added a semicolon to report a mistake

3)logstash Link database and Elasticsearch Of conf file

Name at will , I'm used to calling it mysql.conf, The contents are as follows :

input {
         stdin {}
         jdbc {
               # mysql  Database link ,shop Name the database 
               jdbc_connection_string => "jdbc:mysql://114.67.169.20:3306/myblog"
               #  User name and password 
               jdbc_user => "root"
               jdbc_password => "Wylc!@#$5"
               #  drive 
               jdbc_driver_library => "C:/softs/elk/logstash-7.2.0/mysqletc/mysql-connector-java-8.0.13.jar"
               #  Driver class name 
               jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
               jdbc_paging_enabled => "true"
               jdbc_page_size => "50000"
               #  Executive sql  File path + name 
               statement_filepath => "C:/softs/elk/logstash-7.2.0/mysqletc/soul.sql"
               #  Set the listening interval    Meaning of each field ( From left to right ) branch 、 when 、 God 、 month 、 year , All for * The default is update every minute 
               schedule => "* * * * *"
               #  Index type 
               type => "soul"
         }
         jdbc {
               # mysql  Database link ,shop Name the database 
               jdbc_connection_string => "jdbc:mysql://114.67.169.20:3306/myblog"
               #  User name and password 
               jdbc_user => "root"
               jdbc_password => "Wylc!@#$5"
               #  drive 
               jdbc_driver_library => "C:/softs/elk/logstash-7.2.0/mysqletc/mysql-connector-java-8.0.13.jar"
               #  Driver class name 
               jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
               jdbc_paging_enabled => "true"
               jdbc_page_size => "50000"
               #  Executive sql  File path + name 
               statement_filepath => "C:/softs/elk/logstash-7.2.0/mysqletc/mto_user.sql"
               #  Set the listening interval    Meaning of each field ( From left to right ) branch 、 when 、 God 、 month 、 year , All for * The default is update every minute 
               schedule => "* * * * *"
               #  Index type 
               type => "mto_user"
             }
   }
   filter {
         json {
             source => "message"
             remove_field => ["message"]
         }
    }
     output {
         if [type]=="soul"{
             elasticsearch {
                 hosts => ["localhost:9200"]
                 index => "soul"
                 document_id => "%{id}"
             }
         }
         if [type]=="mto_user"{
             elasticsearch {
                 hosts => ["localhost:9200"]
                 index => "mto_user"
                 document_id => "%{id}"
             }
         }
         stdout {
               codec => json_lines
        }
    }

Be careful : All text files created must be utf-8 nothing BOM Format encoding

Completed the above steps ,logstash The configuration is complete .

4、 start-up logstash Start synchronizing the database

First step : function elasticsearch.bat file , open elasticsearch, Running successfully, you can see :

The second step : To bin Create a batch file in the directory run_default.bat, The content is :

logstash -f ../mysqletc/mysql.conf

The third step : double-click run_default.bat, among logstash -f Means run instruction , ../mysqlec/mysql.conf Indicates that we have configured mysql.conf File path , After successful startup , You can see the running in the terminal sql And synchronized data

5、 Test if synchronization is successful

After synchronization , We can go head Plug in , see es Whether the index library we just created exists in the :

In the basic query , You can see our synchronized data

among :timestamp and version yes elastisearch Fields you add yourself .

Related articles : How to install and use elasticsearch-head plug-in unit

input {
         stdin {}
         jdbc {
               # mysql  Database link ,shop Name the database 
               jdbc_connection_string => "jdbc:mysql://114.67.169.20:3306/myblog"
               #  User name and password 
               jdbc_user => "root"
               jdbc_password => "xxxxxx"
               #  drive 
               jdbc_driver_library => "C:/softs/elk/logstash-7.2.0/mysqletc/mysql-connector-java-8.0.13.jar"
               #  Driver class name 
               jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
               jdbc_paging_enabled => "true"
               jdbc_page_size => "50000"
               #  Executive sql  File path + name 
               statement_filepath => "C:/softs/elk/logstash-7.2.0/mysqletc/soul.sql"
               #  Set the listening interval    Meaning of each field ( From left to right ) branch 、 when 、 God 、 month 、 year , All for * The default is update every minute 
               schedule => "* * * * *"
               #  Index type 
               type => "soul"
         }
         jdbc {
               # mysql  Database link ,shop Name the database 
               jdbc_connection_string => "jdbc:mysql://114.67.169.20:3306/myblog"
               #  User name and password 
               jdbc_user => "root"
               jdbc_password => "xxxxxx"
               #  drive 
               jdbc_driver_library => "C:/softs/elk/logstash-7.2.0/mysqletc/mysql-connector-java-8.0.13.jar"
               #  Driver class name 
               jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
               jdbc_paging_enabled => "true"
               jdbc_page_size => "50000"
               #  Executive sql  File path + name 
               statement_filepath => "C:/softs/elk/logstash-7.2.0/mysqletc/mto_user.sql"
               #  Set the listening interval    Meaning of each field ( From left to right ) branch 、 when 、 God 、 month 、 year , All for * The default is update every minute 
               schedule => "* * * * *"
               #  Index type 
               type => "mto_user"
             }
   }
   filter {
         json {
             source => "message"
             remove_field => ["message"]
         }
    }
     output {
         if [type]=="soul"{
             elasticsearch {
                 hosts => ["localhost:9200"]
                 index => "soul"
                 document_id => "%{id}"
             }
         }
         if [type]=="mto_user"{
             elasticsearch {
                 hosts => ["localhost:9200"]
                 index => "mto_user"
                 document_id => "%{id}"
             }
         }
         stdout {
               codec => json_lines
        }
    }

Related articles :

Centos7 Next es 7.7.0 Installation configuration

How to install and use elasticsearch-head plug-in unit

Springboot Use ES Official recommendation REST Client Integrate ES Achieve keyword highlighting

ELK-Elasticsearch,Logstash,kibana Build log analysis system based on log file

Set up elasticsearch Extranet access , Error handling bootstrap checks failed

版权声明
本文为[Luo Chen]所创,转载请带上原文链接,感谢
https://cdmana.com/2020/12/20201225104724974v.html

Scroll to Top