kafka-logger

It might take some time to receive the log data. It will be automatically sent after the timer function in the expires.

Attributes

This Plugin supports using batch processors to aggregate and process entries (logs/data) in a batch. This avoids the need for frequently submitting the data. The batch processor submits data every 5 seconds or when the data in the queue reaches 1000. See for more information or setting your custom configuration.

IMPORTANT

The data is first written to a buffer. When the buffer exceeds the batch_max_size or buffer_duration attribute, the data is sent to the Kafka server and the buffer is flushed.

If the process is successful, it will return true and if it fails, returns nil with a string with the “buffer overflow” error.

  • origin:

    1. GET /hello?ab=cd HTTP/1.1
    2. host: localhost
    3. connection: close
    4. abcdef

You can also set the format of the logs by configuring the Plugin metadata. The following configurations are available:

NameTypeRequiredDefaultDescription
log_formatobjectFalse{“host”: “$host”, “@timestamp”: “$time_iso8601”, “client_ip”: “$remote_addr”}Log format declared as key value pairs in JSON format. Values only support strings. or Nginx variables can be used by prefixing the string with $.
kafka-logger - 图2IMPORTANT

Configuring the Plugin metadata is global in scope. This means that it will take effect on all Routes and Services which use the kafka-logger Plugin.

The example below shows how you can configure through the Admin API:

  1. {"host":"localhost","@timestamp":"2020-09-23T19:05:05-04:00","client_ip":"127.0.0.1","route_id":"1"}
  2. {"host":"localhost","@timestamp":"2020-09-23T19:05:05-04:00","client_ip":"127.0.0.1","route_id":"1"}

Enabling the Plugin

The example below shows how you can enable the kafka-logger Plugin on a specific Route:

This Plugin also supports pushing to more than one broker at a time. You can specify multiple brokers in the Plugin configuration as shown below:

  1. "brokers" : [
  2. "host" :"127.0.0.1",
  3. "port" : 9092
  4. },
  5. {
  6. "host" :"127.0.0.1",
  7. "port" : 9093
  8. ],

Now, if you make a request to APISIX, it will be logged in your Kafka server:

Disable Plugin

To disable the kafka-logger Plugin, you can delete the corresponding JSON configuration from the Plugin configuration. APISIX will automatically reload and you do not have to restart for this to take effect.

  1. curl http://127.0.0.1:9180/apisix/admin/routes/1 -H 'X-API-KEY: edd1c9f034335f136f87ad84b625c8f1' -X PUT -d '
  2. {
  3. "methods": ["GET"],
  4. "uri": "/hello",
  5. "plugins": {},
  6. "upstream": {
  7. "type": "roundrobin",
  8. "nodes": {
  9. "127.0.0.1:1980": 1
  10. }
  11. }