Skip to content

Latest commit

 

History

History
202 lines (156 loc) · 11.4 KB

File metadata and controls

202 lines (156 loc) · 11.4 KB
description
This page contains the technical details of the Kafka endpoint plugin

Kafka

{% hint style="warning" %} This feature requires Gravitee's Enterprise Edition. {% endhint %}

Overview

Use this endpoint to publish and/or subscribe to events in Kafka via web-friendly protocols such as HTTP or WebSocket. The reactive Gateway mediates the protocol between the client and the backend. Refer to the following sections for additional details.

Quality Of Service

QoSDeliveryDescription
NoneUnwarrantedImprove throughput by removing auto commit
Balanced0, 1 or nUsed well-knowing consumer group and offsets mechanism to balance between performances and quality
At-Best0, 1 or nAlmost the same as Balanced but doing our best to delivery message once only but depending on entrypoint could rely on extra features to ensure which was the last message sent.
At-Most-Once0 or 1Depending on the entrypoint, this level could introduce performance degradation by forcing consumer to commit each message to ensure messages are sent 0 or 1 time.
At-Least-Once1 or nDepending on the entrypoint, this level could introduce performance degradation by forcing consumer to acknowledge each message to ensure messages are sent 1 or multiple times.

Compatibility matrix

Plugin version APIM version
1.x to 2.1.4 3.20.x to 4.0.4
2.2.0 and up 4.0.5 to latest

{% hint style="warning" %} Deprecation

  • Gravitee context attribute gravitee.attribute.kafka.topics is deprecated and will be removed in future versions. Use gravitee.attribute.kafka.producer.topics or gravitee.attribute.kafka.consumer.topics.
  • Use gravitee.attribute.kafka.producer.topics as the message attribute to publish messages to a specific topic. {% endhint %}

Endpoint identifier

To use this plugin, declare the following kafka identifier while configuring your API endpoints.

Endpoint configuration

General configuration

AttributesDefaultMandatoryDescription
bootstrapServersN/AYesDefine the comma-separated list of host/port pairs used to establish the initial connection to the Kafka cluster.

Shared Configuration

Security configuration

AttributesDefaultMandatoryDescription
protocolPLAINTEXTNoDefine your Kafka-specific authentication flow (PLAINTEXT, SASL_PLAINTEXT, SASL_SSL, and SSL).
sasl.saslMechanismN/ANoDefine the SASL mechanism (GSSAPI, OAUTHBEARER, PLAIN, SCRAM_SHA-256, or SCRAM-SHA-512).
sasl.saslJaasConfigN/ANoDefine the JAAS login context parameters for SASL connections in JAAS configuration file format.
ssl.trustStore.typeJKSNoDefine the TrustStore type (NONE, PEM, PKCS12, JKS).
ssl.trustStore.locationN/ANoDefine the TrustStore location.
ssl.trustStore.passwordN/ANoDefine the TrustStore password.
ssl.trustStore.certificatesN/ANoDefine the TrustStore certificates.
ssl.keystore.typeJKSNoDefine the KeyStore type (NONE, PEM, PKCS12, JKS).
ssl.keystore.locationN/ANoDefine the KeyStore location.
ssl.keystore.passwordN/ANoDefine the KeyStore password.
ssl.keystore.keyN/ANoDefine the KeyStore key.
ssl.keystore.keyPasswordN/ANoDefine the KeyStore key password.
ssl.keystore.certificateChainN/ANoDefine the KeyStore certificate chain.

Producer configuration

AttributesDefaultMandatoryDescription
enabledfalseNoAllow enabling or disabling the producer capability.
topicsN/AYesList of topics.
compressionTypenoneNoDefine the compression type (none, gzip, snappy, lz4, zstd).

The following is an example of how to produce messages:

{
  "name": "default",
  "type": "kafka",
  "weight": 1,
  "inheritConfiguration": false,
  "configuration": {
    "bootstrapServers": "kafka:9092"
  },
  "sharedConfigurationOverride": {
    "producer": {
        "enabled": true,
        "topics" : ["demo"]
    },
    "security": {
      "protocol": "PLAINTEXT"
    }
  }
}

Consumer configuration

AttributesDefaultMandatoryDescription
enabledfalseNoAllow enabling or disabling the consumer capability.
topicsN/ANoThe topic(s) from which your Gravitee Gateway client will consume messages.
topics.patternN/ANoA regex pattern to select topic(s) from which your Gravitee Gateway client will consume messages.
encodeMessageIdtrueNoAllow encoding message IDs in base64.
autoOffsetResetlatestNoDefine the behavior if no initial offset (earliest, latest, none).

The following is an example of how to consume messages:

{
  "name": "default",
  "type": "kafka",
  "weight": 1,
  "inheritConfiguration": false,
  "configuration": {
    "bootstrapServers": "kafka:9092"
  },
  "sharedConfigurationOverride": {
    "consumer": {
      "enabled": true,
      "topics": [
        "demo"
      ],
      "autoOffsetReset": "earliest"
    }
  }
}

Using SASL OATHBEARER

To facilitate support for SASL OAUTHBEARER, this plugin includes a login callback handler for token retrieval. This handler is configured using the following JAAS configuration:

"org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required access_token=\"<ACCESS_TOKEN>\";"

The access token can be provided using EL to retrieve it from a Gravitee context attribute:

{
  "name": "default",
  "type": "kafka",
  "weight": 1,
  "inheritConfiguration": false,
  "configuration": {
    "bootstrapServers": "kafka:9092"
  },
  "sharedConfigurationOverride": {
    "security" : {
        "protocol" : "SASL_PLAINTEXT",
        "sasl" : {
          "saslMechanism" : "OAUTHBEARER",
          "saslJaasConfig" : "org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required access_token=\"{#context.attributes['gravitee.attribute.kafka.oauthbearer.token']}\";"
        }
      },
      "producer" : {
        "enabled" : true
        "topics" : [ "demo" ],
        "compressionType" : "none",
      },
      "consumer" : {
        "enabled" : true,
        "encodeMessageId" : true,
        "topics" : [ "demo" ],
        "autoOffsetReset" : "latest"
      }
  }
}

Using SASL AWS_MSK_IAM

The Kafka plugin includes the Amazon MSK Library for AWS Identity and Access Management, which enables you to use AWS IAM to connect to their Amazon MSK cluster.

This mechanism is only available with the SASL_SSL protocol. Once selected, you must provide a valid JAAS configuration. Different options are available depending on the AWS CLI credentials:

  • To use the default credential profile, the client can use the following JAAS configuration:
software.amazon.msk.auth.iam.IAMLoginModule required;
  • To specify a particular credential profile as part of the client configuration (rather than through the environment variable AWS_PROFILE), the client can pass the name of the profile in the JAAS configuration:
software.amazon.msk.auth.iam.IAMLoginModule required  awsProfileName="<Credential Profile Name>";
  • As another way to configure a client to assume an IAM role and use the role’s temporary credentials, the IAM role’s ARN and, optionally, accessKey and secretKey can be passed in the JAAS configuration:
software.amazon.msk.auth.iam.IAMLoginModule required awsRoleArn="arn:aws:iam::123456789012:role/msk_client_role" awsRoleAccessKeyId="ACCESS_KEY"  awsRoleSecretAccessKey="SECRET";

More details can be found in the library’s README.

Dynamic configuration

The Kafka endpoint includes the dynamic configuration feature, meaning that you can:

  • Override any configuration parameters using an attribute (via the Assign Attribute policy). Your attribute needs to start with gravitee.attributes.endpoint.kafka, followed by the property you want to override (e.g. gravitee.attributes.endpoint.kafka.security.sasl.saslMechanism). To override the topics property, add an Assign Attribute policy and set the attribute gravitee.attributes.endpoint.kafka.consumer.topics using a request header value or a query param, for example.
  • Use EL in any "String" type property. The following example shows how to use EL to populate the consumer autoOffsetReset property:
{
  "name": "default",
  "type": "kafka",
  "weight": 1,
  "inheritConfiguration": false,
  "configuration": {
    "bootstrapServers": "kafka:9092"
  },
  "sharedConfigurationOverride": {
    "consumer": {
      "enabled": true,
      "topics": [ "default_topic" ],
      "autoOffsetReset": "{#request.headers['autoOffsetReset'][0]}"
    }
  }
}