Skip to content

Commit

Permalink
[AWS Firehose] Clarify where to find ES endpoint (#4784) (#4797)
Browse files Browse the repository at this point in the history
When setting up an AWS Firehose data stream, users should use the Elasticsearch endpoint URL that contains the .es subdomain.

While the non-.es URLs currently work and will continue to function, the URL containing the .es subdomain is designed as the dedicated endpoint for connecting to Elasticsearch. These URLs are the best option for future-proofing your setup, and we should recommend using them.

---------

Co-authored-by: Arianna Laudazzi <[email protected]>
Co-authored-by: Arianna Laudazzi <[email protected]>
(cherry picked from commit 417b639)

Co-authored-by: Maurizio Branca <[email protected]>
  • Loading branch information
mergify[bot] and zmoog authored Jan 31, 2025
1 parent 5581a31 commit 63d829b
Show file tree
Hide file tree
Showing 4 changed files with 55 additions and 13 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -85,17 +85,26 @@ For more information on how to set up a Amazon Data Firehose delivery stream to

. Collect {es} endpoint and API key from your deployment on Elastic Cloud.
+
- Elasticsearch endpoint URL: Enter the Elasticsearch endpoint URL of your Elasticsearch cluster. To find the Elasticsearch endpoint, go to the Elastic Cloud console and select *Connection details*.
- API key: Enter the encoded Elastic API key. To create an API key, go to the Elastic Cloud console, select *Connection details* and click *Create and manage API keys*. If you are using an API key with *Restrict privileges*, make sure to review the Indices privileges to provide at least "auto_configure" & "write" permissions for the indices you will be using with this delivery stream.
- *To find the Elasticsearch endpoint URL*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Find your deployment in the *Hosted deployments* card and select *Manage*.
.. Under *Applications* click *Copy endpoint* next to *Elasticsearch*.

- *To create the API key*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Select *Open Kibana*.
.. Expand the left-hand menu, under *Management* select *Stack management > API Keys* and click *Create API key*. If you are using an API key with *Restrict privileges*, make sure to review the Indices privileges to provide at least `auto_configure` and `write` permissions for the indices you will be using with this delivery stream.

. Set up the delivery stream by specifying the following data:
+
- Elastic endpoint URL
- API key
- Elastic endpoint URL: The URL that you copied in the previous step.
- API key: The API key that you created in the previous step.
- Content encoding: gzip
- Retry duration: 60 (default)
- Backup settings: failed data only to s3 bucket

IMPORTANT: Verify that your *Elasticsearch endpoint URL* includes `.es.` between the *deployment name* and *region*. Example: `https://my-deployment.es.us-east-1.aws.elastic-cloud.com`

You now have an Amazon Data Firehose delivery specified with:

- source: direct put
Expand All @@ -104,7 +113,7 @@ You now have an Amazon Data Firehose delivery specified with:

[discrete]
[[firehose-cloudtrail-step-four]]
== Step 4: Set up a subscription filter to route Cloudtrail events to a delivery stream
== Step 4: Set up a subscription filter to route CloudTrail events to a delivery stream

image::firehose-subscription-filter.png[Firehose subscription filter]

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -100,8 +100,25 @@ image::firehose-cloudwatch-firehose-stream.png[Amazon Firehose Stream]
+
NOTE: For advanced use cases, source records can be transformed by invoking a custom Lambda function. When using Elastic integrations, this should not be required.

. In the **Destination settings** section, set the following parameter:
`es_datastream_name` = `logs-aws.generic-default`
. From the *Destination settings* panel, specify the following settings:
+
* *To find the Elasticsearch endpoint URL*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Find your deployment in the *Hosted deployments* card and select *Manage*.
.. Under *Applications* click *Copy endpoint* next to *Elasticsearch*.
+
* *To create the API key*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Select *Open Kibana*.
.. Expand the left-hand menu, under *Management* select *Stack management > API Keys* and click *Create API key*. If you are using an API key with *Restrict privileges*, make sure to review the Indices privileges to provide at least `auto_configure` and `write` permissions for the indices you will be using with this delivery stream.
+
* *Content encoding*: For a better network efficiency, leave content encoding set to GZIP.
+
* *Retry duration*: Determines how long Firehose continues retrying the request in the event of an error. A duration of 60-300s should be suitable for most use cases.
+
* *es_datastream_name*: `logs-aws.generic-default`

IMPORTANT: Verify that your *Elasticsearch endpoint URL* includes `.es.` between the *deployment name* and *region*. Example: `https://my-deployment.es.us-east-1.aws.elastic-cloud.com`

The Firehose stream is now ready to send logs to your Elastic Cloud deployment.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,9 +57,15 @@ image::firehose-networkfirewall-stream.png[Firehose stream]

. Collect {es} endpoint and API key from your deployment on Elastic Cloud.
+
- Elastic endpoint URL: Enter the Elasticsearch endpoint URL of your Elasticsearch cluster. To find the Elasticsearch endpoint, go to the Elastic Cloud console and select *Connection details*.
+
- API key: Enter the encoded Elastic API key. To create an API key, go to the Elastic Cloud console, select *Connection details* and click *Create and manage API keys*. If you are using an API key with *Restrict privileges*, make sure to review the Indices privileges to provide at least "auto_configure" and "write" permissions for the indices you will be using with this delivery stream.
- *To find the Elasticsearch endpoint URL*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Find your deployment in the *Hosted deployments* card and select *Manage*.
.. Under *Applications* click *Copy endpoint* next to *Elasticsearch*.

- *To create the API key*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Select *Open Kibana*.
.. Expand the left-hand menu, under *Management* select *Stack management > API Keys* and click *Create API key*. If you are using an API key with *Restrict privileges*, make sure to review the Indices privileges to provide at least `auto_configure` and `write` permissions for the indices you will be using with this delivery stream.

. Set up the delivery stream by specifying the following data:
+
Expand All @@ -68,7 +74,9 @@ image::firehose-networkfirewall-stream.png[Firehose stream]
- Content encoding: gzip
- Retry duration: 60 (default)
- Parameter *es_datastream_name* = `logs-aws.firewall_logs-default`
- Backup settings: failed data only to s3 bucket
- Backup settings: failed data only to S3 bucket

IMPORTANT: Verify that your *Elasticsearch endpoint URL* includes `.es.` between the *deployment name* and *region*. Example: `https://my-deployment.es.us-east-1.aws.elastic-cloud.com`

The Firehose stream is ready to send logs to our Elastic Cloud deployment.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,16 +54,24 @@ NOTE: For advanced use cases, source records can be transformed by invoking a cu

. From the *Destination settings* panel, specify the following settings:
+
* *Elastic endpoint URL*: Enter the Elastic endpoint URL of your Elasticsearch cluster. To find the Elasticsearch endpoint, go to the Elastic Cloud console, navigate to the Integrations page, and select *Connection details*. Here is an example of how it looks like: `https://my-deployment.es.us-east-1.aws.elastic-cloud.com`.
* *To find the Elasticsearch endpoint URL*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Find your deployment in the *Hosted deployments* card and select *Manage*.
.. Under *Applications* click *Copy endpoint* next to *Elasticsearch*.
+
* *API key*: Enter the encoded Elastic API key. To create an API key, go to the Elastic Cloud console, navigate to the Integrations page, select *Connection details* and click *Create and manage API keys*. If you are using an API key with *Restrict privileges*, make sure to review the Indices privileges to provide at least "auto_configure" & "write" permissions for the indices you will be using with this delivery stream.
* *To create the API key*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Select *Open Kibana*.
.. Expand the left-hand menu, under *Management* select *Stack management > API Keys* and click *Create API key*. If you are using an API key with *Restrict privileges*, make sure to review the Indices privileges to provide at least `auto_configure` and `write` permissions for the indices you will be using with this delivery stream.
+
* *Content encoding*: For a better network efficiency, leave content encoding set to GZIP.
+
* *Retry duration*: Determines how long Firehose continues retrying the request in the event of an error. A duration of 60-300s should be suitable for most use cases.
+
* *es_datastream_name*: `logs-aws.waf-default`

IMPORTANT: Verify that your *Elasticsearch endpoint URL* includes `.es.` between the *deployment name* and *region*. Example: `https://my-deployment.es.us-east-1.aws.elastic-cloud.com`

[discrete]
[[firehose-waf-step-four]]
== Step 4: Create a web access control list
Expand Down

0 comments on commit 63d829b

Please sign in to comment.