Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[8.x](backport #4784) [AWS Firehose] Clarify where to find ES endpoint #4796

Merged
merged 1 commit into from
Jan 31, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -85,17 +85,26 @@ For more information on how to set up a Amazon Data Firehose delivery stream to

. Collect {es} endpoint and API key from your deployment on Elastic Cloud.
+
- Elasticsearch endpoint URL: Enter the Elasticsearch endpoint URL of your Elasticsearch cluster. To find the Elasticsearch endpoint, go to the Elastic Cloud console and select *Connection details*.
- API key: Enter the encoded Elastic API key. To create an API key, go to the Elastic Cloud console, select *Connection details* and click *Create and manage API keys*. If you are using an API key with *Restrict privileges*, make sure to review the Indices privileges to provide at least "auto_configure" & "write" permissions for the indices you will be using with this delivery stream.
- *To find the Elasticsearch endpoint URL*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Find your deployment in the *Hosted deployments* card and select *Manage*.
.. Under *Applications* click *Copy endpoint* next to *Elasticsearch*.

- *To create the API key*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Select *Open Kibana*.
.. Expand the left-hand menu, under *Management* select *Stack management > API Keys* and click *Create API key*. If you are using an API key with *Restrict privileges*, make sure to review the Indices privileges to provide at least `auto_configure` and `write` permissions for the indices you will be using with this delivery stream.

. Set up the delivery stream by specifying the following data:
+
- Elastic endpoint URL
- API key
- Elastic endpoint URL: The URL that you copied in the previous step.
- API key: The API key that you created in the previous step.
- Content encoding: gzip
- Retry duration: 60 (default)
- Backup settings: failed data only to s3 bucket

IMPORTANT: Verify that your *Elasticsearch endpoint URL* includes `.es.` between the *deployment name* and *region*. Example: `https://my-deployment.es.us-east-1.aws.elastic-cloud.com`

You now have an Amazon Data Firehose delivery specified with:

- source: direct put
Expand All @@ -104,7 +113,7 @@ You now have an Amazon Data Firehose delivery specified with:

[discrete]
[[firehose-cloudtrail-step-four]]
== Step 4: Set up a subscription filter to route Cloudtrail events to a delivery stream
== Step 4: Set up a subscription filter to route CloudTrail events to a delivery stream

image::firehose-subscription-filter.png[Firehose subscription filter]

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -100,8 +100,25 @@ image::firehose-cloudwatch-firehose-stream.png[Amazon Firehose Stream]
+
NOTE: For advanced use cases, source records can be transformed by invoking a custom Lambda function. When using Elastic integrations, this should not be required.

. In the **Destination settings** section, set the following parameter:
`es_datastream_name` = `logs-aws.generic-default`
. From the *Destination settings* panel, specify the following settings:
+
* *To find the Elasticsearch endpoint URL*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Find your deployment in the *Hosted deployments* card and select *Manage*.
.. Under *Applications* click *Copy endpoint* next to *Elasticsearch*.
+
* *To create the API key*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Select *Open Kibana*.
.. Expand the left-hand menu, under *Management* select *Stack management > API Keys* and click *Create API key*. If you are using an API key with *Restrict privileges*, make sure to review the Indices privileges to provide at least `auto_configure` and `write` permissions for the indices you will be using with this delivery stream.
+
* *Content encoding*: For a better network efficiency, leave content encoding set to GZIP.
+
* *Retry duration*: Determines how long Firehose continues retrying the request in the event of an error. A duration of 60-300s should be suitable for most use cases.
+
* *es_datastream_name*: `logs-aws.generic-default`

IMPORTANT: Verify that your *Elasticsearch endpoint URL* includes `.es.` between the *deployment name* and *region*. Example: `https://my-deployment.es.us-east-1.aws.elastic-cloud.com`

The Firehose stream is now ready to send logs to your Elastic Cloud deployment.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,9 +57,15 @@ image::firehose-networkfirewall-stream.png[Firehose stream]

. Collect {es} endpoint and API key from your deployment on Elastic Cloud.
+
- Elastic endpoint URL: Enter the Elasticsearch endpoint URL of your Elasticsearch cluster. To find the Elasticsearch endpoint, go to the Elastic Cloud console and select *Connection details*.
+
- API key: Enter the encoded Elastic API key. To create an API key, go to the Elastic Cloud console, select *Connection details* and click *Create and manage API keys*. If you are using an API key with *Restrict privileges*, make sure to review the Indices privileges to provide at least "auto_configure" and "write" permissions for the indices you will be using with this delivery stream.
- *To find the Elasticsearch endpoint URL*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Find your deployment in the *Hosted deployments* card and select *Manage*.
.. Under *Applications* click *Copy endpoint* next to *Elasticsearch*.

- *To create the API key*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Select *Open Kibana*.
.. Expand the left-hand menu, under *Management* select *Stack management > API Keys* and click *Create API key*. If you are using an API key with *Restrict privileges*, make sure to review the Indices privileges to provide at least `auto_configure` and `write` permissions for the indices you will be using with this delivery stream.

. Set up the delivery stream by specifying the following data:
+
Expand All @@ -68,7 +74,9 @@ image::firehose-networkfirewall-stream.png[Firehose stream]
- Content encoding: gzip
- Retry duration: 60 (default)
- Parameter *es_datastream_name* = `logs-aws.firewall_logs-default`
- Backup settings: failed data only to s3 bucket
- Backup settings: failed data only to S3 bucket

IMPORTANT: Verify that your *Elasticsearch endpoint URL* includes `.es.` between the *deployment name* and *region*. Example: `https://my-deployment.es.us-east-1.aws.elastic-cloud.com`

The Firehose stream is ready to send logs to our Elastic Cloud deployment.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,16 +54,24 @@ NOTE: For advanced use cases, source records can be transformed by invoking a cu

. From the *Destination settings* panel, specify the following settings:
+
* *Elastic endpoint URL*: Enter the Elastic endpoint URL of your Elasticsearch cluster. To find the Elasticsearch endpoint, go to the Elastic Cloud console, navigate to the Integrations page, and select *Connection details*. Here is an example of how it looks like: `https://my-deployment.es.us-east-1.aws.elastic-cloud.com`.
* *To find the Elasticsearch endpoint URL*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Find your deployment in the *Hosted deployments* card and select *Manage*.
.. Under *Applications* click *Copy endpoint* next to *Elasticsearch*.
+
* *API key*: Enter the encoded Elastic API key. To create an API key, go to the Elastic Cloud console, navigate to the Integrations page, select *Connection details* and click *Create and manage API keys*. If you are using an API key with *Restrict privileges*, make sure to review the Indices privileges to provide at least "auto_configure" & "write" permissions for the indices you will be using with this delivery stream.
* *To create the API key*:
.. Go to the https://cloud.elastic.co/[Elastic Cloud] console
.. Select *Open Kibana*.
.. Expand the left-hand menu, under *Management* select *Stack management > API Keys* and click *Create API key*. If you are using an API key with *Restrict privileges*, make sure to review the Indices privileges to provide at least `auto_configure` and `write` permissions for the indices you will be using with this delivery stream.
+
* *Content encoding*: For a better network efficiency, leave content encoding set to GZIP.
+
* *Retry duration*: Determines how long Firehose continues retrying the request in the event of an error. A duration of 60-300s should be suitable for most use cases.
+
* *es_datastream_name*: `logs-aws.waf-default`

IMPORTANT: Verify that your *Elasticsearch endpoint URL* includes `.es.` between the *deployment name* and *region*. Example: `https://my-deployment.es.us-east-1.aws.elastic-cloud.com`

[discrete]
[[firehose-waf-step-four]]
== Step 4: Create a web access control list
Expand Down