|
4 | 4 | // :description: Add data to your {es-serverless} project.
|
5 | 5 | // :keywords: serverless, elasticsearch, ingest, overview
|
6 | 6 |
|
7 |
| -You have many options for ingesting, or indexing, data into {es}: |
| 7 | +The best ingest option(s) for your use case depends on whether you are indexing general content or time series (timestamped) data. |
8 | 8 |
|
9 |
| -* <<elasticsearch-ingest-data-through-api,{es} API>> |
10 |
| -* <<elasticsearch-ingest-data-through-integrations-connector-client,Connector clients>> |
11 |
| -* <<elasticsearch-ingest-data-file-upload,File Uploader>> |
12 |
| -* <<elasticsearch-ingest-data-through-beats,{beats}>> |
13 |
| -* <<elasticsearch-ingest-data-through-logstash,{ls}>> |
14 |
| -* https://github.com/elastic/crawler[Elastic Open Web Crawler] |
| 9 | +[discrete] |
| 10 | +[[es-ingestion-overview-apis]] |
| 11 | +== Ingest data using APIs |
15 | 12 |
|
16 |
| -The best ingest option(s) for your use case depends on whether you are indexing general content or time series (timestamped) data. |
| 13 | +You can use the <<elasticsearch-http-apis,{es} REST APIs>> to add data to your {es} indices, using any HTTP client, including the <<elasticsearch-clients,{es} client libraries>>. |
17 | 14 |
|
18 |
| -**General content** |
| 15 | +While the {es} APIs can be used for any data type, Elastic provides specialized tools that optimize ingestion for specific use cases. |
| 16 | + |
| 17 | +[discrete] |
| 18 | +[[es-ingestion-overview-general-content]] |
| 19 | +== Ingest general content |
19 | 20 |
|
20 |
| -General content includes HTML pages, catalogs, files, and other content that does not update continuously. |
21 |
| -This data can be updated, but the value of the content remains relatively constant over time. |
22 |
| -Use connector clients to sync data from a range of popular data sources to {es}. |
23 |
| -You can also send data directly to {es} from your application using the API. |
| 21 | +General content is typically text-heavy data that does not have a timestamp. |
| 22 | +This could be data like knowledge bases, website content, product catalogs, and more. |
| 23 | + |
| 24 | +You can use these specialized tools to add general content to {es} indices: |
| 25 | + |
| 26 | +* <<elasticsearch-ingest-data-through-integrations-connector-client,Connector clients>> |
| 27 | +* https://github.com/elastic/crawler[Elastic Open Web Crawler] |
| 28 | +* <<elasticsearch-ingest-data-file-upload,File Uploader>> |
24 | 29 |
|
25 | 30 | [discrete]
|
26 | 31 | [[elasticsearch-ingest-time-series-data]]
|
27 |
| -**Times series (timestamped) data** |
| 32 | +== Ingest time series data |
28 | 33 |
|
29 | 34 | Time series, or timestamped data, describes data that changes frequently and "flows" over time, such as stock quotes, system metrics, and network traffic data.
|
30 |
| -Use {beats} or {ls} to collect time series data. |
| 35 | + |
| 36 | +You can use these specialized tools to add timestamped data to {es} data streams: |
| 37 | + |
| 38 | +* <<elasticsearch-ingest-data-through-beats,{beats}>> |
| 39 | +* <<elasticsearch-ingest-data-through-logstash,{ls}>> |
0 commit comments