|
| 1 | += Elasticsearch |
| 2 | + |
| 3 | +Elasticsearch is a distributed search and analytics engine, scalable data store and vector database optimized for speed and relevance on production-scale workloads. Elasticsearch is the foundation of Elastic's open Stack platform. Search in near real-time over massive datasets, perform vector searches, integrate with generative AI applications, and much more. |
| 4 | + |
| 5 | +Use cases enabled by Elasticsearch include: |
| 6 | + |
| 7 | +* https://www.elastic.co/search-labs/blog/articles/retrieval-augmented-generation-rag[Retrieval Augmented Generation (RAG)] |
| 8 | +* https://www.elastic.co/search-labs/blog/categories/vector-search[Vector search] |
| 9 | +* Full-text search |
| 10 | +* Logs |
| 11 | +* Metrics |
| 12 | +* Application performance monitoring (APM) |
| 13 | +* Security logs |
| 14 | + |
| 15 | +\... and more! |
| 16 | + |
| 17 | +To learn more about Elasticsearch's features and capabilities, see our |
| 18 | +https://www.elastic.co/products/elasticsearch[product page]. |
| 19 | + |
| 20 | +To access information on https://www.elastic.co/search-labs/blog/categories/ml-research[machine learning innovations] and the latest https://www.elastic.co/search-labs/blog/categories/lucene[Lucene contributions from Elastic], more information can be found in https://www.elastic.co/search-labs[Search Labs]. |
| 21 | + |
| 22 | +[[get-started]] |
| 23 | +== Get started |
| 24 | + |
| 25 | +The simplest way to set up Elasticsearch is to create a managed deployment with |
| 26 | +https://www.elastic.co/cloud/as-a-service[Elasticsearch Service on Elastic |
| 27 | +Cloud]. |
| 28 | + |
| 29 | +If you prefer to install and manage Elasticsearch yourself, you can download |
| 30 | +the latest version from |
| 31 | +https://www.elastic.co/downloads/elasticsearch[elastic.co/downloads/elasticsearch]. |
| 32 | + |
| 33 | +=== Run Elasticsearch locally |
| 34 | + |
| 35 | +//// |
| 36 | +IMPORTANT: This content is replicated in the Elasticsearch repo. See `run-elasticsearch-locally.asciidoc`. |
| 37 | +Ensure both files are in sync. |
| 38 | + |
| 39 | +GitHub - elastic/start-local: Try Elasticsearch and Kibana locally is the source of truth. |
| 40 | +//// |
| 41 | + |
| 42 | +[WARNING] |
| 43 | +==== |
| 44 | +DO NOT USE THESE INSTRUCTIONS FOR PRODUCTION DEPLOYMENTS. |
| 45 | + |
| 46 | +This setup is intended for local development and testing only. |
| 47 | +==== |
| 48 | + |
| 49 | +Quickly set up Elasticsearch and Kibana in Docker for local development or testing, using the https://github.com/elastic/start-local?tab=readme-ov-file#-try-elasticsearch-and-kibana-locally[`start-local` script]. |
| 50 | + |
| 51 | +ℹ️ For more detailed information about the `start-local` setup, refer to the https://github.com/elastic/start-local[README on GitHub]. |
| 52 | + |
| 53 | +==== Prerequisites |
| 54 | + |
| 55 | +- If you don't have Docker installed, https://www.docker.com/products/docker-desktop[download and install Docker Desktop] for your operating system. |
| 56 | +- If you're using Microsoft Windows, then install https://learn.microsoft.com/en-us/windows/wsl/install[Windows Subsystem for Linux (WSL)]. |
| 57 | + |
| 58 | +==== Trial license |
| 59 | +This setup comes with a one-month trial license that includes all Elastic features. |
| 60 | + |
| 61 | +After the trial period, the license reverts to *Free and open - Basic*. |
| 62 | +Refer to https://www.elastic.co/subscriptions[Elastic subscriptions] for more information. |
| 63 | + |
| 64 | +==== Run `start-local` |
| 65 | + |
| 66 | +To set up Elasticsearch and Kibana locally, run the `start-local` script: |
| 67 | + |
| 68 | +[source,sh] |
| 69 | +---- |
| 70 | +curl -fsSL https://elastic.co/start-local | sh |
| 71 | +---- |
| 72 | +// NOTCONSOLE |
| 73 | + |
| 74 | +This script creates an `elastic-start-local` folder containing configuration files and starts both Elasticsearch and Kibana using Docker. |
| 75 | + |
| 76 | +After running the script, you can access Elastic services at the following endpoints: |
| 77 | + |
| 78 | +* *Elasticsearch*: http://localhost:9200 |
| 79 | +* *Kibana*: http://localhost:5601 |
| 80 | + |
| 81 | +The script generates a random password for the `elastic` user, which is displayed at the end of the installation and stored in the `.env` file. |
| 82 | + |
| 83 | +[CAUTION] |
| 84 | +==== |
| 85 | +This setup is for local testing only. HTTPS is disabled, and Basic authentication is used for Elasticsearch. For security, Elasticsearch and Kibana are accessible only through `localhost`. |
| 86 | +==== |
| 87 | + |
| 88 | +==== API access |
| 89 | + |
| 90 | +An API key for Elasticsearch is generated and stored in the `.env` file as `ES_LOCAL_API_KEY`. |
| 91 | +Use this key to connect to Elasticsearch with a https://www.elastic.co/guide/en/elasticsearch/client/index.html[programming language client] or the https://www.elastic.co/guide/en/elasticsearch/reference/current/rest-apis.html[REST API]. |
| 92 | + |
| 93 | +From the `elastic-start-local` folder, check the connection to Elasticsearch using `curl`: |
| 94 | + |
| 95 | +[source,sh] |
| 96 | +---- |
| 97 | +source .env |
| 98 | +curl $ES_LOCAL_URL -H "Authorization: ApiKey ${ES_LOCAL_API_KEY}" |
| 99 | +---- |
| 100 | + |
| 101 | +To use the password for the `elastic` user, set and export the `ES_LOCAL_PASSWORD` environment variable. For example: |
| 102 | + |
| 103 | +[source,sh] |
| 104 | +---- |
| 105 | +source .env |
| 106 | +export ES_LOCAL_PASSWORD |
| 107 | +---- |
| 108 | + |
| 109 | +// NOTCONSOLE |
| 110 | + |
| 111 | +=== Send requests to Elasticsearch |
| 112 | + |
| 113 | +You send data and other requests to Elasticsearch through REST APIs. |
| 114 | +You can interact with Elasticsearch using any client that sends HTTP requests, |
| 115 | +such as the https://www.elastic.co/guide/en/elasticsearch/client/index.html[Elasticsearch |
| 116 | +language clients] and https://curl.se[curl]. |
| 117 | + |
| 118 | +==== Using curl |
| 119 | + |
| 120 | +Here's an example curl command to create a new Elasticsearch index, using basic auth: |
| 121 | + |
| 122 | +[source,sh] |
| 123 | +---- |
| 124 | +curl -u elastic:$ES_LOCAL_PASSWORD \ |
| 125 | +-X PUT \ |
| 126 | +http://localhost:9200/my-new-index \ |
| 127 | +-H 'Content-Type: application/json' |
| 128 | +---- |
| 129 | + |
| 130 | +// NOTCONSOLE |
| 131 | + |
| 132 | +==== Using a language client |
| 133 | + |
| 134 | +To connect to your local dev Elasticsearch cluster with a language client, you can use basic authentication with the `elastic` username and the password stored in the `ES_LOCAL_PASSWORD` environment variable. |
| 135 | + |
| 136 | +You'll use the following connection details: |
| 137 | + |
| 138 | +* **Elasticsearch endpoint**: `http://localhost:9200` |
| 139 | +* **Username**: `elastic` |
| 140 | +* **Password**: `$ES_LOCAL_PASSWORD` (Value you set in the environment variable) |
| 141 | + |
| 142 | +For example, to connect with the Python `elasticsearch` client: |
| 143 | + |
| 144 | +[source,python] |
| 145 | +---- |
| 146 | +import os |
| 147 | +from elasticsearch import Elasticsearch |
| 148 | + |
| 149 | +username = 'elastic' |
| 150 | +password = os.getenv('ES_LOCAL_PASSWORD') # Value you set in the environment variable |
| 151 | + |
| 152 | +client = Elasticsearch( |
| 153 | +"http://localhost:9200", |
| 154 | +basic_auth=(username, password) |
| 155 | +) |
| 156 | + |
| 157 | +print(client.info()) |
| 158 | +---- |
| 159 | + |
| 160 | +==== Using the Dev Tools Console |
| 161 | + |
| 162 | +Kibana's developer console provides an easy way to experiment and test requests. |
| 163 | +To access the console, open Kibana, then go to **Management** > **Dev Tools**. |
| 164 | + |
| 165 | +**Add data** |
| 166 | + |
| 167 | +You index data into Elasticsearch by sending JSON objects (documents) through the REST APIs. |
| 168 | +Whether you have structured or unstructured text, numerical data, or geospatial data, |
| 169 | +Elasticsearch efficiently stores and indexes it in a way that supports fast searches. |
| 170 | + |
| 171 | +For timestamped data such as logs and metrics, you typically add documents to a |
| 172 | +data stream made up of multiple auto-generated backing indices. |
| 173 | + |
| 174 | +To add a single document to an index, submit an HTTP post request that targets the index. |
| 175 | + |
| 176 | +---- |
| 177 | +POST /customer/_doc/1 |
| 178 | +{ |
| 179 | +"firstname": "Jennifer", |
| 180 | +"lastname": "Walters" |
| 181 | +} |
| 182 | +---- |
| 183 | + |
| 184 | +This request automatically creates the `customer` index if it doesn't exist, |
| 185 | +adds a new document that has an ID of 1, and |
| 186 | +stores and indexes the `firstname` and `lastname` fields. |
| 187 | + |
| 188 | +The new document is available immediately from any node in the cluster. |
| 189 | +You can retrieve it with a GET request that specifies its document ID: |
| 190 | + |
| 191 | +---- |
| 192 | +GET /customer/_doc/1 |
| 193 | +---- |
| 194 | + |
| 195 | +To add multiple documents in one request, use the `_bulk` API. |
| 196 | +Bulk data must be newline-delimited JSON (NDJSON). |
| 197 | +Each line must end in a newline character (`\n`), including the last line. |
| 198 | + |
| 199 | +---- |
| 200 | +PUT customer/_bulk |
| 201 | +{ "create": { } } |
| 202 | +{ "firstname": "Monica","lastname":"Rambeau"} |
| 203 | +{ "create": { } } |
| 204 | +{ "firstname": "Carol","lastname":"Danvers"} |
| 205 | +{ "create": { } } |
| 206 | +{ "firstname": "Wanda","lastname":"Maximoff"} |
| 207 | +{ "create": { } } |
| 208 | +{ "firstname": "Jennifer","lastname":"Takeda"} |
| 209 | +---- |
| 210 | + |
| 211 | +**Search** |
| 212 | + |
| 213 | +Indexed documents are available for search in near real-time. |
| 214 | +The following search matches all customers with a first name of _Jennifer_ |
| 215 | +in the `customer` index. |
| 216 | + |
| 217 | +---- |
| 218 | +GET customer/_search |
| 219 | +{ |
| 220 | +"query" : { |
| 221 | +"match" : { "firstname": "Jennifer" } |
| 222 | +} |
| 223 | +} |
| 224 | +---- |
| 225 | + |
| 226 | +**Explore** |
| 227 | + |
| 228 | +You can use Discover in Kibana to interactively search and filter your data. |
| 229 | +From there, you can start creating visualizations and building and sharing dashboards. |
| 230 | + |
| 231 | +To get started, create a _data view_ that connects to one or more Elasticsearch indices, |
| 232 | +data streams, or index aliases. |
| 233 | + |
| 234 | +. Go to **Management > Stack Management > Kibana > Data Views**. |
| 235 | +. Select **Create data view**. |
| 236 | +. Enter a name for the data view and a pattern that matches one or more indices, |
| 237 | +such as _customer_. |
| 238 | +. Select **Save data view to Kibana**. |
| 239 | + |
| 240 | +To start exploring, go to **Analytics > Discover**. |
| 241 | + |
| 242 | +[[upgrade]] |
| 243 | +== Upgrade |
| 244 | + |
| 245 | +To upgrade from an earlier version of Elasticsearch, see the |
| 246 | +https://www.elastic.co/guide/en/elasticsearch/reference/current/setup-upgrade.html[Elasticsearch upgrade |
| 247 | +documentation]. |
| 248 | + |
| 249 | +[[build-source]] |
| 250 | +== Build from source |
| 251 | + |
| 252 | +Elasticsearch uses https://gradle.org[Gradle] for its build system. |
| 253 | + |
| 254 | +To build a distribution for your local OS and print its output location upon |
| 255 | +completion, run: |
| 256 | +---- |
| 257 | +./gradlew localDistro |
| 258 | +---- |
| 259 | + |
| 260 | +To build a distribution for another platform, run the related command: |
| 261 | +---- |
| 262 | +./gradlew :distribution:archives:linux-tar:assemble |
| 263 | +./gradlew :distribution:archives:darwin-tar:assemble |
| 264 | +./gradlew :distribution:archives:windows-zip:assemble |
| 265 | +---- |
| 266 | + |
| 267 | +Distributions are output to `distribution/archives`. |
| 268 | + |
| 269 | +To run the test suite, see xref:TESTING.asciidoc[TESTING]. |
| 270 | + |
| 271 | +[[docs]] |
| 272 | +== Documentation |
| 273 | + |
| 274 | +For the complete Elasticsearch documentation visit |
| 275 | +https://www.elastic.co/guide/en/elasticsearch/reference/current/index.html[elastic.co]. |
| 276 | + |
| 277 | +For information about our documentation processes, see the |
| 278 | +xref:elasticsearch/docs/README.md at main · elastic/elasticsearch[docs README]. |
| 279 | + |
| 280 | +[[examples]] |
| 281 | +== Examples and guides |
| 282 | + |
| 283 | +The GitHub - elastic/elasticsearch-labs: Notebooks & Example Apps for Search & AI Applications with Elas[`elasticsearch-labs`] repo contains executable Python notebooks, sample apps, and resources to test out Elasticsearch for vector search, hybrid search and generative AI use cases. |
| 284 | + |
| 285 | + |
| 286 | +[[contribute]] |
| 287 | +== Contribute |
| 288 | + |
| 289 | +For contribution guidelines, see xref:CONTRIBUTING.md[CONTRIBUTING]. |
| 290 | + |
| 291 | +[[questions]] |
| 292 | +== Questions? Problems? Suggestions? |
| 293 | + |
| 294 | +* To report a bug or request a feature, create a |
| 295 | + https://github.com/elastic/elasticsearch/issues/new/choose[GitHub Issue]. Please |
| 296 | + ensure someone else hasn't created an issue for the same topic. |
| 297 | + |
| 298 | +* Need help using Elasticsearch? Reach out on the |
| 299 | + https://discuss.elastic.co[Elastic Forum] or https://ela.st/slack[Slack]. A |
| 300 | + fellow community member or Elastic engineer will be happy to help you out. |
0 commit comments