Elasticsearch is a powerful open source search-oriented document database and supports complex or fuzzy queries. Based upon the Apache Lucene engine, it is often used in parallel with other databases because its search-and-scoring capabilities are so flexible.
Elasticsearch for All is a general introduction to Elasticsearch on Compose.
Are you wondering what you'll get, or can get, with a Compose Elasticsearch deployment? Want to know what you'll need to do to manage it? Check out some of the implementation features and details down in Elasticsearch for Ops and Admins.
Just deployed Elasticsearch and want to get coding with it? Developing an application or want to try a new stack? Then see the Elasticsearch for Developers section for resources on how to connect from different languages, command line tools and more information to get you started.
When deployed on Compose, Elasticsearch comes with these standard Compose features.
- Automatically scaling server stack that scales RAM, CPU, and I/O as your Elasticsearch data grows.
- Daily, weekly, monthly, and on-demand backups.
- Metrics displayed in the Compose UI.
- Deploy, manage, backup, and otherwise automate database tasks through The Compose API.
- Guaranteed resources per deployment.
- Daily logs available for download.
Compose deployments of Elasticsearch also come with a number of Elasticsearch specific features:
- Data Browser For Elasticsearch.
- Optional Kibana add-on for visualizations and analytics provided by a Kibana capsule installed on the same private network as your cluster.
- Deploy either version 2.x or 5.x, as well as an upgrade/migration path from 2.x to 5.x.
- An optional add-on for real-time log access.
- The ability to forward rich metrics to services with an optional add-on.
- Start with 2GB storage for $45 - as you grow each additional gigabyte costs $18.
Compose Elasticsearch deployments start with three data nodes and 2 haproxy portals. Each node contains 2GB of storage and 204MB of memory. The two haproxy capsules have 64MB each and support authentication, HTTPS, and IP whitelisting for enhanced security.
For standard deployments automatic vertical scaling occurs as your data set grows. You can also manually scale up your haproxy portals and Elasticsearch deployment from the deployment's Resources panel. A 10:1 ratio of disk to RAM is maintained, so by increasing the disk allocated to the deployment, you increase the RAM allocated.
SSH portals can be added from the Security panel and any added SSH portals can also be scaled.
For more information, see the Elasticsearch Resources and Scaling page.
See Compose Datacenter Availability for current location availability.
Compose deployments are billed on an hourly basis and grouped into a single monthly billing cycle. This means that any scaling or add-on usage will be charged from when the new resource was provisioned; not just for the month.
Initial deployment set up is $45/month. Depending on which scaling and add-ons options you choose, the cost will increase. For example, if you wish to add extra storage and RAM, it is billed at an additional $18/month per unit of 1GB storage/102MB memory, so an additional 2GB storage/204MB memory will cost an additional $36/month.
General billing information, answers, and details can be found in the Billing FAQ.
All Elasticsearch Deployments are high-availability clusters. Elasticsearch clusters consist of a master member which will coordinate writes. The additional members can become the master in the event of a node failure. The data is spread across the cluster based on the replica and shard count. By default, we preset the replica count to the number of nodes minus one and the shard count to 3. You can specify the replica and shard count when creating an Elasticsearch index but we don't recommend changing the replica count because it could result in data becoming unavailable in the event of a node or hardware issue.
In addition to the 3 node Elasticsearch cluster, we provide 2 HAProxy nodes to serve as a reverse proxy and provide authentication to the cluster. They ensure HTTPS connections to your deployment and provide load balancing, and high-availability for your connections; your application can use either but it's advisable to be able to fail over if you can't reach one.
Elasticsearch backups are taken with the
snapshot utility in the Elasticsearch API. The snapshot process for each index is incremental and is executed in non-blocking fashion so all indexing and search operations can continue normally while it is running. It makes a point-in-time picture of each index at the moment when the snapshot is created.
Should something happen to your current deployment, Elasticsearch backups can be restored directly into a new Elasticsearch deployment. The Backups panel has all the available daily, weekly, monthly, or on-demand backups for you to restore from. It is also possible to trigger a restore operation from the Compose API.
For more details see the Backups page and the Managing Backups via the Compose API page.
Elasticsearch provides a REST API for communicating with your cluster. This allows you to monitor your deployment, perform administrative tasks, do CRUD operations and searches, and other tasks from cURL or any other tool that allows you to make HTTP/REST calls.
The Overview panel of the Compose UI provides the basic information you need to get connected to your databases. In the section Connection info, under the heading HTTP connection, you will find the two endpoints that connect you and your applications to your deployment. For more information and examples in a few popular languages, see the Connecting to Elasticsearch page.
Compose supports a subset of plug-ins for your Elasticsearch deployment and are found through the Plugins section of the Compose UI.
The full list of documentation for Elasticsearch is in the sidebar, in addition to all things Compose.
For more than just help docs, check out Compose Articles and our curated collection of Elasticsearch-related topics for more how-to's and information on Elasticsearch on Compose.
If this article didn't solve things, summon a human and get some help!