AutoOps for Elastic Cloud Serverless
Serverless
For Elastic Cloud Serverless projects, AutoOps is set up and enabled automatically in all supported regions. More regions are coming soon.
In your Elastic Cloud Serverless project, Elastic takes care of provisioning, monitoring, and autoscaling resources so you can focus on your business. This is why Elastic Cloud Serverless is billed based on the effective usage of compute and storage resources.
For more information about how Elastic Cloud Serverless is priced and packaged, refer to the following pages:
Since your monthly Serverless bill is directly related to how many resources have been consumed, it's important for you to understand why your consumption fluctuates and how past usage was influenced by your project's performance. This information lets you adapt your workloads accordingly and have better control over your future bills.
This is where AutoOps comes in. With AutoOps for Serverless, you can:
- understand and monitor your usage patterns through project-level and index-level performance metrics.
- access several curated dashboards to look at your project from all the different angles.
- have full visibility into the main Serverless billing dimensions.
Stack Monitoring is not available in Elastic Cloud Serverless because there is no need for it. Elastic takes care of monitoring and managing your Serverless projects. Learn more about the differences between AutoOps and Stack Monitoring.
AutoOps for Serverless focuses on different billing dimensions related to compute and storage, which are explained in the following subsections.
On Elasticsearch Serverless projects, the main compute-related billing dimension is called a Virtual Compute Unit (VCU). 1 VCU contains 1GB of RAM and the corresponding vCPU and local storage for caching.
There are three main types of VCUs:
- Search VCUs powering the search tier, which handles all search operations.
- Indexing VCUs powering the indexing tier, which handles all data indexing operations.
- Machine learning VCUs powering the machine learning tier, which handles all ML-related operations such as inference, anomaly detection, data frame analytics, transforms, and more.
VCUs materialize the load that each of the above tiers has to sustain to respond to your search, indexing, and machine learning needs respectively. As the load of a given tier fluctuates above or below some pre-defined thresholds, the tier autoscales accordingly to accommodate that load.
For more information about how autoscaling works in Serverless, refer to the following blogs:
Let's say your constant search workload requires 4GB of RAM, which means your search VCU usage for one day will be 4 search VCUs/hour * 24 hours = 96 VCUs.
Given that 1 search VCU = $0.09/hour, this translates to $8.64 for that day.
On Observability and Security Serverless projects, one storage-related billing dimension is called the Ingest rate, which represents the volume of data (in GB) ingested per unit of time.
On all Elasticsearch, Observability, and Security Serverless projects, the main storage-related billing dimension is called Storage retained or Retention, and it represents the total volume of data (in GB prorated over a month) retained in your project.
Let’s say you ingest 1TB of data into your Observability project.
- Ingest rate: Given that 1GB ingested per hour = $0.105, your ingest rate cost will be $107.2.
- Retention: Given that 1GB retained per hour = $0.018 and assuming it took one hour to ingest 1TB of data, that 1TB will be billed 1.42GB for that slice of one hour (1TB/720 hours per month), which translates to $0.025. Each subsequent hour in that month will cost the same.
The following features are coming soon to AutoOps for Serverless:
- An Indexing tier view, which will show you how indexing performance influences your use of ingest VCUs.
- A Machine learning tier view, which will provide insight into your machine learning jobs and inference performance, as well as token usage.
- Visibility into other billing dimensions such as data transfer out of Elastic Cloud and the various Observability and Security add-ons.
In this section, you'll find the following information:
- How to access AutoOps in your Serverless project.
- How to use the Search tier view to see the impact of search performance on your use of search VCUs.
- How to use the Search AI Lake view to drill down into your storage-related usage.