Long-term downsampling

Long-term downsampling

After approximately five days, each persisted metric, including both raw and aggregated metrics, undergoes a process called long-term downsampling (LTD). This process temporally downsamples data into a more compact form, and then deletes the non-downsampled data permanently.

To maintain an accurate representation of the data, Chronosphere utilizes different downsampling methodologies, depending on the metric type and, if it's an aggregated metric, the method of aggregation used to produce its value.

These behaviors are important to note beforehand, since any unexpected results of LTD will be noticed only approximately five days after ingestion.

By default, Chronosphere downsamples long term data at a five-minute granularity, where all data points within each five-minute window are compacted into a single data point. This five-minute window is termed the downsample window.

Cumulative counter

Cumulative counters downsample by preserving the overall increase (respecting resets) between the start and end of the downsample window. This reduces the temporal granularity by observing only one increase every five minutes, while keeping the running count accurate.


Downsampling of gauges differs, depending on how the gauge was ingested.

By default, gauges are downsampled by preserving only the last data point in every downsample window. Any changes to the gauge before the end of the downsample window aren't retained.

If the gauge is an output of a MIN/MAX aggregation, the gauge is downsampled by preserving the MIN/MAX data point in every downsample window, respectively.

If the gauge was ingested with StatsD, the gauge is downsampled using a Largest-Triangle-Three-Buckets downsampling algorithm for consistency with the Graphite query engine.