Monitoring Glossary

DigitalOcean Monitoring is a free, opt-in service that gathers and displays metrics about Droplet-level resource utilization. Monitoring supports configurable alert policies with integrated email and Slack notifications to help you track the operational health of your infrastructure.

This glossary defines the core concepts behind monitoring to help build your mental model of how monitoring work and understand what the documentation is referring to when it uses certain terminology.

An alert interval is the period of time that average usage must exceed a threshold before triggering an alert.
Alerting within a computer monitoring system is the ability to send notifications when certain metrics fall outside of expected ranges.
A data point, or value, is a number and unit representing a single measurement.
A data set is a collection of related data points.
In computing, monitoring is the process of gathering and visualizing data to improve awareness of system health and minimize response time when usage is outside of expected levels.
Percentage units specify a value in relationship to the total available quantity, which is typically set at 100%. Percentages are useful for quantities with a known limit, like disk space.
Rate units specify a value in relation to another measure (most frequently time). Rate units usually tell you frequency of occurrence over a set time period so that you can compare magnitude. Rate units are useful when there is no unambiguous upper boundary that indicates total use or when it is more helpful to examine usage, like incoming bandwidth.
In computing, a resource is a basic component with limited availability. Resources include CPU, memory, disk space, or available bandwidth. You can track a resource in order to monitor the resource’s usage.
System usage monitoring is a type of monitoring that involves tracking system resources.
In alerting, a threshold is a value that defines the boundary between normal and abnormal usage.
Time series data is data collected at regular intervals and arranged chronologically to examine changes over time.
A trend indicates a general tendency in a data set over time. Trends are useful for recognizing changes and for predicting future behavior.
Units are standard ways of comparing values.