When trying to define intelligent data logging, it is useful to compare it with traditional data logging. In the past, data loggers or recorders would typically be configured to sample a set number of channels at a fixed rate and store the recorded values in memory. The data could then be downloaded to a PC and analyzed to identify trends, anomalies and other events. The analysis was often very time consuming, requiring the user to weed through thousands or millions of data points to find a particular region of interest to study. Consider for example a data logger that measures several inputs once a second; within one day there is already more data than will fit into a single Microsoft Excel spreadsheet, the tool of choice for many types of analysis. Higher speed loggers that sample at 100 to 1000 Hz can simply overwhelm the user with data, and analysis becomes a task of finding a needle in a haystack.
With advances in technology and computing power, modern data loggers can dramatically reduce the amount of time and effort required to extract critical information. These "intelligent" data loggers incorporate a number of features that allow them to go well beyond the functionality of traditional data loggers to help home in on the information the user is looking for. The key is separating the raw data from the information it contains, which is what the user is really after.