openPR Logo
Press release

Part I: How Fast Of a Sample Rate Does Your Data Logger Need?

09-13-2018 02:07 PM CET | Industry, Real Estate & Construction

Press release from: CAS DataLoggers

Part I: How Fast Of a Sample Rate Does Your Data Logger Need?

We often receive requests for data logging systems that can sample the inputs 100’s or 1000’s of times a second. And while we offer several different systems that can provide kHz sample rates, we often try to get a better understanding of the application to see if this is really necessary. As you increase the sample rate there are a number of costs that go up quickly. It’s not just the cost of the measurement hardware itself, other costs include:

• Data storage necessitated by larger data volume
• Faster processor to maintain data throughput
• More expensive sensors with faster response times
• More expensive power supplies to accommodate increased power requirements
• More advanced software to deal with large data sets

By getting a better understanding of the application and requirements, you can achieve the best sample rate at an affordable cost.

What is The Goal?
One of the first questions we usually ask is the why question – why are you datalogging? There are probably almost as many reasons as there are applications but broadly speaking they can be broken into 3 categories:

1. Direct Use of Data
2. Indirect Use of Data
3. Contextual – Use of Data is Even More Removed Than the Indirect Use

1. Direct Use of the Data. It could be that you need a hard copy record of the data to show a regulator or customer that your process was within specification. Or, maybe you are doing some engineering analysis and you need to evaluate the behavior of some system such as deflection of a beam versus load. Whatever the reason, you need the actual data even if it just goes into a folder somewhere never to be seen again. Of the 3 categories, this one probably has the justifiable reason for faster sample rates.

2. Indirect Use of the Data. Often times you don’t really need the raw data but need to know if it changes or goes above/below a certain point. A very common example is refrigerator/freezer monitoring. While you might want to know your vaccine refrigerator is at 38.2 instead of 40.1 degrees, it is much more common to just want to know the average temperature and to get an alarm if it goes outside the CDC recommended temperature range of 35°to 46°F. Now you could measure the temperature once a second but really, the temperature will never change that fast; even if the door was left open it would take minutes for the temperature of the contents to change by more than a degree or 2. If you measure once a minute (which is still pretty fast) and only save the minimum, maximum and average you have gone from 86,400 data points a day if you measured once a second to 3 a day and still have captured all of the critical information.

3. In the third class of applications, sometimes called contextual, the use of the data is even more removed than the indirect use above. Here you’re really just looking for some other phenomena that are inferred by the signal you are measuring. One recent example was monitoring a remote pump house; in this case, we captured a
signal from the contactor which activated the pump and from a current transducer on the power lead. The data itself was not important, what the customer wanted to know was if the contactor closed but there was no current flowing indicating the pump motor had failed. When picking a sample rate for this type of application, the determining factor is how soon you need to know of an event – often finding out within 15 or 30 minutes is soon enough.

How Fast Should You Sample?
When trying to determine the appropriate sample rate there are several factors to consider: how fast does the thing you’re measuring change and how fast does the sensor respond. If the application is to measure the voltage signal from an electronic circuit, the signal can change in microseconds, but if you are measuring a physical phenomenon such as pressure, temperature or level the change can be much slower from seconds to minutes. For example, we usually recommend the use of a thermal buffer for refrigerator/freezer temperature monitoring applications. As we have shown in our white paper “Comparison of Thermal Buffer Effectiveness” the temperature that the probe sees inside the buffer can take 5 minutes or more to respond to changes in the outside environment.

Equally important is how fast the sensor responds to these changes. Sometimes the sensor data sheet will present a value for settling time to reach a given percentage of the final value, for example, a humidity sensor may take 30 seconds to get within 3% of the final value. Other times the specifications will show a time constant for the sensor.
Typically, the time constant is the amount of time it takes for the sensor to get to 63% of the final value when exposed to a step change. To get to within 1% of the final value requires 5-time constants; a ¼” thermocouple probe with a time constant of 10 seconds in still air will require 50 seconds to get to within 1% of the actual value. Here are some typical response times:

Visit website for chart @ https://www.dataloggerinc.com/blog/data-logger-sample-rate/

One final note for making fast (> 100 samples/second) measurements – you can’t ignore the effect of electrical noise on the measurement. We often have customers that for whatever reason want to measure small, millivolt level signals at high speeds. The problem is that unless you are very careful with the wiring and shielding these signals are susceptible to picking up electrical noise generated by AC power lines. In the U.S., this noise is at 60 Hertz or a period of 16.6 milliseconds. To get accurate measurements you have to integrate or sample over an integer number of periods to average or filter out this noise. Sampling any faster will almost certainly end up in noisy or inaccurate data.

Look out for Part II of “How Fast of a Sample Rate Does Your Data Logger Need?” where we discuss how sample rate impacts the data logger, as well as how to best process the data for your needs.

Computer Aided Solutions, LLC. dba CAS DataLoggers is a distributor of data loggers, paperless recorders and data acquisition equipment.

We have the industry’s most complete selection of data logging equipment, with hundreds of different models from more than 18 manufacturers. With data loggers from 1 to 300 channels we can record temperature, humidity, force/strain, pressure, flow, voltage, current, resistance, vibration and other digital signals, in connection with serial (RS-232/RS-485), CAN/OBD or SDI-12 devices. We sell directly to end users and also work through a network of distributors and resellers throughout the United States, Canada, Central and South America.

CAS DataLoggers
8437 Mayfield Rd Unit 104
Chesterland, OH 44026

This release was published on openPR.

Permanent link to this press release:

Copy
Please set a link in the press area of your homepage to this press release on openPR. openPR disclaims liability for any content contained in this release.

You can edit or delete your press release Part I: How Fast Of a Sample Rate Does Your Data Logger Need? here

News-ID: 1234370 • Views:

More Releases from CAS DataLoggers

New XH10 & XH11: Data Loggers Enhance Long-Distance Transport
New XH10 & XH11: Data Loggers Enhance Long-Distance Transport
CAS DataLoggers is pleased to announce the XHLogger series from Brainchild Electronics Co., Ltd. The new XH10 and XH11 temperature and humidity data loggers are designed specifically for environmental monitoring during cargo transportation. These reusable devices connect to a computer via USB and automatically generate a PDF report of the recorded data, or they can be used in conjunction with the Data Logger Viewer (DLV) software for in-depth data analysis.
New MSR Data Loggers from MSR Electronics GmbH
New MSR Data Loggers from MSR Electronics GmbH
CAS DataLoggers is pleased to announce that we have partnered with Swiss company MSR Electronics GmbH to bring the MSR family of universal data loggers to our customers. Designed to meet the highest standards of precision and reliability, the new MSR data loggers are compact with large memory to handle various measurement tasks such as measuring and recording shocks, vibration, temperature, humidity, pressure, or light. Why Choose MSR Data Loggers? The ability
Ensuring Workplace Safety: Data Loggers for Compliance With California Regulations
Ensuring Workplace Safety: Data Loggers for Compliance With California Regulatio …
In workplaces across California, ensuring the health and safety of employees is paramount. This commitment is not just a moral imperative, but a legal requirement under California Code of Regulations Section 3395, which mandates specific measures to ensure workplace safety by protecting workers from heat illness. Among these measures is the monitoring of environmental conditions such as temperature and relative humidity, critical factors that can significantly impact employee well-being. Understanding California
New AirGate 4G Cellular Router from Novus
New AirGate 4G Cellular Router from Novus
NOVUS presents AirGate 4G, an industrial VPN router for cellular networks. Data sending is secure with this new device as it uses encryption protocols and firewall systems most commonly used in IT infrastructures, including automatic fallback for 4G, 3G, and 2G cellular networks. AirGate 4G is CE Mark certified and was developed for industrial environments. It can maintain its high availability performance even in extended operation situations, being equipment suitable

All 4 Releases


More Releases for Data

Data Catalog Market: Serving Data Consumers
Data Catalog Market size was valued at US$ 801.10 Mn. in 2022 and the total revenue is expected to grow at a CAGR of 23.2% from 2023 to 2029, reaching nearly US$ 3451.16 Mn. Data Catalog Market Report Scope and Research Methodology The Data Catalog Market is poised to reach a valuation of US$ 3451.16 million by 2029. A data catalog serves as an organized inventory of an organization's data assets, leveraging
Big Data Security: Increasing Data Volume and Data Velocity
Big data security is a term used to describe the security of data that is too large or complex to be managed using traditional security methods. Big data security is a growing concern for organizations as the amount of data generated continues to increase. There are a number of challenges associated with securing big data, including the need to store and process data in a secure manner, the need to
HOW TO TRANSFORM BIG DATA TO SMART DATA USING DATA ENGINEERING?
We are at the cross-roads of a universe that is composed of actors, entities and use-cases; along with the associated data relationships across zillions of business scenarios. Organizations must derive the most out of data, and modern AI platforms can help businesses in this direction. These help ideally turn Big Data into plug-and-play pieces of information that are being widely known as Smart Data. Specialized components backed up by AI and
Test Data Management (TDM) Market - test data profiling, test data planning, tes …
The report categorizes the global Test Data Management (TDM) market by top players/brands, region, type, end user, market status, competition landscape, market share, growth rate, future trends, market drivers, opportunities and challenges, sales channels and distributors. This report studies the global market size of Test Data Management (TDM) in key regions like North America, Europe, Asia Pacific, Central & South America and Middle East & Africa, focuses on the consumption
Data Prep Market Report 2018: Segmentation by Platform (Self-Service Data Prep, …
Global Data Prep market research report provides company profile for Alteryx, Inc. (U.S.), Informatica (U.S.), International Business Corporation (U.S.), TIBCO Software, Inc. (U.S.), Microsoft Corporation (U.S.), SAS Institute (U.S.), Datawatch Corporation (U.S.), Tableau Software, Inc. (U.S.) and Others. This market study includes data about consumer perspective, comprehensive analysis, statistics, market share, company performances (Stocks), historical analysis 2012 to 2017, market forecast 2018 to 2025 in terms of volume, revenue, YOY
Long Term Data Retention Solutions Market - The Increasing Demand For Big Data W …
Data retention is a technique to store the database of the organization for the future. An organization may retain data for several different reasons. One of the reasons is to act in accordance with state and federal regulations, i.e. information that may be considered old or irrelevant for internal use may need to be retained to comply with the laws of a particular jurisdiction or industry. Another reason is to