Access this content
Your content has been opened.
Guide to Saving Time with Intelligent Data Logging has been emailed to . Entered the wrong email?
Don't see the content in your inbox?
Make sure to check your spam and other messages folders.
Can't get to your email right now?
Please enter a valid verification code.
Code sent to:
Register to access this content
By accessing content on the Pharmaceutical Manufacturing Resource Directory you agree to our Terms of Service and Privacy Policy; and, you acknowledge that your information may be shared with the content publisher.
How to Simplify Data Analysis and Find Answers Quicker When trying to define intelligent data Logging, it’s useful to compare it with traditional data logging. In the past, data loggers or recorders would typically be configured to sample a set number of channels at a fixed rate and store the recorded values in memory. The data could then be downloaded to a PC afterward and analyzed to identify trends, anomalies, and other events. The analysis was often very time-consuming, requiring the user to weed through thousands or millions of data points to find a region of interest to study. Consider for example a data logger that measures 2 inputs once a second; within one day this would fill more than 86,000 rows in a Microsoft Excel spreadsheet. Now imagine several months’ worth of data or a higher speed logger that samples at 100 to 1000 Hz. This amount of data can simply overwhelm the user and analysis becomes a task of finding a needle in a haystack.