The method of how sensors are being monitored is changing within the oil and gas and petrochemical industry. The number of sensors is increasing while the number of subject matter experts (SME) is decreasing. To tackle this phenomenon the industry needs to become more efficient.
The way sensors are being analyzed is moving from sensor data and expert analysis in the field to historical data and statistical analysis from a database in the office. All aspects of the sensor can be observed when standing near the sensor. At that moment, an operator can determine if the sensor is running properly or not, but the history is unknown. When analyzing historical process data, additional diagnostics and maintenance data are often missing. Without diagnostic and fault alarm data, the quality of the sensor data cannot be confirmed. Without confirmation, this data is not reliable.
Performing maintenance on a sensor using a fixed schedule may even harm the quality and performance of the sensor. Maintenance management activities should rely on actual, accurate, and controlled data. What conditions trigger a maintenance task and how can it be predicted when the task needs to be executed based on priorities? What data is needed and how can this data be made available to a (remote) SME? What defines reliable data? How can the current infrastructure and procedures be adjusted to answer the new challenges from a data quality, software, and access perspective?
Sensor data quality can be assessed through a set of controls like validation/calibration results, operational state, calibrated ranges, and alarms. The result of this assessment needs to be tied to data that is being stored to assure quality data. The accuracy and reliability of data are guaranteed through this Sensor Data Quality label.
Historical data needs to be verified and filtered to manage the ever-increasing data flow.
The data needs to be judged following certain criteria. When a number between 1 and 100 is expected and the result is 1000, the conclusion would be that the result is incorrect. This is more complex when it comes to data within the oil & gas and petrochemical industries. Industries may also have additional requirements for managing equipment status and metadata. For example:
– The results of statistical process control
– Calibrated ranges
– Measurement uncertainties
– Operational states
– Diagnostic information
Evaluating reliability data can confirm the validity of analytical data. The data which is not reliable can be stored for a shorter period to be analyzed if needed. An example where this data might still be useful is the determination of outliers for statistical process control.
Putting quality stamps on real-time data will assure that it is reliable and accurate resulting in trustworthy data. Combining these data with maintenance management software will prove that the sensors are trustworthy and show evidence that the sensors are working correctly within the operating envelope and the limits. Learnings can be shared and consequently shorten the time to identify root causes.
The growth of the industry calls for better efficiency in maintenance. The goal is to use quality data to reduce risk and improve cost efficiencies (manpower, cost of maintenance, and cost of production). This is achievable through improved data quality, access, and control, leading to more effective scheduling of maintenance. By improving on these points remote SMEs can consult more easily and reliably. All data that is shared will be controlled, transparent, and reliable. Condition-based and predictive maintenance tasks can be created with the knowledge of the technicians, engineers, and SMEs combined with the analysis of accessible data provided securely.