It is difficult to argue with the notion that knowing your unit cost is essential for running a successful and profitable business. This is particularly critical to the companies that operate large-scale industrial processes where volumes are high, and even a slight change in the unit cost might significantly impact their profitability. Of course, all these businesses would have their unit cost calculated and closely monitored. However, in most cases, it is done retrospectively by dividing the total spend by the number of units produced over a certain period, e.g., a month or a quarter.
The problem with this approach for process-based industries is that it effectively presents only an average unit cost over this period and hides how it is influenced by important process-related variable factors, i.e., how efficiently the process was operated. We argue that this traditional approach is no longer sufficient. In the modern digital age, companies that use complex industrial methods equipped with multiple sensors and data gathering software must have the real-time (or at least an hourly) unit cost available to them. This will enable them to analyze how the process changes during the day, week, month, etc., identify good and bad performance periods and investigate what influenced it. This will unlock numerous opportunities for process improvement and optimization.
This article highlights the importance of having a model that calculates the real-time unit cost for complex industrial processes, outlines the main steps in developing such a model, and then explains how the real-time unit cost can be used to improve and optimize the process and achieve the best possible unit cost. Overall, this might be quite a challenging journey. Therefore Hint and CogInTech recently developed a solution that can help operators achieve tangible results quickly and without significant investments in software or hardware. This solution is briefly presented in this article as well.
Unit cost for process-based industries
A unit cost is traditionally defined as a total expenditure incurred by a company to produce, store, and sell one unit of a product or service. This is usually done once a quarter by the company accountants who calculate all the fixed and variable expenses and then divide it by the number of units produced. The company’s financial statement often reports the unit cost, and the company management and external stakeholders closely analyze it as it is an integral indicator that combines most aspects of the company operations and indicates how well the company is run.
However, for the process-based industries, the efficiency of the industrial process is one of the main factors that have a significant impact on the variable costs, and a traditional approach completely hides this influence by reporting only the average value. To illustrate this point, let’s have a look at the plot below that shows the contribution of the process efficiency to the unit cost during a particular day, along with the mean unit cost value that would be presented on the balance sheet. The graph demonstrates that on the 1st of April 2020, between 15-00 and 19-00 hours, something happened with the process, and this resulted in the unit cost spike well above the average value that would be in the financial statement, i.e., it reached $900 instead of the average value of $570.
Imagine you are the operations manager running that plant. Would you like this graph reported to you every day? The answer is, of course, a yes, as this graph not only gives valuable insight into how well the process is run but also allows you to identify and investigate such occurrences to eliminate their root cause. Moreover, as we will show below, this approach also opens opportunities for early problem detection, better shutdown and maintenance planning, process optimization, and reducing the unit cost to the minimal possible values.
However, it is currently highly uncommon to have such reports available for managers and engineers. There are complex reasons why it is the case, but the main ones are that financial, process, and operation departments typically work in silos. Currently, there is also no simple tool that would bring the process and financial data together. Hint and CogInTech recently developed a solution for the latter issue, and our experience shows that the implementation of this tool brings different departments together by formulating the common goal that enables breaking those silos.
Unit cost model – prerequisites for success
OK, we now seem to agree that it is a good idea to have the real-time unit cost calculated and available for analysis. However, developing a model that will deliver this is a relatively complex project. Therefore, before starting this project, it is essential to highlight a few main prerequisites for its success.
First, since building this model will require cross-departmental cooperation, it is necessary to ensure the management’s buy-in and support and employee involvement and engagement. This is essential for the success of this project since to build such a model; one will need to talk to process engineers, operators, financial and commercial personnel as they hold critical pieces of information required for the model to be reliable and trustworthy. It is usually quite feasible to get this buy-in as the benefits of having the real-time cost calculated is easy to communicate to the management. They are typically quite supportive of this idea.
Secondly, it is necessary to ensure that the process data needed for building the unit cost model is good quality data that is ideally managed and verified by specialized tools.
Maintaining low uncertainty is one way to optimize data quality. The uncertainty of measurement is related to process data. However, data quality is not just about the process data. The issue with data quality is looking at all data of the entire evacuation system, including all measurements, usage, value, and historical maintenance, diagnostic, calibration, and validation data.
In today’s highly competitive landscape, to keep your company financially healthy and accountable, you want to:
- have a transparent system
- deliver the exact numbers, with the right quality, without altering the data
- optimize the system
- reducing errors
- work closer to operation limits while keeping it safe
- have traceability of your data
However, it is easier said than done. The desire to fix the problem is often ignored due to the expectation of highly high costs, effort, and required time.
Currently, operators only use data validity as a check, e.g., in the process, reading within the expected range of 50 bar but measuring 60 bar.
We have found that other parameters are missing and should be added to the
current real-time data analysis to get better results:
- Data integrity: the rate of change of the process measurement, measuring 60 bar and it is changing within 20 sec from 60 bar to 50 bar
- Data redundancy: two process measurements for the same parameter, e.g., two pressure transmitters are measuring 60 bar.
- Data consistency: a holistic view using different pieces of information about the system to verify the quality of the measurement, e.g., your flow is going from low to high pressure.
- Diagnostic real-time data: telling you the current health status of the meter, do I still have flow in my analyzer?
- Real-time uncertainty data: total and calibration uncertainty, recalibration when outside the uncertainty limits.
- Historical data: measurement data history, uncertainty data history, diagnostic data history, validation data history, maintenance data history, operational state history, sample analysis data history.
- Maintenance data: telling you about historical maintenance data like calibration, validation, verification, sample & calibration frequencies, manufacturer requirements, and sensor failures.
Hint AML solution is one of such tools, and it offers a comprehensive solution for Information Management, Maintenance, and Data Acquisition systems. It connects the dots from the sensor to the boardroom, changing real-time data into quality data, information, and finally into knowledge. Suppose the quality of the process data is questionable. In that case, all the unit cost calculations are likely not reliable, so it is probably not worth progressing this project until the data quality issues are resolved.
Finally, although not essential, it is preferred that the process used in the unit cost model is relatively problem-free and running relatively smoothly. Otherwise, the model might be quite noisy and difficult to utilize. If the process has obvious issues, it is worthwhile to eliminate them through traditional process improvement techniques, maintenance, etc.
Unit cost model where to start and how to build it
Once it is ensured that the listed above prerequisites of success are in place, it is possible to start building the unit cost model for the process. Although it is theoretically feasible to model the entire plant, it might be more efficient to begin modeling a particular processor unit within the plant, focusing on the most significant contributors to the variable cost. For example, it might be the crude oil quality and volumes as a function of chemical consumption for an upstream separation train or high energy-consuming processes or units for downstream or petrochemical plants, e.g., crude distillation or hydrocracking units. This will enable gaining experience in building simple models that can be used to create a complex and comprehensive model of the entire plant or company.
Once the processor unit is selected, the next step is to build its model that takes into account all the main process parameters (e.g., flow rates, energy consumption, liquid levels, valve positions, etc.), all the interactions between them and combines them with all the relevant financial data (e.g., energy cost, raw material cost, etc.). The goal is to calculate the unit cost as a function of the primary process parameters and financial figures. This might be tricky, so it is better to use some tools to facilitate this process.
It is possible to use whatever tool you prefer, but one of the options is the CogInTech Digital Twin Builder which is free in its basic version that you can find at the website cogintech.com. This tool is designed specifically for building unit cost models; it is easy to use and supports formulas. This tool also allows creating the model of your process to make further analysis more accessible. It defines the inputs/features and goals/labels very explicitly and identifies that variables can be controlled and which ones are just measurable.
The tool allows you to move from having P&IDs or PFDs as your process representation to a simple model of the process where only the main elements of the process are shown. The tool will help you visualize the interactions between the components and supplement it with formulas to calculate the required parameters. An example of such a transition from P&IDs to a simple model is shown below.
It is recommended to have a half-day workshop to develop a process model with all the relevant specialists and stakeholders in the room. The goal of this workshop is to connect financial numbers with the specific features of the technological revolution and to be able to calculate the unit cost as a result. It is essential to ensure that all the knowledge about the process will be represented, as it will be required to combine the financial and process details in the model. Thus, the presence of the process engineers, operators, and finance specialists is essential for the success of this workshop and building a reliable and accurate model of the process.
What is next? AI-powered Digital Twin and process optimization
Once the model that connects financial and process data is ready, the unit cost can be calculated by plugging the historical process and financial data into the model. Depending on the process and monitoring equipment installed, the frequency for the unit cost calculations might vary from every few minutes (or even seconds) to only every 12 or 24 hours. It is typically possible to calculate the hourly unit cost, which is sufficient for most practical applications. So, what are the main options for using this hourly unit cost data?
First, the analysis of historical data is likely to reveal numerous insights into how the plant and process have been operated in the past. The variability of the unit cost tells the story of how well the process was run and whether there were periods when the efficiency of the process was suboptimal, which led to higher unit costs. Investigating such periods might detect equipment problems or how deviation from the maintenance regime influenced the unit cost. Additionally, suppose there were periods when the unit cost was lower than usual. In that case, it is worth analyzing them and extracting some practices and reasons that led to this standard unit cost. After the historical data is studied, it is worthwhile to run weekly, or daily unit cost reports that are likely to be very useful for the management, as they are an integral performance indicator of how well the process was run and any apparent problems.
Secondly, connecting the model to real-time data can provide an operator with valuable insight into health status. If the unit cost is close to the average, it is reasonable to assume that things are every day. However, if the unit cost starts creeping up, then it is worth investigating the reason for that increase and addressing it as soon as practically possible. An extension to Hint Global AML Solution can help to deliver this option.
Finally, perhaps the most powerful application for such a model is process optimization. Once the process model is built and there is effectively a function that connects process and financial data to calculate the unit cost, it is possible to apply modern machine learning tools and develop a digital twin of the process using historical data. This digital twin can then be interrogated to extract the best practices and best method operating conditions that minimize the unit cost. This can be used to develop a set of operational instructions that will help run the process at the maximum efficiency level. Moreover, if this digital twin is connected to the live process data, it can provide real-time advice to operators on adjusting the process settings to achieve the best performance. Hint in partnership with CogInTech can help operators to build such digital twins.
Real-time unit cost, an integral characteristic of the process health status, is a valuable tool for modern process-based industries. It opens numerous opportunities for increasing efficiency through process optimization and improving reliability by detecting the problems early. The projects of building the unit cost models typically have very high ROI and payback within months.
In partnership with CogInTech, it is happy to assist companies with building the unit cost models and digital twins for optimization. We offer the following tiers of services:
- Development of a simplified digital twin for unit cost calculations
- Analysis of historical data using the simplified digital twin and providing recommendations for the process improvement
- Connecting the simplified digital twin with the live data to provide real-time unit cost and regular reporting
- Neural networks can accurately model analysis of the process
- Development of a comprehensive AI-powered digital twin for a process, with all the interactions within the process accurately modeled by neural networks
- Detailed process optimization recommendations based on the AI-powered digital twin model of the process
- Connecting the AI-powered digital twin to real-time data to provide guidance to operators in real-time and detect problems early
If you are interested, please get in touch, and we will be happy to help if you want
to learn more about Hint, please visit our website www.hint-global.com
Mr. Vladislav Romashov, CogInTech, UK
Mr. Anton Varentsov, CogInTech, Russia
Mr. Wolter Last, Hint Americas Inc., Houston, Texas, USA