Single-use systems in Life Sciences – a win-win-situation
A technology which is gaining traction and at the same time meeting regulatory compliance
Article30.09.2025
简述
Single-use systems are rapidly gaining traction in biopharma due to their flexibility, reduced contamination risk, and suitability for continuous processing – especially in personalized medicine and gene therapies.
Calibration challenges arise with single-use instruments, as they often lack full traceability and cannot be factory-calibrated as a complete unit, unlike multi-use systems.
Regulatory compliance requires instruments to be verified and calibrated before use. Single-use devices must meet standards like FDA 21 CFR 820.72 and ISO 9001, which is often difficult without innovative solutions.
Risk-based approaches (e.g., FMEA, HAZOP) are essential for integrating single-use instruments into cGMP environments. Tools like Heartbeat Technology support compliance and process safety.
Best practices include combining tight manufacturing tolerances, automatic calibration data transfer, and onboard self-verification to ensure accuracy and meet regulatory expectations.
目录目录
From multi-use to single-use technology – the ideal single-use instrument
Single-use systems are increasingly becoming the backbone of modern biomanufacturing. While traditional multi-use systems maintain their importance for specific applications, the increasing demand for personalized medicines, cell and gene therapies as well as the need for greater flexibility, efficiency and contamination control will strongly drive the adoption and growth of single-use technologies in 2025 and beyond. Single-use technologies are designed for one-time use and then discarded, which simplifies operations and reduces the risk of cross-contamination. The future most likely involves a blend of both technologies, with single-use systems leading in flexibility and speed for many evolving modalities. The growth of intensified continuous processing in biopharmaceutical manufacturing, coupled with the benefits of single-use technology, create a demand for single-use sensors with a proven performance record for long-term use.
To fulfill regulatory compliance, the instrument must be suitable for its intended purpose and be calibrated, inspected and checked prior to use as well as maintained on a regular basis. These requirements apply to any instrument which is installed in a cGMP (current Good Manufacturing Practice) environment independent of its design (multi-use or single-use technology). The frequency and the technology used to perform these checks are typically established through a risk analysis and depend on the criticality of the measuring point.
Calibration procedures and processes for multi-use instruments are well defined, recognized and implemented globally across the life sciences industry.
Single-use instruments on the other hand typically do not offer the same quality of certificates and the available documentation varies according to instrument type and manufacturer. Documentation may include partial factory calibration certificates or a general manufacturer declaration which states the expected accuracy of the system. The instruments are often accompanied by additional documentation stating the calibration factor that may require manual transfer to the control system during commissioning.
Many single-use flowmeters on the market today provide only very limited self-diagnostic features that give the user some level of information about the health of the device. However, most of them fail to provide conclusive and traceable evidence that the instrument has been commissioned properly and is thus operating according to specification.
From an operator point of view, an “ideal” single-use instrument should include the following features:
Seamless integration into an existing digital infrastructure
PAT (Process Analytical Technology) and development of analytical technologies that support high throughput processes or continuous process systems
Preferably arriving calibrated and ready to use without manual intervention
Compatible with nonintrusive calibration and integrity testing
The same or better quality of measurement as traditional, multi-use sensing technology
Shift to single-use technology
With advances in therapeutic and diagnostic solutions, DNA sequencing and genome editing, the biotech industry is rapidly evolving, driving demand for genetics and regenerative medicines.
Biopharmaceuticals such as mammalian cell-based recombinant proteins are the most rapidly growing product segment in modern Life Sciences. The availability of high productivity cell lines providing up to double-digit gram level per liter has resulted in production facilities shrinking in size. This, combined with the requirements of health care for more economic production processes and an increasing number of potential product candidates, has contributed to multi-use equipment being replaced by single-use technology over the last 20 years. Single-use technologies have been in use in biopharmaceuticals since the early 1990s, marking a significant shift in manufacturing practices.
About a decade ago, the manufacture of several 100 kg/year of a monoclonal antibody, often sufficient for a niche or biosimilar market, would have required multiple 5.000 to 10.000 l or larger stainless-steel bioreactor runs and other comparably scaled equipment. But now, the same amount can be manufactured quicker and more often at lower cost with a few or even one 500 to 2.000 l single-use bioreactor. Single-use systems have lower upfront costs than traditional manufacturing methods, making them an attractive option for many companies in the biopharmaceutical industry and the pharmaceutical industry in general. The costs for repeated purchases of single-use components are generally more than compensated by the avoidance of cleaning/sterilization/validation, the corresponding loss of time and the loss of flexibility that comes with multi-use stainless steel systems.
Single-use or disposable bioprocessing equipment is now used for ≥85 % of pre-commercial scale, i.e., preclinical and clinical biopharmaceutical manufacturing, and is increasingly being adopted for commercial product manufacturing. Single-use systems are used for various applications such as cell culture, harvest, purification, formulation, and filling in biopharma, making them versatile tools in the industry.
In addition to the emergence of personalized medicine, improvements in cell culture processes and the growth of biologics have spurred the growth of single-use technology. Process intensification efforts are transforming the biotech process from a pure batch process to a continuous process with the aim of reducing manual operator interaction. Single-use systems require less energy and resources to operate than reusable systems, further enhancing their appeal in modern biomanufacturing. This shift toward a fully automated process also increases the demand for reliable instrumentation.
Quality by Design (QbD) and Process Analytical Technology (PAT)
QbD (Quality by Design) and PAT (Process Analytical Technology) are two interconnected approaches used in the biotech industry to enhance manufacturing processes and product quality. QbD is a proactive strategy that integrates quality considerations throughout the product life cycle, while PAT focuses on real-time monitoring and control of critical process parameters to ensure consistent product quality. Essentially, QbD sets the framework and PAT provides the tools for achieving that framework's goals.
Technology transfer from R&D to a full-scale cGMP production is often problematic and inefficient due to a complex handover of knowledge, information and skills. If a process engineer is missing important information from earlier studies, this can cause significant delays downstream, which means the product will be late to market. To improve this situation, regulators and industries represented in the International Conference on Harmonization (ICH) have adopted the principle of QbD. This means that the product’s Critical Quality Attributes (CQAs) as well as the CQAs of drug substance, excipients, intermediates or Critical Process Parameters (CPPs) impacting drug product CQAs must be controlled and maintained during the manufacturing process within an appropriate limit, range or statistical distribution to ensure the desired product quality. The resulting product quality is guaranteed if all critical production parameters stay within an acceptable range which is defined as the design space. If a process parameter is considered critical for process control (CPP), it is of utmost importance that the relevant instrument provides accurate and reliable measuring results during the entire life cycle.
Quality by Design (QbD)/Process Analytical Technology (PAT).
Risk-based approach to single-use technology in biopharmaceutical manufacturing process
Quality Risk Management (QRM) is now a regulatory expectation, and it makes good business sense. The goal of the risk assessment is to increase process understanding and deliver safe and effective products to the patients. Internationally recognized organizations such as ISPE have published guidance documents to support the implementation of risk-based concepts, e.g., ISPE GAMP® Good Practice Guide: A Risk-Based Approach to Calibration Management (Second Edition).
The good news for single-use applications is – as far as risk management is concerned – that most of the hard work has been done in the multi-use sector already. The challenge is how to take this knowledge and apply it to single-use instrumentation.
Risks are encountered throughout the entire biopharmaceutical manufacturing process – from raw material supply through manufacturing and filling operations to final distribution. Several assessment tools are available to evaluate, manage and mitigate the risk in a process. QRM methods used to identify the risks and develop strategies to minimize or control them include Failure Mode Effect Analysis (FMEA), Fault Tree Analysis (FTA), Preliminary Risk Analysis (PRA), Hazard and Operability Analysis (HAZOP) as well as Hazard Analysis and Critical Control Points (HACCP).
To incorporate the safety parameters of the instruments into the QRM tool is crucial and sometimes problematic. Standard “off the shelf” process automation does not always provide the required parameters and information. Process instruments specifically designed to provide highest functional safety (e.g., developed according to IEC 61508) are most suitable for critical applications. Results and detailed information about the safety design such as FMEDA (Failure Modes Effects and Diagnostic Analysis) data are readily available from the instrument manufacturer and can be used as direct QRM input for risk mitigation calculations.
Legal requirements for maintenance of multi-use and single-use instrumentation
Instruments must be checked prior to use (factory calibration and on-site commissioning) and regularly as part of a recurring maintenance interval. Such checks can be done by performing consequential wet calibrations and/or utilizing onboard verification functionalities (e.g., Heartbeat Technology).
A calibration of an instrument, for example a flowmeter, is the determination and documentation of the difference between the displayed value and the true value of a primary fluid (measurand) without technical intervention. Traceability is accomplished by a formal comparison to a standard which is related to national or international standards. Detected deviations between the displayed value and the effective measured reference value can be corrected after the calibration by adjusting the calibration factor. A calibration protocol is issued to document the findings and put on record for possible audits.
What is verification?
A verification serves to confirm that an instrument is still performing within the bounds of acceptable performance and specification. It is often used to verify and confirm that the last (wet) calibration of the instrument is still valid. Traceability and documentation requirements are identical to a calibration, see chapter above.
Paradigm shift for instrument checks in single-use systems
Multi-use instruments typically consist of an entire prefabricated assembly which can be factory calibrated. For instance, a multi-use flowmeter has a sensor (e.g., Coriolis), electronics (including an analog-digital converter) and a transmitter (that converts raw signals to a digital signal for integration in other systems), all in the same housing that is integrated in the process piping.
For single-use systems, separating the sensing element from the measuring system and reusing the electronics can dramatically lower the cost in comparison with the need to dispose of the entire measuring system. Therefore, a single-use flowmeter typically consists of the following elements:
a) A reusable base unit which is stationary and not in contact with the process
b) A disposable component (flow tube) in contact with the process
Components of Proline Promass U 500 single-use flowmeter as skid-mount version (left) and table-top version (right).
Both components (disposable and base unit) are manufactured independently and cannot undergo a (combined) factory calibration. The first time these two components are assembled to one unit is during commissioning on site. This approach, however, brings about a paradigm shift in a user’s approach to calibration as compared to traditional technology.
On-site wet calibration of an installed single-use flowmeter (base unit and disposable part) is typically not possible or practical due to the sterile boundary of the flow path. This leads to a situation today where many single-use instruments installed in a cGMP environment do not fulfill requirements in accordance with FDA 21 CFR sec. 820.72 and ISO 9001:2015 (section 7.1.5/7.1.5.2a) for calibration.
Today’s practice and introduction to risk management with single-use technology
It is common practice today to install a flowmeter as part of a single-use flow assembly in a biotech process without checking its suitability prior to use. As wet calibrations on site are not feasible, the user just relies on the instrument performing to expectations according to the manufacturer’s specifications. It is not surprising that many installations suffer from inaccurate readings due to undetected defects or operator induced error, for example by entering a wrong calibration factor into the system.
Single-use equipment typically undergoes a sterilization step between factory calibration and commissioning which can affect the accuracy of certain types of instruments as well. Such influences often go undetected if the instrument is not checked properly during commissioning on site.
Alternative methods for regulatory check of single-use instrumentation
Best practice alternatives today include the following three approaches. Each method has its drawbacks and cannot fulfill cGMP and industry requirements if used individually:
a) Manufacture a sensing element to such tight tolerances that it will perform within product accuracy specification when connected to the electronics.
b) Determine the individual calibration data during manufacturing (factory calibration) and upload this data to the base unit electronics prior to batch.
c) Check a sensor or instrument combination immediately before use during on-site commissioning.
The following table shows that a combination of the technologies and procedures can drastically improve performance and provide regulatory compliance.
* Calibration data can be entered manually into the control system or automatically via scanner or integrated data transfer. Automatic methods increase reliability and lower the risk of human error.
High accuracy and new possibilities in the biotech industry with multivariable Coriolis measuring technology as well as comprehensive industry compliance of Proline Promass U 500 from Endress+Hauser
At the end of the course you will know about the features of the PROFINET technology and the PA profiles, network design of 100BaseTX and Ethernet-APL.