PQMS
  • Homepage
  • Products
  • About Us
  • Blog
  • Contact us
Login
Book demo
  • One-Size-Fits-All Paperless Validation – Is It Really Fit for Use?

    One-Size-Fits-All Paperless Validation – Is It Really Fit for Use?

    Synopsis:

    Popular  validation tools have a basic flaw – these applications were designed to only carry out qualification activities.  This article emphasises the differences between qualification and validation, and why for process and cleaning validation/monitoring, such “paperless validation” applications offer little more value than a document management system. An efficient digital validation platform has to be a suite of applications each performing what it has to be designed for, qualification, manufacturing process management, cleaning process management separately,  but still integrated so as to provide a smooth flow of data. 

    Introduction

    No manufacturing sector can survive without a relentless focus on quality—because the very essence of manufacturing is to produce consistent and standardized product.  Any lapse in quality in pharmaceutical industry leads to regulatory scrutiny, fines, damaged reputations, and ultimately business failure. 

    In pharmaceutical manufacturing, the regulatory expectation is that the process be in “a state of control”.  And historically we “validate” processes, e.g., manufacturing, packaging, cleaning or QC method.  The core objective of validation is to demonstrate document consistent, controlled performance over time.  Validation is more than a regulatory requirement—it is the foundation of operational consistency, product quality and most importantly.

    As digital “validation” solutions flood the market, the expectation is that these digital tools do more than merely replicating paper processes in electronic form – that they will enable smarter, more integrated validation approaches that align with regulatory requirements and operational needs. But do they deliver on this promise?

    This article explores the differences between validation and qualification, and why the “one-size-fits-all” approach towards  “validation” is an idea that falls well short of customer expectations. 

    Let’s Clarify the Basics

    Before delving deeper, it is crucial to differentiate between the two terms that are often confused: qualification and validation.

    • The expectation from Qualification is that the equipment, utilities, instruments, software used in a GMP environment have been installed correctly and function as intended.
    • Expectation from Validation is to show that the processes are “in a state of control”. 

    In simple terms, Qualification answers, “Does that entity work as per our requirement?” while Validation dives deeper to explore, “Can we get a reliable and consistent product?”

    Understanding Qualification

    The User Requirements Specification (URS) is usually the first document created when a new entity/asset is being planned, detailing the requirements, including business, compliance, and operational requirements.  This activity is done by stakeholders who are either responsible or accountable for that asset/entity.

    The selected vendor(s), or the internal resource where applicable, respond with a Functional Specification (FS) detailing how the requirements in the User requirement will be met.

    The specific test script document requirements vary depending on what is being qualified. However, the intent remains the same: to demonstrate, through test scripts, that the User Requirement Specification (URS) requirements have been met. These test scripts may be documented in Factory Acceptance Tests (FAT), Installation Qualification (IQ), Operational Qualification (OQ), Site Acceptance Tests (SAT), or Performance Qualification (PQ). The selection of which of these to include depends on the particular asset or system being qualified.

    A Traceability Matrix document is then put together mapping each requirement to the corresponding tests, detailing the user requirement ID and the test script ID. This matrix provides clear visibility of how every user requirement specified in the URS is verified through specific test cases. 

    In this sense, qualification is more likely to be a documentation-driven, structured and checklist-based approach, focusing on verifying that vendor-supplied asset/entity meet predefined specifications. 

    Understanding Validation

    Validation, by contrast, focuses on ensuring that processes consistently produce outcomes that meet predetermined specifications.

    According to the FDA, validation is: “Establishing documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality attributes.”

    Where qualification proves that an asset/entity is installed and functions correctly, validation demonstrates that the processes using that asset/entity are robust, repeatable, and compliant.  Furthermore, it is even more critical that “effective monitoring and control systems” are put in place for “process performance and product quality, thereby providing assurance of continued suitability and capability of processes”. This is a requirement as per ICHQ10 and continues to be for the last decade one of the most cited observations. The goal in validation and monitoring is not just compliance but smarter, data-driven decisions, continuous improvement being an important pillar. 

    Validation, therefore, is data-driven  and is focussed on data analysis providing the assurance that the process will consistently produce product whose quality meets predetermined specifications.

    Given that qualification and validation have very different approaches and end goals, is it possible to use one software tool to manage both requirements?

    Many software vendors market “all-in-one” applications as a one-stop solution. These apps promise to provide a unified platform for validation.  At first glance, the appeal of a unified system is undeniable. After all, a single tool that manages documentation, qualification, and validation activities across an entire organization sounds like a great solution.

    However, when the design requirement for a qualification activity has absolutely no relationship with what is to be done in validation and ongoing monitoring, how is it possible to combine the use cases?  

    The data type that a qualification management system main data type is textual information. User requirements are primarily captured as detailed text entries within structured tables, and supporting test scripts are also documented largely in text form. While most data centers around descriptive narratives—such as specifications, acceptance criteria, and rationales—numeric data appears mainly when recording process parameters during process qualification (PQ) for equipment. This emphasis on text ensures that every requirement and its verification are clearly documented for traceability, auditability, and regulatory compliance.

    Validation and ongoing monitoring activities are fundamentally centered around the defined product specifications, attribute specifications, equipment process parameter ranges and how these are met. The primary data type handled in this context is numeric, as these processes often involve the collection and analysis of quantifiable results—such as, equipment process parameter range for a noted batch, attribute measured results—to demonstrate compliance with set limits or acceptance criteria. This numerical data provides objective evidence to verify that equipment operate within their approved specifications, and enables continuous, data-driven monitoring to promptly detect deviations or trends. While supporting documentation like SOP may include textual explanations or rationales, it is the structured numeric data that forms the backbone of validation and monitoring efficacy.

    Pharmaceutical regulators expect manufacturers to proactively show processes are in control, detect deviations, implement timely corrections, and continually improve processes to guarantee product safety and efficacy.  While qualification is an integral part of the process validation program and expects all entities/assets used in the product manufacture to be shown to be “fit for use”, the expectation is also timely access to real-time or trending data, making it easy to promptly identify deviations, detect process drifts, or generate automated alerts.

    So an  “all-in-one” application designed to aid in qualification and CSV, can only act as storage repository for files associated with process validation.  Such applications consider validation or ongoing process verification runs as static activities with no ability to let the data speak for itself.   These teams rely on time-consuming, error-prone manual reviews to collate and interpret data, increasing the risk of missed trends and delayed responses to quality issues. Furthermore, static documentation impedes data integration, reduces the potential for cross-functional collaboration, and hinders the effectiveness of continuous improvement initiatives—ultimately compromising regulatory compliance and the robust oversight expected in modern pharmaceutical manufacturing.

    Conclusions

    Ultimately, companies must weigh the true cost of investment, both in terms of efficiency and compliance risk, when selecting validation tools. It may be time for the industry to reconsider whether costly all-in-one solutions are necessary, or if more focused, built for purpose software solutions targeted toward manufacturing process would better serve their needs.

    ravi

    July 27, 2025
    paperless validation
    digital transformation, digitilization, paperless validation, pharmaceutical
  • Digitalization in Pharma: How to make the switch

    Digitalization in Pharma: How to make the switch

    Synopsis

    A robust digital ecosystem streamlines processes and improves data management in the pharmaceutical industry, but conflicting priorities can create disconnected software silos. Because digitization needs continuous investment, steady incremental progress is better than sporadic efforts. Success depends on strong leadership, a committed and skilled implementation team, and solid management and stakeholder support.

    Introduction

    The COVID-19 pandemic signalled a shift in mindset for the pharmaceutical industry. Almost overnight, many companies were forced to move from traditional paper documents and desktop-based applications to a digital, remote-working ecosystem. 

    Yet, when the SARS-CoV-2 virus went into an endemic state and employees returned to work, many of the processes that had been converted to the digital domain remained that way.  They were more efficient, drove better compliance and saved the company resources and money, it transpired. And so, willingly or not, pharmaceutical companies started embracing digitalization.

    The industry had, of course, been using software applications before the pandemic, such as Enterprise Resource Planning (ERP), Quality Management Systems (QMS), and Laboratory Information Management Systems (LIMS). But the pandemic instigated a drive to move all processes to digital platforms. Now, as companies increasingly invest in software applications and gravitate toward connected factories and sites, it has become clear that there are significant challenges involved in integrating both hardware and software systems.

    Challenges

    One of the challenges pharmaceutical companies have faced relates to the migration away from paper-based records. A batch manufacturing record (BMR), for example, must be maintained for every product batch, and is typically recorded on paper. The same is true of the logs that track everything from daily equipment use to weighing balance verifications. The shift to digital during the pandemic, however, meant that these records had to be captured on tablet devices instead. For staff to move freely around a facility, from equipment to equipment, these tablets require wireless connectivity to a central server or the cloud. And therein lies the problem – existing cleanroom infrastructures were not designed for wireless connectivity.  Cleanrooms need to be retrofitted with wireless routers.  And this also brings up a whole host of cybersecurity concerns, such as setting up an effective ransomware mitigation plan.

    Another challenge is choosing the right software. No application can manage an entire site, so companies must select and integrate a series of software tools for certain tasks, activities, and processes without duplicating them, which is where things can get confusing. Process parameter data, for example, is captured in both a BMR and a Process Performance Qualification (PPQ) run record or report, while the data in an Annual Quality Review (AQR) Report comes from disparate sources, with some requiring analyses by statistical tools.

    Defining digital requirements

    The first step to a successful digital transformation program is identifying  the activities being carried out at a manufacturing site.  A plan must then be put together to discern which of them can and should be digitized.

    Not every activity needs to be digitized immediately, however. Just like a solo house buyer might consider a multi-bedroom property, leaving space for future expansion when it comes to digital applications is a sensible move.

    Who decides which activities should go digital? Many companies have created a new team or point person, responsible for getting departments to share their digitization requirements. Approving the overall requirement, meanwhile, is a job for the site head. It’s a critical activity, one that’s similar to putting up a pharmaceutical facility.

    The site head has the power to allocate money across multiple budgets, even when the purse strings are tight, and he is the right person to form a team that that has collective knowledge of the activities taking place at the site.

    Setting goals

    Once the activities being moved to the digital domain have been identified, specific goals need to be decided on. Many companies prioritize the ability to see and analyze data on demand while others want to streamline their practices before the workflow is digitized, doing away with non-value-adding activities. The latter is a critical issue, one that can lead to heated discussions. In company meetings, it’s common to hear the expression: “If it ain’t broke, don’t fix it”. This is where the site head can offer a (gentle) guiding hand to make sure that digitalization and all the efficiency it brings isn’t sacrificed through fear of disrupting the status quo.

    A connected digital platform will generate a large amount of data. That’s why it’s important to decide which part of the collected data is to be analyzed and how.  Companies may also find that some of the data has no purpose.  In which case, they need to decide whether it’s worth storing.  In the same vein, it’s important to examine the relevant procedures and forms to identify those that will become obsolete once transferred to the digital domain. For example, equipment usage log data can be obtained from the Manufacturing Execution System (MES), so why capture it in a logbook?It’s critical to ‘spring clean’ what data is to be collected and from where before moving it to the digital realm. 

    Another question to ask is whether the technology currently available can be leveraged to improve efficiency. For instance, digitalization has already impacted training with classroom training being replaced in some cases by virtual reality training, while videos have replaced many stepwise procedures.

    Creating a requirements document

    Once the workflows and associated activities have been sanitized and streamlined, a site requirement document needs to be created. The priority should be on marking mandatory requirements – all others can  be thought of as ‘nice-to-haves’.

    A company with multiple manufacturing sites might even consider a global document covering all locations.

    Ultimately,  the goal is to define a system with seamless data flows. This is where an (internal or external) IT source with software architecture knowledge can be included in the team. This resource can assist the team with developing the digital platform’s architecture.

    Revising the requirements document

    Next, the team needs to start looking for applications that are available on the market. The goal at this stage is to find applications that meet all the mandatory requirements. Some of these requirements may already be covered by existing company applications such as ERP, LIMS, and QMS, and it’s possible that new requirements arise when reviewing applications. On the other hand, there may be requirements – specially mandatory – that aren’t met by existing applications (more on how to handle these later in the article).

    Once the team has defined its requirements and compared them with what’s available on the market, it may need to then include new requirements or redefine existing ones in its documentation. Applications of interest should be categorized into groups covering the same activities. If some  overlap, the application  can  be incorporated into multiple groups. Then, the documentation should  be restructured to define the requirements for each application group. With the help of the IT resource, details about how these application groups will ‘talk to’ each other should also be noted in the document.

    One way forward involves identifying applications that can form the nucleus of the digital platform, with other applications interfacing through the nucleus via an Application Programming Interface (API).  An API allows two software applications to ‘speak’ to each other. This streamlines data flows between applications.

    Data could be transferred from one application to the other but that would be inefficient.  For example, if all the receiving application wants to know is  when production on a piece of equipment started and ended, why provide extraneous information? The right design then is to identify the fields of interest and only query them when needed. 

    The other option is to consider a data hub. Here, data from disparate sources is assimilated and stored in a database that can be used by any application. The advantage of this kind of architecture is that it moves away from data duplication, avoiding  data inconsistencies and enhancing data integrity. Care should be taken though to ensure that a clear audit trail identifies the changes any application makes to the data and that data changes are broadcast to all applications. This immediately reveals the impact of any changes made.

    It’s also important to specify the location where an application will be hosted and who it will be hosted by. This is where the team’s input becomes crucial.

    Storage

    Something else to consider is where the applications and the data will reside. The software industry is increasingly moving to the cloud. The main driver behind this is the ability to provide instant support, and, more importantly, to put all infrastructure under the software vendor’s control.

    A problem with site-based servers is that they require localized on-site support, and this has become very expensive. It’s also well documented that multi-tenant Software as a Subscription (SaaS) applications can be rolled out faster than site-based installations. This is due to the time saved in avoiding application installations and leveraging vendor-executed qualification documentation. The reason why the vendor executed documents can be leveraged is because the application has already been installed and qualified.  Repeating the same test for each site does not add any value, except when executing performance qualification. 

    Some vendors only provide applications on a multi-tenant SaaS basis. If some of the applications are running on a local server or on the company’s cloud account, the type of data architecture to use when integrating these applications is the next thing to address. 

    Once the requirement document has been revised, the team may prefer splitting the document for vendor evaluation. Care should, however, be taken to ensure that the relevant interface requirements are properly spelled out in the “child requirement documents”.

    Finding a vendor

    Selecting a vendor is critical to successfully setting up a complete digital platform. Some make the mistake of sharing the requirement document with the vendor and then asking it to self-certify whether all requirements are being met. This can lead to either:

    (a) the company only discovering missing features after the software has been purchased, or

    (b) the vendor hastily developing the missing features and then releasing them without proper testing.

    Neither of these options leads to productive outcomes. They usually result in delays caused by incomplete platform setups and sometimes even the selection of a new vendor. The digitalization team must, therefore, verify whether the requirements are being met and make notes on those that aren’t, along with a reason why. This can still lead to the vendor being asked to develop missing features, but it will, at least, be an informed decision.

    Another common pitfall is making decisions based on presentations that rely on scripted videos. In the era of Theranos-type scams, it’s not always clear if a product  demo represents a prototype or a working application.  This is why it’s important for the team to verify the product’s minimal functionalities. Company data might need to be anonymized when checking requirements against the features being presented.  This anonymized data can then be fed into the application and the process verified. This should be a mandatory requirement.

    Some applications require specialist knowledge (e.g., process validation and cleaning validation). So naturally, an application designed by a subject matter expert will be of a higher quality and better suit users’ needs. This becomes very apparent during the demo as the workflow tends to be much smoother. 

    It may also be a good idea to consider a vendor who can provide software for multiple applications, thereby ensuring interconnectivity between them all. But care should be taken to ensure that the vendor has expertise in the other applications of interest. For example, it could be that the vendor is known for their Manufacturing Execution System (MES) application but not for their process validation application.  In this case, it’s better to consider applications from separate vendors. 

    Once a vendor has been decided on, the team then faces the challenging task of matching their requirements to the application. Only once these applications are identified can they begin to appraise vendors, discuss goals, and share requirements. Vendors should also be encouraged to look out for and highlight any potential implementation roadblocks so that they can clear them early on.

    What should the digitalization team do when requirements are not supported by any existing applications? Should it develop solutions on its own? In this case, reflection is key. As a company in the drug manufacturing business – not software development – it’s best to stay in the right lane. There are very few pharmaceutical manufacturers that have successfully developed and managed applications by themselves.

    The problem does not lie in the software’s development phase but rather in its maintenance, where both the underlying framework and the application itself need to be kept up-to-date. It’s at this point, especially in the manufacturing sector, that a third party vendor is usually called in to update the code. But what happens when the technology stock used to build these applications becomes obsolete – will another application be developed? Who will test the application? The self-development of applications often leads down a rabbit hole of issues such as these. In some companies, there’s something of a ‘software graveyard’ where many tombstones point to abandoned in-house software developments. The best strategy is, then, to identify a suitable vendor who knows what they’re doing.

    Qualifying and rolling out applications

    Many applications fail early on because some members of the team tasked with implementing the roll-out, i.e., qualification of the system followed by training all the associated users and implementing changes to affected procedures, do not believe that the application they’ve purchased will meet the relevant requirements.

    Procedural delays and red tape caused by objections can stretch out application implementation times. It’s therefore important that the team leader lay out a clear timeline. Site management should also provide support for addressing delays (be it actual or based on perceptions).

    Sometimes it’s just not possible to roll out all the features of an application at once. This might be due to network or extra hardware requirements. In such cases, the application roll-out can be divided into phases that each have a clearly defined execution timeline. Site management should ensure that an adequate budget is available for rolling out the later phases.

    Qualifying the application can also drain a significant amount of time. The team must, therefore, draw up a clear map of what needs to be qualified. If dealing with a SaaS application, then it will need to discern whether the vendor qualification document can be leveraged rather than repeating the qualification process.

    A clear demarcation must also be made when it comes to establishing what the requirements will be if the application is hosted on a local server instead of a SaaS application. If this is not done, then the computer systems validation team will often create its own installation qualification document in an ad hoc fashion. This can result in an auditor questioning how the team performed an installation on a server that was not its own. 

    A long-term commitment

    Unlocking the full potential of a digital ecosystem demands ongoing evolution and adaptation. Regular assessments and strategic adjustments are essential for its continued success. Changes in technology, regulations, vendor status, and the like can dictate various requirement changes, while activities that could not previously be brought onto the digital platform can be successfully integrated. It’s therefore crucial that pharmaceutical companies look to maintain ongoing digital transformations into the future.

    From defining digital requirements and documenting them, to finding the right vendor and finally rolling out applications, leadership is key on the journey to digitalization. It plays a central role in both budgeting to keep the platform fully functional and ensuring that the right team members are retained.  This team will be responsible for carrying out an annual review of the digital platform and recommending course corrections as required. In all this, there must be active and constructive leadership support.

    In today’s interconnected world, we can harness and leverage data to our advantage. It’s crucial that we understand how to effectively use data and then translate that understanding into action that enhances existing processes. The members of an agile and efficient digital transformation team serve as skilled navigators capable of interpreting ever-changing circumstances and adapting to a shifting landscape.

    Quascenta Team

    April 28, 2025
    Uncategorized
    digital transformation, digitilization, pharmaceutical

Powering Quality from Lab to Label © 2025, Quascenta Pte. Ltd.