Digitalization in Pharma: How to make the switch

2151887054

The COVID-19 pandemic signalled a shift in mindset for the pharmaceutical industry. Almost overnight, many companies were forced to move from traditional paper documents and desktop-based applications to a digital, remote-working ecosystem. 

Yet, when the SARS-CoV-2 virus went into an endemic state and employees returned to work, many of the processes that had been converted to the digital domain remained that way.  They were more efficient, drove better compliance and saved the company resources and money, it transpired. And so, willingly or not, pharmaceutical companies started embracing digitalization.

The industry had, of course, been using software applications before the pandemic, such as Enterprise Resource Planning (ERP), Quality Management Systems (QMS), and Laboratory Information Management Systems (LIMS). But the pandemic instigated a drive to move all processes to digital platforms. Now, as companies increasingly invest in software applications and gravitate toward connected factories and sites,it has become clear that there are significant challenges involved in integrating both hardware and software systems.

Challenges

One of the challenges pharmaceutical companies have faced relates to the migration away from paper-based records. A batch manufacturing record (BMR), for example, must be maintained for every product batch, and is typically recorded on paper. The same is true of the logs that track everything from daily equipment use to weighing balance verifications. The shift to digital during the pandemic, however, meant that these records had to be captured on tablet devices instead. For staff to move freely around a facility, from equipment to equipment, these tablets require wireless connectivity to a central server or the cloud. And therein lies the problem – existing cleanroom infrastructures were not designed for wireless connectivity.  Cleanrooms need to be retrofitted with wireless routers.  And this also brings up a whole host of cybersecurity concerns, such as setting up an effective ransomware mitigation plan.

Another challenge is choosing the right software. No application can manage an entire site, so companies must select and integrate a series of software tools for certain tasks, activities, and processes without duplicating them, which is where things can get confusing. Process parameter data, for example, is captured in both a BMR and a Process Performance Qualification (PPQ) run record or report, while the data in an Annual Quality Review (AQR) Report comes from disparate sources, with some requiring analyses by statistical tools.

Defining digital requirements

The first step to a successful digital transformation program is identifying  the activities being carried out at a manufacturing site.  A plan must then be put together to discern which of them can and should be digitized.

Not every activity needs to be digitized immediately, however. Just like a solo house buyer might consider a multi-bedroom property, leaving space for future expansion when it comes to digital applications is a sensible move.

Who decides which activities should go digital? Many companies have created a new team or point person, responsible for getting departments to share their digitization requirements. Approving the overall requirement, meanwhile, is a job for the site head. It’s a critical activity, one that’s similar to putting up a pharmaceutical facility.

The site head has the power to allocate money across multiple budgets, even when the purse strings are tight, and he is the right person to form a team that that has collective knowledge of the activities taking place at the site.

Setting goals

Once the activities being moved to the digital domain have been identified, specific goals need to be decided on. Many companies prioritize the ability to see and analyze data on demand while others want to streamline their practices before the workflow is digitized, doing away with non-value-adding activities. The latter is a critical issue, one that can lead to heated discussions. In company meetings, it’s common to hear the expression: “If it ain’t broke, don’t fix it”. This is where the site head can offer a (gentle) guiding hand to make sure that digitalization and all the efficiency it brings isn’t sacrificed through fear of disrupting the status quo.

A connected digital platform will generate a large amount of data. That’s why it’s important to decide which part of the collected data is to be analyzed and how.  Companies may also find that some of the data has no purpose.  In which case, they need to decide whether it’s worth storing.  In the same vein, it’s important to examine the relevant procedures and forms to identify those that will become obsolete once transferred to the digital domain. For example, equipment usage log data can be obtained from the Manufacturing Execution System (MES), so why capture it in a logbook?It’s critical to ‘spring clean’ what data is to be collected and from where before moving it to the digital realm. 

Another question to ask is whether the technology currently available can be leveraged to improve efficiency. For instance, digitalization has already impacted training with classroom training being replaced in some cases by virtual reality training, while videos have replaced many stepwise procedures.

Creating a requirements document

Once the workflows and associated activities have been sanitized and streamlined, a site requirement document needs to be created. The priority should be on marking mandatory requirements – all others can  be thought of as ‘nice-to-haves’.

A company with multiple manufacturing sites might even consider a global document covering all locations.

Ultimately,  the goal is to define a system with seamless data flows. This is where an (internal or external) IT source with software architecture knowledge can be included in the team. This resource can assist the team with developing the digital platform’s architecture.

Revising the requirements document

Next, the team needs to start looking for applications that are available on the market. The goal at this stage is to find applications that meet all the mandatory requirements. Some of these requirements may already be covered by existing company applications such as ERP, LIMS, and QMS, and it’s possible that new requirements arise when reviewing applications. On the other hand, there may be requirements – specially mandatory – that aren’t met by existing applications (more on how to handle these later in the article).

Once the team has defined its requirements and compared them with what’s available on the market, it may need to then include new requirements or redefine existing ones in its documentation. Applications of interest should be categorized into groups covering the same activities. If some  overlap, the application  can  be incorporated into multiple groups. Then, the documentation should  be restructured to define the requirements for each application group. With the help of the IT resource, details about how these application groups will ‘talk to’ each other should also be noted in the document.

One way forward involves identifying applications that can form the nucleus of the digital platform, with other applications interfacing through the nucleus via an Application Programming Interface (API).  An API allows two software applications to ‘speak’ to each other. This streamlines data flows between applications.

Data could be transferred from one application to the other but that would be inefficient.  For example, if all the receiving application wants to know is  when production on a piece of equipment started and ended, why provide extraneous information? The right design then is to identify the fields of interest and only query them when needed. 

The other option is to consider a data hub. Here, data from disparate sources is assimilated and stored in a database that can be used by any application. The advantage of this kind of architecture is that it moves away from data duplication, avoiding  data inconsistencies and enhancing data integrity. Care should be taken though to ensure that a clear audit trail identifies the changes any application makes to the data and that data changes are broadcast to all applications. This immediately reveals the impact of any changes made.

It’s also important to specify the location where an application will be hosted and who it will be hosted by. This is where the team’s input becomes crucial.

Storage

Something else to consider is where the applications and the data will reside. The software industry is increasingly moving to the cloud. The main driver behind this is the ability to provide instant support, and, more importantly, to put all infrastructure under the software vendor’s control.

A problem with site-based servers is that they require localized on-site support, and this has become very expensive. It’s also well documented that multi-tenant Software as a Subscription (SaaS) applications can be rolled out faster than site-based installations. This is due to the time saved in avoiding application installations and leveraging vendor-executed qualification documentation. The reason why the vendor executed documents can be leveraged is because the application has already been installed and qualified.  Repeating the same test for each site does not add any value, except when executing performance qualification. 

Some vendors only provide applications on a multi-tenant SaaS basis. If some of the applications are running on a local server or on the company’s cloud account, the type of data architecture to use when integrating these applications is the next thing to address. 

Once the requirement document has been revised, the team may prefer splitting the document for vendor evaluation. Care should, however, be taken to ensure that the relevant interface requirements are properly spelled out in the “child requirement documents”.

Finding a vendor

Selecting a vendor is critical to successfully setting up a complete digital platform. Some make the mistake of sharing the requirement document with the vendor and then asking it to self-certify whether all requirements are being met. This can lead to either:

(a) the company only discovering missing features after the software has been purchased, or

(b) the vendor hastily developing the missing features and then releasing them without proper testing.

Neither of these options leads to productive outcomes. They usually result in delays caused by incomplete platform setups and sometimes even the selection of a new vendor. The digitalization team must, therefore, verify whether the requirements are being met and make notes on those that aren’t, along with a reason why. This can still lead to the vendor being asked to develop missing features, but it will, at least, be an informed decision.

Another common pitfall is making decisions based on presentations that rely on scripted videos. In the era of Theranos-type scams, it’s not always clear if a product  demo represents a prototype or a working application.  This is why it’s important for the team to verify the product’s minimal functionalities. Company data might need to be anonymized when checking requirements against the features being presented.  This anonymized data can then be fed into the application and the process verified. This should be a mandatory requirement.

Some applications require specialist knowledge (e.g., process validation and cleaning validation). So naturally, an application designed by a subject matter expert will be of a higher quality and better suit users’ needs. This becomes very apparent during the demo as the workflow tends to be much smoother. 

It may also be a good idea to consider a vendor who can provide software for multiple applications, thereby ensuring interconnectivity between them all. But care should be taken to ensure that the vendor has expertise in the other applications of interest. For example, it could be that the vendor is known for their Manufacturing Execution System (MES) application but not for their process validation application.  In this case, it’s better to consider applications from separate vendors. 

Once a vendor has been decided on, the team then faces the challenging task of matching their requirements to the application. Only once these applications are identified can they begin to appraise vendors, discuss goals, and share requirements. Vendors should also be encouraged to look out for and highlight any potential implementation roadblocks so that they can clear them early on.

What should the digitalization team do when requirements are not supported by any existing applications? Should it develop solutions on its own? In this case, reflection is key. As a company in the drug manufacturing business – not software development – it’s best to stay in the right lane. There are very few pharmaceutical manufacturers that have successfully developed and managed applications by themselves.

The problem does not lie in the software’s development phase but rather in its maintenance, where both the underlying framework and the application itself need to be kept up-to-date. It’s at this point, especially in the manufacturing sector, that a third party vendor is usually called in to update the code. But what happens when the technology stock used to build these applications becomes obsolete – will another application be developed? Who will test the application? The self-development of applications often leads down a rabbit hole of issues such as these. In some companies, there’s something of a ‘software graveyard’ where many tombstones point to abandoned in-house software developments. The best strategy is, then, to identify a suitable vendor who knows what they’re doing.

Qualifying and rolling out applications

Many applications fail early on because some members of the team tasked with implementing the roll-out, i.e., qualification of the system followed by training all the associated users and implementing changes to affected procedures, do not believe that the application they’ve purchased will meet the relevant requirements.

Procedural delays and red tape caused by objections can stretch out application implementation times. It’s therefore important that the team leader lay out a clear timeline. Site management should also provide support for addressing delays (be it actual or based on perceptions).

Sometimes it’s just not possible to roll out all the features of an application at once. This might be due to network or extra hardware requirements. In such cases, the application roll-out can be divided into phases that each have a clearly defined execution timeline. Site management should ensure that an adequate budget is available for rolling out the later phases.

Qualifying the application can also drain a significant amount of time. The team must, therefore, draw up a clear map of what needs to be qualified. If dealing with a SaaS application, then it will need to discern whether the vendor qualification document can be leveraged rather than repeating the qualification process.

A clear demarcation must also be made when it comes to establishing what the requirements will be if the application is hosted on a local server instead of a SaaS application. If this is not done, then the computer systems validation team will often create its own installation qualification document in an ad hoc fashion. This can result in an auditor questioning how the team performed an installation on a server that was not its own. 

A long-term commitment

Unlocking the full potential of a digital ecosystem demands ongoing evolution and adaptation. Regular assessments and strategic adjustments are essential for its continued success. Changes in technology, regulations, vendor status, and the like can dictate various requirement changes, while activities that could not previously be brought onto the digital platform can be successfully integrated. It’s therefore crucial that pharmaceutical companies look to maintain ongoing digital transformations into the future.

From defining digital requirements and documenting them, to finding the right vendor and finally rolling out applications, leadership is key on the journey to digitalization. It plays a central role in both budgeting to keep the platform fully functional and ensuring that the right team members are retained.  This team will be responsible for carrying out an annual review of the digital platform and recommending course corrections as required. In all this, there must be active and constructive leadership support.

In today’s interconnected world, we can harness and leverage data to our advantage. It’s crucial that we understand how to effectively use data and then translate that understanding into action that enhances existing processes. The members of an agile and efficient digital transformation team serve as skilled navigators capable of interpreting ever-changing circumstances and adapting to a shifting landscape.