Quality in projects for content migration – “a life necessity“ in the Life Sciences industry: Interview with Markus Schneider
Apr 24, 2019 | by Markus Schneider | 0 Comments

 

content migration

Markus Schneider, Managing Consultant and Life Sciences Expert at fme AG in Frankfurt, is one of the leading migration specialists in Central Europe. He is leading a team of application- and content-migration experts specialized on the Life Science and especially Pharmaceutical industry.

Jens Dahl: Markus, you have been working in the area of ECM in the Life Science and Pharmaceutical industry for more than 20 years. You know many platforms, applications, client environments and of course, the processes in that industry by heart. I think one could call you one of the top specialists in this area. What makes this industry so special?

Markus Schneider: Mainly the regulatory requirements, only the aviation and nuclear industry are subject to similar strict quality and security regulations. This is obviously for all our best interest. In these frameworks, you have to keep in mind what could happen if there were errors in the approval or manufacturing processes.

 

Jens Dahl: So, It can be said that the industry is so sensitive because it is about people’s live and health. That is why processes and quality assurance during the introduction of IT-Systems are strongly regulated, supervised and in case of occurrence of irregularities severely sanctioned. Consequently, these possible sanctions bear high risks for the enterprise itself. The strong regulations and risks in case of violation naturally apply to any kind of content migration.

According to your experience, what is the biggest challenge in these kind of content migration projects?

Markus Schneider: The biggest challenges are ensuring of data quality and the process of validation itself.

For example, if submission-relevant data is to be migrated (data that must be reported to the regulatory authorities) the requirement is often that the data must be migrated 100% error-free. In these migration projects, the effort and complexity depends crucially on the data quality in the legacy system. The actual data quality, however, can often not be estimated correctly at the start of the project. In general, the client assumes a very high level of data quality, but this is often not the case. The reasons for data errors in the legacy system can be very diverse; but this is a separate topic in itself. Fact is, however, that these errors must be identified and corrected during migration. Data migration is therefore not just about copying data from A to B. The challenge lies in the data cleansing of the existing data and for example, its adaptation to controlled value lists in the target system. Since, as a rule, the legacy systems are also validated, necessary corrections can often not be made there, i.e. the correction must be carried out during the migration. We have therefore developed procedures and approaches in our team to be able to master these challenges.

The second aspect is the validation process. Our customers are of course very skilled and experienced in implementing validation processes in the IT environment; their experience is most often limited to changes in existing systems or processes and the introduction of new applications. But, data migrations cannot be validated in the same way as the introduction of a new software. There exists elementary differences that must be considered as part of the validation plan already. It is our task to advise our clients in this area and to propose solutions that offer a reasonable cost/benefit ratio.

 

Jens Dahl: Would you say certain aspects of content migration projects are regularly misjudged or underestimated?

Markus Schneider: As the developer of the migration-center we at fme are repeatedly asked by customers to provide our assessment of the customer’s failed migration projects. Therefore, I have already prepared several analyses of failed migrations. Two aspects are particularly noticeable here:

The incorrect selection of the migration tool and / or incorrect migration approach
Once it is clear that data in the legacy system needs to be adapted to fit controlled value lists in the target system, you need a rule-based tool with which these values can be transformed. Static mapping tables or customized scripts won’t do the job, because users will continue working in the legacy system just until the day of the productive migration. Therefore, it is likely that there will be new data sets, which might not be able to be correctly processed by your planned migration approach. Thus, you will need to adapt the implementation followed by additional tests.

In the meantime, users keep creating more and more data and finally you will end up in an endless reengineering circle and without getting closer to the initial goal.

Incorrect estimation of data quality in the legacy system
As I have already described , this is a particular important point. If you assume from the beginning of the project that the data in the legacy system correspond 100% to the object model, it often comes to the situation that the calculated project budget is not sufficient and the milestones of the project cannot be reached as planned. This is then often tried to be compensated by compromises in data quality, which will regularly lead to failure during the Qualification Phase because the acceptance criteria cannot be fulfilled, resulting in not getting the approval to proceed with the migration to the productive system.

 

Jens Dahl: So what’s your advice or rather which conclusions do clients draw from this projects?

Markus Schneider: In many cases, the client underestimates his own efforts and contributions in a migration project. There are dozens, if not hundreds of detailed decisions to be made within the scope of a data migration. We at fme can make sure the migration runs smoothly from the technical point of view and of course we can bring in our experience and expertise to provide important information to create a profound decision base. What we cannot do is to take business content related decisions for them. This stays in the responsibility of the client. For example when matching data values from legacy systems to controlled value lists of the EMA (European Medicines Agency) the customer must decide which values are to be assigned and how. We will of course ensure that their decisions will be technically correctly implemented and error-free during the migration. My experience is that the more realistic a customer estimates his own efforts and creates the necessary organizational basis, the better and faster the projects will run.

 

Jens Dahl: Well, I would say having high-quality processes, methods and tools is something that is desirable independently from the industry. Is this something you could imagine applying these methods in other industries as well, but to reduce the high costs caused by additional documentation and regulatory requirements by something more pragmatic?

Markus Schneider: There are two things that need to be clearly separated here. The quality of a migration in validated environments is not guaranteed by a high documentation effort. Exactly the opposite is the case. Only if the quality is very high the documentation effort in a validated environment can be achieved in a reasonable framework. Everybody who has already done projects in validated environments knows what it means to document unexpected deviations and to explain them in a deviation analysis. That’s why the migration approach which we developed aims especially for high quality and an optimal migration rate. This also means, that the costs for a migration project do not drop drastically due to the reduction of the documentation effort.

The procedures, the implementation standards and the continuous, automated validation of data quality enable us to plan and execute migrations accurately and precisely controlled.

In the pharmaceutical industry the cost of the additional mandatory documentation is an added expense, but as this is not obligatory to the same extent in other industries, these costs could be considerably reduced.

The assumption that migrations in regulated environments are always more expensive is not necessarily true. The question is always what causes high costs in the individual migration projects and are the underlying costs actually comparable?

Here is a simple example:

Supplier A calculates a project budget of 100.000€ for the migration of 2 million documents and achieves a migration rate of 98 %.
Supplier B calculates 130.000 € for the same migration but achieves a migration rate of 100%.

From the project point of view the first offer might seem more attractive, but from the business perspective, the second offer is definitely the better one. The choice for supplier A could turn out to be very expensive for the company, because its employees have to migrate the missing 2% manually. This will take up a significant amount of their time, during which they cannot concentrate on their daily business . This issue is quite often underestimated or neglected by clients.


>
more about our migration projects and services
> more about Life Sciences at fme AG

 

This interview was conducted by Jens Dahl, leader of the competence center Migration Services at fme AG. It was originally recorded in German and translated into English.