How to maximise clinical trial data

Anders Mortin, co-founder of consulting company TriTiCon looks at the three key steps for start-ups and growing companies wanting to maximise the value of trial data.

In the early stages of starting a company or with a new product in the pipeline, the focus is primarily on trial execution. And of course, execution is important. However, it is the accumulated data that matters in the end as it is the evidence revealing the value of your product and also determining future funding, partnering or out-licensing of the product in the developmental stage. It is furthermore the foundation for a future submission. From a regulatory perspective, you, as the sponsor, have a responsibility to ensure that the data is accurately recorded and reported, even if you fully outsource your trials.

Ensure standards compliance for your data

The simple advice? Follow CDSIC standards. The standards from the Clinical Data Interchange Standards Consortium (CDISC) are not only a technical data concept for data specialists, but also a best practice, quality and compliance driver for CRF design and data collection alongside being an expectation from the FDA for how data is submitted in the end. 

In other words, ensure your trial data deliveries follow requirements by requiring that your data is delivered in CDISC format and with the required associated documentation. 

Now, whilst having CDISC compliant data and documentation takes you a long way, these standards do not define everything. You still need to “enrich” them with your own decisions and details to ensure full consistency across trials and to drive decisions on which data to collect (in detail).

Start with statements and references as requirements to your CRO (or internal staff) and gradually enrich these with details such as specific variables or classifications. When the level of detail grows, “your standards” will develop from a list of requirements in the contract, from decisions on what and how to collect and structure the data, to detailed variables and rules. 

There are specific tools available for managing CRFs and data standards, but in the beginning Word and/or Excel etc will work just fine, and then you can gradually move to more structured formats or a dedicated tool. 

Useful tool: standards compliance checker - a program that checks data against formal CDISC rules for structure and content of actual delivered data-sets (which is also what the FDA does if you send them data and is most likely one of the first things a potential buyer would do in a due diligence). You can get this in-house (probably the smallest, cheapest and easiest tool you will ever get, though a bit on the technical side), or if you source your trial, simply request that your vendor runs the checks and addresses any issues found. Small solution: big value. 

Standards are important for much more than data structure including areas like documentation, output (Tables, Listings and figures) and not least, standards for which data you want to collect. 

See www.triticon.com/resources for an overview of standardised areas and a checklist for required  data and documentation deliverables.

Ensure compliance in data-handling and systems validation

It is cumbersome, it is time consuming and it does require specific knowledge, but handling data in a compliant manner with validated systems is a requirement. Without it, your data might be rejected and in the worst-case scenario, the entire trial deemed worthless. Therefore, as a minimum, you must have the following two activities in place: 

Audit (preferably before selection): Documenting that you have ensured quality processes and compliance for data and handling and validation of the systems collecting, storing and processing the data. 

Oversight of the ongoing trial execution: Documented activities for ensuring quality and compliance by your vendors. 

By “in place” we mean a) having documented processes for performing them (SOPs), b) having qualified personnel performing them, and c) having documented evidence that you have done it.

Remember that even if you fully outsource your trials, you as the sponsor still have full responsibility of the clinical trial. If you run the trial in-house, the tasks and processes are still the same, but the “label” is changed to internal quality assurance work. And as usual, if it is not documented, it didn’t happen.

Controlled Data Storage

So, now you have ensured compliant data structure, the processes and systems are in order and you receive your data and documentation. Now you need to store it safely. There is no use in spending a lot of time and money collecting the data and then wasting it by lack of control. There are some truly advanced systems on the market for data storage (and for managing statistical processing of the data). These are true heavyweight systems to buy and implement and they all come with their challenges. As long as you are storing data, you can (luckily) get a very long way with a basic file-server, though you must ensure the following: 

Security: For external threats and hardware failure (with back-up and disaster recovery capabilities). 

Access control: Access must be controlled with documentation on who has had access to data over time, including proof that no one has modified any of the data. 

Version control/traceability/reproducibility: There must be a clear ‘line of sight’ from statements in reports and submission back to data and programs so that results can be reproduced, i.e. you cannot overwrite data, and you must have documentation of who took which program/applied to which data to produce a certain output or result.

GxP compliant provider: With defined working procedures (Quality Management System/SOPs) that can handle and allow audits and inspections.

If you don’t have in-house IT capabilities, your best bet is to source your storage to a CRO or GxP established IT partner/provider who can support you with the above requirements. There are various on-line providers implementing storage services at practically no cost. Technically most of these are safe and stable, but it can be difficult to get documentation of their procedures and details about audits/inspections that have been carried out. 

Summary

Data standards and system validation will definitely not revolutionise the way you develop your product and are often mistakenly associated with larger trials, or only for use later in a trial. They are however critical, since mishandling of the data may not only result in a decrease in the value of your product to a potential buyer or partner, though may also result in non-compliance and possible rejection of your data by the authorities. 

Managing your data in the right way doesn’t have to be a huge exercise or investment. It is simply about starting with the basics, getting your focus and strategy right and having the understanding and knowledge to source or implement the right services and systems. Since regulatory focus and expectations are increasing, it is important to remember that even if you source, you must have control and ensure oversight and validation. 

There are three components you need to have in-house: 1) Standards management - know what you need; specify what you want; verify that you have got it, 2) Store the data in a safe and controlled way, and 3) Ensure validation and compliance in how 1) and 2) are being done through oversight. Using a file server together with programs such as Excel and Word plus small toolkits and processes, will enable you to reach the first step and set the foundation for more comprehensive system implementations. 

Back to topbutton