At Decentralized Clinical Trials Digital Week (March 2021), Kai Langel, Director of Clinical Innovation at Janssen, hosted a session on the anatomy and practical use cases for digital measures, including biomarkers, trial endpoints and remote monitoring solutions.
He explored the idea of building an ecosystem around clinical innovation, championing partnerships, through the new Digital Endpoints Ecosystem and Protocols (DEEP) framework. The session featured a lengthy Q&A at the end, with numerous questions from the audience, which we have gathered together here.
This piece originally appeared in the Adoption of Decentralized Clinical Trials eBook (June 2021). Read or download the full ebook here.
Can you explain what synergies an ecosystem approach will provide?
“I think right now there's a lot of untapped synergy asset components level assets out there in the broader ecosystem. The way I look at the different organizations in the space is that there's many aggregators like Janssen, and any other pharma company is also an aggregator of different measurement assets. A lot of these assets are sitting on our shelves and we quite often also struggle even internally to know what kind of assets we have available to us already. What kind of algorithms we have. What kind of datasets our different scientists have already generated, where are they, and how would I get access to them and so on.
So if you build a catalog and structure these different assets, you will quite likely discover that you actually have many assets that are related to each other that you didn't know about. And if you can multiply that opportunity across the industry, we'll discover that there's many people and many organizations that are trying to reinvent the wheel that was already invented 10 years ago. Nobody knew about it because it was invented in some obscure location, and that wheel is actually sitting in some dark corner in some scientist's office.
So when we start to catalog these sort of things, we'll discover that, "Look, they actually have already invented the wheel. Let's work together to refine the wheel. Let's make better wheels and different wheels." So when we catalog these different things, you'll discover that on the same shelves where your asset is sitting, you'll discover that there's many other related assets.
And in this catalog model that we have, you can now start building connections between these different assets. You can show which assets are related to each other, what the buildings blocks are that we already have, and where the gaps are where you don't actually have an asset that you might have to create that. And then you can tap into the services to make them more accessible and so on.
So there's a lot of synergy between companies. There's also lots of publicly funded assets that have been traded through IMI and other things that are very hard for different innovators to find. So for example, startups often don't have a good way to find out what kind of assets are coming out of IMI and these sort of sources.
So building a public catalog of those sort of things makes those assets a lot more accessible, not only to the industry stakeholders, but the broader ecosystem, the startup community, academia, and so on. That we can all focus on problems that are still worth solving, and we can leverage work already done by others in the ecosystem.
How can the Digital Endpoints Ecosystem and Protocols (DEEP) framework help aid collaborative studies?
“There's many different ways of doing that. So DEEP really enables these new collaboration models, and one of the concepts to DEEP is the concept of a desired solution profile.
So today we don't really have a very good mechanism for the industry to signal to the innovation ecosystem about what kind of solutions and things are needed. So the startups and so on are trying to guess what the customer needs might be.
There's a mechanism to announce your intent to build something new. So if you need a new kind of wheel, a new kind of solution, you can describe your interests and publish this in the community. And this allows others with the same interest to join you, and maybe co-develop that thing together.
It will also then signal the ecosystem out there who might be holding pieces of what you are actually looking for already. Somebody might then step up and say, "Hey, I actually have the solution that you're asking for, or parts of it. So why don't you leverage what I already have as a starting point for that." And then you don't have to do it alone. Of course, it doesn't work in all cases, because sometimes we want to do some of these things privately. But in many cases, they need to actually align with the measurements of all solutions of technologies.
It is often in everybody's interest the more a particular measurement solution is used, and so it becomes a kind of a gold standard. Like PROs, those instruments are quite standard, everybody uses the same measurement tools in their studies. The same principle can apply to these digital tools as well. Very simply all tests to use the same tools, in some cases to get the work done that they needed to get done. So why not align on creating those solutions from the start to make sure that they stay aligned, that they can do it more efficiently by pooling resources and things like that.”
Are all DEEP capabilities available to the public, or is membership required?
“DEEP isn't fully launched yet, so we are still kind of introducing the concept to the community slowly and we are working through the best ways to really make this scalable and more accessible to everyone.
One of the principles that have been really important to us, is that it should be open access at least for some elements of the platform. So for example, service protocols, this is going to be our gift to the world where we'll simply release these service protocols to everybody for free - there's no membership payments required.
Many aspects of DEEP we want to make broadly available to the community without asking for payment or subscriptions. However, something like this really also needs to be sustainable. So we don't want this to be a product we'll build and fade away. This is something that needs to serve the community and needs to be sustainable. And we are thinking through the best ways to make all of that happen.
We don't have all of those answers yet, but those are some of the principles that are important to us. We want to maximize the value that DEEP brings to the community, the ecosystem, to really drive forward the adoption of these digital measurements, to allow the full ecosystem to tap into the potential and the power of them. That's the key motivation driver behind this. But to make it sustainable there also needs to be some funds generated, so we're still thinking through that part.”
What are the current major challenges in setting up this ecosystem?
“The main challenge is what is the right format for scaling this up? Because this was born inside the walls of Janssen, but of course, it's not really Janssen's core business to provide such cross ecosystem initiatives. So we are looking for the next phase of DEEP. This might become independent from the Janssen organization, but this then becomes a rather unusual project, and that's not something that many people have done before. So it's a bit complex to navigate through all that right now.
Everybody seems to buy into the concept. We have a very strong response from many different stakeholders from the industry, from tech, academia, the regulators. So it is clearly needed in the community. The challenge is how to actually make it all happen.”
What are your thoughts around ensuring that digital endpoints and biomarkers are acceptable to regulators such as the FDA?
“The thing is, this is a new challenge for the regulators as well. These things are rather new. They are evolving. They are complicated. We have things like AI, we have like self-learning algorithms. So how do you control them? What are the new policy impacts of these sort of innovations?
They are trying to figure it out as much as we are. For the community as a whole to figure this out, they need to be working together in a more agile way than what we have maybe seen in the past. Where we have to have an open dialogue so we can share these learnings with the broader community. Many pharma companies are using these private meetings like FDA Type B meetings, for example, at the start of clinical development programs to have these sort of conversations.
One of the problems is that the feedback they get from those discussions are not really shared within the community. So if there's part of a measurement concept that one company proposes with the regulators, and they get feedback from them, that feedback is not shared with others. All the companies end up having the same conversations with regulators about use of active corrective measures, for example, when it would be a much more efficient to share that feedback within the community so they can learn together, and make sure that they all focus on the right things and doing things in a more consistent way.
So this is where I'm saying that this new era of collaboration and more open dialogue is needed. And we need these more practical mechanisms to have an environment where we can share this sort of information with each other. Use case is also very important so if the regular pathway is very different for a medical device, or biomarker than it is for a novel outcomes measure.
So it is a different department within FDA, for example, that would look at that and the mindset, the expectations for evidence for these kinds of things might be different depending on what the use case is. So the regular pathway might be different, but the actual technical solution might be a 100% the same.
There's lots of re-use potential within use cases. Like a biomarker can become a trial endpoint as well, so the regular pathway is different, but the solution itself can be the same. There's also lots of re-use potential between diseases. Like a step counting measure in one disease, can be transitioned for use in another disease with some additional clinical validation evidence to be done.
So the trick is having the meaningful conversation at the right time with the regulators to make sure that you're aligned with them along the way from the measurement concept, to developing it, to validating it, to understanding its performance, and the interpretation of the results in the end.”
How will the sensitivity of complete intellectual property be managed across all stakeholder groups?
“We had an IP focused workshop in the summer where we invited IP experts from the aerospace industry to facilitate this for us. So they could help inspire us about what are the different IP options they have seen in their work in that space.
And the thinking that we adopted from there is that we really want to standardize the IP, or what they call asset leverage options. So we want to offer a menu of options that the asset owner can choose from for how to make their assets available to the broader community. This could be licensing for money, which is very straightforward. I give you my asset, you give me money. Very easy. It could also be a data exchange transaction where I'm an academic, so I will give you my dataset, you'll give me your dataset, and we both win. And that's one way of doing it.
Some of this can be free access for research purposes for example. Some of these can simply be open source free for any use, because it was publicly funded assets, for example. So we have identified nine different leverage mechanisms, and we want to standardize these as very easy to use options, for asset owners and the customers alike.
Once you make your IP choice, we can then support that by having a standardized legal templates to go with that so that it doesn't take months and armies of lawyers to negotiate these sort of things. We want to pre-agree on those things using standardized templates and framework.
And that's how the marketplace can become a lot more efficient. It's not a very fun marketplace if it takes you six to nine months of negotiation to get access to your purchases. So through these kinds of standardization and helping bring the clarity to IP, will really make a big difference.
What is happening today is some organizations simply don't have an IP strategy around these sort of assets. And if you don't have a strategy, nothing is going to happen because nobody knows what they're allowed to do, and what they need to keep private.
So we want to bring IP discussions front and center, so that for every asset we have in the marketplace, there's an IP arrangement and a framework around it, that is crystal clear to all parties.”
What has been the reaction from different parties so far?
"It has been really refreshing to see how different companies and organizations think about this. What we have seen with some startup companies, is that the DEEP stack, for example, really gives them a very clear framework of expectations when they talk to pharma customers. The problem is that we haven't had a common language between some of these startup innovators and pharma, but through the use of the common stack model, we now have a common framework, that they both understand, and work out what the different pieces of the solution that they have together. We have seen some startups actually put their own solutions into the stack model, translating assets they have, putting them into the stack and then showing that to the pharma customer.
With academia, their tech transfer offices' job is to commercialize and monetize the different research assets coming out of academia. So if they have access to this kind of catalog where those building blocks coming from academia can find a new life as part of a startup company, they will find new avenues for accessing new markets. For example, a startup could adopt an algorithm developed by a university, or a pharma company can take the algorithm and put it into clinical development program.
If you look at this from the regulatory perspective, if you're a regulator, you want to see consistency in the way that things are being measured. And today that consistency isn't there with these digital measurements. There's very little base for the industry to align. Through these kinds of a common framework, the regulators will see that companies start to approach these development products in a more consistent manner. And they'll be able to then make regulatory decisions by looking at their data and comparing apples to apples instead of all the inconsistencies that they see put into trials today.
For service companies, they see these as enabling new business models for them even. One of the new concepts that DEEP introduces in the framework in some of the services, is the role of the custodian. If you think about some of these validated questionnaires that clinical trials use, the copyright holder where you can license an instrument like that from the copyright holder, that equivalent doesn't exist in digital measurements.
So we have introduced the role of the custodian, which is the role that assembles the full stack. They've already seen different components coming from different parties and the custodian then coordinates the full stack creation. Maybe utilizing an algorithm from academia, buying the sensor from another company, doing somebody else's work to generate evidence to support it, and then licensing that full package under one license agreement to customers, just like you can license an instrument. That custodian concept can be a whole new business model for some service companies who want to start acting as a custodian and offering that as a service to the community.
So there's many different opportunities and synergies for many players in the community.”
This piece originally appeared in the Adoption of Decentralized Clinical Trials eBook (June 2021). Read or download the full ebook here.