Transcript
41 min
Ram Yalamanchili (00:20.64)
All right, nice to see you again. And I'm excited about having a conversation with you about what's happening in the world of ophthalmology and also more of the topics on the AI front, which I think is fascinating. So to begin, I'd love to get an introduction about you and your experience so far. Could you maybe speak a few words on that?
Namrata Saroj (00:44.736)
Yeah. First of all, thank you so much for having me on this podcast. as we've discussed before, AI in the clinical trial arena is happening and it's happening fast. And I think we all need to sort of embrace this, which has been sort of part of my journey. I've been in clinical trials for over 20 years, primarily in ophthalmology. And I've seen sort of the evolution of not just the trials, but the landscape of the trials as well.
where when I used to work as a coordinator back in the day, it was very few therapeutic trials. And now we have some sites doing over 50 trials. So this is where sort of the need for efficiency comes in. I think AI has sort of this misconstrued, it's not misconstrued, maybe it's one dimensional aspect of predictive and imaging. And that's sort of been sort of the buzzwords and ophthalmology.
But recently I've been exposed to doing a little bit more research on where AI could help. And there's so many more pillars where AI can contribute as well and Tilda being one of them as an AI teammate. So I think there's a lot of support we can get from AI in making our trials much more efficient and ultimately, know, expediting the drug development process.
Ram Yalamanchili (02:08.34)
Yeah, and I know you've had a lot of experience in this space. I'm curious, have you seen things in the ophthalmology space which contrast with other therapeutic areas, maybe ophthalmology being either an earlier ophthalmology later ophthalmology, that sort of thing, right? you kind of, any thoughts to share on that front?
Namrata Saroj (02:30.934)
I wish I could, but I pretty much have worked in ophthalmology. I also hear that our trials compared to even, I probably do retina even, our trials are much longer than some of the trials. It just depends on the specialty because you can have very short trials where the biomarkers are very distinct and you can actually have readouts. But for us in retina,
I know that we have to do primarily minimum for at least a year, if not two year trials, right? So that is a pretty long duration.
Ram Yalamanchili (03:05.14)
Yeah, and I think it probably also differs between the anterior and the posterior side of things, right? In terms of durations. Yep. And I thought it was interesting the way you started off with AI having many pillars. And I know the predictive part of it is a lot of literature, a lot of conferences. think they get a good amount of coverage on that pillar. But just to kind of set the stage here, could you explain a little bit more about what you mean by that?
Namrata Saroj (03:09.038)
Great.
Ram Yalamanchili (03:34.372)
What is predictive in your usage of that term?
Namrata Saroj (03:38.026)
Yeah, no, it's, it's, it's, I've, this is my term now is the four pillars of drug development and where AI can come in. think the first one, a pillar is, target identification or drug identification, right? So this is, this sits with the scientists. I have nothing to do with it. I don't ever want to have anything to do with it. I'm not that smart, but it's, it's using AI to be able to streamline identification of potential therapeutics.
So that's the first pillar, right? And then we come to the second pillar once that has been identified, what's our next step is to bring this in humans and design trials. So the second pillar is trial design where can we use existing data to streamline the inclusion exclusion criteria? Can we use data to understand where the biomarkers would be more efficient that we can actually select the population the right way? Because a lot of times trials fail,
because maybe we didn't select the right population, right? In postdoc analysis, you find a subgroup. So that's your second pillar. Then eventually the third pillar is about the trial execution where it's the process, it's the administrative work, it's the collection of data, it's the patient management, and that's where the sponsor CROs as well as the sites have probably the most heavy lifting to do. And then finally is the fourth pillar, which is looking at outcomes and
understanding the data, right? So those, think that's sort of the journey of the trials. And I think each, at each pillar, there's a different role of AI.
Ram Yalamanchili (05:13.31)
Yeah, and also different parts of the ecosystem, right? I think the individuals who get involved in trials need not be the same across all four pillars. I assume it's actually not quite quite probably fairly siloed.
Namrata Saroj (05:25.289)
It's absolutely, my work starts usually when the trial is ready to be in humans. That's why I say that the drug identification part is way above my head. I do, there's overlap of personnel, but I think what is needed is different in different stages.
Ram Yalamanchili (05:43.91)
Right, right. So that kind of brings me to some experience on this front as well, right? I started off in my biotech career primarily on the predictive side of things, building a molecular diagnostics company. you know, early science was sort of when we got founded as a company at Lexant. And after that, it was more or less proving that early science is actually a product. So running into the clinical trial process. So I assume...
you're sort of talking about that phase of a biotech where you might acquire an early science and then you now start to build a company around it. And a good chunk of your company building, whether it be fundraising, whether it be sort of creating the first set of team, really depends on how good is your clinical development program and how fast can you go through it? What is your ability to run these large programs, guess, or whatever size they might be, right?
Well, something I'm curious about is given your experience in this space, what are some challenges which you've seen in particularly in the area which you worked on?
Namrata Saroj (06:50.926)
Yeah, so I think there was one one stats that I just recently read, which we're pretty familiar with. The number keeps going up. But the latest stats says to bring a drug to market can cost anywhere between one point five to two billion dollars and it can take up to 15 years. Right. And then if you start looking at the finalists, how many sort of candidates you have to look through before you can actually get that one approval. That's also pretty pretty remarkable. So
So I think looking at that, keeping that in perspective, that is the challenge, right? It takes so many trials and errors sometimes before you can actually get to a drug that's marketable and good for patients. And I'm trained as an optometrist, vision care is very, very important to me. So my goal is to bring the best treatments to patients as soon as possible. And I think that...
the process is lengthy, right? I've worked in a couple, I I look back at to some of the drugs that I've worked on and now we take the program and they're available, but the process was so long and the biggest, you know, once you've identified the drug and I don't want to focus on that because that's scientist thing, running, designing the trial, can, right? That's you get your feedback, you design your trial, but you're also still designing the trial for masses. And as we're learning more and more about disease, especially in right now,
but realizing we're not treating one disease, we're treating like an umbrella of diseases. So if we can streamline that, we've had some unsuccessful trials and now we look back and say, maybe it was too much of a general population that was recruited. One of the reasons maybe, right? And then I wanna come to the fact that we have so much development and innovation happening in our field, which is remarkable, but what does that mean? It means a lot of trials.
And there are only so many sites and there are only so many clinical research sites. So we have overwhelmed some of our sites with so many trials running. mean, even like in Neovascular AMD right now, we have so many competing studies that even how do you choose patients and how do you monitor and assign the right amount of staff to each study? So that workload is increasing. And we've got very enthusiastic investigators and sites. They wanna be part of every study.
Namrata Saroj (09:09.024)
One thing the investigators always don't pay attention to is how much work goes on behind the scene. I remember when I was back in the hospital, I mentioned I didn't have that many studies. And so my investigators would keep taking studies on and decided, I don't want to do it anymore. And I'm like, you have no idea how much my team has to work to actually even get the study up and running. So anything that we can help the study staff is very critical at this point, only because of the fact that we have so much volume of studies that I have.
Ram Yalamanchili (09:39.206)
That makes sense, yeah. And I've seen this as well, right? I think it's not just ophthalmology. I think there's many other therapeutic areas which probably fit right into this description which you just gave. What I'm also fascinated about right now, we can obviously talk about what some of the other pillars, the layer parts of the operational AI parts, which are interesting, but just taking a look at what's happening in the AI domain right now, there's...
Namrata Saroj (09:48.951)
right.
Ram Yalamanchili (10:08.324)
LLMs which are being trained on drug discovery on predicting protein folding. And I think we've made some pretty impressive breakthroughs there, including the recent Nobel Prize going to Demis and his team. And then of course they've launched their own company now. I think I read this fascinating article and I think his quote was literally like, we're gonna solve all known diseases in 10 years or something like that.
And I would love for them to do that, but I also think there is an enormous amounts of bandwidth problem they're gonna face exactly like what you're talking about, is who is the, unless you eliminate human trials altogether, maybe that day can come at some point, but at least given the current regime and process, I sort of see the bandwidth massively increasing in certain pillars as you've categorized it. And in some areas where,
Namrata Saroj (10:43.052)
bit.
Ram Yalamanchili (11:05.02)
we're desperately in need of better solutions to improve the bandwidth there and productivity there, right? Which again comes back to the number of sites, number of people at the sites and everything else which goes on at the site and the CRO and the sponsor. So the clinical development aspect of it. So is that sort of how you would see it as well? Like in terms of where we're probably going?
Namrata Saroj (11:24.16)
I do, you know, there's aspiration and then there's realism, right? And I think for a long time, we've been hearing AI. I I remember a few years ago at Arvo, which is one of our big research meetings, as you know, there was like one session on AI. And now we both have just come back from Arvo. mean, it was everywhere, right? There was no topic that was getting discussed without AI. that's the progress and it's really good progress.
But you're right, there's a lot of aspirations, but we have to be realistic about what's going to be achievable, right? Predictive biomarkers are going to be amazing, but I think we still have a lot of work to do. It's not, we're not there yet. think where, you know, if you look at our day-to-day work, and I'm not talking about the clinical work that I'm engaged with or you're engaged with, it's more about even our personal lives, right?
assisting us in so many ways. So I think this concept of assisting where execution is needed, it's a no brainer for me, right? Like why wouldn't I want that help? you I have nephews who they always say, Auntie, we start writing everything with chat GPT, right? Like that's the norm now. So where is our chat GPT?
Ram Yalamanchili (12:43.993)
Yeah, I agree and more, right? I I also am a big believer that we're gonna be seeing more agentic style use cases developing from now on. I think for the past couple of years, we've certainly done a lot of the traditional like chat or conversational AI chatbots, that sort of thing. But the technology is maturing to a point where I think you can certainly start to think about
having an assistant or an agent which is able to, know, very capable assistant, right, was able to take on some of this work and perform it on your behalf and come back. And yeah, we certainly need that. think clinical research can definitely use that. Multiple different roles in the research industry could use that. And you know, the roles are across multiple organizations. We have sponsors, we have CROs, we have sites, and they all have different roles. And I would say a...
a need exists for all of these different roles to have some form of a IT Maryland assistant.
Namrata Saroj (13:46.934)
Yeah, I think the only comment I'll make is I don't think it's going to be replacing anything. It's going to be assisting. And I think that's the word we want to emphasize. It's an assistant, right? And as you guys put it, Atilda, it's a teammate. And I think that's important for people to understand that, you know, we still need the workforce the way we do. We just need to become more efficient because of the needs that we have.
Ram Yalamanchili (13:56.367)
Yeah.
Ram Yalamanchili (14:10.522)
Absolutely, I I think there's several camps on this. I've seen people say, know, AI is going to replace, you know, this and that. But I've been pretty convinced from, you know, several years now that we're gonna see sort of like an abundance of opportunity driven by AI. I think the ultimate, you know, the world where we would...
probably hopefully live in soon will be a place where we have many problems in front of us and we have the resources to solve those many problems or at least take shots of these many problems. And I think anyone who's worked in the drug discovery space know that there are many challenges which we have not even touched upon, right? There's so many areas we could probably ask the questions and hopefully if we had the resources and the ability, we should go and pursue those. And that's not possible in the current regime. Like we've been at the current
style of operations for the past decades, many decades now. And we've only achieved so much of productivity or bandwidth as we call it, number of patients recruited. Like we've got the same exact issues and problems with for the last many years now.
Namrata Saroj (15:18.37)
Yeah.
So it's interesting you say that is I used to say this a lot and I still actually say is, you we've made such great advancements in the actual medicine, but how we bring those medicines to market are we're so antiquated in how we work still. Right. I mean, I worked when I started my career, we were doing everything was paperwork, right. Actual paperwork. And fortunately, we've transitioned to electronic.
but just the way that things are managed. And a lot of it, still has to do with the regulatory sort of guidelines. you know, nobody wants to take the risk and do something different because there's so much at stake when you're developing a drug. So God forbid something goes wrong. You don't know if it's a drug or the way you did it. And so that's why there's this sort of apprehension to make these bold changes in the drug development process. But at the same time, I think we are doing ourselves a disservice by not changing the process.
or not having the process evolve the way it should be.
Ram Yalamanchili (16:22.328)
Yeah, and I think changing process is probably one of the biggest aspects of how all of this is going to get adopted, right, in general. Like, what is your strategy there? What I found fascinating is things like, you know, put aside the predictive AI parts of it. I think there's a certain amount of regulatory hurdle you would have to cross before you can actually adopt those type of use cases into your automation or AI.
Namrata Saroj (16:30.658)
Right.
Ram Yalamanchili (16:51.138)
I mean, I think like an example of that would be maybe looking at certain imaging and predicting eligibility, for example. And I think that there's some excellent science in that aspect, but I think the hurdle is also pretty high, right? There's companies which are focused on that running their own clinical trials, for example, on the data and sort of proving out whether their models actually can predict and whether that can be used from a clinical setting, things like that. But I also see another world where
There's ample sets of opportunities in the workflow where you do not need to cross such a regulatory burden or a regulatory hurdle. This could be something as simple as, we dot all the i's and did we cross all the t's on all my documentation? Did I follow the checklist of things which I'm supposed to follow as per the protocol? And is there some kind of an AI agent or a teammate which can help me do that?
By and large, think what we've noticed is those type of problems are the ones where there's a majority of focus in terms of actual day-to-day execution. And it eats up a lot of time. There's multiple layers of checks currently in place because none of these are automated, so they're all manual. So you have people checking other people's work and so on and so forth, right? There's layers of checking. And I know the industry calls it monitoring, but overall, I think there's interesting opportunities where
you have to sort of take a more holistic view and say, know, I'm not going to change so much of my process. It's just the way I come into this execution model and use technology to do exactly what I would have, you know, my staff of five or 10 or 20 people do, but have them use AI to be more productive, more consistent, more reliable in terms of like the quality they're able to do the set. And I'm...
I'm seeing that already happen. think from, from site sponsors and CRO's perspective, there's just tremendous amount of opportunity. The entire industry seems to be very much in like a manual mode right now. And sort of watching that transition slowly into a sort of a hybrid model. So I think the question in my mind is what's, what's going to be that catalyst? I think at some point, every industry needs a catalyst to adopt.
Ram Yalamanchili (19:18.229)
Any type of technology, I know you mentioned paper and paper's pretty much, I think in certain areas it's not existing anymore, especially data capture is now electronic. But I'm curious, like, do you see any other trends which are gonna come in which will basically drive the need for productivity or efficiency, right?
Namrata Saroj (19:35.014)
Yeah, I think the number one catalyst is competition. Right? I think every company right now is whatever they're doing, they're trying to get to the next milestone in an expedited way. So whatever you can do to get to the next milestone, you're going to do. And if that increasing of the efficiency can get you to the next milestone, they'll do it. So it's from a sponsor's perspective that's essential.
There's also differentiation, right? So the CROs and the sites and, know, what are they doing differently than somebody else is doing? Right? I call on a CRO as well. And when I'm looking to sort of pitch to companies, I'm looking to see, why would you want to choose me? Right? So it's because I can give you something that maybe some other company doesn't have. And I think this is where, you know,
the adoption is so critical, because if you don't adopt it, you are going to be left behind. I I think that's really the simplicity of this.
Ram Yalamanchili (20:42.838)
So in some ways it's the competition which will drive that option among among your peers.
Namrata Saroj (20:46.508)
Absolutely. Yeah, because how are you going to do something better? You know, I mean, let's attack the tech world. One technology versus the other technology only succeeds because somebody they offered something that the other person did. Right. So so that's going to be the key is like, how do I differentiate? And as I said, what we're doing in clinical trials has expanded so much. The volume is so much.
Ram Yalamanchili (21:03.765)
Mm-hmm. Mm-hmm.
Namrata Saroj (21:15.874)
that you have to stand out from every perspective, right? As a sponsor, you wanna be bringing a drug that is helpful to patients, it's safe. From a CRO perspective, you're able to execute really well. And from a site perspective, you have to show up and say, can deliver, right? So I'm gonna be, my patients are gonna be well taken care of because I can do this trial efficiently. again, the one thing I wanna also touch was I did do some monitoring way back in my days.
And it was one of the most tedious jobs you can ever do. Like hats off to every monitor. We do some remote monitoring now, so things have helped with the ECRFs, but to your point, double checking something just to make sure there's no error probably still leaves errors on the table.
Ram Yalamanchili (22:00.328)
And I have a case study there where I think frequently I've seen certain worksheets like your visual equity scores, for example, where you're expecting the person doing the test to also do calculations and eventually based on that calculation make a decision whether this particular patient is going to be put into a specific arm or is there an additional dosing which is coming up into the treatment for the patient, right? And I think it's
Namrata Saroj (22:19.79)
Right.
Ram Yalamanchili (22:29.407)
pretty fascinating that those type of worksheets are not automated because you would think that those can be. mean, those are, you don't need AI for that. It's much simpler than that. But to the extent that what I've seen, it's a pretty interesting scenario where, you know, something like that could meaningfully change the direction of the trial itself, like, you know, the data which is being captured and whether you're doing it right or not, and yet it's not being done this way.
I'm curious, like what's, why? Why is this not happening already?
Namrata Saroj (23:03.582)
asking the question I asked, remember I said, medicines have advanced but our process hasn't, right? Because it's about validations and guidelines. I wish I could give you an answer, but that's the unmet need. I think it's, until recently, we were still doing source documents. Actually, a lot of people still do source documents as a paper because you need to see where the pen hit the paper. But we're doing,
Ram Yalamanchili (23:05.844)
Ha!
Namrata Saroj (23:28.782)
There are EMRs everywhere. A lot of them, are no source documents. So e-sources come into being, which is really helpful. But it's not that e-source has been around for a long time. People are still adapting to it. So I think the process has been slow to evolve because, as I mentioned, it's just the change is scared creates fear, I think. We don't want to do this because it's always been done this way.
Ram Yalamanchili (23:52.638)
Yeah.
Namrata Saroj (23:55.374)
But I think people have actually the word I would use now is instead of looking at it as change. Let's look at its evolution. This is this is the next evolution
Ram Yalamanchili (24:03.635)
Yeah, I also found a lot of the systems today are not easy enough to configure or set up to suit this particular type of a use case. So you could try to do it. You might have the intention to try to do it, but then you're becoming a software company yourself. You know what mean? Like you would have to budget a certain amount of your clinical trial budgets to go do this. And that's not something which you probably have the right resourcing or the talent.
Namrata Saroj (24:13.045)
okay.
Namrata Saroj (24:20.707)
Peace out.
Ram Yalamanchili (24:33.928)
allocation within the company to go pursue. So these are another interesting aspect of what I've seen.
Namrata Saroj (24:39.594)
I'll add to that, So sponsors are looking at most sponsors, you know, outsourced to CROs to run the studies, right? So sponsors are not investigating it. They do what they're told to do. It's the CROs that have to invest. And until recently, I don't think CRO was this big business. It was a vendor and that runs your trials. But we've seen over the last five to 10 years where CROs sort of activity has gone up tremendously. They've been acquired by big companies. there's, you know, when
Acquisition by big companies, as you know, it's good in some ways, but not so good in some ways. And good is that they can invest in progress, right? And what we're seeing, I mean, I'm hearing this from the big CROs right now, they're looking into AI technology. Some of them are developing things on their own. So I think it's that movement that's happened in sort of creating these bigger organizations that will invest. To your point, somebody has to do it, right?
a small hero or a small sponsor, which are very budget constrained, are not going to invest. going to say, hey, this works. Why? We don't need to change it.
Ram Yalamanchili (25:44.241)
Yeah, yeah. I think being able to present a solution which is easy to adopt and is not, know, obviously can be so prohibitive in the cost that it's hard to adopt or even sell that as a value. So I think that's where the gap is and certainly where, you know, we're trying to play a role here. But I do see quite a few of these sort of use cases, even in that situation which I was telling you about, we had a study
where we had some access to their retrospective data, looking at these essentially scanned documents in their repository. mean, the error rate we were able to just showcase immediately saying, hey, did you realize that this set of worksheets had these calculation errors? And it's critical from a study execution perspective to know that, right?
It's one of those things where you have that aha moment. You're like, wow, I wish I had this in a real time basis because every time a site did it, it doesn't matter whether a paper or not. Nowadays, LLMs have the ability to have visual cortex. They can see, they can read just like how we would. And you can have workflows built on top of it. And you can do some sort of quality check just like you and I would if we were reading through this paperwork.
And this is what's really interesting to me. I think there are some pretty clear benefits which are not in the regulatory realm of prediction or data generation. It's more of, I just have to do this. There's so much of this to do, and there's a high error rate because it's a tedious amount of work. And I would rather have an assistant there, and an AI could do this day and night. So it's a...
That's just one example, but I've seen several such examples across the web.
Namrata Saroj (27:36.526)
I think the word tedious is really important here, right? Because as we're talking about sort of the volume of trials going up, right? But we also within a trial, the volume of data we're collecting is going up as well, because we at least in saying right now we have many more imaging devices, there are functional tests and, and PROs, right? So all of that data, the number of the amount of data we're collecting goes up. the, the aspect of, you know, being tedious goes up as well.
And to your point, it's human nature. We're not perfect. Well, there's going to be some errors.
Ram Yalamanchili (28:13.863)
Yeah, My, like towards the tail end of some of these questions, how do you see, you use the word competition, and is that largely a time, is it a budget, is it a mix of everything? Like how do you see that? And what about talent? you know, just hiring and retaining, keeping the right talent as well, right? So I'm curious, is there more to discuss there in terms of like,
when you define it as competition and being competitive rather.
Namrata Saroj (28:45.122)
I think it's everything you mentioned, right? So it starts with talent because you only need one of right people running the studies in all three. So let's just, you we'll do sponsor CRO insights, right? You need talented individuals to actually design the right studies, design the right products and be able to actually execute the right way. And same thing for CROs. You need to be, you know,
have the right people to execute on the ground. And of course, at the site level for sure. And at site level, it's a combination. You've got the investigators and you've got to study staff and study staff have become a huge department there because it takes so much to run the studies to coordinate all these different aspects of what the sponsors are asking for. So the competition comes in, the competition is again, different for different, these three different elements, right? For the sponsor, the competition comes like, hey, what am I?
What are we providing to the sites, Is the drug the right, something that's good for my patients? But also we wanna be compensating the sites for the right amount too as well, right? There's a lot of hard work that goes in. So, over the course of the years, I think it's more because the studies have become so...
intensive and time consuming that you have to be compensating at a really high level. So that's competition as well. You know, I'll tell you, most of my sites, sometimes they want to pick study not because of the sponsor, but based on the CRO. They have very strong opinions about the CROs they want to work with, because that's the interaction on almost on a daily level. And if they don't like the people they're working with, they're not going to do it. So if a CRO is going to have a monitor that's going to be
not well trained or doesn't understand the details of the trials, or doesn't have experience in the therapeutic area and goes to a site that the coordinators have tremendous therapeutic experience, it's not going to be a good match. Again, there's so many elements of the trial that you mentioned this earlier in the podcast that we haven't even uncovered everything because we don't know that this is an issue. You can only uncover when you actually start
Ram Yalamanchili (30:49.487)
Thank
Namrata Saroj (31:02.882)
delving into fixing things and you go, okay, maybe that other thing needs to be fixed as well.
Ram Yalamanchili (31:07.12)
I'm curious, why do you think we have a sort of like, why don't we have more sites participating in this ecosystem of research? Because I do hear sites are bottleneck and there's only so many sites and investigators out there. But I'm curious, like, because it's economically it's viable, at least it seems like it. And it's a, you really are pursuing something as a very interesting
sort of a workflow, right? Like you're working on cutting edge drugs or devices. There's tremendous benefit you can bring to the patient population in general. So I'd be curious to hear your thoughts, given your experience in the space.
Namrata Saroj (31:54.168)
So the impression is that it's not economically viable, right? So that's actually how it started to act. So that's probably one of the number one reasons. The number of clinical sites now that we have is significantly higher than when I started over 20 years ago. There is an interest in research, but it really depends on the type of practice.
We're seeing more more fellows and residents get exposed to the clinical trial world when they're in their trainings. So when they leave, they want to start it. Back in the day, that exposure was not there. So they didn't know any better. And you sort of join a practice and you kind of do your routine clinical practice. But now, because they're doing their training, they're getting exposed.
I have a lot of young investigators who've gone into established practices with no research department, start their own department, right? But it's a challenge still. have one site where they have really great recruitment and I'm offering them to expand and do things. I was reminded that the rest of the group doesn't really believe in research, so they're not going to invest in it. So that's the thing. You have to invest in it.
before you can profit from it, like anything else, right? So there is still a struggle. You would think that for all the number of retina practices I have, we should have so many sites and it should never be an issue, but it still is. It's a lot of convincing for some of the younger investigators for their senior partners to be bought in and then they have to deliver and then they have to show that
you know, they're not losing money, they're actually making money. And that doesn't always happen with the first research project you do. So hopefully, more and more people will see the bigger picture of research, where you can say, you know, it's cutting on the cutting edge treatment, you can offer that to your patients, for your academic and career growth, that's really great as well. So hopefully, more and more people will see that and more and more
Namrata Saroj (34:14.808)
practices will get established. In fact, I will tell you, most of the great recruiters that we have in our studies right now are not your seasoned researchers. It's the young ones because they're motivated to do it.
Ram Yalamanchili (34:25.935)
Yeah, yeah. No, I think that's an interesting take. And I can see that the activation energy required, especially if you're a new site just getting started, yeah, you have to invest and that there's a certain amount of uncertainty. I should say there's uncertainty in that process in terms of what you're getting into. It's a commitment. But again, I think this is the type of interesting places where automation is always
Namrata Saroj (34:44.205)
Yeah.
Ram Yalamanchili (34:55.945)
so much more beneficial than sort of not using any automation completely relying on manual process, right? So I think we've definitely seen sites even our own experience has been like getting a lot from bringing some of the cutting edge technologies to their practice.
Namrata Saroj (35:13.396)
Yeah, so I think one other aspect of starting research or even some research centers, they share study staff between the research department and clinic, right? And I always say that's not always the best move because the work we do in clinic is not the same as we do in, not exactly the same as we do in the study arena. And I think if, especially for those folks where they might have, you know, resource limitations,
any automation support helps because then they can balance both aspects because it is very demanding to be balancing both. If you spoke to study coordinators who are doing both, it's like it is very difficult to kind of keep switching off like, okay, I'm doing the study, which is more documentation. I'm going to clinic and I need to get the patients through. So any automation we can help there would definitely be appreciated.
Ram Yalamanchili (36:04.686)
Yeah, it's interesting you put it out there, Namrata. I've seen two versions of this, right? One is the early sites where they really are looking for any sort of help which can offload the work from their current staff. So that's one site, one sort of a phenotype. And then there's these other sites which are very high volume, high recruiting, like successful practices which have built great research programs.
But for them, it's more about how can I do more with the number of staff I trust and I've worked with for many, many years. we don't necessarily, we're not too keen on like doubling tripling our staff because we have done it and there's some problem or the other and that's not the path they want to go after, right? So I think either way I can see benefits from essentially like bringing in some sort of a automation or like AI technology, right? So that's kind of where we are and what we're seeing.
Namrata Saroj (36:37.09)
Thank you.
Namrata Saroj (37:03.662)
Absolutely. think that, and this is what's the beauty of technology and AI is you can custom it to what you need, right? It's not one formula fits all. And I think this is why the discussions I have with FIX right now is, you when people say AI, it's not one thing. There's so many aspects to it. So what is it that you need and do you have an AI solution for it? It's very different than saying,
I have AI. That means nothing.
Ram Yalamanchili (37:37.55)
Yeah, I agree. And I think another realization which we've had is you have to go after from a persona perspective. So which persona are you looking for help in? And can we solve it or can we not, right? So we go that way.
Namrata Saroj (37:55.918)
Yeah. Right. What's the in drug development? The biggest thing we go for is unmet need. Right. What is that? And I'll give you an example of tech. It's not AI related, but several years ago I was helping a company that, um, it's on the other side of clinical trials. do prior authorizations for our drugs. Right. I sit on clinical. didn't know that. Right. What's still being done with faxes and telephone calls. And somebody came up with an electronic system. And when we were trying to implement that.
I, we got resistant from some sites, my God, like would my proud person get, you know, laid off or whatever. You're like, no, they're helping them. They're going to be more efficient in the priority you're going to do. Right. That's, that's the conversation we're having now again, in this respect with AI as well, that, Hey, we're not talking about replacement. We're not talking, this is about how we can make everybody more efficient.
Ram Yalamanchili (38:36.877)
Mm-hmm.
Ram Yalamanchili (38:51.947)
Yeah, yeah. No, absolutely. think, and like you said, I like the way you put it, right? I think the competition will eventually drive this towards adoption. The benefits are very clear. It's not just about having a shiny tool. I think the productivity gain, the economic and the quality benefits you're getting are pretty tremendous with what we have, what we're seeing right now. So I do look at it as a matter of time rather than, you know, a matter of if or.
That's sort of the thing, right?
Namrata Saroj (39:22.926)
Yeah, productivity and quality are going to be very essential. when we do site selections, we're looking at how do they recruit, which of course that's a different AI that's going to work on that. But the quality of the data really matters. When the monitors go in and tell us they have so many outstanding queries, you're getting red flagged. And so the next time around, people will think twice, like, OK, maybe they recruited well, but the data was a disaster, so we can't risk that anymore.
Right? Over the course of running clinical trials, I've had data from sites we had to actually exclude because it was not good. Right? And think about how much of an investment cost that is. And it can really affect the outcomes of a trial and that affects sort of the entire drug development process. So that's really critical that the sites provide good data. And if we can help them do that, their visibility goes up. And guess what? They're going to be
Ram Yalamanchili (40:14.124)
Hmm.
Namrata Saroj (40:20.342)
on top of the list for the next recruitment.
Ram Yalamanchili (40:23.212)
Yeah, yeah. Fascinating. Well, thanks for your time. I think this has been a pretty interesting call. yeah, I hope to talk to you soon. Thanks for your time again.
Namrata Saroj (40:36.578)
Thank you and great work on told. I think, you know, going as an AI teammate to the sites, to the sponsors, to the CROs. I think it's going to be really helpful. And I, as far as I know, I think you guys are the only ones doing that right now. hopefully more sites will, will adopt and we'll get some companies to adopt them too.
Ram Yalamanchili (00:20.64)
All right, nice to see you again. And I'm excited about having a conversation with you about what's happening in the world of ophthalmology and also more of the topics on the AI front, which I think is fascinating. So to begin, I'd love to get an introduction about you and your experience so far. Could you maybe speak a few words on that?
Namrata Saroj (00:44.736)
Yeah. First of all, thank you so much for having me on this podcast. as we've discussed before, AI in the clinical trial arena is happening and it's happening fast. And I think we all need to sort of embrace this, which has been sort of part of my journey. I've been in clinical trials for over 20 years, primarily in ophthalmology. And I've seen sort of the evolution of not just the trials, but the landscape of the trials as well.
where when I used to work as a coordinator back in the day, it was very few therapeutic trials. And now we have some sites doing over 50 trials. So this is where sort of the need for efficiency comes in. I think AI has sort of this misconstrued, it's not misconstrued, maybe it's one dimensional aspect of predictive and imaging. And that's sort of been sort of the buzzwords and ophthalmology.
But recently I've been exposed to doing a little bit more research on where AI could help. And there's so many more pillars where AI can contribute as well and Tilda being one of them as an AI teammate. So I think there's a lot of support we can get from AI in making our trials much more efficient and ultimately, know, expediting the drug development process.
Ram Yalamanchili (02:08.34)
Yeah, and I know you've had a lot of experience in this space. I'm curious, have you seen things in the ophthalmology space which contrast with other therapeutic areas, maybe ophthalmology being either an earlier ophthalmology later ophthalmology, that sort of thing, right? you kind of, any thoughts to share on that front?
Namrata Saroj (02:30.934)
I wish I could, but I pretty much have worked in ophthalmology. I also hear that our trials compared to even, I probably do retina even, our trials are much longer than some of the trials. It just depends on the specialty because you can have very short trials where the biomarkers are very distinct and you can actually have readouts. But for us in retina,
I know that we have to do primarily minimum for at least a year, if not two year trials, right? So that is a pretty long duration.
Ram Yalamanchili (03:05.14)
Yeah, and I think it probably also differs between the anterior and the posterior side of things, right? In terms of durations. Yep. And I thought it was interesting the way you started off with AI having many pillars. And I know the predictive part of it is a lot of literature, a lot of conferences. think they get a good amount of coverage on that pillar. But just to kind of set the stage here, could you explain a little bit more about what you mean by that?
Namrata Saroj (03:09.038)
Great.
Ram Yalamanchili (03:34.372)
What is predictive in your usage of that term?
Namrata Saroj (03:38.026)
Yeah, no, it's, it's, it's, I've, this is my term now is the four pillars of drug development and where AI can come in. think the first one, a pillar is, target identification or drug identification, right? So this is, this sits with the scientists. I have nothing to do with it. I don't ever want to have anything to do with it. I'm not that smart, but it's, it's using AI to be able to streamline identification of potential therapeutics.
So that's the first pillar, right? And then we come to the second pillar once that has been identified, what's our next step is to bring this in humans and design trials. So the second pillar is trial design where can we use existing data to streamline the inclusion exclusion criteria? Can we use data to understand where the biomarkers would be more efficient that we can actually select the population the right way? Because a lot of times trials fail,
because maybe we didn't select the right population, right? In postdoc analysis, you find a subgroup. So that's your second pillar. Then eventually the third pillar is about the trial execution where it's the process, it's the administrative work, it's the collection of data, it's the patient management, and that's where the sponsor CROs as well as the sites have probably the most heavy lifting to do. And then finally is the fourth pillar, which is looking at outcomes and
understanding the data, right? So those, think that's sort of the journey of the trials. And I think each, at each pillar, there's a different role of AI.
Ram Yalamanchili (05:13.31)
Yeah, and also different parts of the ecosystem, right? I think the individuals who get involved in trials need not be the same across all four pillars. I assume it's actually not quite quite probably fairly siloed.
Namrata Saroj (05:25.289)
It's absolutely, my work starts usually when the trial is ready to be in humans. That's why I say that the drug identification part is way above my head. I do, there's overlap of personnel, but I think what is needed is different in different stages.
Ram Yalamanchili (05:43.91)
Right, right. So that kind of brings me to some experience on this front as well, right? I started off in my biotech career primarily on the predictive side of things, building a molecular diagnostics company. you know, early science was sort of when we got founded as a company at Lexant. And after that, it was more or less proving that early science is actually a product. So running into the clinical trial process. So I assume...
you're sort of talking about that phase of a biotech where you might acquire an early science and then you now start to build a company around it. And a good chunk of your company building, whether it be fundraising, whether it be sort of creating the first set of team, really depends on how good is your clinical development program and how fast can you go through it? What is your ability to run these large programs, guess, or whatever size they might be, right?
Well, something I'm curious about is given your experience in this space, what are some challenges which you've seen in particularly in the area which you worked on?
Namrata Saroj (06:50.926)
Yeah, so I think there was one one stats that I just recently read, which we're pretty familiar with. The number keeps going up. But the latest stats says to bring a drug to market can cost anywhere between one point five to two billion dollars and it can take up to 15 years. Right. And then if you start looking at the finalists, how many sort of candidates you have to look through before you can actually get that one approval. That's also pretty pretty remarkable. So
So I think looking at that, keeping that in perspective, that is the challenge, right? It takes so many trials and errors sometimes before you can actually get to a drug that's marketable and good for patients. And I'm trained as an optometrist, vision care is very, very important to me. So my goal is to bring the best treatments to patients as soon as possible. And I think that...
the process is lengthy, right? I've worked in a couple, I I look back at to some of the drugs that I've worked on and now we take the program and they're available, but the process was so long and the biggest, you know, once you've identified the drug and I don't want to focus on that because that's scientist thing, running, designing the trial, can, right? That's you get your feedback, you design your trial, but you're also still designing the trial for masses. And as we're learning more and more about disease, especially in right now,
but realizing we're not treating one disease, we're treating like an umbrella of diseases. So if we can streamline that, we've had some unsuccessful trials and now we look back and say, maybe it was too much of a general population that was recruited. One of the reasons maybe, right? And then I wanna come to the fact that we have so much development and innovation happening in our field, which is remarkable, but what does that mean? It means a lot of trials.
And there are only so many sites and there are only so many clinical research sites. So we have overwhelmed some of our sites with so many trials running. mean, even like in Neovascular AMD right now, we have so many competing studies that even how do you choose patients and how do you monitor and assign the right amount of staff to each study? So that workload is increasing. And we've got very enthusiastic investigators and sites. They wanna be part of every study.
Namrata Saroj (09:09.024)
One thing the investigators always don't pay attention to is how much work goes on behind the scene. I remember when I was back in the hospital, I mentioned I didn't have that many studies. And so my investigators would keep taking studies on and decided, I don't want to do it anymore. And I'm like, you have no idea how much my team has to work to actually even get the study up and running. So anything that we can help the study staff is very critical at this point, only because of the fact that we have so much volume of studies that I have.
Ram Yalamanchili (09:39.206)
That makes sense, yeah. And I've seen this as well, right? I think it's not just ophthalmology. I think there's many other therapeutic areas which probably fit right into this description which you just gave. What I'm also fascinated about right now, we can obviously talk about what some of the other pillars, the layer parts of the operational AI parts, which are interesting, but just taking a look at what's happening in the AI domain right now, there's...
Namrata Saroj (09:48.951)
right.
Ram Yalamanchili (10:08.324)
LLMs which are being trained on drug discovery on predicting protein folding. And I think we've made some pretty impressive breakthroughs there, including the recent Nobel Prize going to Demis and his team. And then of course they've launched their own company now. I think I read this fascinating article and I think his quote was literally like, we're gonna solve all known diseases in 10 years or something like that.
And I would love for them to do that, but I also think there is an enormous amounts of bandwidth problem they're gonna face exactly like what you're talking about, is who is the, unless you eliminate human trials altogether, maybe that day can come at some point, but at least given the current regime and process, I sort of see the bandwidth massively increasing in certain pillars as you've categorized it. And in some areas where,
Namrata Saroj (10:43.052)
bit.
Ram Yalamanchili (11:05.02)
we're desperately in need of better solutions to improve the bandwidth there and productivity there, right? Which again comes back to the number of sites, number of people at the sites and everything else which goes on at the site and the CRO and the sponsor. So the clinical development aspect of it. So is that sort of how you would see it as well? Like in terms of where we're probably going?
Namrata Saroj (11:24.16)
I do, you know, there's aspiration and then there's realism, right? And I think for a long time, we've been hearing AI. I I remember a few years ago at Arvo, which is one of our big research meetings, as you know, there was like one session on AI. And now we both have just come back from Arvo. mean, it was everywhere, right? There was no topic that was getting discussed without AI. that's the progress and it's really good progress.
But you're right, there's a lot of aspirations, but we have to be realistic about what's going to be achievable, right? Predictive biomarkers are going to be amazing, but I think we still have a lot of work to do. It's not, we're not there yet. think where, you know, if you look at our day-to-day work, and I'm not talking about the clinical work that I'm engaged with or you're engaged with, it's more about even our personal lives, right?
assisting us in so many ways. So I think this concept of assisting where execution is needed, it's a no brainer for me, right? Like why wouldn't I want that help? you I have nephews who they always say, Auntie, we start writing everything with chat GPT, right? Like that's the norm now. So where is our chat GPT?
Ram Yalamanchili (12:43.993)
Yeah, I agree and more, right? I I also am a big believer that we're gonna be seeing more agentic style use cases developing from now on. I think for the past couple of years, we've certainly done a lot of the traditional like chat or conversational AI chatbots, that sort of thing. But the technology is maturing to a point where I think you can certainly start to think about
having an assistant or an agent which is able to, know, very capable assistant, right, was able to take on some of this work and perform it on your behalf and come back. And yeah, we certainly need that. think clinical research can definitely use that. Multiple different roles in the research industry could use that. And you know, the roles are across multiple organizations. We have sponsors, we have CROs, we have sites, and they all have different roles. And I would say a...
a need exists for all of these different roles to have some form of a IT Maryland assistant.
Namrata Saroj (13:46.934)
Yeah, I think the only comment I'll make is I don't think it's going to be replacing anything. It's going to be assisting. And I think that's the word we want to emphasize. It's an assistant, right? And as you guys put it, Atilda, it's a teammate. And I think that's important for people to understand that, you know, we still need the workforce the way we do. We just need to become more efficient because of the needs that we have.
Ram Yalamanchili (13:56.367)
Yeah.
Ram Yalamanchili (14:10.522)
Absolutely, I I think there's several camps on this. I've seen people say, know, AI is going to replace, you know, this and that. But I've been pretty convinced from, you know, several years now that we're gonna see sort of like an abundance of opportunity driven by AI. I think the ultimate, you know, the world where we would...
probably hopefully live in soon will be a place where we have many problems in front of us and we have the resources to solve those many problems or at least take shots of these many problems. And I think anyone who's worked in the drug discovery space know that there are many challenges which we have not even touched upon, right? There's so many areas we could probably ask the questions and hopefully if we had the resources and the ability, we should go and pursue those. And that's not possible in the current regime. Like we've been at the current
style of operations for the past decades, many decades now. And we've only achieved so much of productivity or bandwidth as we call it, number of patients recruited. Like we've got the same exact issues and problems with for the last many years now.
Namrata Saroj (15:18.37)
Yeah.
So it's interesting you say that is I used to say this a lot and I still actually say is, you we've made such great advancements in the actual medicine, but how we bring those medicines to market are we're so antiquated in how we work still. Right. I mean, I worked when I started my career, we were doing everything was paperwork, right. Actual paperwork. And fortunately, we've transitioned to electronic.
but just the way that things are managed. And a lot of it, still has to do with the regulatory sort of guidelines. you know, nobody wants to take the risk and do something different because there's so much at stake when you're developing a drug. So God forbid something goes wrong. You don't know if it's a drug or the way you did it. And so that's why there's this sort of apprehension to make these bold changes in the drug development process. But at the same time, I think we are doing ourselves a disservice by not changing the process.
or not having the process evolve the way it should be.
Ram Yalamanchili (16:22.328)
Yeah, and I think changing process is probably one of the biggest aspects of how all of this is going to get adopted, right, in general. Like, what is your strategy there? What I found fascinating is things like, you know, put aside the predictive AI parts of it. I think there's a certain amount of regulatory hurdle you would have to cross before you can actually adopt those type of use cases into your automation or AI.
Namrata Saroj (16:30.658)
Right.
Ram Yalamanchili (16:51.138)
I mean, I think like an example of that would be maybe looking at certain imaging and predicting eligibility, for example. And I think that there's some excellent science in that aspect, but I think the hurdle is also pretty high, right? There's companies which are focused on that running their own clinical trials, for example, on the data and sort of proving out whether their models actually can predict and whether that can be used from a clinical setting, things like that. But I also see another world where
There's ample sets of opportunities in the workflow where you do not need to cross such a regulatory burden or a regulatory hurdle. This could be something as simple as, we dot all the i's and did we cross all the t's on all my documentation? Did I follow the checklist of things which I'm supposed to follow as per the protocol? And is there some kind of an AI agent or a teammate which can help me do that?
By and large, think what we've noticed is those type of problems are the ones where there's a majority of focus in terms of actual day-to-day execution. And it eats up a lot of time. There's multiple layers of checks currently in place because none of these are automated, so they're all manual. So you have people checking other people's work and so on and so forth, right? There's layers of checking. And I know the industry calls it monitoring, but overall, I think there's interesting opportunities where
you have to sort of take a more holistic view and say, know, I'm not going to change so much of my process. It's just the way I come into this execution model and use technology to do exactly what I would have, you know, my staff of five or 10 or 20 people do, but have them use AI to be more productive, more consistent, more reliable in terms of like the quality they're able to do the set. And I'm...
I'm seeing that already happen. think from, from site sponsors and CRO's perspective, there's just tremendous amount of opportunity. The entire industry seems to be very much in like a manual mode right now. And sort of watching that transition slowly into a sort of a hybrid model. So I think the question in my mind is what's, what's going to be that catalyst? I think at some point, every industry needs a catalyst to adopt.
Ram Yalamanchili (19:18.229)
Any type of technology, I know you mentioned paper and paper's pretty much, I think in certain areas it's not existing anymore, especially data capture is now electronic. But I'm curious, like, do you see any other trends which are gonna come in which will basically drive the need for productivity or efficiency, right?
Namrata Saroj (19:35.014)
Yeah, I think the number one catalyst is competition. Right? I think every company right now is whatever they're doing, they're trying to get to the next milestone in an expedited way. So whatever you can do to get to the next milestone, you're going to do. And if that increasing of the efficiency can get you to the next milestone, they'll do it. So it's from a sponsor's perspective that's essential.
There's also differentiation, right? So the CROs and the sites and, know, what are they doing differently than somebody else is doing? Right? I call on a CRO as well. And when I'm looking to sort of pitch to companies, I'm looking to see, why would you want to choose me? Right? So it's because I can give you something that maybe some other company doesn't have. And I think this is where, you know,
the adoption is so critical, because if you don't adopt it, you are going to be left behind. I I think that's really the simplicity of this.
Ram Yalamanchili (20:42.838)
So in some ways it's the competition which will drive that option among among your peers.
Namrata Saroj (20:46.508)
Absolutely. Yeah, because how are you going to do something better? You know, I mean, let's attack the tech world. One technology versus the other technology only succeeds because somebody they offered something that the other person did. Right. So so that's going to be the key is like, how do I differentiate? And as I said, what we're doing in clinical trials has expanded so much. The volume is so much.
Ram Yalamanchili (21:03.765)
Mm-hmm. Mm-hmm.
Namrata Saroj (21:15.874)
that you have to stand out from every perspective, right? As a sponsor, you wanna be bringing a drug that is helpful to patients, it's safe. From a CRO perspective, you're able to execute really well. And from a site perspective, you have to show up and say, can deliver, right? So I'm gonna be, my patients are gonna be well taken care of because I can do this trial efficiently. again, the one thing I wanna also touch was I did do some monitoring way back in my days.
And it was one of the most tedious jobs you can ever do. Like hats off to every monitor. We do some remote monitoring now, so things have helped with the ECRFs, but to your point, double checking something just to make sure there's no error probably still leaves errors on the table.
Ram Yalamanchili (22:00.328)
And I have a case study there where I think frequently I've seen certain worksheets like your visual equity scores, for example, where you're expecting the person doing the test to also do calculations and eventually based on that calculation make a decision whether this particular patient is going to be put into a specific arm or is there an additional dosing which is coming up into the treatment for the patient, right? And I think it's
Namrata Saroj (22:19.79)
Right.
Ram Yalamanchili (22:29.407)
pretty fascinating that those type of worksheets are not automated because you would think that those can be. mean, those are, you don't need AI for that. It's much simpler than that. But to the extent that what I've seen, it's a pretty interesting scenario where, you know, something like that could meaningfully change the direction of the trial itself, like, you know, the data which is being captured and whether you're doing it right or not, and yet it's not being done this way.
I'm curious, like what's, why? Why is this not happening already?
Namrata Saroj (23:03.582)
asking the question I asked, remember I said, medicines have advanced but our process hasn't, right? Because it's about validations and guidelines. I wish I could give you an answer, but that's the unmet need. I think it's, until recently, we were still doing source documents. Actually, a lot of people still do source documents as a paper because you need to see where the pen hit the paper. But we're doing,
Ram Yalamanchili (23:05.844)
Ha!
Namrata Saroj (23:28.782)
There are EMRs everywhere. A lot of them, are no source documents. So e-sources come into being, which is really helpful. But it's not that e-source has been around for a long time. People are still adapting to it. So I think the process has been slow to evolve because, as I mentioned, it's just the change is scared creates fear, I think. We don't want to do this because it's always been done this way.
Ram Yalamanchili (23:52.638)
Yeah.
Namrata Saroj (23:55.374)
But I think people have actually the word I would use now is instead of looking at it as change. Let's look at its evolution. This is this is the next evolution
Ram Yalamanchili (24:03.635)
Yeah, I also found a lot of the systems today are not easy enough to configure or set up to suit this particular type of a use case. So you could try to do it. You might have the intention to try to do it, but then you're becoming a software company yourself. You know what mean? Like you would have to budget a certain amount of your clinical trial budgets to go do this. And that's not something which you probably have the right resourcing or the talent.
Namrata Saroj (24:13.045)
okay.
Namrata Saroj (24:20.707)
Peace out.
Ram Yalamanchili (24:33.928)
allocation within the company to go pursue. So these are another interesting aspect of what I've seen.
Namrata Saroj (24:39.594)
I'll add to that, So sponsors are looking at most sponsors, you know, outsourced to CROs to run the studies, right? So sponsors are not investigating it. They do what they're told to do. It's the CROs that have to invest. And until recently, I don't think CRO was this big business. It was a vendor and that runs your trials. But we've seen over the last five to 10 years where CROs sort of activity has gone up tremendously. They've been acquired by big companies. there's, you know, when
Acquisition by big companies, as you know, it's good in some ways, but not so good in some ways. And good is that they can invest in progress, right? And what we're seeing, I mean, I'm hearing this from the big CROs right now, they're looking into AI technology. Some of them are developing things on their own. So I think it's that movement that's happened in sort of creating these bigger organizations that will invest. To your point, somebody has to do it, right?
a small hero or a small sponsor, which are very budget constrained, are not going to invest. going to say, hey, this works. Why? We don't need to change it.
Ram Yalamanchili (25:44.241)
Yeah, yeah. I think being able to present a solution which is easy to adopt and is not, know, obviously can be so prohibitive in the cost that it's hard to adopt or even sell that as a value. So I think that's where the gap is and certainly where, you know, we're trying to play a role here. But I do see quite a few of these sort of use cases, even in that situation which I was telling you about, we had a study
where we had some access to their retrospective data, looking at these essentially scanned documents in their repository. mean, the error rate we were able to just showcase immediately saying, hey, did you realize that this set of worksheets had these calculation errors? And it's critical from a study execution perspective to know that, right?
It's one of those things where you have that aha moment. You're like, wow, I wish I had this in a real time basis because every time a site did it, it doesn't matter whether a paper or not. Nowadays, LLMs have the ability to have visual cortex. They can see, they can read just like how we would. And you can have workflows built on top of it. And you can do some sort of quality check just like you and I would if we were reading through this paperwork.
And this is what's really interesting to me. I think there are some pretty clear benefits which are not in the regulatory realm of prediction or data generation. It's more of, I just have to do this. There's so much of this to do, and there's a high error rate because it's a tedious amount of work. And I would rather have an assistant there, and an AI could do this day and night. So it's a...
That's just one example, but I've seen several such examples across the web.
Namrata Saroj (27:36.526)
I think the word tedious is really important here, right? Because as we're talking about sort of the volume of trials going up, right? But we also within a trial, the volume of data we're collecting is going up as well, because we at least in saying right now we have many more imaging devices, there are functional tests and, and PROs, right? So all of that data, the number of the amount of data we're collecting goes up. the, the aspect of, you know, being tedious goes up as well.
And to your point, it's human nature. We're not perfect. Well, there's going to be some errors.
Ram Yalamanchili (28:13.863)
Yeah, My, like towards the tail end of some of these questions, how do you see, you use the word competition, and is that largely a time, is it a budget, is it a mix of everything? Like how do you see that? And what about talent? you know, just hiring and retaining, keeping the right talent as well, right? So I'm curious, is there more to discuss there in terms of like,
when you define it as competition and being competitive rather.
Namrata Saroj (28:45.122)
I think it's everything you mentioned, right? So it starts with talent because you only need one of right people running the studies in all three. So let's just, you we'll do sponsor CRO insights, right? You need talented individuals to actually design the right studies, design the right products and be able to actually execute the right way. And same thing for CROs. You need to be, you know,
have the right people to execute on the ground. And of course, at the site level for sure. And at site level, it's a combination. You've got the investigators and you've got to study staff and study staff have become a huge department there because it takes so much to run the studies to coordinate all these different aspects of what the sponsors are asking for. So the competition comes in, the competition is again, different for different, these three different elements, right? For the sponsor, the competition comes like, hey, what am I?
What are we providing to the sites, Is the drug the right, something that's good for my patients? But also we wanna be compensating the sites for the right amount too as well, right? There's a lot of hard work that goes in. So, over the course of the years, I think it's more because the studies have become so...
intensive and time consuming that you have to be compensating at a really high level. So that's competition as well. You know, I'll tell you, most of my sites, sometimes they want to pick study not because of the sponsor, but based on the CRO. They have very strong opinions about the CROs they want to work with, because that's the interaction on almost on a daily level. And if they don't like the people they're working with, they're not going to do it. So if a CRO is going to have a monitor that's going to be
not well trained or doesn't understand the details of the trials, or doesn't have experience in the therapeutic area and goes to a site that the coordinators have tremendous therapeutic experience, it's not going to be a good match. Again, there's so many elements of the trial that you mentioned this earlier in the podcast that we haven't even uncovered everything because we don't know that this is an issue. You can only uncover when you actually start
Ram Yalamanchili (30:49.487)
Thank
Namrata Saroj (31:02.882)
delving into fixing things and you go, okay, maybe that other thing needs to be fixed as well.
Ram Yalamanchili (31:07.12)
I'm curious, why do you think we have a sort of like, why don't we have more sites participating in this ecosystem of research? Because I do hear sites are bottleneck and there's only so many sites and investigators out there. But I'm curious, like, because it's economically it's viable, at least it seems like it. And it's a, you really are pursuing something as a very interesting
sort of a workflow, right? Like you're working on cutting edge drugs or devices. There's tremendous benefit you can bring to the patient population in general. So I'd be curious to hear your thoughts, given your experience in the space.
Namrata Saroj (31:54.168)
So the impression is that it's not economically viable, right? So that's actually how it started to act. So that's probably one of the number one reasons. The number of clinical sites now that we have is significantly higher than when I started over 20 years ago. There is an interest in research, but it really depends on the type of practice.
We're seeing more more fellows and residents get exposed to the clinical trial world when they're in their trainings. So when they leave, they want to start it. Back in the day, that exposure was not there. So they didn't know any better. And you sort of join a practice and you kind of do your routine clinical practice. But now, because they're doing their training, they're getting exposed.
I have a lot of young investigators who've gone into established practices with no research department, start their own department, right? But it's a challenge still. have one site where they have really great recruitment and I'm offering them to expand and do things. I was reminded that the rest of the group doesn't really believe in research, so they're not going to invest in it. So that's the thing. You have to invest in it.
before you can profit from it, like anything else, right? So there is still a struggle. You would think that for all the number of retina practices I have, we should have so many sites and it should never be an issue, but it still is. It's a lot of convincing for some of the younger investigators for their senior partners to be bought in and then they have to deliver and then they have to show that
you know, they're not losing money, they're actually making money. And that doesn't always happen with the first research project you do. So hopefully, more and more people will see the bigger picture of research, where you can say, you know, it's cutting on the cutting edge treatment, you can offer that to your patients, for your academic and career growth, that's really great as well. So hopefully, more and more people will see that and more and more
Namrata Saroj (34:14.808)
practices will get established. In fact, I will tell you, most of the great recruiters that we have in our studies right now are not your seasoned researchers. It's the young ones because they're motivated to do it.
Ram Yalamanchili (34:25.935)
Yeah, yeah. No, I think that's an interesting take. And I can see that the activation energy required, especially if you're a new site just getting started, yeah, you have to invest and that there's a certain amount of uncertainty. I should say there's uncertainty in that process in terms of what you're getting into. It's a commitment. But again, I think this is the type of interesting places where automation is always
Namrata Saroj (34:44.205)
Yeah.
Ram Yalamanchili (34:55.945)
so much more beneficial than sort of not using any automation completely relying on manual process, right? So I think we've definitely seen sites even our own experience has been like getting a lot from bringing some of the cutting edge technologies to their practice.
Namrata Saroj (35:13.396)
Yeah, so I think one other aspect of starting research or even some research centers, they share study staff between the research department and clinic, right? And I always say that's not always the best move because the work we do in clinic is not the same as we do in, not exactly the same as we do in the study arena. And I think if, especially for those folks where they might have, you know, resource limitations,
any automation support helps because then they can balance both aspects because it is very demanding to be balancing both. If you spoke to study coordinators who are doing both, it's like it is very difficult to kind of keep switching off like, okay, I'm doing the study, which is more documentation. I'm going to clinic and I need to get the patients through. So any automation we can help there would definitely be appreciated.
Ram Yalamanchili (36:04.686)
Yeah, it's interesting you put it out there, Namrata. I've seen two versions of this, right? One is the early sites where they really are looking for any sort of help which can offload the work from their current staff. So that's one site, one sort of a phenotype. And then there's these other sites which are very high volume, high recruiting, like successful practices which have built great research programs.
But for them, it's more about how can I do more with the number of staff I trust and I've worked with for many, many years. we don't necessarily, we're not too keen on like doubling tripling our staff because we have done it and there's some problem or the other and that's not the path they want to go after, right? So I think either way I can see benefits from essentially like bringing in some sort of a automation or like AI technology, right? So that's kind of where we are and what we're seeing.
Namrata Saroj (36:37.09)
Thank you.
Namrata Saroj (37:03.662)
Absolutely. think that, and this is what's the beauty of technology and AI is you can custom it to what you need, right? It's not one formula fits all. And I think this is why the discussions I have with FIX right now is, you when people say AI, it's not one thing. There's so many aspects to it. So what is it that you need and do you have an AI solution for it? It's very different than saying,
I have AI. That means nothing.
Ram Yalamanchili (37:37.55)
Yeah, I agree. And I think another realization which we've had is you have to go after from a persona perspective. So which persona are you looking for help in? And can we solve it or can we not, right? So we go that way.
Namrata Saroj (37:55.918)
Yeah. Right. What's the in drug development? The biggest thing we go for is unmet need. Right. What is that? And I'll give you an example of tech. It's not AI related, but several years ago I was helping a company that, um, it's on the other side of clinical trials. do prior authorizations for our drugs. Right. I sit on clinical. didn't know that. Right. What's still being done with faxes and telephone calls. And somebody came up with an electronic system. And when we were trying to implement that.
I, we got resistant from some sites, my God, like would my proud person get, you know, laid off or whatever. You're like, no, they're helping them. They're going to be more efficient in the priority you're going to do. Right. That's, that's the conversation we're having now again, in this respect with AI as well, that, Hey, we're not talking about replacement. We're not talking, this is about how we can make everybody more efficient.
Ram Yalamanchili (38:36.877)
Mm-hmm.
Ram Yalamanchili (38:51.947)
Yeah, yeah. No, absolutely. think, and like you said, I like the way you put it, right? I think the competition will eventually drive this towards adoption. The benefits are very clear. It's not just about having a shiny tool. I think the productivity gain, the economic and the quality benefits you're getting are pretty tremendous with what we have, what we're seeing right now. So I do look at it as a matter of time rather than, you know, a matter of if or.
That's sort of the thing, right?
Namrata Saroj (39:22.926)
Yeah, productivity and quality are going to be very essential. when we do site selections, we're looking at how do they recruit, which of course that's a different AI that's going to work on that. But the quality of the data really matters. When the monitors go in and tell us they have so many outstanding queries, you're getting red flagged. And so the next time around, people will think twice, like, OK, maybe they recruited well, but the data was a disaster, so we can't risk that anymore.
Right? Over the course of running clinical trials, I've had data from sites we had to actually exclude because it was not good. Right? And think about how much of an investment cost that is. And it can really affect the outcomes of a trial and that affects sort of the entire drug development process. So that's really critical that the sites provide good data. And if we can help them do that, their visibility goes up. And guess what? They're going to be
Ram Yalamanchili (40:14.124)
Hmm.
Namrata Saroj (40:20.342)
on top of the list for the next recruitment.
Ram Yalamanchili (40:23.212)
Yeah, yeah. Fascinating. Well, thanks for your time. I think this has been a pretty interesting call. yeah, I hope to talk to you soon. Thanks for your time again.
Namrata Saroj (40:36.578)
Thank you and great work on told. I think, you know, going as an AI teammate to the sites, to the sponsors, to the CROs. I think it's going to be really helpful. And I, as far as I know, I think you guys are the only ones doing that right now. hopefully more sites will, will adopt and we'll get some companies to adopt them too.