Behind the Scenes Episode 352: Google Cloud Medical Imaging Suite

Welcome to the Episode 352, part of the continuing series called “Behind the Scenes of the NetApp Tech ONTAP Podcast.”

2019-insight-design2-warhol-gophers

This week on the podcast, Jason Klotzer (jklotzer@google.com) of Google and Kim Garriott (kim.garriott@netapp.com)of NetApp join me to discuss the new Google Cloud Medical Imaging Suite offering!

For more information:

Tech ONTAP Community

We also now have a presence on the NetApp Communities page. You can subscribe there to get emails when we have new episodes.

Tech ONTAP Podcast Community

techontap_banner2

Finding the Podcast

You can find this week’s episode here:

I’ve also resurrected the YouTube playlist. You can find this week’s episode here:

You can also find the Tech ONTAP Podcast on:

I also recently got asked how to leverage RSS for the podcast. You can do that here:

http://feeds.soundcloud.com/users/soundcloud:users:164421460/sounds.rss

Transcription

The following transcript was generated using Descript’s speech to text service and then further edited. As it is AI generated, YMMV.

Episode 352: Google Cloud Medical Imaging Suite

Episode 352 – Google Cloud Medical Imaging Suite
===

Justin Parisi: This week on the Tech ONTAP podcast, Kim Garriott and Jason Klotzer join us to tell us all about the new Google Cloud Medical Imaging Suite.

Podcast intro/outro: [Intro]

Justin Parisi: Hello and welcome to the Tech ONTAP podcast. My name is Justin Parisi. I’m here in the basement of my house and with me today on the phone we have a couple of special guests to talk to us all about the Medical Imaging Suite with Google Cloud. So to do that we have Kim Garriott. So Kim, what do you do here at NetApp? And how do we reach you?

Kim Garriott: Hi Justin. I am the Chief Innovation Officer for Healthcare, as well as our general manager for medical imaging at NetApp. And you can reach me at kim.garriott@netapp.com.

Justin Parisi: All right, and also with us, Jason Klotzer here. So Jason, what do you do at Google and how do we reach you?

Jason Klotzer: I’m a customer engineer at Google Cloud within the healthcare life sciences vertical. I work on imaging solutions and strategy within that vertical. So medical imaging related workflows. And that’s my background. I can be reached at jklotzer@google.com or via LinkedIn.

Justin Parisi: Does a customer engineer actually engineer the customers?

Like do you create them from a lab?

Jason Klotzer: That would be awesome. , you know, sometimes good, sometimes bad, that would be really cool. But I think there’s a lot of engineering that goes into just figuring out what customers need in most cases. So I think that’s where a lot of it comes out.

Justin Parisi: Yeah. And that’s partly the goal here of this podcast is to try to help customers understand the value of what we’re talking about.

And today we are talking about the Google Medical Imaging Suite. So what I like to do is start off with just a high level overview of medical imaging. So, Kim you’re good at doing this. So the medical imaging overview, tell us all about what medical imaging entails.

Kim Garriott: So medical imaging, and we commonly refer to it as enterprise imaging or enterprise medical imaging in a healthcare organization today, really is any type of visual or multimedia content that ranges from traditional radiology or cardiology exams that we’ve all had ,to ophthalmology studies or women’s health studies, even digital photos that your dermatologist may take of a mole or a lesion on your skin. It’s really all of this different type of multimedia content that we see across a healthcare enterprise.

And we manage this data all in a digital format. Now, whether that is in a radiology PACS or vendor-neutral archive, or any of the ancillary systems that may do the same type of image management. And as we talk about the value of data and how artificial or augmented intelligence, as we like to say in healthcare, can really help us get the right kind of care, the type of care and timely care to patients.

Looking at this data and being able to leverage the value of this data in meaningful ways to improve clinical outcomes is something that we’re all keenly focused on today.

Justin Parisi: So I know that Google Cloud offers a variety of ways to consume compute and storage. So Jason, can you give me an idea of how medical imaging customers use that and what Google’s overall medical imaging history is?

Jason Klotzer: I don’t think I can cover all that in an hour, but I can definitely give you an overview. Yeah, so Google, much like other hyperscalers and those focusing on bringing infrastructure at scale to customers. It does have a myriad of different offerings when it comes to compute, storage… then as you go to PACS and SaaS, your databases and these kinda overlays But in the medical imaging domain, I would say beyond your classic infrastructure components that can be used by what we refer to as ISVs or your software vendors out there who create the things like the components that Kim was referring to, your PACS, your VNAs or other clinical systems.

Those components, of course, GCP, we’re great at offering those components and have a lot of different offerings for them. But when you kind of take it up a level to different offerings that you would want to maybe use a composite of those different offerings we start moving into the solution space where we have very specific things for industry.

In this case, the use case we’re talking about is healthcare and even more specifically medical imaging. So that’s actually the main reason why we introduce what we call the Medical Imaging Suite so that we can take a composite of different offerings from the domain, partner with those who are very close to care and those who are intimately involved in the domain as well, including NetApp reports and kind of bring a best of breed offering to those in the space.

Now, high level, Medical Imaging Suite is focused on secondary use of the imaging data leveraging services that are available within NetApp as well as leveraging services that are already available in Google Cloud and doing it in a very standardized way. I’m gonna pause just for a sec cause I know I spoke a lot about all of that stuff.

But that’s really where we’re going in Google Cloud, and we have been in the domain actually for quite a number of years now.

Justin Parisi: So you, you mentioned secondary use, and I would imagine that involves maybe AI ops processing of the data. Is it something else? Am I understanding that right?

And if so, what is a primary use? Break that down for me.

Jason Klotzer: Sure. Yeah. So primary use of medical imaging data is typically for diagnosis. So a patient comes into a hospital or an imaging center and they’re scanned. That could be a CT, MR, ultrasound, you know, you name it. There are many different modalities in which a patient could be scanned. Kim was mentioning some other one, dermatology as well, you know, ophthalmology. But effectively when that imaging data is acquired, somebody needs to read it and determine what’s going on internally within that patient. That’s the diagnosis. That is the primary use for that imaging data.

Now this is done a lot. You could imagine that in these imaging organizations could be a watch hospital chain. You know, it could be outpatient imaging center. Across just the US, there are millions and millions and millions and millions of scans done every single year, and that breaks down into billions or even trillions of images.

They’re all looking for ways to optimize that workflow. So it could be assisting the radiologist or the cardiologist or the ophthalmologist, et cetera. So doing sort of medial tasks in a more optimal way, a more efficient way. That they can, in many cases, derive from the imaging data that they have, the clinical data that they have, the purpose for that procedure. So even just deriving insights on, how many of those are done on a yearly basis. What kind of hardware is used for these type of things. These type of insights are absolutely available, so that kind of yields the secondary use of that data.

So secondary use of broadly would be performing research, using that data building AI models, deep learning or other, to be able to gain insights from that data. Even just very basic retrospective analysis of that data would be a secondary use of that information. And of course, secondary use goes much more broadly than that.

So there are organizations within the provider space who also work with pharmaceutical companies for the purpose of drug discovery. Those who are working with MedTech providers who are creating the new scanner or the news of seizure. So there are many different ways in which this data could be leveraged to innovate and to kind of, you know accelerate the domain and eventually accelerate and improve patient care.

Kim Garriott: I’m happy that you touched on pharmaceutical industry and MedTech. So the larger life sciences industry, they’ve always had some use for medical images, but in this day, when we’re moving forward, And using image analysis and even seeing imaging used even more broadly through MedTech and pharmaceutical research.

Now we start to think about medical imaging not only being something that occurs within the healthcare provider or that delivery space, a hospital or an outpatient center, something like that, that we as humans interact with as patients. But now medical imaging is really much more in the forefront of life sciences in general.

So I was really happy to hear you add that on there.

Justin Parisi: So when you say life sciences, does that include things like genomics?

Kim Garriott: Yeah, absolutely.

Jason Klotzer: Yeah, so, within the space, I’d say a lot of the research that at least I’m seeing being done is multi- omics driven so meaning you have genomics, possibly proteomics you know, your actual imaging data, your core clinical data from a lab standpoint. Even just basic things like notes that are associated with the patient history. All this kind of information aggregated together in a way where you can kind of have a holistic view of the patient’s background as it relates to the clinical setting. That tends to be where a lot of the organizations are going because it gives them the best view of what happened.

So yeah, absolutely. Genetics included.

Justin Parisi: You specifically called out secondary use cases for the Medical Imaging Suite, and I’m wondering if there’s a reason why you would target those as opposed to primary or in addition to primary. What are the benefits of having secondary in Google Cloud versus having primary in Google Cloud and you know, what would be the drawbacks of each?

Jason Klotzer: Primary use is something that has a tremendous amount of complicating factors, not only from the functionality standpoint, but also from the from the regulatory standpoint from quality of care. There are hundreds of commercial providers for thing, for things like diagnostic radiology.

And frankly they do a really great job. I mean, they’ve been in the domain for a number of years. I, myself worked for GE Healthcare for many years. And they do really great when it comes to primary care when it comes to software. Google Cloud, I’d say do a really great job at data of scale. So being able to do analysis on massive amounts of information and give you [inaudible] in that information. Having patterns architecturally that allows me to perform those operations. So I’d say the main driver in doing secondary use is because we can accelerate that domain using the same architectural primitives that we use all the time.

And we can accelerate that domain in a meaningful way. I think that’s why we primarily are starting with the secondary use. I don’t want to even comment on the diagnostic use scenario because it’s a whole different business category that kind of offers those products to their customers directly, and that’s just not the business that we’re in.

Kim Garriott: Medical imaging as a industry segment in health IT has always been treated as its own rather special entity within healthcare. Oftentimes you know, shadow, IT will run these imaging environments and we’re starting to see that go away more and more with centralized IT and with clinicians becoming a lot more comfortable with service level agreements, things like that.

But you’ve even seen some evidence of staying away from the diagnostic imaging applications even in the large EMR vendors, right? So they tend to have modules to cover every type of clinical specialty and workflow out there, largely imaging workflows. Now, they certainly can point to images and integrate that imaging content into the patient’s clinical record and show great continuity of care data within a single pane of glass.

But imaging historically has been this very complex thing for as it’s used in primary diagnosis. . I think that there’s that aspect of it, right? So let those who do what they do best, do what they do best. And certainly there are new players coming into the market all the time. But it’s always interesting to me when someone asks me, so who’s the industry leader in PACS or in VNA?

And when you look at the distribution of market share, it’s all over the place. The market share leader only has 17% of the market, and from there it drops off pretty sharply. So lots of players in this area that have dedicated many years at doing primary diagnosis. Well, I also think that the opportunity for the Google Cloud Medical Imaging Suite is really aligned with where health IT and imaging IT is today. Pre-pandemic, we were not even really talking about cloud in healthcare, but especially when it came to medical imaging because of some of the factors that Jason started to allude to. But post-pandemic, there is a rush to be able to leverage cloud, especially for secondary use cases, and this is important because it helps organizations develop that level of comfort with the cloud and familiarity with the cloud in these non-critical diagnostic workflows. So even a secondary use case could be as simple as backup or a disaster recovery copy, but certainly the power lies in harnessing the data and the value of the data further into feeding the development of AI models or advanced analytics really again, to advance patient care.

So I really think it’s a timely introduction of this environment, and directly in alignment with how we see health IT and imaging IT moving toward the cloud.

Justin Parisi: So what it sounds like is maybe a primary use case wouldn’t require as much compute as something like a secondary use case might. So, Google Cloud is good at scaling, and if you don’t need that scale, then maybe it’s kind of overkill and maybe you’ve already got stuff that already works for you.

And then, as Jason mentioned, the other aspect of that is the data being anonymized, right? So , you can’t really have patient data out there floating around without being anonymized. So I guess that the software that’s available out there today, does that anonymization so you can turn it into research data.

Is, is that all accurate or am I missing something there?

Jason Klotzer: There are a lot of details within that space. Clinical providers now that absolutely store PHI in their respective cloud providers. So all the compliance efforts that have gone down within the cloud providers over a number of years have been toward iTrust, HIPAA, FedRamp. Whatever the kinda compliance requirements are within the healthcare space that have been kind of box checked over a number of years so that clinical providers can be comfortable that PHI can be used within cloud providers. And we do see that in a number of providers today. Now for those who are strictly research based it could be startups, could be your pharma, your MedTech companies where quite frankly, you know, the PHI isn’t something that is relevant to them in their research objectives.

We do have services that can perform the itemization at that scale for DICOM data or fire data in GCP. And we do see those leveraged quite heavily, but yeah, just in summary, customers can use PHI within cloud provider or they can use the identified information there. And we have services that support both.

Kim Garriott: And Justin, there certainly are imaging software vendors, PACS and VNA vendors that are in the process, or already there in the cloud or in the process of getting their capabilities in the cloud to provide software as a service and other managed services to their customers. So I guess what we are saying is with this Medical Imaging Suite, we’re really focused on the secondary use cases.

That’s not to say that primary use cases are not able to be managed in the cloud, but just simply that this solution is really focused on those secondary use cases, and that’s quite a heavy lift from a compute perspective and storage performance perspective, when you think of AI model development and advanced analytics, oftentimes that’s why organizations will look to perform these secondary use cases in a cloud environment so that they do not have to have the compute at scale or the storage performance at scale on-prem, when it’s available readily in the cloud and can be scaled dynamically as it’s needed for that particular project versus having those very expensive GPU or CPU resources running 24/7 on-prem and being underutilized when those models are not trained.

Justin Parisi: Yeah, I mean, theoretically you could do anything in the cloud. Right? But do you need to is the question at that point, or do you need to, because you already have a solution in place?

Jason Klotzer: Yeah. I wanna reiterate and add on top of some of Kim’s previous points. In some organizations, they’re perfectly happy with what they have and they’re good to go with prospective patient workflows that they need to cater to.

But when it comes to clinical workflow, it’s going to be operating on each procedure. Let me put it that way. So, patient comes in, they get scanned, that in itself could be, could be a gigabyte worth of data. And that data itself is going to be passed along to a radiologist, cardiologist or whoever’s gonna do the interpretation, and they’re gonna do what they have to do and the data’s gonna be passed on with respect to the result. And that’s effectively the transaction.

So that type of thing can of course be moved to the cloud . But there are architecture implications of course where you hear the terms lift and shift. You hear the terms cloud native in terms of architecture. These are all things that the software vendor who has the fact or the diagnostic system has to take into account when they move that to cloud because it will affect the experience of the application – most times for the better, because of the availability of resources again, in the cloud. But these are things that they have to take into account. Now, shifting that to research, research workflow in many cases is gonna be very different from the clinical workflow in the sense that, when clinical diagnosis occurs, you’re looking at that transaction and maybe even a comparison to several previous historicals with the same patient. But, worst case scenario, you’re looking at several gigabytes worth data in that transaction. Now when you look at the research case, research is typically conducted over large cohorts of data.

So let’s say somebody is looking at building an algorithm to do intracranial hemorrhage or lesion detection, or something that is around some specific modality and some specific anatomical region and a certain group of patients. Maybe, men age 35 or 50. You know, you get the gist in terms of built in the cohort, but that’s gonna be conducted on thousands or tens of thousands or even millions of data sets at a time. So performing that kind of an operation, PACS systems, kind of neutral archives, you know, these kind of existing commercial systems, they aren’t built for that purpose. The cloud provider, they’re in infrastructure that are in several cases, especially in GCP case, you know, built to transact with billions of users or billions of sort of actors.

They are built for exactly that purpose, you know, whether it be this specific industry or us. So I think that’s why there is a high directionality to leverage cloud for those exact scenarios because the architecture is purposefully built for that.

Justin Parisi: So Kim, I understand that NetApp and Google Cloud have had a very close relationship and partnership over the years. How did the idea of creating a partnership for a Medical Imaging Suite come to pass?

Kim Garriott: I, I wasn’t here at the inception of this idea, so I’ll give a little shout out to Jeff Tudor for being so engaged with Google and coming up with this initial idea of working together and being able to partner in this dynamic solution.

But the value that NetApp brings to this partnership is, well over a thousand plus hospitals in the US alone run their imaging workloads on NetApp. And because of the ease of use to be able to communicate between our on-premise NetApp storage capacities running with our ONTAP operating system, it’s very easy to replicate that data over to a Cloud Volumes in the Google Cloud, so the beauty of that ease of data replication was one of the really strong benefit that NetApp brings to this partnership. And certainly the value of Google and the power in this suite of tools that they’ve developed was very attractive to NetApp to be able to help our customers replicate their data into an environment that provides this really robust set of tools to use to derive the value of your data. So I think those were some really strong partnership tenets as this started out from the very beginning. And Jason, I think you were there in the earlier days than I was.

So you might be able to speak to this as well.

Jason Klotzer: Yeah. I was there in the early days. I started out slowly, but things accelerated very, very quickly. In large part, I think because you came on board and then got involved in that conversation. But the original premise is just as you state.

We clearly were aware that NetApp is extremely good when it comes to enterprise storage management, and that it is the backend for a large portion of the clinical data settings out there. So Medical Imaging Suite – a theory at that point – being a kind of secondary use research environment that could accelerate AI development and analysis.

We still were not able to get the data to where it needed to be in a relatively frictionless way, is the way I wanna say it. NetApp. Our previous direction, or I’d say, you know, one of our directions is that the clinical systems themselves can replicate this data, but that takes a tremendous amount of orchestration to get it into the environment to perform the research.

So we thought, why not bypass all that, and just allow the storage layer itself to do this movement of data or replication of data to an environment where it has services accessible to it. So people don’t have to do all that orchestration themselves with their clinical IT in some cases, shadow IT staff.

That was the original idea.

Kim Garriott: Historically, anytime you try to replicate or migrate imaging data, it’s a multi-year process – even if you are inside the same data center, moving from one system to another. So there truly is great value in being able to just quickly, and I know I’m kind of going into some content that we’ll get into a little bit later in this conversation, but there is great value in being able to quickly replicate via SnapMirror, that imaging data, within minutes or hours versus years. I have been involved in imaging migrations that literally take over five years because you’re constantly having to figure out when you can tax the PACS or the VNA database to be able to route that data. And when you do this at the storage layer, again, it becomes much less complicated and much easier to do again within days.

So you can quickly stand up these environments and quickly start to be able to work with the data in ways that you simply can’t do today without this replication capability in place.

Jason Klotzer: A large portion of the opportunities with customers that we’re seeing are driven through new AI related or analytics related development. Basically there’s an idea in place where they say, Hey, we think that we can gain some sort of an insight from this information. So we need it available for our researchers or a third party research organization to be able to analyze this information. Cool. But that also entails that you have to get that data to the place where it needs to be in that bump in time, in that instance.

So this could be petabytes worth of information. So immediately at the onset, if you’re talking about, Hey, I’m gonna copy all this data from here to here via my existing network, you know, that’s gonna be extremely taxing on the existing clinical systems. It’s gonna be extremely taxing on the network and the storage.

So effectively all the hardware infrastructure. And it’s gonna take time months, maybe even longer to be able to replicate the data. Now what if you had something where this is just done where you already have it in place where the data’s already available. So whenever somebody says, Hey, I have this new great idea, or we wanna run simultaneously, several projects on this data, it’s already available.

That, that’s exactly the intent. And it doesn’t display clinical systems. You don’t need some auxiliary pathway for the data to traverse. It will just be in place. Your existing clinical workloads will be exactly the same. And your research will have access to the kind of information that they need on an ongoing basis.

That’s the intent.

Justin Parisi: Yeah. And what’s great about that is you can establish a baseline relationship where you have exact replica of volume A on destination site, and then if that data changes on volume A, then you update. You don’t replicate the entire volume again, you replicate the blocks that have changed.

So not only is the initial replication faster than saying, doing an rsync or something like that, it’s also faster to get the new updates because you’re not having to deal with all that extra data coming in. You’re reducing the load on the network, you’re reducing the cost it takes to have that ingest of that data into the cloud.

So there are a lot of benefits to having a replication strategy that works like that.

Jason Klotzer: Absolutely.

Justin Parisi: So, that’s one aspect of a NetApp benefit for this particular use case. I can think of others, but I’m gonna let Kim tackle this first. So Kim, what are some other things besides the obvious – the SnapMirror piece – that NetApp offers as a benefit for medical imaging?

Kim Garriott: Certainly we’ve hit on the replication, but I don’t want to underemphasize the importance of that and the ease of being able to do that because this is a game changer in helping customers very quickly be able to spin up AI projects. Maybe they don’t wanna move their entire imaging library over into a different environment. Maybe they only want to move specific cohorts of data to do that. So this is just really the easy button to be able to move that data. But beyond that, Cloud Data Sense is a very powerful tool that can add a lot of value.

And again, I go back to the creation of an AI data cohort. Cloud Data Sense does a lot of functions when it comes to auditing and being unable to identify Stray PII or Stray PHI, which is very valuable in a solution like this, but also the ability to be able to interrogate DICOM metadata and be able to identify imaging studies that will be appropriate candidates for an AI data cohort. That is a very powerful thing and despite the name Cloud Data Sense, whether the data is on prem or whether it’s in the cloud, Cloud Data Sense can interrogate that data. So not only in the location cloud or on-prem, but also it’s storage vendor agnostic. So whether that imaging workload is housed on NetApp storage on-prem or if it’s in some other type of storage capacity, Cloud Data Sense can interrogate that imaging data at a DICOM level and help customers be able to identify what data they do wanna replicate over if it’s not the entire volume of data.

And then once the data is inside the Google Cloud, again leveraging Cloud Data Sense, we are able to track and audit the use of that data, the access of that data, where that data may be residing, if it’s residing somewhere that it wasn’t initially intended to reside. With that toolset, can dynamically alert users or administrators as to, Hey, you need to, you know, look into this, or maybe we can categorize that data, classify it, and map it appropriately to ensure the highest level of protection and privacy is applied to this sensitive data.

Jason Klotzer: Building on the Data Sense story. I mean, just think about kind of event driven architecture that could be built from that. Some would be in the PHI and data sensitivity domain, but some of them are also, for the purpose of amending a pre-built model or a purposefully built model that needs to have new data that’s updated with it.

And you have new CT head studies that are coming in. Do you want update that AI model that you previously built that might not have this data associated with it at this point? There’s tons and tons of different opportunities from an event-driven mechanism standpoint.

Kim Garriott: Yeah. And then the classic tremendous value of ONTAP, right? So in these secondary use cases, an organization could choose to also store a backup copy or a disaster recovery copy or even a tertiary copy because there are many healthcare organizations that are now not only storing a first and second copy of imaging data, but also a third copy.

So in that, especially if they are an existing NetApp customer, on-prem, being able to span that data fabric, that ONTAP operating system into the Google Cloud, gives the organization the same look and feel and all the capabilities of ONTAP so that their storage administrators or their cloud administrators only have to look at a single pane for administration responsibility.

So that’s also another great benefit.

Justin Parisi: So we’ve kind of danced around this idea of Google Medical Imaging Suite, but we haven’t actually talked about what’s in it, right?

A sandwich is great, but is there peanut butter? Is there jelly? Is there ham? Is there turkey? So what is actually in the Google Medical Imaging Suite that we would be interested in?

Jason Klotzer: We don’t have a lot of time, but just at a very high level Medical Imaging Suite is made up of really five different components within the solution that can be used in various different ways with one another.

It’s not all or nothing.

The first is storage. How do I get my imaging data into the cloud? And this is where in addition to some of the native offerings that we have in Google Cloud, like Healthcare API for DICOM storage, NetApp plays a very, very strong role in being able to do a replication or a filtered sort of replication into the cloud.

Then there’s the imaging lab component which is used for doing annotations, sort of pre-processing of the imaging data. Moving in the direction of being able to have labeled annotated data for the purpose of model building. Then we have our data sets and dashboards piece, which is really where you’re able to build a metadata view across all of your imaging data, which is where we leverage big query highly. And you can either query it directly using SQL or you can build a view of that data using a looker or data studio from a UI standpoint. I think this one also, and, and this is probably more of a teaser than anything, but this is also one where we could definitely look at leveraging Data Sense more to see how we can kind of prepopulate that analytics view of the data to give more immediate insights on an ongoing basis.

Next component in the solution is what we call our pipelines, our imaging AI pipelines. This is where you actually build AI. So you do training. Google Cloud has what’s referred to Vertex AI. It’s a AI platform within GCP.

Where it has everything that you can need for ML ops to custom training, to feature management, et cetera, et cetera. But the AI pipeline piece is where you do the building of the ML model. Your data science team, you can do custom development. If you don’t have a data science team, you can even leverage things like auto ML where effectively you just give the input, you specify the kind of training job that you want.

Like in our case it will typically be VisionAI and it will go to town for you. You don’t have to do anything besides that. It’s NoCode.

And lastly of course is the deployment piece. Once you have your model built within Vertex AI, that’s when you’re gonna decide, okay, well where am I gonna deploy this model?

Meaning where am I gonna deploy this artifact that when I gave it input, it’s gonna give output that tells me what the inference or the projection is. And that’s a often debated conversation item in itself, especially within this. Because of the low latency requirements in most of medical imaging workflows.

So some customers are perfectly comfortable deploying their model directly to GCP because we have offerings for that. Some of them say, Hey, I wanna deploy this model in close proximity to my scanner. We can do that as well because we have edge capabilities to be able to deploy the model.

And we even have what’s called TensorFlow data, which allows me to deploy a model directly to [inaudible]. So there are a lot of options within the background. These are the different pieces or components within Medical Imaging Suites that sort of comprised of the holistic solution. And they all interact with one another if you decide to move them all together.

And it is all based on Healthcare API at the epicenter of it, because it integrates with all those pieces.

Justin Parisi: I would imagine there is a landing page somewhere, or a blog that talks a little bit more about this in detail. Where would I find that information?

Jason Klotzer: That is cloud.google.com/medical-imaging.

Medical dash imaging.

Justin Parisi: That seems way too easy, is that, that’s not a trick, is it?

Jason Klotzer: It’s super easy. Yeah. I’m not gonna magically create that page as we’re discussing it, it exists, so, yep. That’s it.

Justin Parisi: That’s so intuitive. It’s like somebody thought about that and like said this is what it should be.

Jason Klotzer: Yeah, there was a little bit of thought that went into this whole project just a little bit. Yeah, so I think it’s pretty comprehensive and it also kind of tells you how you can follow up on any things that you might want to pursue within the space. Yeah. It’s pretty good.

Justin Parisi: All right. Sounds like you’ve given me a lot to think about in regards to Google Cloud and medical imaging and all sorts of things that we can do with that.

So if I wanted to reach you, Jason again, how do we do that?

Jason Klotzer: It’s jklotzer. So my first name and my last name at google.com, or you can find me on LinkedIn.

Justin Parisi: All right. And Kim.

Kim Garriott: it’s Kim.garriott@netapp.com or Kim Garriott on LinkedIn.

Justin Parisi: All right. That music tells me it’s time to go. If you’d like to get in touch with us, send us an email to podcast@netapp.com or send us a tweet @NetApp. As always, if you’d like to subscribe, find us on iTunes, Spotify, Google Play, iHeartRadio, SoundCloud, Stitcher, or via techontappodcast.com. If you’d like to show today, leave us a review. On behalf of the entire Tech ONTAP podcast team, I’d like to thank Kim Garriott and Jason Klotzer for joining us today. As always, thanks for listening.

Podcast intro/outro: [Podcast outro]

 

Advertisement

One thought on “Behind the Scenes Episode 352: Google Cloud Medical Imaging Suite

  1. Pingback: A Year in Review: 2022 Highlights | Why Is The Internet Broken?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s