Skip to the main content.
Talk to sales Start for free
Talk to sales Start for free

21 min read

4. Christine Sublett "HIPAA has great opportunity for evolution."

4. Christine Sublett

On this week's episode of HIPAA Critical we focus on continued ransomware attacks, coding errors, who's winning and losing this week and our predictions for the New Year. Also don't miss out on part 2 of our captivating interview with cyber security expert Christine Sublett.  

 

 

 

Rather read?

Here’s the full transcript of this episode.

Olena Heu: Thank you for tuning in to the HIPAA Critical podcast. This week's edition is chock full of the latest news. We've, of course, got winners and failures and joining me this week again, Chief Marketing Officer Rick Kuwahara.

 

Rick Kuwahara:Hey,Olena.

 

[THEME MUSIC] Olena: There's a lot to report this week, and let's just dive right in. Rick, what do you see in terms of what people need to know about?

 

Rick: Well, you're right, there's a lot to cover. And I think one thing that has been in the news a lot lately has been ransomware, where we've been seeing a lot popping up.

And recently, EMSISOFT released a State of the Ransomware report that was really interesting and showed that in 2019 that the U.S. was hit by a really unprecedented amount of ransomware attacks that impacted at least 948 government agencies.

Educational establishments like schools, universities and, of course, healthcare providers. Um, it's estimated that the cost was an excess of about $7.5 billion this year alone, so that's really crazy.

Of the organization's impacted, you know, they're 759 healthcare providers, and that's just up until the point where the report was made. There has actually been a couple other attacks since that's been made.

So it's even, um, showing how crazy ransomware has been this year and just how much incidents have increased. And a lot of it has been putting...um, it's not just inconvenience. A lot of it really puts people's health and safety at risk. When health care provider is impacted by...or even, you know, government agencies.

When either of them are impacted by a ransomware attack, it can have direct effect against, you know, emergency patients having to be redirected to other hospitals. Um, if medical records are inaccessible or lost, you know, that could affect someone's healthcare.

So emergency patients have to be redirected to other hospitals, that can affect their care, if there's any surgical procedures that, um, were scheduled, they may be have to be cancels or tests postponed. And, of course, 911 services could be interrupted due to a ransomware attack. So this is really serious when...and this happening a lot this year.

We're seeing that you know, primarily the government agencies and healthcare providers are being focused on because it's been shown that they are actually not ready for these sophisticated attacks that are coming in.

There's recently also a report issued by the state auditor of Mississippi that stated there was a disregard for cyber security and state government and that, you know, many state entities are operating like the state, and federal cyber security laws don't apply to them.

So that's really a really serious indictment against how a lot of these government agencies air set up to protect themselves against these ransomware attacks.

 

Olena: And when government declares a state of emergency, I think that really says something about how serious this is.

 

Rick: Absolutely. And it's, you know, unfortunately, it takes a lot of these type of incidents to really wake up everyone to how serious, um, these ransomware attacks can be.

People think, "Oh, it's not gonna happen to us." And it's really not a question anymore. You know, if it's gonna happen, it's when it's gonna happen.

The thing is that, you know, we do see that ransomware attacks can be prevented even if it's successful, and it gets you that you don't have to pay the ransom. You don't have to be hit if you have the right policies and procedures in place.

It's notable that in the report, the banking industry really hasn't reported any successful ransomware attacks, and it's not just because they weren't targeted, but they have better systems in place.

 

Olena: Leading by a good example.

 

Rick: Yeah, it's funny, huh? You deal with people's money and they care a lot more. Um, but they've... and I mean and financial industry has is always under attack, so they they know they've learned, um, to be proactive and what they're doing. So we could definitely learn from that.

 

Olena: Did they indicate the source of the attacks? Like, is it more so domestic, or is it international?

 

Rick: It's a great question. They are, you know, international. We know that it's not just U.S. based hackers that air coming in, but it's coming from all over. So, they are targeting the U. S. in particular, and the U. S. Government agencies and healthcare just because of the success that people are having against those, um, organizations. And so that really emboldens, you know, other hackers to follow suit so it's coming from all over.

 

Olena: So also in the news, we're gonna transition over to IT and patient data. You know, obviously we've seen this so many times this year with a lot of people's information being compromised. What did we learn this week?

 

Rick: Okay, so there's a lot of great news coming out of Capitol Hill where we are seeing a lot more government agencies looking to really help update privacy laws and regulations.

You know, as technology has leaped ahead of where we were with privacy laws. And there's been some gaps that has been created because of it. Especially regarding, you know, patient data and your medical records.

You know, everybody agrees that we want to have people have control over their own healthcare and especially their medical data.

And so on one side, you have the office of National Coordinator for Health IT (ONC) proposing a rule that would help technology companies, really get more access to patient data. And that's great because it really helps consumers or everyday people have access to their own data, and really encouraging the flow of data and not hindering it, which a lot of people accuse HIPAA regulations of doing.

But on the other side, you know there's the College of Health Care Information Management Executives or C.H.I.M.E. which is made of CIOs in healthcare.

And although everybody wants the sharing of data to happen an interoperability to become a reality, you know, they are hitting back against ONC.

And saying that, you know, we really want that these tech companies, even if they're gonna have access to data they should have to comply with data blocking policies, really privacy policies, that there has to be safeguards to prevent the PHI from being misused.

So this really is a blurring of the lines between health, what's considered health data and what's considered consumer data. So if your doctor were to give you your medical records, you know they have to comply with HIPAA regulations because there a cover entity, their health care provider.

But once you have it yourself, it's your own data, and you aren't required by any HIPAA regulations to do anything.

So if you were to give it to third party app, and you willingly say "Okay, here it is." Here's my records just for you to have. And, you know, maybe that app is there to help you with manage your prescriptions or something like that. The app itself, once you give them your data, that's that's your consumer data is not considered health data, so it might not fall under HIPAA anymore.

 

Olena: Okay.

 

Rick: So what does that app you give it to? What do they have to do with it? Do they have to follow any hipper regulations? You know, what are the privacy issues around this? So that's the concern that CHIME is bringing up.

 

Olena: Gotcha. That's really interesting and very important because you might not think about it. But you know, if you have diabetes or high blood pressure or, you know, whatever and you're inputting this data exactly a where does it go, who's responsible? What happens if that information is compromised?

 

Rick: Yeah, and a lot of these tech companies, you know, they they're operating in good faith. You know, no one's trying to really take advantage of consumers, but without regulations in place, it could let a bad actor in.

Because right now there's nothing really to say outside of public scrutiny, to have these organizations now give the data you've given them to third party data aggregator without your approval. And then what are they using...doing with it?

Um, on the good news is the House Energy and Commerce Committee released an initial draft of a bi-partisan Privacy bill on December 18 and the hope is that the bill is going to help close the privacy gaps across all industries and establish standards and safeguards on how all organizations are allowed to use and collect data from individuals for their business.

 

Olena: So what would you say the takeaways are from this?

 

Rick: Well definitely we're seeing how fast technology goes than regulations. You know, the tech world moves way faster than Washington D. C. So we're really behind. But it's great that there are, now discussions happening. And there's movement to really, standardized and set safeguards for how patient data is being used going forward.

 

Olena: I almost feel like that's kind of government in general, like always, trying to play catch up, to enforce something that they're like, oops we need to catch up to this.

 

Rick: Yeah, and it could be good or bad. I mean, you definitely don't want regulation to slow innovation.

So innovation sometimes has to happen ahead of things, but at the same time is great to see that when there are concerns happening and we want everyone everybody's data to be secure and for them to be able to control it.

Because in the end, you know, we're all patients, so ah, definitely it's great to see things moving in the right direction.

 

Olena: Absolutely. And moving along also, we have another update that's related to medical devices.

 

Rick: Yeah, so this is really interesting and kind of concerning, but Keyfactor researchers recently discovered a vulnerability in RSA keys and certificates used by really lightweight internet devices. So that's a lot of technical talk there so I'm gonna try and break it, come back a little bit.

 

Olena: Thank you [laughing]. I just heard letters, and, uh, keys.

 

Rick: You can kind of think of our RSA certificates as a process to secure the transfer of data between different sources, and usually they're remote of each other, so they're separated.

This could include things like medical devices and implants. Um, where there's data going from that to a central source to help with your medical care.

The concern is that you're supposed to only have these basically private keys that helped decrypt the data. And if you don't have these keys in place, then you're not supposed to be able to see the data, and usually that's pretty secure.

But what Keyfactor researchers found was that the way that RSA is generating these random numbers and keys, that there is actually overlap where it is not unique. So definitely this takes a lot of computing. But there is a chance that hackers could find two keys that are not completely random and therefore intercept the data transmission between these remote devices.

 

Olena: I see.

 

Rick: So it's not something that has been confirmed out in the wild yet is something that these Keyfactor researchers have found. But it's definitely a vulnerability that could allow a lot of medical devices to malfunction, which, of course, you know when you're dealing with people's health, that could have really devastating effects.

And so the...what's definitely something that it brings out is that medical device manufacturers really have to really think about the security while they're designing something and not just using what's out there necessarily that says, as an open source tool, they really got to think about how they can put security into the design.

 

Olena: I would think that that would be like the first thing you know. Aside from whatever the creation is, I would think that, you know, obviously you have to first determine how secure all this is gonna be. But for some is it maybe an afterthought.

 

Rick: It could be we don't want to...hopefully hopefully it's not, but there are definitely gaps.

So someone who is a security cyber security expert and how they're designing things can be different than, someone who is pushing forth whose expertise might be in another area. Let's say.

So you know, everybody's working in good faith to try and go forward. Um, and a lot of people might have thought, "Hey, RSA was fine." But you know, attackers are always kind of...sometimes one step ahead.

And it's great that there are researchers out there who are trying to break things so that we can find out "Okay, where are the holes." That helps everyone in the end, but definitely something that can get improved upon for sure.

 

Olena: What would you say are the takeaways from that example as well?

 

Rick: I think. Definitely. It just goes back to, um that security at design, going back to number one and making sure that it's integrated into the process as, you know, product development is happening that we're always thinking about, "Okay, there are bad actors out there. What can we do to make sure that they're not getting in?"

 

Olena: Well, as we mentioned, we like to focus on the winners, and, you know, everybody loves a little good news, and that might help to move things along. So let's focus on who's winning this week, and it sounds like NYU someone related to that has some kind of something happening in Manhattan.

 

Rick: Yeah, NYU Langone Health recently opened a biotech incubator in Manhattan. So they're calling it by a labs at NYU Langone, and it's pretty exciting reading about it.

They're going to try and house more than 35 early stage biotech and life sciences companies, and the aim is that it's gonna help startups focus on science and innovation and move more quickly into their own spaces and not just in this, you know, little incubator.

And having gone through 500 Startups ourselves with Paubox, incubators could be great ways to just build relationships, collaborate with others. And there's just a excitement in that space that can really help launch companies forward. So...

 

Olena: This is like a physical space where people can go and gather? Or is it more virtual?

 

Rick: No, it's It's definitely a physical space. So it's a huge going to be pretty big place in Manhattan. And it'll be a space where the bio lab staff can provide education and operational support for these startups and companies, just to help boost them along, because a lot of times, their expertise is in the science and not necessarily the business side of things.

 

Olena: Understandable.

 

Rick: Yeah. So it's great that you know there's gonna be a place like this for the startups ago and grow and hopefully come out with the next big thing in biotech.

Olena: All right, now. Familiar name, winning this week, Blue Cross Blue Shield. What can you tell us about what they're doing?

 

Rick: Yeah. Earlier this year, Blue Cross Blue Shield launched the Data Innovation Challenge. And the idea was for the program to attract and award data driven tools to improve access to care, improve patient engagement and improve the care, delivery and health outcomes.

It's a great goal that the program has, and they just announced the winner.

So Thrive Earlier Detection won, the Blue Cross Blue Shield Data Innovation Challenge. They just announced it.

And their mission for Thrive is to develop tools designed to integrate earlier cancer detection into routine medical care. So definitely hitting that ways to improve care, delivery and outcomes. And it sounds really exciting.

I mean, I just read about it a little bit, so I don't know too much about it, but it's basically liquid biopsy tests to help find many cancers that earlier stage of the disease. And as we know with cancer care that the earlier you can diagnose the better it is.

Olena: Definitely. Wow. Sounds incredible.

 

Rick: That's really exciting. So be great to see what they can do with it when, with all the data that they will now have access to. They get five years access to Blue Cross Blue Shield data, which, of course, is, de-identified.

So it's not like it's individual data it's just this aggregate group of numbers that you see so they won't actually...Thrive won't actually know individual patient data, but, um, in aggregate a lot of this de-identified data is gonna help them move faster to get their product ready for real use. So exciting.

Olena: Talking about, you know, utilizing something like that. It sounds a little like MRI where they would scan then they, you know, inject a different liquid/ink. Or whatever. So sounds very interesting. I'm very curious to learn more because, you know, we're all impacted by cancer, and we've all known someone that's either been diagnosed or has passed away from cancer. So I would love to hear more about this as it develops.

 

Rick: Absolutely. Yeah. We'll keep an eye on this one. It sounds very exciting.

Olena: Definitely. Okay, well, we just highlighted are winners this week and props and smiles to them and a whole that's also focused on those who are failing. So give us some insight into the failures.

 

Rick: Yeah, unfortunately, there's there's always an abundance of these. So LifeLabs, which is a Canadian diagnostic laboratory, had to pay to retrieve data of over 15 million patients. So this was in the news. It's a big deal, just the size of it.

Um, and they discovered unauthorized access on November 1st, and they know that the data affected, includes names, addresses, email log-ins, data birth health card numbers, which is like their health for their health insurance and even lab results.

They didn't specifically say that it was ransomware, but we know that they have to pay to get access to the data back. So most likely is a ransomware attack.

Olena: And they didn't say how much they paid.

 

Rick: No, as of now, I haven't seen anything. I'm sure you know this will come out later, but, um, yes, it's really unfortunate. Goes to show, no matter how big you are, you know, around somewhere can hit anyone. So again, going back to what we talked about earlier is just being ready for it.

Olena: And there's another failure in regards to health data as well.

 

Rick: Yeah. So this one shows that government agencies are not exempt. So The Centers for Medicare and Medicaid Service (CMS) had to pull its Blue Button 2.0 API offline after they found a coding error potentially exposed beneficiary data for about 10,000 patients.

Um, so they call this the Blue Button 2.0 API, the BB 2.0, basically, it's used by Medicare beneficiaries to authorize third party apps to access their Medicare claims data.

So if people have gone on to, like, a insurer something to see their Medicare claims and, um, manage that they often have to log in, a lot of times that company that you're using, third party, they call on the Blue Button API to, um, to get the information from the CMS.

So a bug was detected, and thankfully, they found it, but still it's determined that, you know, the bug could have potentially exposed the data for everybody. And really unfortunate.

They've launched an investigation, but right now, they haven't really determined how much data has been exposed officially yet. Hopefully, we'll get that out soon, and people can be made aware.

But really bad situation that we're seeing, You know, anybody can be affected. Like you gotta be really, really careful when we're we're dealing with patient data.

Olena: And it looks like the coding error was detected in January of 2018.

 

Rick: Yeah, so long time that it's it's been there, and we don't know if people have actually been exploiting it, but it is great once they saw that that error was there, that they've took it down.

Olena: And, um, what do you see? As far as the takeaways from this example?

 

Rick: Definitely. That, you know, we see that better testing is needed because this is something that they could have been caught in QA which is one of the steps you do when you're, you know, pushing code to become live for it to be used. Usually go through a QA test or quality assurance testing process, and in this case, it could have been done better.

I think the CMS officials determined that, um, a comprehensive review actually was not completed, and the coding error could have been caught earlier.

 

Olena: Wow. So don't just assume that everything's gonna be okay and run with it. You want to definitely make sure that everything has been tested and it's secure.

 

Rick: Yeah. And luckily for a CMS that it was actually not their employees, but a developer found the error because they weren't just assuming things were fine with the API that was provided by the CMS. They said they were doing their own. QA and found the error, which is great.

 

Olena: Excellent. All right. And also making news. New Orleans the latest city to be hit with ransomware, and we've been talking about ransomware, this whole podcast because, you know, it's been very prevalent. What did we learn about this?

 

Rick: Yes, so New Orleans had to declare a state of emergency, because its network was hit with the ransomware attack. So they think that it was a phishing attack that compromised the credentials of a city worker which led to the ransomware getting deployed.

The ransomware itself was identified as Ryuk, and that was used in five other successful attacks that we know of on city governments that resulted in $2 million in ransom, so we don't know if it's the same person attacking the same attacker.

But we do know that like we said before, if something successful, other people are going to do it over and over again. And unfortunately, see this particular ransomware resulting in $2 million in ransom being paid out.

Thankfully, though, New Orleans didn't pay, they were prepared. So just like how we saw LifeLabs wasn't and they had to pay a ransom, New Orleans was. They were ready and they they didn't have to pay anything out.

 

Olena: Interesting. So it's a failure. But it's also partially a win as well.

 

Rick: Yeah, yeah, definitely. It's great to see that they were ready for this to happen. They have the training ready, and they had the systems in place to make sure that  the damage was mitigated.

 

Olena: And so would we assume that this was something via email?

 

Rick: Yeah, if they suspect it's a phishing attack, so usually that's where um, it's a malicious email that's being sent. So one of their city workers got this email. They clicked on something probably, and that downloaded the ransomware virus and got deployed out into the network.

 

Olena: A lot to learn from these examples. All right, well, last week we had a very comprehensive interview with Christine Sublett and Rick tell us you know what we're going to hear this week because she was just a wealth of knowledge. Rick: Christine, like you said, wealth of knowledge. She was great to talk with, and we covered a lot of top, a lot of ground. So we had to actually split her interview into two parts. And in this last piece we talked about mitigating human error in your cybersecurity plan and also a little bit about how HIPAA has to change in 2020. Olena: All right, take a listen.


 

Rick: A lot of these cyber attacks were seeing hitting healthcare like those ransomware attacks were heavily focused on people as you know, the initial threat factor. So you know how much can be done. You know, when you're doing your planning about mitigating that risk, or can anything be done?

 

Christine Sublett: Right. So I do think there are things companies can do, and you're spot on about people.

25 years ago, I remember listening to some security people joke that if we could just take people out of the loop, our systems will be way more secure. And the truth is actually that is true, [chuckle] but we don't have that luxury.

So one of the things, or there's really two things that I think are probably the top two things that companies can do here. Well, actually three.

One is, from a technology perspective is put in place some type of technology, some type of filtering capability that can actually identify...These types of phishing or other social engineering type emails that are trying to come into the organization.

Because you're right, that's like the number one threat factor right now. And so, if we could at least stop some of these, and if we can use threat intelligence and indicators of compromise to filter out these email addresses and other things, we would at least have a leg up.

Now, it's not gonna stop everything, right? So we do have to train our workforce.

And so I think there's two things I would do with the workforce. One is I would train. And there's some great security awareness programs out there to train individuals on how to recognize different types of social engineering attacks.

Certainly, all social engineering attacks are not phishing or types of phishing attacks, but they are by far the vast majority. And so, it's really the primary way that a lot of the account and credential compromises take place as well as ransomware are coming in off of some type of email.

And helping your workforce understand how to recognize them and what to do if they suspect they've received one so that they do make good decisions about what that might be.

The other is to test. And so there are also some great programs out there and technologies for testing your workforce.

So it allows you to set up particular types of social engineering emails and tests to actually test the workforce to see if you catch any of them clicking on things or responding to things that they shouldn't be. And I have seen some organizations on the first test, they'll have a click rate of 70% of their workforce clicks on this first test and maybe 30% of them actually enter in their credentials.

 

Rick: Wow.

 

Christine: Right, I know. It's just staggering. [laughter] And I see numbers like this, and I'm thinking, "Oh God, this is not good." Right?

But once you identify these users who've clicked on it and the users who've also entered credentials and then you have further training for these folks, and you explain to them what again, "This is why we're doing this." It's not a punitive thing. They're not in trouble. "But let us help you understand how to identify these things. So you don't do this when it's real."

And what I've seen is the amount of improvement between the first test and the second test is just, it's really significant.

That one company I'm thinking of where they had 70% click on it and 30% entered credentials, the second time they had less than 10% click on it and 1% enter a credential.

And so to me, that was worth every cent of the training and the testing. And the fantastic thing is that these programs for testing and training your workforce on social engineering type attacks are actually really inexpensive. So it's probably one of the most cost efficient ways to reduce your risk as a company.

 

Rick: That's great. So I do have one follow-up question that we can kinda weave in. It goes back to your first answer. It struck me when you said HIPAA was written in 1999.

 

Christine: [laughter] Boy. Right?

 

Rick: So how much does HIPAA have to evolve? There's HITECH and everything, but how does HIPAA have to evolve to meet the changing technology?

 

Christine: So, HIPAA has great opportunity for evolution. Let me say that to start with because exactly that, right?

The security rules written in 1999 and went into effect in 2005 and the privacy rule was written in, I think, 1996 and went into effect in 2003. And as you mentioned, there was a HITECH Act which did update some of those.

But primarily on the security side, what it did is it just pushed the same sets of security requirements from the business associate to the covered entity.

And before the HITECH Act, the business associate only had those requirements if they were contractually obligated based on that contract with the covered-entity customer. And in many cases, they were already, but this also made it part of the legal requirement as well.

HIPAA was designed to address a fairly limited set of healthcare data. It certainly was never designed to cover all health care data. And so we have a pretty significant gap in our regulatory framework from a healthcare data perspective.

And HIPAA was never intended to cover this massive set of healthcare data. And then you think about how that world has expanded in the last 20, 25 years, and it looks even less adequate today.

So, there are plans at HHS to issue some changes or update the HIPAA regulations. And I believe they're due out first quarter of 2020. So not long.

And of course then we'll have opportunity to comment on these proposed rule changes. But if it stays fairly narrow, which HIPAA really is fairly narrow because, again, it doesn't cover most healthcare data, it really covers this small subset.

And it doesn't cover any consumer information, any consumer healthcare data.

So when you use all of these different apps that are now collecting data from devices or input by patients or imported from a record you download from your healthcare provider, none of that's covered by HIPAA.

And so it's sitting on your phone. In many cases, it's uploading to a cloud environment from an app provider where you probably know almost nothing about how they use your data or how they share it. And currently, HIPAA doesn't cover any of that.

And so, the way I view HIPAA and what will be proposed changes is unless it's expanding the scope of the data that it's regulating, we still will have some significant gaps. And so, I don't know that HIPAA's the right place to address this versus an over-arching privacy law.

Some states are trying to address this from a state level, California being one and the leader on this.

Our CCPA goes into effect January 1st, but it also has significant carve-outs. So it doesn't cover non-profits. It doesn't cover HIPAA-covered data, so HIPAA-covered data is exempt from it. And it also doesn't cover any organization with revenue under 25 million.

And so, if we want to really address privacy law, healthcare privacy from an overarching perspective, probably a federal regulation is our best approach. And it remains to be seen whether or not any of the proposed changes in HIPAA will address it, but frankly, I suspect not.

 

Rick: Right. And just to clarify, when you're saying it doesn't cover consumers, when a consumer downloads or has their own data, there's nothing. It's basically free for anybody after that. There's no regulation as far as if you're, like you said, a Fitbit or something like that. They're not a covered entity, right? So they're not covered under the scope of HIPAA. Is that what you're referring to, or is it something...

 

Christine: Yeah. So, what I said is that consumer data is not covered.

 

Rick: Right. Just making sure.

 

Christine: So if a customer enters data into an app on their phone that's not a HIPAA-covered app, then it's not covered by HIPAA.

Or if they download, even if a patient downloads their record from a covered entity, from a healthcare provider, and then they upload it to an app, it's not covered.

And so this is why HIPAA is so confusing to people is that the same set of data could be covered.

So your medical record is covered when it's sitting with your doctor, but once you download it and do something with it yourself, it's not covered.

And so that's baffling to most people, and including me. [laughter] And so it's a question of, "Should privacy protections and security protections follow or security requirements follow the data?" And there are a lot of us who think that maybe yes.

But at the very least, my primary concern is that as we look at appropriate security controls for healthcare data, regardless of whether it's covered by HIPAA or not, we're implementing appropriate controls, but also doing it in a way that does not make it difficult for individuals and their families to receive and share their data as they wish because ultimately this is about as personal a set of data as you can imagine.

People should have the ability to do exactly what they want with it.


 

Olena: And to hear more, you can hear an excerpt from the first half of the interview from our podcast last week at pal box dot com. And you can also view the transcript there as well. Now we're headed into the predictions, you know, now we're in 2020. So happy New Year. And what do you think we're gonna see this year?

 

Rick: Well, we'll definitely want to see, um that HIPAA is going to change in 2020. Ah, whether that is another piece of regulation coming in and adding onto it.

But like Christine was talking about and what we were talking about earlier with the news regarding CHIME, is that there are going to be significant changes in privacy regulations.

As we deal with the rise of consumerism and healthcare, which, you know, it kind of makes a fine line between helping people protect their data, but also not limiting their access to it. Yeah, so a lot of things in the works, we see movement, and that's gonna really affect everybody.

So it's gonna be really interesting to see how that pans out in 2020.

But prediction 100%. We're going to see some changes into regard to HIPAA and privacy and 2020.

 

Olena: Alright, well, thanks for tuning in to the HIPAA Critical podcast. For more information, you can log on to our Web site Paubox.com. That's P-A-U-B-O-X dot com. Thanks for tuning in. Until next time. [THEME MUSIC]

Subscribe to Paubox Weekly

Every Friday we'll bring you the most important news from Paubox. Our aim is to make you smarter, faster.