by Sierra Reed Marketing Manager at Paubox
Article filed in
17. Julie Haney “It’s Really Unrealistic for Organizations to Believe That Their Employees are Going to Always Make Good Security Decisions.”
by Sierra Reed Marketing Manager at Paubox
A lot has been happening in the world of healthcare and cybersecurity. This week we chat with our new Paubox Marketing Manager, Sierra Reed, about the CISA Alert on the Top 10 Routinely Exploited Vulnerabilities right now, DHHS revealing an increase in the number of cybersecurity attacks due to COVID-19, MedPlus Solutions is winning this week, while Saint Francis Healthcare reveals a major email breach and we chat with Julie Haney from NIST’s Visualization and Usability Group.
Here’s the full transcript of this episode.
Olena Heu: Welcome to another edition of the hypocritical podcast this week, I’m joined by a new addition to the Paubox team. Welcome Sierra Reed, our new Marketing Manager.
Sierra Reed: Hi Olena. Thanks so much for the warm welcome and it is a pleasure to be here.
Olena: We are very excited to have you and it’s a pleasure as well. Tell us more about your background and what you’re doing with Paubox now.
Sierra: Yeah, sure. So I come from a risk management, compliance and healthcare background. And I’m very excited to join the growing Paubox team. We are hiring at a rapid rate in marketing and across the company. And it’s a very, very exciting time to come on board.
Olena: Well, welcome again. And if you’ve been tuning into HIPAA Critical, you know that each podcast we like to highlight a few things, winners, failures and also showcase our encrypted interview series. But first, before we do that, we’re going to talk about what’s happening in the news right now.
Sierra: Yeah, so recently The US Department of Homeland Security Cybersecurity Infrastructure Security Agency. Wow, that’s a mouthful (CISA) for short and the FBI released a joint alert on the top 10 routinely exploited vulnerabilities.
Olena: Oh, tell us what this report is and why it’s significant.
Sierra: Yeah, sure. So, the alert provides technical and mitigation guidance on the most common exposures exploited by foreign cyber actors. So the top 10 list states that most organizations do not install the necessary patches to protect their devices and systems. And some organizations make it even easier by relying on outdated software and hardware.
Olena: I see and what would be an example of outdated software or hardware.
Sierra: Sure. For instance, some organizations still use Windows 7, which Microsoft ended support for in January of 2020. And of the 10 vulnerabilities, seven relate to Microsoft products. So an example would be something like Office, Windows or SharePoint. And the alert mentions three vulnerabilities that have been exasperated by the pandemic and social distancing. And these include things such as a virtual private network, Microsoft Office 365, cloud problems, and then general cyber security weaknesses.
Olena: I see it’s interesting also how the pandemic and social distancing have impacted these vulnerabilities and created more of a problem for those who aren’t even protected.
Sierra: It really is. Cybercriminals are using the pandemic to take advantage of weak cybersecurity while others are targeting industries such as healthcare. And all of this points to the need to prioritize patching. IT professionals should most definitely concentrate on patching to make cyber attacks more difficult for foreign actors. And I know that patching is sometimes a significant investment but in the long run, the cost of mitigating a breach, may ultimately be a lot higher.
Olena: And what else is in the news right now? I think it’s also related to cyber attacks and the pandemic.
Sierra: Yes, for sure. So the Department of Health and Human Services has reported an increase in cybersecurity breaches in both hospitals and healthcare provider networks. So between the month of February and May of this year, there have been 132 reported breaches, according to the HHS, so this is almost a 50% increase in reported breaches during this same time last year, which is a huge jump.
Olena: That’s crazy. And how are hackers gaining this access?
Sierra: So hackers find vulnerabilities in a system in a number of ways. So that is a great question. One way is gaining access to a network through phishing emails that target an organization’s employees. It also can be accomplished by hacking into a patient from medical devices, or by going into a medical facility and finding vulnerable devices within the hospital.
So it’s a very common tactic for hackers to gain control and access through patients medical devices through COVID because more people are using remote care, and these devices usually don’t come with built in security systems. Even more scary, once a hacker has gained control over a remote device, they can access the hospital’s network.
Olena: Wow, that is crazy.
Sierra: Yeah, and as most people know, the main motive for hacking into a hospital is financial gain. Hackers make money by selling patients PHI or by holding the network for ransom.
Olena: What can hospitals do to prevent this from happening?
Sierra: Using medical devices with built in security can prevent this from happening while alerting hospitals that their network has been compromised. This also helps temporary hospitals because the devices already have built in protection. So then all you really need is better network protection.
Olena: I see. All right. Well, thank you so much, you know, and we always look forward to the latest and news headlines and keeping us abreast of the situation. And now we’re going to transition over to our winners and failures. And we’d like to start with the good news first. So who’s winning this week?
Sierra: Yeah, so our winner of the week is one of our clients MedPlus Solutions, and they were founded in 2011 and serve 10,000 lives a month. Their goals before using Paubox were to make it easier for MedPlus Solutions customers to receive and access PHI to improve encryption workflow for their internal team, they wanted to create a positive customer experience for their clients and of course, ensure 100% HIPAA compliance.
Before they tried Paubox Email Suite, MedPlus Solutions user Virtru, which is a portal based solution and Virtru did not encrypt 100% of the emails that they were sending. So it was a hassle. Senders had to activate the encryption function instead of specific keywords in order to trigger that encryption.
And as a result of this, fewer than one in five emails that MedPlus Solutions sent were actually encrypted. This was obviously a huge issue. So to solve this problem, MedPlus turned to Paubox Email Suite, and now they encrypt five times more email than before.
Olena: Excellent sounds like Paubox really assisted them with streamlining their process, and also keeping in touch with their customers and patients effectively. Tell me more about Paubox Email Suite.
Sierra: Sure, and first, I would like to provide a little bit of background. So there’s a common misconception that making it harder to read an email makes it more secure and that is simply not true. All that does is make the email more difficult to read, resulting in obviously frustration. And it most definitely shouldn’t take five clicks and a password to read a secure email. We’ve definitely all been there.
It shouldn’t take plugins and a key phrase to send a secure email either. So to answer your question after providing a little bit of background, Paubox Email Suite as a way to send and receive HIPAA compliant email. It’s a way that requires no portals or extra steps, just secure email that works exactly like regular email. Without getting into the weeds too much. That’s pretty much a general overview.
Olena: Great news, for sure and glad that they’re winning.
Sierra: Yeah, me too.
Olena: All right, and moving over to failures and this is a big one.
Sierra: It sure is. So a large fail here as St. Francis Healthcare Partners in Connecticut is having to notify over 38,000 patients that some of their PHI has potentially been obtained by hackers. So somebody gained access to their email system. The attack actually occurred in December of 2019 but it took until March of this year for the forensic investigation to determine the patient’s PHI was actually compromised.
Olena: And so what type of data was stolen?
Sierra: Information that could have been accessed included things such as their names, medical histories, treatment information, dates of service, prescriptions, types of procedures, all things you wouldn’t want to get out into the public. But luckily, no financial information or socials were compromised during this specific breach. So that is definitely the positive side of this unfortunate situation. And as a result of the breach, St. Francis Healthcare has taken steps to improve data security practices, and all affected patients have been notified by email.
Olena: Well, that is a large PHI breach. What can companies do to protect themselves from email breaches like this one?
Sierra: Great question. Email breaches are only getting more sophisticated with all sorts of new scams arising from COVID-19. Companies should be taking a proactive approach and stopping email threats before they hit their inbox. An example would be implementing some sort of spam filtering, I would say email security against ransomware and phishing attacks. Here at Paubox, we do this with an advanced email security feature called ExecProtect.
Olena: Excellent and for more information on that you guys can tune in or log on to Paubox.com. That’s Paubox.com and now we’re going to segue over to our encrypted interview series, where this week, Rick Kuwahara, Chief Operating Officer, had the chance to chat with Julie Haney, with NIS T’s visualization and usability group. In this interview, they talk about password protection and policies, phishing and susceptibility, as well as security awareness and training. Take a listen.
Rick Kuwahara: Can you share a little about that? The mission of the usable cybersecurity team that you’re on?
Julie Haney: Yeah. So we’re a small team, but we’re very multidisciplinary. And we really conduct research that’s at the intersection of cybersecurity, human factors and human computer interaction. So we really want to provide actionable guidance based on our research to the folks who can actually do something with it so they can actually apply it.
So talking about people like policymakers and practitioners and organizations and giving them guidance so that they can incorporate usability and other user considerations when they’re making security decisions when they’re developing security processes or developing products. So that those can be better used by people. And ultimately, that all you know supports people an organization being better off security wise.
So over the years, the team’s done research in a variety of different security areas from a user center perspective. So things like passwords for adults. And then more recently, we did a study with children’s passwords. We’ve looked at people’s kind of general security and privacy perceptions. One of the first projects I worked on at NIST had to do with understanding how difficult it was for developers to implement cryptography in their products.
We’ve done quite a bit of work the last couple years on phishing and how user context plays a big role into susceptibility of phishing attacks. We’ve looked at consumer perceptions of smart home security and privacy. And then I’ve also done some work with security awareness.
Rick: So it’s really great. I think that’s a great mission that you guys are working on. And I don’t think there’s a policy that screams the need for more usability than a lot of the password policies. Like, you know, for example, it’s been set in stone for years. You know that best practices means changing your passwords every few months. But I know NIST put out some new guidelines just a little while ago. That’s really not the case anymore, right?
Julie: Yeah, so NIST is definitely and other organizations have really evolved and their password guidance over the years. Like you said, like the kind of strict password policy of which I’m a little embarrassed to say I was part of perpetuating back in my DVD days years ago. That was very focused on complexity of passwords and changing your password frequently, not reusing passwords, all that.
So the problem with that approach is that it really puts a lot of burden on the user. So the user has to come up with all these complicated passwords and they have to change pretty frequently. Plus, they’re trying to manage a lot of different passwords. I mean, just think about how many different passwords you have, potentially on lots of different systems and they’re told not to reuse these things.
But then they end up doing that because it’s a coping mechanism and they also ultimately start doing less secure practices as a coping mechanism. So writing passwords down keeping, you know, I’ve seen this before keeping plain text password files on their computer, you know, coming up with more guessable, and predictable passwords, because it’s just difficult to remember these complex passwords that have all these different characters and everything.
So then not to mention how frustrated people were over the whole password situation, right. So people are really starting to come to the realization that these strict password policies were perhaps causing more harm than good and so NIST is now taking a little bit of a different approach. The password guidance that they have, which is in Special Publication 863 Digital Identity Guidelines.
They underwent a revision within the last couple of years, and some of my colleagues from the usability group, were able to advocate for the user and usability within that document. And so the idea now is to remove some of the burden from the user and to do more of the security work on the back end. So things like moving to two factor authentication, better encryption of password files on the servers, and then alleviating that burden like getting rid of the password change requirement.
Of course unless something suspicious happens on the network, and you feel like you have to do that, enabling users to have longer passwords and be able to use all the characters on the keyboard, which, although it’s not explicitly mentioned in 863, kind of opens up the possibility for people to use longer pass phrases, which are easier to remember and harder to guess.
So all of that, really, it comes down to alleviating the work and frustration on the user end because the users are it’s not their prime security’s not their primary task. So alleviating that burden and putting more of the burden on the system security itself.
Rick: I think that’s great. And that really brings up that broader issue that a lot of times we think of security as equaling good security equaling complexity. But that’s the opposite of what humans seek, right? We don’t want complexity in our lives. We want ease so I think that’s a great step. Are there any other areas where you see users kind of being overburdened with, I guess complexity or you mentioned being cognitively overloaded, I think, yeah, somewhere.
Julie: So a couple of things come to mind. So, one, think of all of the kind of security warnings and things you get, like if you’re browsing the web. I mean, what does that really mean to a user and they kind of become numb to those types of warnings over time, right? Like they don’t really understand what they mean or what they should do about it. So they just kind of click okay. Okay. Okay. So that’s one thing.
Another thing is, I think, especially now, with people introducing more smart home devices, having more Internet of Things types of devices and their home network. So now you know, whereas before, you might have a couple laptops or desktops or something on your home network, now you’ve got those plus smartphones.
And I think my husband did some kind of network monitoring and was mapping out our home network and all the devices and it was something like 50 devices on the network or something like for a family of four, plus, we have all these smart home devices and he’s a hobbyist with that.
So I think people don’t really understand how to secure their home networks. I think it’s becoming much more complicated. They want to just install it and forget it. And right now, I think too much burden is being put on the user to assume that they have the knowledge to be able to make those decisions for their home network. So that they’re protecting the security of their devices and the privacy of all of the sensitive information that’s being collected by these devices.
There is a lot I mean. You can tell a lot about a persons daily habits and all kinds of things. So I think that we need to give the user more help than that. Put more burden on the manufacturers to provide a lot of this or at least let users know how they can manage the security and privacy of all these devices that we have inundating us now.
Rick: Right? I think that’s a good point of helping the user especially non technical ones, figure out how to make that device secure because a lot of software and things have a lot of these settings in place not set up.
Zoom comes to mind, I’m not going to call out a single company but they had all the issues with everybody working from home and remote work. And a lot of the security settings were there. It just wasn’t easy to find. And they made a lot of changes positively to make it easier for people to manage the security. Yeah, transparency is a big thing. Definitely lack or lack of transparency is the problem.
Especially with so much there’s always a tax, but it seems like it’s ramped up even more, as far as, you know, trying to allow hackers trying to get into zoom or all these VPN remote networks and things. Do you see it needing to be regulated to have manufacturers kind of step up their game, I guess, or is it just an awareness thing where you’re you’re building this… Hey, we got to make it easier for users to be secure?
Julie: I’m hesitant, I’m not going to say push toward regulation necessarily, but I think manufacturers could definitely benefit from guidance, some from voluntary guidance that’s put out there. Because if you think so take the example of smart home devices again. So you have a lot of companies that used to, you know, we’re used to producing things that went in your house, okay?
So for example, like appliances, and so now they’re moving into this smart type of market, right? where suddenly these are internet connected devices. And a lot of these manufacturers don’t really understand the kind of traditional secure development type of processes that other like it product manufacturers have been doing for years. They don’t necessarily understand the security and privacy implications because this is all new to them.
So giving the manufacturers guidance I think is definitely needed. Because where else are they going to get that from? For example, the final should probably be released next week. There’s a NIST report that’s going to be released that provides some guidance for manufacturers on different security capabilities that they can apply. But it also takes a very user centered approach.
So it tells the manufacturers….hey, here’s some things that you need to consider. You need to consider your users like… who do you think is going to be using these devices? Who? What’s the scenario? What are their security needs going to be? And wrapping all of that into the product. And then how do you let them know about what’s available? How do you communicate to them about how you’re going to provide updates or what’s going to happen when this product goes end of life. So those type of recommendations and guidelines I think are very much needed in things like the IoT space and then in other security related areas as well.
Rick: Yeah, that’s a great point and in the end, it benefits the manufacturers because nobody wants to be the next headline with a breach.
Julie: Oh, absolutely.
Rick: So, kind of shifting gears a little bit. You know, we talked a little about password policies. And I know you’ve had experience with security awareness training, which is something that all organizations struggle with. Because if it was as simple as watching a video or answering a quiz, we wouldn’t see so many breaches caused by phishing attacks. How can programs improve from just checking off the annual compliance requirements to actually changing behavior and reducing risk?
Julie: Yeah, so this is a topic I’m really interested in so stop me if I go on too long. So like you said, there’s a lot of industries that are very regulated right now. I’m very compliance heavy when it comes to security. And so first of all, I want to say that I don’t think mandated security awareness training is a bad thing. Because I think having some training is better than having none. But I think the problem comes when organizations only do that minimum bar and think that they’re being successful as long as they meet their compliance goals.
So I think it’s really unrealistic for organizations to believe that their employees are going to always make good security decisions when they only give them one or two hours of training a year. Without reinforcing that throughout the year, and a lot of security, whereas training is just that. It’s like the one hour of computer based training or clicking through a bunch of PowerPoint slides like stereotypically kind of boring. And that’s just not going to cut it.
So, organizations really need to move beyond that to do something that, like you said is more effective and actually changing attitudes and behaviors. So focusing on engaging and empowering employees, trying to help make security a habit, something that they feel personally responsible for, and that they want to do, because they believe it’s the right thing for them in the organization.
So there’s a few ways that we’re finding that organizations can move beyond this compliance mentality. So first, you have to be able to really communicate the relevance of why security is important to people’s trust, jobs and the organization’s mission. You have to establish that connection. Keep the security awareness topics related to things that are going on in the organization, in the world, real world problems, even seasons of the year.
So there might be some possible security pitfalls during certain seasons of the year. So keeping it topical. So for example, I know of one federal agency that every December they have a security fair type of thing where they drop an event where they address holiday related security themes like safe online shopping or what to look out for when you’re buying those fitness trackers or smart home devices like what are the security and privacy considerations.
So relating those things to people’s daily lives. And part of that, too, is establishing the homework connection, which is another trend I’m seeing a lot of. So habits are much easier to maintain if they’re bleeding through people’s lives. Not just when they’re at work so that they are happening when they leave work. So giving them information that they can bring home to their families, in addition to the work related stuff.
And then I think we really need to move beyond, like again, that boring one hour of computer based training and you know, need to make it more engaging and memorable and if it’s appropriate, make it okay to make it fun. Use a lot of different communication channels because people have different preferences for how they receive this type of information. And you need to do it on a regular basis.
Don’t just have a once a year event. Do different events throughout the year, be creative. I’ve seen all kinds of creative approaches. A lot of organizations have security days where they bring in speakers or informal drop in events, but I’ve seen security themed calendars. There was one organization that did a Shakespeare theme performance that they called To Send or Not To Send, where they talked about secure email behaviors. So that was something that was very engaging to people and definitely memorable. Then not just providing the awareness.
The awareness is the kind of the understanding that security is important to me and that there are security threats out there but then you have to give them the tools that they can do something about it. So give them prioritized, concrete steps that they can take. That’s super important. Don’t just put the fear in them, tell them what they can do about it. And then finally, you need to be able to measure the effectiveness of the Security Awareness Program.
That’s not a trivial thing to do, it’s going to differ from organization to organization. But you know, some things that I’ve seen others do. When they look at, for example, events attended, so they have a training event, they look at how many people are going, but they also look at who is going. So are there certain groups or work roles that aren’t attending these events?
So that gives them a clue, like, okay, we need to do more outreach to that group and try to understand more about… well, why are they interested in this. Things like doing surveys after training events, getting feedback from employees about what they liked, or didn’t like, or if they have some suggestions for future topics. I think what’s really valuable is looking at user generated security events or incidents.
For example, if my organization is really focused on phishing awareness, I can look at if I’m doing phishing exercises on a regular basis, I can look at the click rates for those exercises… are those going down? If I’m educating people about secure email practices, is the number of PHI email incidents going down? I’m not just focusing on the negative trends, but looking at the positive trends.
Are more people reporting suspicious emails, are they reporting potential security incidents? All of these types of measures can start to provide some indication on the effectiveness of the security awareness training and also inform which areas may need more attention.
Olena: Well, thank you for tuning into this edition of the HIPAA Critical Podcast and congratulations, Sierra on her inaugural podcast.
Sierra: Yeah, thanks so much. It was a pleasure to be here..
Olena: Wonderful. We look forward to having you more often and welcome to the team. If you like what you hear, be sure to like and subscribe. Until next time.