When working on a plan for this episode I had two different sources drop some insider breach issues in my lap. When I added those to the news stories we are already following involving insider issues, it was clear the topic was meant to be. Multiple cases and reports are out — the topic we must cover is because I am reading about insider breaches everywhere around me.
The Summer 2019 OCR Cybersecurity Newsletter just came out. I thought that it was timely since the news just came out that the Capital One breach, which is really an Amazon AWS breach, involves 30 companies. Then, I received an email from our friend, Rob Pruter with SPHER Inc, about the recently released report “Insider Data Breach survey 2019 Research” that was commissioned by Egress and conducted by Opinion Matters. I always knew that Rob and I had a connection but this was sooo very weird. Not only was his information in line with what I was reading at the time but when I see these things SPHER always comes to mind. We will get into that more later.
After looking at all that information there are certain things that rise to the top, let’s look at what they are all talking about as it pertains to why you listen to this podcast. What does it mean to me, it is all about me, me me. Am-i-right-or-am-i-right.
What does OCR say about insider breaches in their newsletter?
I do appreciate there is a footnote referenced in the first paragraph of the newsletter. That footnote says “This newsletter is focused on malicious threats that insiders can present, but unintentional or inadvertent actions by insiders can also introduce cybersecurity threats.” I want to emphasize that as well. Most data breaches involve an insider of some sort making a mistake if not being outright malicious.
The idea that insiders are a real problem is obvious when you read their newsletter remember, they get to see things we never see that are happening in organizations today. When a rash of things happens either behind the scenes or in the news, OCR releases newsletters or guidance like this just to educate without confirming specific cases. Insiders can be malicious by stealing data, sharing data, or destroying data. All of those possibilities are pointed out in the newsletter. Another important note included is that business associates are also part of those defined as insiders. Anyone that has been purposefully granted access to your data is an insider. Do not leave anyone out when you are planning your safeguards.
There are some suggestions for keeping an eye on things even though they acknowledge that it is particularly hard to track these problems. They suggest you do things like:
Follow HIPAA Security Rule requirements
A security risk analysis so that you know where your data is and where it should be going. That is the only way you have a chance of noticing when something is not right. The risk management plan should include building systems, policies, and procedures for auditing and detecting these kinds of insider breaches as soon as possible.
Set access controls to the minimum necessary required to do a job. If a staff member doesn’t need to update data, then they should not be allowed to do so in their user profile.
By reviewing “audit logs, access reports, and security incident tracking reports” as required in the Security Rule, you will be able to notice anomalies that lead you to the insider issues. They refer to this as “Real-time visibility and situational awareness.”
They finally make a point of saying that security isn’t something you just set it and forget it. It is a dynamic process. Create reviews of access requirements for promotions or demotions, job changes, department changes, etc. Also, handle termination checklists before someone leaves your employee not a month or two later. Yes, David’s pet peeve.
What is in the Egress insider breaches report?
This report is pretty interesting research for us to reference. Basically, it shows us that the tech folks think insiders are messing up with data constantly. On the flip side, insiders think the data is pretty much theirs to do what they want or need to do with it. Here are some stats and quotes from the report.
There is a serious disconnect between the concerns of IT and the employee’s understanding (or willingness to admit to understanding) what constitutes a data breach. They will say they didn’t cause a breach, but they did actually share data outside security policies.
One of the statistics the report showed over and over was that disconnect. Many employees felt like the data they worked with was something they could share as they please. The age difference in that opinion was interesting.
Here is a chart that made one thing we have talked about pretty obvious.
How many times have we talked about what a bad idea it is to use “just say no” as a policy? This survey makes it very clear that employees will find a way to do something they feel is required in their job even if it violates policy. If this is something they feel like they need to do as a group, you have to give them the ability to do that securely, or they will do it without security insecurely. Different story if there is one person, maybe, but put “just say no” as a policy is not a policy that will be enforceable in most cases.
I really like the next chart because it asked employees to explain why they made an accidental improper disclosure.
It was the Friday before a holiday, it was the end of the day and I need to run, my deadline is here, and I am not ready, or I was just so tired I made a mistake. All of those reasons sound familiar reasons to us. We hear them all the time. When you look at the sum of all of those reasons almost every person answered one if not several of those as a reason for their mistake. How do you train around tiredness, rushing, and stress?
This study just reiterated what OCR was pointing out is a problem. Insiders may not even be acting with malicious intent but the problems they create are huge. This report said that the IT team sees an insider breach has the largest impact on reputation. I am not sure if they think it is the reputation of the company, the IT safeguards, or both.
What should you do about all of this insider breaches information?
As my good friend, Stephen, told me this week he sometimes elects to rely on his “ostrich syndrome”. I think most people are happy that way when it comes to things that are uncomfortable like these cases. Here is the bottom line you get to determine how uncomfortable you get to be. My bet is ostrich syndrome leaves you with a lot of sand packed into the more uncomfortable places left sticking out of the sand while you hide our head.
We do all kinds of things to monitor users on the network and devices looking for invalid logins and things being done by people who are not supposed to be inside the network. It is unfortunate but there isn’t a lot of watching over the ones who are already inside and invited to be there. This is where our friends at SPHER come into play. Their app is looking at what is happening within your patient records systems where the valuable data is kept. Insiders with access to those systems don’t get free-range in there. As Severino said it, patient records should be treated like bars of gold. Those systems are where all your gold is stored.
All of those certified EHR/EMR systems had to generate a log of activity. However, they did not have to do anything with it other than let you look up info in it. You have to know to look for the problem to find the problem. That is like living in a time when you had no weather forecasts. No other way to know what was coming than to look out the front door, or what is left of it, after a storm has passed.
That is where SPHER comes into play. It evaluates activities in those logs and looks for patterns that seem out of whack. It then shows you what looked odd and asks you if that is something to worry about or not. You teach it by answering things that it should let that go always, ignore it just this once, or holy crap let me look into what is going on — you are correct, that is not normal.
It isn’t in real-time but you will probably know before the FBI or a report calls to tell you about it. In fact, you can know in a matter of days when strange things start happening. You don’t learn it when the problem starts. By the time the FBI or reporter calls you the problem has been going on for months. When it comes to the damage an insider can do, the only way to catch it before it is out of control is through systems like these. Again, there are competitors. I am not saying there is only one solution. We can compare solutions in another episode.
If you don’t have one or aren’t getting a solution such as SPHER in place, you must develop a regular random audit of patient records access. I mean at least once a month someone is assigned some random accounts to review or a day to review. They must look at accounts closely for people that don’t belong or behavior that doesn’t belong. Start this week. If you aren’t looking at that stuff, then you are simply unaware that it is happening or has happened. Just look at the results of the study — people think they aren’t violating policy when they do what they believe is their right to do with the data they access.
When I first met Ray Ribble, President & Founder of SPHER, several years ago I told him he was ahead of his time. I knew that everyone needed this kind of solution connected to their EHR but it was going to take a while for them to understand that they needed it. Just like data breaches and ransomware attacks of 2016 made people pay attention to Kardon, their time has come when insider breaches are everywhere. If you don’t get a plan to deal with this inside your organization soon, you will be one of the stories that get others to jump on board.