There’s a news story going around that talks about how horrible computer security tends to be in hospitals. This probably doesn’t surprise anyone who works in the security industry, security is often something that gets in the way, it’s not something that helps get work done.
There are two really important lessons we should take away from this. The first is that a doctor or nurse isn’t a security expert, doesn’t want to be a security expert, and shouldn’t be a security expert. Their job is helping sick people. We want them helping sick people, especially if we’re the people who are sick. The second is that when security gets in the way, security loses. Security should lose when it gets in the way, we’ve been winning far too often and it’s critically damaged the industry.
They don’t want to be security experts
It’s probably not a surprise that doctors and nurses don’t want to be computer security experts. I keep going back and forth between “you need some basics” and “assume nothing”. I’m back to the assume nothing camp this week. I think in the context of health care workers, security can’t exist, at least not the way we see it today. These are people and situations where seconds can literally be the difference between life and death. Will you feel better knowing the reason your grandma died was because they were using strong passwords? Probably not. In the context of a hospital, if there is any security it has to be totally transparent, the doctors shouldn’t have to know anything about it, and it should work 100% of the time. This is of course impossible.
So the real question isn’t how do we make security 100% reliable, the question is where do we draw our risk line. We want this line as far from perfect security and as close to saving lives as possible. If we start to think in this context it changes our requirements quite a lot. There will be a lot of “good enough security”. There will be a lot of hard choices to make and anyone who can make them will have to be extremely knowledgeable with both health care and security. I bet there aren’t a lot of people who can do this today.
This leads us to point #2
When security gets in the way, security loses
If you’re a security person, you see people do silly and crazy things all the time. Every day all day. How many times a day do you ask “why did you do that”? Probably zero. It’s more likely you say “don’t do that” constantly. If you have kids, the best way to get them to do something is to say “don’t do that”. If we think about security in the context of a hospital, the answer to “why did you do that” is pretty simple, it’s because the choice was probably between getting the job done and following the security guidelines. A hospital is one of the extremes where it’s easy to justify breaking the rules. If you don’t, people die. In most office settings if you break the rules, nobody dies, there will possibly be some sort of security event that will cost time and money. Historically speaking, in an office environment, we tell people “don’t do that” and expect them to listen, in many cases they pretend to listen.
This attitude of “listen to me because” has created a security universe where we don’t pay attention to what people are actually doing, we don’t have to. We get in the way, then when someone tries to get their work done, we yell at them for not following our bizarre and esoteric rules instead of understanding the challenge and how we can solve it together. The next great challenge we have isn't tighter rules, or better training, it's big picture. How can we start looking at systems with a big picture view? It won't be easy, but it's where we go next.
What do you think? Let me know: @joshbressers