After we went over the HIPPA security documentation the explosion of critical emails from development staff was deafening.
The law doesn’t “require” you to have any specific security, and it doesn’t even require you to have security at all. The Security Rule is based on the fundamental concepts of flexibility, scalability and technology neutrality. Therefore, no specific requirements for types of technology to implement are identified. It allows a business to use any
security measures that enables it “reasonably and appropriately” to implement the standards and implementation specifications. The business must determine which security measures and specific technologies are reasonable and appropriate for implementation in its organization.
So, there’s some good and some bad here. Technology is constantly evolving and evolving fast. Attacks on data come from all angles, often specific to the business rather than just randomly. Protection, therefore, must be customized to the needs of the business and the data which it is responsible for.
Currently, the most effective security I’ve seen is Two Factor Authentication (2FA). (If you aren’t familiar with it, you can read more about it here) That’s great, but what about in 10 years? My guess is that 2FA will be laughed at by hackers. There will be some way around it, some way to fake a fingerprint and hack a phone generated one-time-code. So, they can’t legislate it. The same goes for the actual encryption logic; we’ve got MD5, SHA, RSA. MD5 went from being the best type of encryption angoritym to being considered by the CMU Software Engineering Institute as “cryptographically broken and unsuitable for further use” (ref. Now, the most secure documents are stored in SHA-2.
So, you really can’t legislate specifics when working with Technology.
Beyond that, we have to consider the actual information itself. “HIPPA related data” covers A LOT. Our inventory systems, for example, cover medical equipment, but also staff information. So, we have to secure and encrypt, but do we have to encrypt “everything”? NO. They also have terabytes of archived images and documents which are not remotely private. Should we be required to have them all encrypted? The illustrated manuals for the repair and maintanence of a Cat Scan machine from 1995 that isn’t being used anymore? Encrypt it?. No. We dump it on Amazon S3 and give the system a URL.
We’ve built software where data regarding healtcare forms is actually pre-loaded onto a barcode scanner and can’t even be accessed without custom programming. Should the government insist we encrypt that?
So, you can’t legislate the specifics of what data because businesses are all different and the data is extremely varied.
Finally, an my most personal issue with security is, “Weakest point of Entry”. The best story I’ve heard was while we were working on an OLAP system with Wells Fargo back in 2004. Another bank (I won’t name) had a white-hat security company come to give them an “Audit” after another security company locked down their network. The contract for the auditor simply stated that a large payout would be made only if they could produce secure, internal, data within 7 days.
The very same day, one of the auditors dressed up in a blue button down shirt and khaki pants, loaded down with several laptop bags, trotted right by their front desk security (a contractor from a temp agency getting paid $12/hr) waving a fake ID while yammering on a cell phone. He went right up to accounting and told one of the clerks he needed to update the security on her machine. Just told her to go get a coffee and he’d be done in a minute or two. 5 minutes later, he walked out with a bunch of random accounting files from her machine an handed it off to close the contract.
There was a huge argument – heard all over town – about whether that “counted”, because the security firm was only authorized to secure the network, and didn’t even consider the little laminated badges or the staff. They finally got the payout, because there were no specifics in the contract.
The learning point is that you need to determine the “low hanging fruit”. Sometimes it’s a back-door password in programming or a few forgotten scripts that were put in to help QA but got published anyway. But, for the most part, the security issues is human. Don’t spend your entire budget securing your network when you’ve got a security guard who’s practically illiterate. (that’s not a joke either. My wife teaches for a volunteer literacy program and actually has had corporate security guards who can’t read as well as my 8yr old. Seriously.)
HIPPA regulations are frustratingly vague, but that’s not always a bad thing.
- but legislating specifics in security technology is a seriously bad idea. The last thing we need is to be legally forced to implement something that is outdated.
- The overall breadth of data which may fall under HIPPA is so broad that there is no way that all of it needs the exact same level of security.
- The human factor isn’t even considered here. If someone’s been in prison for identity theft, should they get a job at your company… even as a janitor?
For the more business minded, like myself, and for those developers who still aren’t convinced. What if the government DID require specifics? Politically, we all know that it would truly be the cheapest and lowest level of encryption possible. While a company may insist that it uses a very secure technology, they will probably get hacked some day, and they know it. Legally, they want to be able to go to court and say, “Look, HIPPA only requires that we use ABC, but we actually went over-and-above, using XYZ!”, suddenly making them the responsible ones and the people who are suing because they’ve lost their entire lives due to a hacker walk out of court empty handed. The higher the security required, the more cost associated with it. Even if they are already far more secure, that “requirement”, itself, is a “cost”.
The more clear reality is that if ABC was required, nobody in their right mind would be doing XYZ, they’d be doing BCD. The “Over-and-Above” argument works just as well if you’re 10 miles above crap or 10 inches. Specifics in HIPPA would, most likely, reduce security, as the required method would be out-dated and nobody would feel they need to do anything beyond a tiny step more.
References
- http://www.hhs.gov/ocr/privacy/hipaa/administrative/securityrule/techsafeguards.pdf
- http://www.hhs.gov/ocr/privacy/hipaa/faq/securityrule/2001.html
- http://spin.atomicobject.com/2014/11/20/encryption-symmetric-asymmetric-hashing/
- http://www.kb.cert.org/vuls/id/836068
- http://docs.aws.amazon.com/AmazonS3/latest/dev/s3-dg.pdf
- http://www.hhs.gov/ocr/privacy/hipaa/administrative/breachnotificationrule/brguidance.html