Be Still my Heart

implantable medical devicesI read two articles recently that gave me pause.  First, the Washington Post summarized a recently released Verizon report detailing the increase in breaches, and the poor prognosis for following years.  The only silver lining in the article was the admonition that data sharing and monitoring are keys to recognizing breaches early…nice to know MiddleGate is on the right track according to others as well.

Then Business Insider published an article explaining the results of a medical equipment security audit done at a Midwest hospital system.  This article described in detail the types of hacks possible on implantable, or networked medical equipment.  One paragraph in particular caught my eye:

Though targeted attacks would be difficult to pull off in most cases they examined, since hackers would need to have additional knowledge about the systems and the patients hooked up to them, Erven says random attacks causing collateral damage would be fairly easy to pull off.”

Unfortunately, I disagree with this statement as targeted attacks are possible if you have specific information on your target (i.e. a stolen medical record).  Pretend for a moment that you are a member of your family receives the following email:

“Hello.  You don’t know me, but I feel as if I know everything about you.  I have a copy of your medical records here, and it has been an interesting read.  I especially liked the part about you having gotten a “SuperTech Pacemaker IV” last Fall…great choice considering that near-death scare you had with an AV heart block.  Of course, I also think it was a great choice because it so happens that I can hack your pacemaker from all the way over here in Europe…small world huh?!  Now, don’t panic…I don’t want to hack your pacemaker…that would ruin your day.  However, I do want you to wire $5,000 to the account below to keep me from being tempted (sometimes I have a bad day and lash out :-).

Is it really that unlikely that people who have stolen your medical records won’t be tempted to at least blackmail you, even if they don’t have the capacity to hack your implanted medical devices?  Is it really that unlikely that a few pacemakers won’t be hacked and shut off to kill a person and make this type of blackmail more credible?

 

Shoot in Foot…Rinse…Repeat.

imagesThis blog is normally written in ‘first person plural’ (i.e. “we”) to reflect the multiple inputs that influence the entries.  Today, however, you are getting me, Sean Scorvo, M.D., in all my inglorious ‘first person-ness.’  Why the break in tradition you ask?  “Current events.”

When I was practicing in the E.R., I took care of several “bullet vs. foot” injuries (PS: bullets tend to win).  I heard all kinds of excuses, but the unifying theme was stupidity.  As CEO of MiddleGate, I’ve been privy to a new kind of “bullet vs. foot” injury, but the unifying theme hasn’t changed.

Three years ago I had the pleasure of meeting Ed Stull.  A year ago, we brought Ed in as our CTO.  I’ve learned many things from Ed, but cryptography and network security top the list.  I don’t profess to be an expert in the arena, but I’ve learned enough to recognize that what we are doing in the field is not only top notch, but cutting edge.  I’ve also come to recognize that events like Adobe’s recently announced mega-breach (150 million usernames and passwords), and the carelessness exposed in their cryptography/encryption system as a result are the equivalent of the 2AM ambulance call of a GSW (gun shot wound) to the foot (insert ER staff eye-roll here).

Here is where the analogy diverges.  The 2AM GSW to the foot affected only said Darwinian weed-out candidate, and was a self limited event (the “victim” did not have a tendency to pull the trigger twice).  Breach events exposing underlying lack of attention to sound cryptography/encryption, on the other hand, affect millions and are not self limited events.  Once the breach occurs, the poorly encrypted usernames, passwords, financial data, demographic data, etc., etc., are used to infiltrate networks and other systems.  In other words, the damage perpetuates and extends.

This is not “new” news.  That a breach occurred is not surprising…it is to be expected.  That makes the fact that the data contained in the Adobe environment was not adequately secured all the more troubling.  Unfortunately, this is unlikely to be a wakeup call for Chief Security Officers everywhere…if it were, the 2012 LinkedIn breach wherein use of an antiquated HASHING algorithm (SHA-1…a soon to be retired system) without “salting”, or the 2011 Sony Playstation breach wherein credit card numbers were encrypted but personal data was not, would have elicited a data security sea change.  In other words, the Adobe breach may have happened, but with everyone having already learned the lessons of their peers,  the encryption would have been robust enough that nothing could have been gained by the hackers.  Alas, that was not the case.

In the end, I am happy Ed is on our team.  I am happy we are well ahead of this issue. I am happy we will not shoot ourselves (or more importantly, our customers) in the proverbial foot.  I am, however, unhappy in the realization that I naively thought I’d seen my last GSW to the foot back in 2007…I’m just seeing them in a different form now.

Dark Side of the Moon

220px-Dark_Side_of_the_Moon copyAs you’ve likely noticed, we attempt to liven up the world of HIPAA and all things related…it isn’t easy.  HIPAA gets modified, medical records are breached (again), someone sues someone else, etc., etc., etc.  After awhile, the entire discussion sounds a bit like a broken record skipping.

Thats why we like to find that pristine copy of “Dark Side of the Moon” hidden in the back of the vintage record shop, and put it on the turntable.  It hasn’t been played since 1978 and it still creates static as you pull it from the record jacket.  It’s this unfettered, non-skipping record that allow one to break through the annoying background noise to try to figure out how song #3, “On the Run,” relates to song #6, “Money”…remember how you’d listen to the whole album/conversation to put together the big picture? (PS: if we have to explain these references because you’ve never heard of Pink Floyd or an LP record, then this entry isn’t for you…go to another browser window immediately).

When we read the following article by Al Saikali, we had one of those “found a pristine copy of Dark Side of the Moon” moments.  Mr. Saikali describes how, in Resnick/Curry v. AvMed, Inc. in the Southern District Court of Florida, a settlement was reached for $3,000,000 in the loss of two laptops containing un-encrypted patient insurance information.  In the settlement, the 11th Circuit Court wrote an opinion supporting the plaintiff’s contention that although the litigants had not been shown to suffer damage (yet), a portion of the insured’s premiums were supposedly to have gone to the securitization (e.g. encryption) of patient data, employee training on proper HIPAA protocols, etc.  Given that the defendant did not, apparently, spend $ in those areas (as evidenced by the breach), the plaintiffs had standing to sue.  Apparently the defendants took this as writing on the wall and decided to settle.

So how does this take us back to the “Dark Side of the Moon” reference?  Well, we’ve been keeping track, and this past year has been quite interesting on the breach litigation front.  First, Clapper v. Amnesty International said there had to be proven harm in order for the plaintiff to win in a breach case…one would think this would have emboldened the defendants in Resnick/Curry v. AvMed, but read on.  Subsequently, Hinchy v. Walgreens pointed out that HIPAA could be used as a weapon in breach cases regardless of harm, and by private citizens no less, where it illuminated that a Covered Entity had not met the industry standard for patient data security.  Now, in Resnick/Curry v. AvMed, Inc., we have a settlement, based in large part on a Circuit Court opinion, pointing out that, regardless of harm, the plaintiffs had a basis to sue on the expectation that some of their premium was going toward securing their patient data and it was not apparently secured.

The final outcome is that there is no final outcome.  There appears to be a balance establishing itself in the courts.  Proof of harm in a breach is being balanced by an expectation that patient data is secured according to industry standards (i.e. HIPAA).  There may indeed be a test case that makes its way to Supreme Court some day, tilting this balance one way or the other, but in the interim this is where we appear to be.

Now, as for the link between “On the Run” and “Money,”, come on, really?  And on that note, it is time for us advance the turntable arm to the last two songs, “Brain Damage” & “Eclipse,” and bid you adieu.

Take 15 U.S.C. §1681b…Please

Henny_YoungmanTake 15 U.S.C. §1681b…Please (sorry, I couldn’t resist the Henny Youngman reference there).  It regulates the sale of consumer reports which are essentially aggregated data.  The sale of aggregated data is big business, and Data Aggregators serve a number of valuable functions.  Their data allows us to apply for loans and mortgages, spend our business advertising $ effectively, ensure that our daycares aren’t employing pedophiles, etc. All due respect to Orwellian protestations (as they are valid), Data Aggregators play an important role in our society.

So we read with interest the investigative article detailing the sale of aggregated data to an identity theft ring by company owned by a well known Data Aggregator.  The article chastises said Data Aggregator for having sold data to an un-vetted “vendor”, and regulators for having missed the signs, but we took away a different message:  The regulatory environment controlling the sale of this data is convoluted.  It would be challenging for any business to ensure compliance and consumer safety while executing a viable business model.

15 U.S.C.§1681b details the “permissible purposes of consumer reports” (i.e. when it is allowable to sell aggregated consumer data).  While not a defense of the company involved in this particular situation, we do challenge you to read that U.S. Code, put yourself in the shoes of a Data Aggregator, and come up with a business model that allows you to vet all vendors, data requests, etc. in a cost effective manner with a 100% guarantee that a scam artist hasn’t infiltrated the ranks.

Fortunately, the mothers of the MiddleGate team taught us to never point out a problem without offering a solution.  The MiddleGate model, developed to work in the world of HIPAA, may be a model for the future of the Data Aggregation industry.  We believe this case points out that the future of Data Aggregators may not be in the sale of their data, but in the sale of the patterns their data matches to.  We used the model to navigate the complex world of HIPAA in a cost effective manner, and the same could be done to navigate 15 U.S.C.§1681b knowing that it is unlikely there will be any meaningful regulatory reform in the near future.  We used the model to share the knowledge our data conveyed without sharing underlying patient information.  In short, we used the model to maintain privacy in a world clamoring for information.

 

Security Analytics

UnknownWe try to avoid simply re-posting articles here.  However, a recent paragraph we read in “How Existing Security Data Can Help ID Potential Attacks” (from Information Week Reports) succinctly described a trend we are seeing and a market we are servicing:

Don’t think about security analytics as simply another product you need to buy; think about it first as a new approach to intelligent incident response. That new approach is needed because, frankly, what most of us are doing now isn’t working. By the time most security pros process disconnected forensic information, the bad guys already have your data. According to Verizon’s 2013 Data Breach Investigations Report, in over half of reported incidents, it took malicious hackers only a few hours to go from initial compromise to data exfiltration. However, 85% of breaches took organizations weeks or more to discover. Similarly, according to the Ponemon Institute’s Post Breach Boom study, it took an average of 80 days to discover and resolve a malicious breach. Eighty days!

Jules Verne wrote a novel about traveling around the world by balloon in 80 days…became a couple of movies.  Seems quaint now when we consider that our data can travel around the world in 80 seconds (or less), and that 80 days of undiscovered malicious data use allows a phenomenal amount of time for damage to be done.  We found that healthcare Covered Entities are sitting on vast troves of data that they simply cannot utilize (i.e. share to good effect) because of the restrictions placed on them by HIPAA/HITECH.    We solved that piece of the puzzle and decided to act because, quite honestly, the current silo approach to data security isn’t working and in the end, it is everyone’s medical records, privacy, and security that are at risk…and that mean’s everyone at MiddleGate and our extended families as well as our customers.

The New Standard

gold standardLead

GOLD or LEAD?  In medicine new treatment modalities would run through a series of steps before eventually (if ever) being accepted as a “standard of care.”  The business world goes through similar steps before accepting a new modality as a “best practice.”

Interestingly, it seems that the two worlds may be overlapping thanks to the Federal Government.  Although not strictly a “standard of care”, HIPAA is the mandated standard for maintenance of medical record privacy.  Recent court cases have explored the limits of HIPAA’s use as a defensive tool, as well as an offensive tool where medical record privacy issues are concerned (see “For Every HIPAA Yin, a HIPAA Yang” and “Clapper v. Amnesty International“).  Now it seems plausible that HIPAA’s utility may not stop once one crosses the line from the world of medical record privacy to the world at large.

Our company has had discussions regarding use of compliance with HIPAA Security and Privacy Rules as a competitive advantage in other industries such as telecommunications and finance, where privacy concerns are growing in the wake of recent news items.  Could it be that companies may one day tout their “ability” to protect one’s personal information on a level equal to the standards set by HIPAA for Protected Health Information?  Could it be that HIPAA standards will facilitate use of government agencies to protect against government intrusion?  The debate opened by this possible use of HIPAA standards as a best practice outside of healthcare is intriguing, and it represents the steps necessary for deciding if it can be a best practice.  We may be looking at the next gold standard…or the next batch of lead.  Either way, we may just be seeing the beginning of the debate.

For Every HIPAA Yin, a HIPAA Yang

YinYang-1I’m staring out the window at the East side of Portland, OR as I write this.  The clouds have finally started to roll in, likely spelling an end to a spectacular Portland summer.  The 60-90 days between late June and late September where we can count on sun, low humidity, and temperatures in the 80’s are Yin to 300-or-so days of cold, wet, and cloudy Yang.  Alas, Yin, we shall miss you, but Yang, you do keep things green and fresh.

The world of HIPAA is not without its Yin and Yang.  Last week we reviewed the HIPAA Yin implications of Clapper v. Amnesty International, showing its utility for defense against damages in breach cases.  Now consider the Yang: Hinchy v. Walgreens, and its use as a roadmap for the use of HIPAA as a weapon for individuals.  Allow me to expand.  The HIPAA Privacy Rule does not give individuals (you and I) the right to sue anyone for violation of our medical information privacy.  Rather, the Federal Govt. metes out fines, publicly shames, decreases reimbursement, and occasionally imprisons the guilty party(-ies).   However, as “The Pathology Blawgger” describes in a spectacular article, an enterprising attorney by the name of Neal Eggeson has been successful in using HIPAA to establish a standard of medical information privacy.   When there is deviation from this established medical information privacy standard Mr. Eggeson is able to show how individuals (e.g. his clients) are effected.

Thus, in the span of seven months, we’ve gone from use of HIPAA as a defense in the courts, to use of HIPAA as a weapon in the courts.  For every Yin a Yang.

Clapper v. Amnesty International

Landmarks, Memorials, Monuments

Those involved in Protected Health Information security are going to come to know this case well if they don’t already.  This US Supreme Court decision from February, 2013, at its core, declared that if damage from a breach can’t be proven, damages will not be awarded to the class action suit litigants.

Dry stuff, yes, but consider the implications in relation to HIPAA, HITECH, and the FInal Omnibus Rule.

First, given that HIPAA’s definition of breach has been modified from “Risk of Harm” to an Objective Standard of harm (including whether breached information was actually acquired or viewed), the Clapper decision backs up in the courts what has already been decided by the Final Omnibus Rule.

Second, with Safe Harbor definitions substantiating that inadvertent disclosure of Protected Health Information (PHI) to a person authorized to access PHI without further use or disclosure not permitted by the HIPAA Privacy Act does not constitute a breach, the Clapper decision again backs up in the courts what has already been clarified in the Final Omnibus Rule.

Still too dry for you?  Let us link this back to the real world.  Sutter Health recently experienced another breach, potentially adding to its $4.25 Billion class action suit woes from a previous breach.  What does Clapper v. Amnesty International mean to them?  Well, the Sutter Health legal defense team now has coverage on all fronts.  They may be able to prove that no harm has come of the breach, in which case they are in much better shape on the class action suit front, and potentially on the HIPAA/HITECH front as well (of course, Safe Harbor may still not be achieved).  No harm equates to loss for the class action litigants.  Loss for the class action litigants may very well remove a $4.25 Billion liability for Sutter Health.  You may rest assured that the Clapper decision is going to affect multiple cases in progress, and many cases to come in a similar manner.

 

“I don’t know what HIPAA stands for, but I believe in it, and I practice it.”

images

Given that football season is upon us, it seems appropriate to kick off this entry with immortal words from an immortal NFL quarterback: Peyton Manning, circa 2011.

That quote, for you football fans, spilled forth as Mr. Manning was trying to dodge questions regarding the status of his neck…nothing like falling back on HIPAA as the reason one can’t divulge personal medical information.

As you can tell, today I’m using football as a cheap entrance to discuss HIPAA, the Final Omnibus Rule and all things related.  You likely just cringed, but please allow me to continue…there is a link here.

As in HIPAA/Final Omnibus, there are (relatively) safe areas on the football field:  On the football field, “the pocket” affords relative safety, and in HIPAA, it is falling in with Safe Harbor.

As in HIPAA/Final Omnibus, football allows for “do-overs”:  On the football field, instant replay and challenge flags equate to Affirmative Defense.

And finally, as in HIPAA/Final Omnibus, football is a stickler for rules:  On the football field, the midfield referee conference equates to the Objective Definition of Breach.

Those of you wanting to read how I made this metaphorical leap from the football field to the world of regulatory compliance can do so from our white paper (link at top of page here).

Best of luck to your team(s) this season!

Medical Identity Theft a Growing Problem

Medical Identity Theft a Growing Problem

By Emily P. Walker, Washington Correspondent, MedPageToday
Published: September 23, 2011
Click here to provide feedback
WASHINGTON — Nearly four out of ten doctors and hospitals surveyed have caught a patient trying to use someone else’s identity in order to obtain healthcare services, according to a new survey from accounting firm PricewaterhouseCoopers (PwC).

Patients seeking medical services under someone else’s name was the second most common privacy or security issue reported by healthcare providers, according toPwC’s nationwide survey of 600 executives from U.S. hospitals, doctors’ organizations, health insurance companies, pharmaceutical manufacturers, and life sciences companies.

Medical identify theft is the fastest-growing form of identity theft, affecting 1.42 million Americans in 2010 and costing more than $28 billion, the report said.

http://www.medpagetoday.com/PracticeManagement/InformationTechnology/28696

When I was practicing medicine in the ER, this was a daily occurrence.  In fact, the problem led to the formation of our company, and the launch of our service offerings when I found that the medical identity theft issue was part of a larger problem related to lack of security for Protected Health Information.