HIPAA Compliance for Medical Practices
64.6K views | +19 today
Follow
HIPAA Compliance for Medical Practices
HIPAA Compliance and HIPAA Risk management Articles, Tips and Updates for Medical Practices and Physicians
Your new post is loading...
Your new post is loading...
Scoop.it!

Will 2016 be Another Year of Healthcare Breaches?

Will 2016 be Another Year of Healthcare Breaches? | HIPAA Compliance for Medical Practices | Scoop.it

As I listened to a healthcare data security webinar from a leading security vendor, I had to ask: “Are we now experiencing a ‘New Normal’ of complacency with healthcare breaches?” The speaker’s reply: “The only time we hear from healthcare stakeholders isAFTER they have been compromised.”

 

This did not surprise me. I have seen this trend across the board throughout the healthcare industry. The growing number of cyberattacks and breaches are further evidence there is a ‘New Normal’ of security acceptance — a culture of ‘it-is-what-it-is.’ After eye-popping headlines reveal household names were compromised, one would think security controls would be on the forefront of every healthcare action list. Why then are we seeing more reports on healthcare breaches, year after year? 

 

This idea comes from the fact that, due to a lack of enforcement, acceptable penalties, and a culture of risk mitigation, more breaches are to be expected in the healthcare industry. Until stricter enforcements and penalties are implemented, a continuation of breaches will occur throughout the industry.

 

The Office of Civil Rights (OCR), the agency overseeing HIPAA for Health and Human Services, originally scheduled security audits for HIPAA to begin in October 2014. Unfortunately, very few audits have occurred due to the agency being woefully understaffed for their mandate covering the healthcare industry, which accounts for more than 17 percent of the U.S. economy.

 

Why Sweat a Breach?

Last September, newly appointed OCR deputy director of health information privacy, Deven McGraw, announced the launching of random HIPAA audits. In 2016, it is expected 200 to 300 covered entities will experience a HIPAA audit, with at least 24 on-site audits anticipated. However, this anticipated figure only accounts for less than one percent of all covered entities —not much of an incentive for a CIO/CISO to request additional resources dedicated to cybersecurity.

 

Organizations within the industry are approaching cybersecurity from a cost/benefit perspective, rather than how this potentially affects the individual patients. For payers who have been compromised, where will their larger customers go anyway? Is it really worth a customer’s effort to lift-and-shift 30,000, 60,000 or 100,000 employee health plans to another payer in the state? This issue is similar to the financial services industry’s protocol when an individual’s credit card has been compromised and then replaced, or when individual’s want to close down a bank account due to poor service: Does anyone really want to go through the frustration with an unknown company?

 

For some of the more well-known breaches, class-action lawsuits can take years to adjudicate. By then, an individual’s protected health information (PHI) and personally identifiable information (PII) has already been shared on the cybercriminal underground market. In the meantime, customers receive their free two-year’s worth of personal security monitoring and protection. Problem solved. Right?

 

The Cost of Doing Business?

When violations occur, the penalties can sting, but it’s just considered part of the cost of doing business. In March 2012, Triple-S of Puerto Rico and the U.S. Virgin Islands, an independent licensee of the Blue Cross Blue Shield Association, agreed to a $3.5 million HIPAA settlement with HHS. In 2012, Blue Cross Blue Shield of Tennessee paid a $1.5 million fine to turn around and have another HIPAA violation in January 2015..

As of December 2015, the total number of data breaches for the year was 690, exposing 120 million records. However, organizations are unlikely to be penalized unless they fail to prove they have steps in place to prevent attacks. If an organization does not have a plan to respond to a lost or stolen laptop, OCR will possibly discover areas for fines, but this can be a difficult process. Essentially, accruing a fine after a cyberattack or breach is relative.

 

A more recent $750,000 fine in September 2015 with Cancer Care group was settled, but the occurrence happened in August of 2012 — nearly three years later. A 2010 breach reported by New York-Presbyterian Hospital and Columbia University wasn’t settled until 2014 for $4.8 million. Lahey Hospital and Medical Center’s 2011 violation was only settled in November 2015 for $850,000. With settlements taking place several years after an event, settling may appear to be a legitimate risk assessment, further reinforcing the ‘New Normal’ of cybersecurity acceptance.

 

At one HIMSS conference, the speaker emphasized to a Florida hospital the need to enforce security controls. They replied with, “If we had to put in to place the expected security controls, we would be out of business.”

 

Simply put: The risks of a breach and a related fine do not outweigh the perceived costs of enhancing security controls. For now, cybersecurity professionals may want to keep their cell phones next to the nightstand.

more...
Guillaume Ivaldi's curator insight, April 2, 2016 10:18 AM
Simply amazing: cost of providing a decent security is clearly not aligned with the business outcomes, and therefore it is economically better to endure the fine than being fully compliant to the regulation ...
Elisa's curator insight, April 2, 2016 5:47 PM
Simply amazing: cost of providing a decent security is clearly not aligned with the business outcomes, and therefore it is economically better to endure the fine than being fully compliant to the regulation ...
Scoop.it!

Millions Potentially Affected by Premera Data Breach

Millions Potentially Affected by Premera Data Breach | HIPAA Compliance for Medical Practices | Scoop.it
With so many data breach lawsuits in the news lately, a person would think that companies that have access to private consumer or patient information would take cyber security seriously. Unfortunately, every day there seems to be more news about companies that have been hit by hackers and have allowed customer information to be made vulnerable. One of the more recent of the data breaches is the Premera data breach, in which approximately millions of patients had their private information compromised.

Lawsuits have followed, with plaintiffs alleging Premera Blue Cross did not properly or adequately secure customer information. The lawsuits allege negligence on Premera’s part. As of July 15, 2015, the number of lawsuits consolidated for pretrial proceedings sits at around 35, according to court documents. But the multidistrict litigation (MDL Number 2633) was only just approved, and more lawsuits could certainly be filed, given the massive number of patients affected by the cyber attack.

Reports indicate that up to 11 million customers may have had their information compromised, although some reports put that number affected at around 4.5 million. Still, for those whose information was accessed, the results can be disatrous.

That’s because information stored by companies such as Premera could be used to commit identity theft, where thieves file for credit or tax refunds under someone else’s name. That puts the victim at risk of having his or her credit negatively affected and having lenders come after the victim for bogus mortgages and lines of credit, not to mention the trouble he or she could face for a fraudulent tax return.

Even those who aren’t victims of large-scale identity theft face the time and hassle of sorting out the consequences of having credit

READ MORE PREMERA BLUE CROSS REPORTS DATA BREACH LEGAL NEWS
Premera Data Breach Lawyer: Companies Must Face Consequences for Failing to Protect Identifying Information
Premera Blue Cross Data Breach Results in Several Lawsuits, Class Actions
Patient health information is protected under the Health Insurance Portability and Accountability Act (HIPAA). HIPAA also requires timely notification of information breaches, which critics say was violated here. According to some attorneys, Premera allegedly knew about issues in its security systems from an audit, but did not adequately address those issues, leaving patient information vulnerable.

Furthermore, the data breach allegedly started back in May 2014, but Premera reportedly didn’t warn customers until March 2015. Patients were notified by a letter that their personal information might have been accessed.
more...
No comment yet.
Scoop.it!

Reminders on HIPAA Enforcement: Breaking Down HIPAA Rules

Reminders on HIPAA Enforcement: Breaking Down HIPAA Rules | HIPAA Compliance for Medical Practices | Scoop.it

HIPAA enforcement is an important aspect of The HIPAA Privacy Rule, and also one that no covered entity actually wants to be a part of. However, it is essential that healthcare organizations of all sizes understand the implications of an audit from the Office for Civil Rights (OCR), and how they can best prepare.


This week, HealthITSecurity.com is breaking down the major aspects of OCR HIPAA enforcement, and what healthcare organizations and their business associates need to understand to guarantee that they keep patient data secure. Additionally, we’ll review some recent cases where the OCR fined organizations because of HIPAA violations.


What is the enforcement process?


OCR enforces HIPAA compliance by investigating any filed complaints and will conduct compliance reviews to determine if covered entities are in compliance. Additionally, OCR performs education and outreach to further underline the importance of HIPAA compliance. The Department of Justice (DOJ) also works with OCR in criminal HIPAA violation cases.


“If OCR accepts a complaint for investigation, OCR will notify the person who filed the complaint and the covered entity named in it,”according to HHS’ website. “Then the complainant and the covered entity are asked to present information about the incident or problem described in the complaint. OCR may request specific information from each to get an understanding of the facts. Covered entities are required by law to cooperate with complaint investigations.”


Sometimes OCR determines that HIPAA Privacy or Security requirements were not violated. However, when violations are found, OCR will need to obtain voluntary compliance, corrective action, and/or a resolution agreement with the covered entity:


“If the covered entity does not take action to resolve the matter in a way that is satisfactory, OCR may decide to impose civil money penalties (CMPs) on the covered entity. If CMPs are imposed, the covered entity may request a hearing in which an HHS administrative law judge decides if the penalties are supported by the evidence in the case. Complainants do not receive a portion of CMPs collected from covered entities; the penalties are deposited in the U.S. Treasury.”


During the intake and review process, OCR considers several conditions. For example, the alleged action must have taken place after the dates the Rules took effect. In the case of the Privacy Rule, the alleged incident will need to have taken place after April 14, 2003, whereas compliance with the Security Rule was not required until April 20, 2005.


The complaint must also be filed against a covered entity, and a complaint must allege an activity that, if proven true, would violate the Privacy or Security Rule. Finally, complaints must be filed within 180 days of “when the person submitting the complaint knew or should have known about the alleged violation.”


Recent cases of OCR HIPAA fines


One of the more recent examples of HIPAA enforcement took place in Massachusetts, when the OCR fined St. Elizabeth’s Medical Center (SEMC) $218,400 after potential HIPAA violations stemming from 2012.


The original complaint alleged that SEMC employees had used an internet-based document sharing application to store documents containing ePHI of nearly 500 individuals. OCR claimed that this was done without having analyzed the risks associated with the practice.

“OCR’s investigation determined that SEMC failed to timely identify and respond to the known security incident, mitigate the harmful effects of the security incident, and document the security incident and its outcome,” OCR explained in its report. “Separately, on August 25, 2014, SEMC submitted notification to HHS OCR regarding a breach of unsecured ePHI stored on a former SEMC workforce member’s personal laptop and USB flash drive, affecting 595 individuals.”


OCR Director Jocelyn Samuels reiterated the importance of all employees ensuring that they maintain HIPAA compliance, regardless of the types of applications they use. Staff at all levels “must follow all policies and procedures, and entities must ensure that incidents are reported and mitigated in a timely manner,” she stated.


In April of 2015, the OCR also agreed to a $125,000 settlement fine with Cornell Prescription Pharmacy (Cornell) after allegations that also took place in 2012. In that case, Cornell was accused of improperly disposing of PHI documents. Papers with information on approximately 1,600 individuals were found in an unlocked, open container on Cornell’s property.


“Regardless of size, organizations cannot abandon protected health information or dispose of it in dumpsters or other containers that are accessible by the public or other unauthorized persons,” OCR Director Samuels said in a statement, referring to the fact that Cornell is a small, single-location pharmacy in Colorado. “Even in our increasingly electronic world, it is critical that policies and procedures be in place for secure disposal of patient information, whether that information is in electronic form or on paper.”


However, not all OCR HIPAA settlements stay in the thousand dollar range. In 2014, OCR fined New York and Presbyterian Hospital (NYP) and Columbia University (CU) $4.8 million from a joint breach report that was filed in September 2010.


NYP and CU were found to have violated HIPAA by exposing 6,800 patients’ ePHI when an application developer for the organizations tried to deactivate a personally-owned computer server on the network that held NYP patient ePHI. Once the server was deactivated, ePHI became accessible on internet search engines.


“In addition to the impermissible disclosure of ePHI on the internet, OCR’s investigation found that neither NYP nor CU made efforts prior to the breach to assure that the server was secure and that it contained appropriate software protections,” OCR explained in its statement. “Moreover, OCR determined that neither entity had conducted an accurate and thorough risk analysis that identified all systems that access NYP ePHI.”


Regardless of an organization’s size, HIPAA compliance is essential. Regular risk analysis and comprehensive employee training are critical to keeping covered entities up to date and patient data secure. By reviewing federal, state and local laws, healthcare organizations can work on taking the necessary steps to make changes and improve their data security measures.

more...
No comment yet.
Scoop.it!

Hospital Slammed With $218,000 HIPAA Fine

Hospital Slammed With $218,000 HIPAA Fine | HIPAA Compliance for Medical Practices | Scoop.it

Federal regulators have slapped a Boston area hospital with a $218,000 HIPAA penalty after an investigation following two security incidents. One involved staff members using an Internet site to share documents containing patient data without first assessing risks. The other involved the theft of a worker's personally owned unencrypted laptop and storage device.


The Department of Health and Human Services' Office for Civil Rights says it has entered a resolution agreement with St. Elizabeth's Medical Center that also includes a "robust" corrective action plan to correct deficiencies in the hospital's HIPAA compliance program.

The Brighton, Mass.-based medical center is part of Steward Health Care System.


Privacy and security experts say the OCR settlement offers a number of valuable lessons, including the importance of the workforce knowing how to report security issues internally, as well as the need to have strong policies and procedures for safeguarding PHI in the cloud.

Complaint Filed

On Nov. 16, 2012, OCR received a complaint alleging noncompliance with the HIPAA by medical center workforce members. "Specifically, the complaint alleged that workforce members used an Internet-based document sharing application to store documents containing electronic protected health information of at least 498 individuals without having analyzed the risks associated with such a practice," the OCR statement says.


OCR's subsequent investigation determined that the medical center "failed to timely identify and respond to the known security incident, mitigate the harmful effects of the security incident and document the security incident and its outcome."


"Organizations must pay particular attention to HIPAA's requirements when using internet-based document sharing applications," says Jocelyn Samuels, OCR director in the statement. "In order to reduce potential risks and vulnerabilities, all workforce members must follow all policies and procedures, and entities must ensure that incidents are reported and mitigated in a timely manner."


Separately, on Aug. 25, 2014, St. Elizabeth's Medical Center submitted notification to OCR regarding abreach involving unencrypted ePHI stored on a former hospital workforce member's personal laptop and USB flash drive, affecting 595 individuals. The OCR "wall of shame" website of health data breaches impacting 500 or more individuals says the incident involved a theft.

Corrective Action Plan

In addition to the financial penalty - which OCR says takes into consideration the circumstances of the complaint and breach, the size of the entity, and the type of PHI disclosed - the agreement includes a corrective action plan "to cure gaps in the organization's HIPAA compliance program raised by both the complaint and the breach."

The plan calls for the medical center to:


  • Conduct a "self-assessment" of workforce members' familiarity and compliance with the hospital's policies and procedures that address issues including transmission and storage of ePHI;
  • Review and revise policies and procedures related to ePHI; and
  • Revise workforce training related to HIPAA and protection of PHI.
Lessons Learned

Other healthcare organizations and their business associates need to heed some lessons from OCR's latest HIPAA enforcement action, two compliance experts say.


Privacy attorney Adam Greene of the law firm Davis Wright Tremaine notes: "The settlement indicates that OCR first learned of alleged noncompliance through complaints by the covered entity's workforce members. Entities should consider whether their employees know how to report HIPAA issues internally to the privacy and security officers and ensure that any concerns are adequately addressed. Otherwise, the employees' next stop may be complaining to the government."

The settlement also highlights the importance of having a cloud computing strategy, Greene points out. That strategy, he says, should include "policies, training and potential technical safeguards to keep PHI off of unauthorized online file-sharing services."


The enforcement action spotlights the continuing challenge of preventing unencrypted PHI from ending up on personal devices, where it may become the subject of a breach, he notes.

The case also sheds light on how OCR evaluates compliance issues, he says. "The settlement highlights that OCR will look at multiple HIPAA incidents together, as it is not clear that OCR would have entered into a settlement agreement if there had only been the incident involving online file sharing software, but took action after an unrelated second incident involving PHI ending up on personal devices."


Privacy attorney David Holtzman, vice president of compliance at security consulting firm CynergisTek, says the settlement "serves as an important reminder that a covered entity or a business associate must make sure that the organization's risk assessment takes into account any relationship where PHI has been disclosed to a contractor or vendor so as to ensure that appropriate safeguards to protect the data are in place."


The alleged violations involving the document sharing vendor, he says, "involve failure to have a BA agreement in place prior to disclosing PHI to the vendor, as well as failing to have appropriate security management processes in place to evaluate when a BA agreement is needed when bringing on a new contractor that will handle PHI."

St. Elizabeth's Medical Center did not immediately respond to an Information Security Media Group request for comment.

Previous Settlements

The settlement with the Boston-area medical center is the second HIPAA resolution agreement signed by OCR so far this year. In April, the agency OK'd an agreement with Cornell Prescription Pharmacyfor an incident related to unsecure disposal of paper records containing PHI. In that agreement, Cornell was fined $125,000 and also adopted a corrective action plan to correct deficiencies in its HIPAA compliance program.


The settlement with St. Elizabeth is OCR's 25th HIPAA enforcement action involving a financial penalty and/or resolution agreement that OCR has taken since 2008.


But privacy advocate Deborah Peel, M.D., founder of Patient Privacy Rights, says OCR isn't doing enough to crack down on organizations involved in HIPAA privacy breaches.


"Assessing penalties that low - St. Elizabeth will pay $218,400 - guarantees that virtually no organizations will fix their destructive practices," she says. "Industry views low fines as simply a cost of doing business. They'll take their chances and see if they're caught."


The largest HIPAA financial penalty to date issued by OCR was a $4.8 million settlement with New York-Presbyterian Hospital and Columbia University for incidents tied to the same 2010 breach that affected about 6,800 patients. The incidents involved unsecured patient data on a network.

more...
No comment yet.
Scoop.it!

Before a Medical Data Breach, Begin Your Response Plan

Before a Medical Data Breach, Begin Your Response Plan | HIPAA Compliance for Medical Practices | Scoop.it

In the last 18 months, there have been three massive data breaches involving the healthcare industry, scores of smaller breaches, and a growing trend of insider threats posed by employees who have sold protected health information (PHI) for their own personal gain. Unlike stolen credit card numbers that can be deactivated, the personal identifying information needed to commit identity-theft type crimes, such as name, address, Social Security number, and date of birth, cannot be changed easily, if at all. Because of the permanent nature of the information that they contain, health records are approximately 10 times more valuable than stolen credit card numbers on Internet black markets where they can be bought and sold in bulk.


Now more than ever, because of new threats posed by such cybercriminals, any organization that collects, uses, discloses, or stores PHI is a potential breach victim. Covered Entities and their Business Associates subject to HIPAA who suffer a data breach must act quickly and correctly in assessing the situation. They must thoroughly investigate and mitigate risks caused by the breach, attempt recovery of the lost information, and provide required notifications to affected individuals and others. Throughout this process, organizations experiencing a breach should strive to demonstrate publicly that the data loss is being handled responsibly and appropriately.


Defining a "Breach"


HIPAA defines a breach as the acquisition, access, use, or disclosure of PHI in a manner inconsistent with the Privacy Rule that compromises its security or privacy.  In most cases, a breach is presumed to have occurred unless it can be demonstrated that there is a "low probability" that the PHI has been compromised. When performing this initial inquiry, an organization must consider:


1. The nature and extent of the PHI involved, including the types of identifiers and likelihood of re-identification;


2. The unauthorized person who used the PHI or to whom the disclosure was made;


3. Whether the PHI was actually acquired or viewed; and


4. The extent to which the risk to the PHI has been mitigated.


Plan Ahead for Breach Notification



Leonardo M. Tamburello
Every Covered Entity and Business Associate that handles PHI should develop its own unique breach response plan, built upon its most recent Security Risk Assessment (SRA), itself a fundamental step in the development of a comprehensive HIPAA security program. This security program should include a complete inventory of all devices containing sensitive data and policies and procedures requiring the immediate reporting of any lost, stolen, or compromised devices or media.


Using the most critical vulnerabilities identified in the SRA as a blueprint, the "worst case" scenario should be used to develop a detailed response plan. This discussion and handling of the "crisis" in a benign environment should be memorialized and refined into a formal breach response plan that identifies clear lines of communication and responsibility, including what gets done, who does it, and when they are supposed to do it.  


Merely having a breach response plan on paper is not enough. Individuals who are expected to implement the plan must understand and be equipped to execute their responsibilities.  


Whether through a medical practice's in-house counsel or an outside law firm, there are important reasons to integrate counsel into a breach response plan. Privacy counsel with breach response experience can bring valuable insight and steadying presence to an unfamiliar and sometimes chaotic situation. In the event of a follow-up investigation by HHS' Office for Civil Rights (OCR) (which is mandated in breaches affecting 500 or more individuals) or civil litigation, an organization's deliberative processes and internal communications and/or actions involving their counsel regarding breach response may be kept confidential through these doctrines. Without the involvement of counsel, the entirety of an organization's actions and communications would be potentially discoverable in the now familiar class-action lawsuits that inevitably follow data breaches.


Activating the Breach Response Plan


If it is determined that a breach has occurred, an organization should immediately take all possible steps to minimize or limit the impact of the breach while documenting its efforts to do so. Mitigation often occurs parallel with an investigation, and its own document trail, into the cause of the breach. In some cases, such as when a device is physically lost or stolen, mitigation may be impossible unless there is a way to remotely wipe the data contained on it. If the breach involves media or paper that can be tracked or retrieved, every effort should be made to recover it.  Law enforcement should be contacted if criminal activity such as theft or intrusion is suspected.  


Like other aspects of breach response, a medical practice's internal investigation into a breach should be thoroughly documented. The Privacy Officer, in consultation with privacy counsel for the organization, should collect and preserve evidence in accordance with established policies and procedures. This information may include interviews, e-mails, chat logs, voicemails, cellular calling records, computer logs, and any other information regarding the data loss.


If the breach involves cyber intrusion, the Privacy Officer will likely require the assistance of IT vendors or others such as specially-trained law enforcement divisions. Expert forensic assistance from these individuals can be invaluable when investigating a possible breach or determining the scope of known breach.


Formal Notification to Individuals, HHS, and Others


Once a breach has been internally confirmed, HIPAA requires official notification to all affected individuals and the OCR. If the breach involves 500 or more individuals, media organizations in the area where the affected individuals live must also be notified. Most times, these notifications must occur within 60 days of when the breach actually was, or should have been, discovered.


This does not necessarily mean that the breach will remain private until further disclosure. In many instances, breaches become public knowledge long before formal notification is made. To prevent such situations from spiraling out of control, it is imperative that an organization's breach response team be prepared to make public limited information in which there is a high degree of confidence, while stressing that the investigation is ongoing and this information may evolve. Scrambling to figure out a breach response strategy while trying to investigate and mitigate the possible harm can easily lend to inaccurate and/or harmful information being disseminated. Responding with silence will only intensify the scrutiny in such situations. A breach response plan will help a practice follow a "script" through an otherwise unfamiliar and potentially high-stakes crisis.


Poor breach notifications can take many shapes. Some fail to acknowledge the seriousness of the situation. Others provide incomplete or incorrect information. Another poor "response strategy" is complete silence or other tone-deaf actions which demonstrate organizational discord or a misunderstanding of the severity of the situation. Any of these missteps can be severely damaging, not only from a reputational point of view, but also during later phases if there is a formal investigation by OCR.  


After the required notifications have been made, the organization should update its current risk management plan to reflect lessons learned and vulnerabilities addressed as a result of the breach.


Conclusion


Most cyber intrusions are not brutish acts of virtual "smash and grab" thuggery, but well-planned and strategic, with the hallmarks of stealth and patience. As data collection and information sharing among healthcare providers and their affiliates grows in the future, the threats to the security and integrity of this information will continue to increase.

Failing to prepare for a breach is the same as preparing to fail at responding to one. As electronic health information continues to multiply along with data sharing among multiple providers and affiliates, preparing for this threat must become an organizational priority for everyone.

more...
No comment yet.
Scoop.it!

Potential HIPAA Violations Found in LA County DPH Audit

Potential HIPAA Violations Found in LA County DPH Audit | HIPAA Compliance for Medical Practices | Scoop.it

An IT security audit at the L.A. County Department of Health (DPH) revealed potential HIPAA violations, and that there are several areas of improvement for DPH.


There need to be better system access controls, IT equipment control, and computer encryption, according to a report by the County of Los Angeles Department of Auditor-Controller. The review included testing system access to five systems DPH identified as mission critical, including systems containing sensitive health information. Physical security over IT equipment was also reviewed, along with computer encryption, antivirus software, equipment disposition, and IT security awareness training.


“DPH needs to restrict unneeded access to sensitive/confidential information in their systems, and determine whether unneeded access resulted in a HIPAA/HITECH violation,” the report stated.


In terms of inappropriate systems access, the Auditor-Controller explained that DPH did not remove systems access for 13 users after they were terminated from DPH employment. One of those employee accounts was used for three years after they were terminated to view PHI and to order laboratory tests for approximately 100 DPH clients, according to the report.


DPH’s attached response indicated they determined that a current employee used the terminated employee’s account in performing her job duties. The current employee failed to obtain her own system account, which violated County policy. However, she wa authorized to view PHI and no reportable HIPAA/HITECH violation occurred. DPH indicates it has reminded IT managers to promptly remove terminated employee access. DPH is also developing a procedure to notify managers of personnel changes so they can immediately updates systems access.


Device encryption is another area that needs improvement, according to the audit report. DPH needs to ensure that portable computers are encrypted because it is a Board Policy requirement. However, DPH did not have encryption documentation for 18 percent of its 1,773 portable computers. DPH also did not have enough detailed documentation, the report found, as the remaining items’ tag or serial numbers could not be matched to any of the computers in inventory.


“DPH’s response indicates they will recall all portable computers to validate and document that each device is encrypted,” the audit stated. “DPH also worked with the Chief Information Office to acquire software that will allow them to monitor the encryption status of all portable and desktop computers.”


One aspect of the audit that was especially disturbing is that DPH reportedly is lacking in its computer incident response. Specifically, the report stated that DPH managers/staff failed to report 131 missing or stolen IT equipment items to the Department’s Information Security Office (DISO) between 2011 and 2013.


Not only is this another Board Policy requirement, the oversight did not allow DISO to assess the impact of any of the data or software loss. Furthermore, DISO  could not make required notifications to the Chief Information Office, the Auditor-Controller HIPAA Privacy Officer or the Auditor-Controller Office of County Investigations.


DPH’s response indicates they have reminded all employees to immediately report missing or stolen IT resources to their supervisor. DPH management also told us that subsequent to our review, they investigated and accounted for 100 (76%) of the 131 missing IT equipment items. Of the 31 that remain unaccounted for, DPH indicated that three could have contained PHI, but DPH indicated they believe the risk of a breach is low.


Following this audit, and a less than ideal audit at the L.A. County Probation Department, Supervisor Mark Ridley-Thomas requested that county staff report back on how feasible it would be to conduct annual IT and security review audits on all county departments. The Board of Supervisors unanimously approved the request, according to The Los Angeles Daily News.


“We want to foster accountability and transparency in the county, that’s the move we’re making,” Ridley-Thomas told the news source. “Our security, quality, safeguards and monitoring efforts need to keep up. We need to improve what we’re doing ... We need to step up our game.”

more...
No comment yet.
Scoop.it!

Hospital Slammed With $218,000 HIPAA Fine

Hospital Slammed With $218,000 HIPAA Fine | HIPAA Compliance for Medical Practices | Scoop.it

Federal regulators have slapped a Boston area hospital with a $218,000 HIPAA penalty after an investigation following two security incidents. One involved staff members using an Internet site to share documents containing patient data without first assessing risks. The other involved the theft of a worker's personally owned unencrypted laptop and storage device.


The Department of Health and Human Services' Office for Civil Rights says it has entered a resolution agreement with St. Elizabeth's Medical Center that also includes a "robust" corrective action plan to correct deficiencies in the hospital's HIPAA compliance program.

The Brighton, Mass.-based medical center is part of Steward Health Care System.


Privacy and security experts say the OCR settlement offers a number of valuable lessons, including the importance of the workforce knowing how to report security issues internally, as well as the need to have strong policies and procedures for safeguarding PHI in the cloud.

Complaint Filed

On Nov. 16, 2012, OCR received a complaint alleging noncompliance with the HIPAA by medical center workforce members. "Specifically, the complaint alleged that workforce members used an Internet-based document sharing application to store documents containing electronic protected health information of at least 498 individuals without having analyzed the risks associated with such a practice," the OCR statement says.


OCR's subsequent investigation determined that the medical center "failed to timely identify and respond to the known security incident, mitigate the harmful effects of the security incident and document the security incident and its outcome."


"Organizations must pay particular attention to HIPAA's requirements when using internet-based document sharing applications," says Jocelyn Samuels, OCR director in the statement. "In order to reduce potential risks and vulnerabilities, all workforce members must follow all policies and procedures, and entities must ensure that incidents are reported and mitigated in a timely manner."


Separately, on Aug. 25, 2014, St. Elizabeth's Medical Center submitted notification to OCR regarding a breach involving unencrypted ePHI stored on a former hospital workforce member's personal laptop and USB flash drive, affecting 595 individuals. The OCR "wall of shame" website of health data breaches impacting 500 or more individuals says the incident involved a theft.

Corrective Action Plan

In addition to the financial penalty - which OCR says takes into consideration the circumstances of the complaint and breach, the size of the entity, and the type of PHI disclosed - the agreement includes a corrective action plan "to cure gaps in the organization's HIPAA compliance program raised by both the complaint and the breach."

The plan calls for the medical center to:


  • Conduct a "self-assessment" of workforce members' familiarity and compliance with the hospital's policies and procedures that address issues including transmission and storage of ePHI;
  • Review and revise policies and procedures related to ePHI; and
  • Revise workforce training related to HIPAA and protection of PHI.


Lessons Learned

Other healthcare organizations and their business associates need to heed some lessons from OCR's latest HIPAA enforcement action, two compliance experts say.


Privacy attorney Adam Greene of the law firm Davis Wright Tremaine notes: "The settlement indicates that OCR first learned of alleged noncompliance through complaints by the covered entity's workforce members. Entities should consider whether their employees know how to report HIPAA issues internally to the privacy and security officers and ensure that any concerns are adequately addressed. Otherwise, the employees' next stop may be complaining to the government."

The settlement also highlights the importance of having a cloud computing strategy, Greene points out. That strategy, he says, should include "policies, training and potential technical safeguards to keep PHI off of unauthorized online file-sharing services."


The enforcement action spotlights the continuing challenge of preventing unencrypted PHI from ending up on personal devices, where it may become the subject of a breach, he notes.


The case also sheds light on how OCR evaluates compliance issues, he says. "The settlement highlights that OCR will look at multiple HIPAA incidents together, as it is not clear that OCR would have entered into a settlement agreement if there had only been the incident involving online file sharing software, but took action after an unrelated second incident involving PHI ending up on personal devices."


Privacy attorney David Holtzman, vice president of compliance at security consulting firm CynergisTek, says the settlement "serves as an important reminder that a covered entity or a business associate must make sure that the organization's risk assessment takes into account any relationship where PHI has been disclosed to a contractor or vendor so as to ensure that appropriate safeguards to protect the data are in place."


The alleged violations involving the document sharing vendor, he says, "involve failure to have a BA agreement in place prior to disclosing PHI to the vendor, as well as failing to have appropriate security management processes in place to evaluate when a BA agreement is needed when bringing on a new contractor that will handle PHI."

St. Elizabeth's Medical Center did not immediately respond to an Information Security Media Group request for comment.

Previous Settlements

The settlement with the Boston-area medical center is the second HIPAA resolution agreement signed by OCR so far this year. In April, the agency OK'd an agreement with Cornell Prescription Pharmacyfor an incident related to unsecure disposal of paper records containing PHI. In that agreement, Cornell was fined $125,000 and also adopted a corrective action plan to correct deficiencies in its HIPAA compliance program.


The settlement with St. Elizabeth is OCR's 25th HIPAA enforcement action involving a financial penalty and/or resolution agreement that OCR has taken since 2008.


But privacy advocate Deborah Peel, M.D., founder of Patient Privacy Rights, says OCR isn't doing enough to crack down on organizations involved in HIPAA privacy breaches.


"Assessing penalties that low - St. Elizabeth will pay $218,400 - guarantees that virtually no organizations will fix their destructive practices," she says. "Industry views low fines as simply a cost of doing business. They'll take their chances and see if they're caught."

The largest HIPAA financial penalty to date issued by OCR was a $4.8 million settlement with New York-Presbyterian Hospital and Columbia University for incidents tied to the same 2010 breach that affected about 6,800 patients. The incidents involved unsecured patient data on a network.

more...
No comment yet.
Scoop.it!

Premera Blue Cross Data Breach Results in Several Lawsuits, Class Actions

Premera Blue Cross Data Breach Results in Several Lawsuits, Class Actions | HIPAA Compliance for Medical Practices | Scoop.it

Premera is the third largest health insurer in Washington State, and was hit with a cyber attack initiated on May 5 of last year. The Premera attack exposed the personal information of as many as 11 million current and former clients of Premera across the US. While Premera noted on January 29 of this year - the day the data breach was discovered - that according to best information none of the personal data had been used surreptitiously, the fact remains that the data mined by cyber attackers is exactly the kind of information useful for perpetrating identity theft.

To that end, it has been reported that the cyber attackers targeted sensitive personal information such as names, dates of birth, Social Security numbers, mailing addresses, e-mail addresses, phone numbers, member identification numbers, bank account information, and claims and clinical information.

As for why the attack was not discovered for some eight months, Premera has said little. However, the breadth of the attack - affecting some 11 million people - and the delay in discovering the breach (initiated May 5, 2014 and revealed January 29, 2015) will likely provide much fodder for Premera cyber attack lawsuits.

According to the Puget Sound Business Journal, the New York Times had suggested the Premera cyber attack may have been perpetrated by the same China-based hackers who are suspected of breaching the federal Office of Personal Management (OPM) last month. However, the VP for communications at Premera, Eric Earling, notes there is no certainty the attack originated in China.

“We don’t have definitive evidence on the source of the attack and have not commented on that,” he said. “It continues to be under investigation by the FBI [Federal Bureau of Investigation] and we would leave the speculation to others.”

That said, it has been reported that the US government has traced all of these attacks to China.

Recent data breach attacks, including the Vivacity data breach and Connexion data breach, are reflective of a shift in targets, according to cyber attack experts. The attacks to the data systems of the federal OPM notwithstanding, it seems apparent that hackers are increasingly shifting their targets to health insurers in part due to the breadth of information available from the health records of clients.

The goal of cyber attackers in recent months, according to claims appearing in the New York Times, is to amass a huge trove of data on Americans.

Given such a headline as “Premera Blue Cross Reports Data Breach of 11 Million Accounts,” it appears they have a good start. While it might be a “win” for the hackers involved acquiring such data surreptitiously and illegally, it remains a huge loss in both privacy and peace of mind for millions of Americans who entrust their personal information to insurance providers, who, in turn, require such information in order to provide service. Consumers and clients also have historically assumed that such providers have taken steps to ensure their personal information is secure.

When it isn’t - and it takes eight months for a cyber attack to be identified - consumers have little recourse than to launch a Premera cyber attack lawsuit in order to achieve compensation for the breach, and as a hedge for the possibility of ample frustration down the road were the breach to evolve in a full-blown identity theft.

To that end, five class-action data breach lawsuits have been filed in US District Court for the District of Seattle. According to reports, two of the five lawsuits allege that Premera was warned in an April 2014 draft audit by the OPM that its IT systems “were vulnerable to attack because of inadequate severity precautions,” according to the text of the lawsuits.

Tennielle Cossey et al. vs. Premera asserts that the audit in question, “identified… vulnerabilities related to Premera’s failure to implement critical security patches and software updates, and warned that ‘failure to promptly install important updates increases the risk that vulnerabilities will not be.’

“If the [OPM] audit were not enough, the events of 2014 alone should have placed Premera on notice of the need to improve its cyber security systems.”

Moving forward, Premera Blue Cross data breach lawsuits are being consolidated into multidistrict litigation, given the number of Americans affected and their various locations across the country. An initial case management conference has been scheduled for August 7.

more...
No comment yet.
Scoop.it!

Orlando Health reports data breach for 3,200 patients

Orlando Health reports data breach for 3,200 patients | HIPAA Compliance for Medical Practices | Scoop.it

Orlando Health said Thursday about 3,200 patients’ records were accessed illegally by one of its employees, who was fired during an investigation.



The hospital system said it discovered the data breach on May 27. A news release on Thursday, July 2, said it began notifying patients “today”, which would be more than 30 days after the breach.



According to the release, there was no evidence that the data was copied or used illegally, but Orlando Health reported the incident in accordance with its data breach policies.


Under Florida law, notice to victims of a data breach is required within 30 days, unless the custodian of records has determined that nobody suffered identity theft or any other financial harm.


The records included certain patients at Winnie Palmer Hospital for Women & Babies, Dr. P. Phillips Hospital and a limited number of patients treated at Orlando Regional Medical Center from January 2014 to May 2015.


Theft of patient information at health-related companies is one of the primary ways that tax refund fraud has been occurring in Florida, according to federal authorities. Thieves can use the information to submit a fake tax return in your name, claiming refunds that could prevent or delay a legitimate refund.


In the Orlando Health incident, stolen data may have included names, dates of birth, addresses, medications, medical tests and results, the last four digits of social security numbers, and other clinical information. The former employee may have also accessed insurance information in approximately 100 of those patient records.


Steve Stallard, corporate director for compliance and information security said in a statement that Orlando Health “deeply regrets any concern or inconvenience this may cause our patients or their family members.”


The organization is providing affected patients with call center and other support, the news release said.


Orlando Health has reported other data breaches, such as a March 2014 incident where over 500 child patient records were misplaced.

more...
No comment yet.
Scoop.it!

Why Hackers Love Healthcare Organizations

Why Hackers Love Healthcare Organizations | HIPAA Compliance for Medical Practices | Scoop.it

If you look at all the data breaches that took place in 2014, you might conclude that healthcare organizations have lax cybersecurity protocols. You’d be wrong, but it’s not hard to see how you might reach that conclusion. Last year, the healthcare sector reported more breaches—333 in all—than any other industry. Like any symptom viewed in isolation, diagnosing the real ailment in the healthcare industry requires a more thorough examination. Want to know why hackers are so intent on breaking into healthcare organizations’ systems—and so successful? Here are the top reasons:


Healthcare data is the most valuable data of all.


If a hacker goes through the trouble of infiltrating, say, an e-commerce vendor or a brick-and-mortar retailer, he’ll walk away with thousands or hundreds of thousands of credit card numbers. That’s no small haul, but credit card companies and consumers have learned to deal with breaches. Banks assign their customers new numbers, issue them new cards and promise to wipe any suspicious charges. By the time hackers can sell their stolen card data, much of it is useless.


Healthcare data, by contrast, gives criminals just about everything they need to steal identities, creating valuable goods to sell on the black market. A breach at a health insurance company, for example, could yield data ranging from bank account and Social Security numbers to medical history to family names and beyond. Think of all of the fraudulent accounts a criminal could open simply by getting ahold of a customer’s Social Security number, her address and her mother’s maiden name.


In an industry where everything is sensitive and regulated, workers resist additional controls.


Just like chief information security officers in other industries, CISOs working in healthcare evaluate their vulnerabilities and their priority technology upgrades on an ongoing basis. Because of healthcare information’s depth, deploying new technology can be complex, but selling users on that technology and its associated security protocols can be seriously challenging. A doctor who has to endure multiple controls just to  prescribe medication or complete another mundane task might understandably bristle when the security team introduces multi-factor authentication or some other process that he views as just another obstacle to doing his job.


Human beings—including medical providers—are fallible, and hackers know it.


When my wife was in the hospital for the birth of our daughter, I noticed something during every nursing shift. The staff left patient folders open on the front desk. There was ample security to protect newborns themselves, but not to protect their data. Harried working conditions also contribute to the potential exposure of digital data. If an over-tired doctor heads home after a 20-hour shift and forgets his laptop in the taxi, that could be just the opening a criminal needs to access an entire healthcare system. Humans aren’t error proof, which is why the technology, particularly in healthcare, has to be.


A hacker only needs to be right once; the healthcare organization needs to be right all the time.


For every high-profile data breach affecting a healthcare organization during the past 18 months, there are experts ready to say, “They should have known better.” “They should have known laptops have to be encrypted.” “They should have known they had to train their staff to avoid phishing scams.” “They should have known...” Whatever security protocol completes that should-have-known statement, the reality is that no one can predict every scenario. If you try to manage data security through prediction, you will fail. It’s always a race between the good guys and the bad guys, and the bad guys only have to get it right one time to do serious damage. Instead of trying to predict and prevent every possible attack method, security teams need to implement technology capable of understanding normal user behavior and sounding alerts when activity deviates from established patterns.


The healthcare industry is at a pivotal point in terms of its data security. After a record year of data leaks and losses, security leaders know the havoc breaches wreak, and they know it’s time to re-evaluate their defenses. Instead of deploying tools that can only withstand one type of attack or implementing processes that ignore the inherent fallibility of human end users, CISOs need to pay attention to the user data itself. By focusing on user behavior intelligence, healthcare organizations can spot and stop attacks before hackers fatally damage their reputations.

more...
Roger Steven's comment, July 10, 2015 6:33 AM
http://www.mentorhealth.com/control/hipaa-and-security-breaches
Ashley Anne Abeling's curator insight, July 15, 2015 6:54 PM

Technology has it advantages but this is one of the downsides of using it to store very personal and important information. Making sure that the offices I work for and educating my students on the importance of internet safety is a priority of mine as an educator. We take for granted technology and when something goes wrong we have to be prepared for the aftermath.

Scoop.it!

Smaller companies seek cybersecurity

Smaller companies seek cybersecurity | HIPAA Compliance for Medical Practices | Scoop.it

When it comes to hacker attacks and data breaches, small businesses are targets, too.


In fact, the origin of the infamous Target breach of 2013, which compromised information on 40 million customer credit cards, has been traced to the hacking of the computer system of a refrigeration company in the Pittsburgh area doing work for the Minneapolis-based retailer.


One or more of Fazio Mechanical Services’ employees had access to Target’s data network for electronic billing, contract submissions and project management, and one of its employees apparently opened a malware-laced email from an outside hacker, according to published reports.



How ownerscan keep dataout of harm’s way
  1. Keep paper files, flash drives and CDs with sensitive data on them under lock and key. Shred such paper documents before recycling.
  2. Require employees to have a unique user name and a strong password that is changed several times a year. Also, encrypt data.
  3. Use antivirus and anti-spyware software, and don’t open email attachments or other downloads unless they are from trusted sources.
  4. Install updates to security, Web browser, operating system and antivirus software as soon as they become available. These contain the "patches" that address security vulnerabilities.
  5. Enable your operating system’s firewall. Take special precautions when allowing remote access.
  6. If you have a Wi-Fi network, make sure it is encrypted, hidden from the public, and that users need a password to get in.
  7. Make sure third parties that have access to your data have proper data protection practices.
  8. Talk to your insurance broker about cybersecurity insurance.

 


Cybersecurity consultant Karl Kispert used the experience of Fazio as a cautionary tale Wednesday in speaking to a gathering of about 70 information technology professionals and small-business owners at William Paterson University in Wayne. The event, a cybersecurity conference presented by the New Jersey Small Business Development Center, emphasized the risks and the potential for financial liabilities at smaller companies, which experts said are real and growing, and business owners need to be prepared.


"If I’m a hacker, the weakest link is a vendor with few, if any, controls around their IT environment," Kispert said. "Small to midsized companies are as at risk as any of the companies you read about in the newspaper."


But when hacking incidents and data breaches occur at small companies, they rarely become known to the public, Kispert said Thursday. Thomas Ryan, enterprise security expert at Hewlett Packard, said that, on average, it takes 240 days to discover that you’ve been breached.


"It’s not will you get breached, it’s when," said Roseland lawyer Karen Painter Randall, a partner at Connell Foley LLP. "No one is immune."

The art of hacking, pioneered by clever young tricksters who got their kicks vandalizing websites, has evolved into a global underworld of thieves, spies and saboteurs, sometimes state-sponsored, who embrace the Internet as their playing field of choice. Stolen personal information is sold to identity thieves and other criminals on shady websites.


In January, President Obama called for new laws requiring companies to disclose when they’ve been hacked, and several cybersecurity-related bills are now pending.


Obama’s statement followed a breach at Home Depot involving 56 million customer credit cards as well as an incident at Staples that exposed information on more than 1 million payment cards.


Fortune 500 companies are likely to have cybersecurity insurance to offset the cost of responding to data breaches — which typically involve notifying victims and providing them with credit monitoring for a year. Target said earlier this year that its 2013 data breach would cost an estimated $252 million and the company expected insurance to cover only $90 million.


But smaller companies by and large are not insured, and general liability insurance does not include coverage for cyberattacks.

"Cyber insurance is becoming one of those essential products," said North Brunswick business consultant Carol Gabel, chief executive officer of Seven Pearls LLC, a risk management specialist and one of the conference organizers. As demand increases, "it will become more affordable and more available," she said.


The cyber insurance market has existed since the late 1990s, and the amount of coverage being sold is small compared to coverage for more traditional risks.


Cyber insurance, however, is one of the fastest-growing sectors in the insurance market, according to Marsh’s U.S. Insurance Market Report 2015. Factors considered in pricing policies include the size of the company, the number of customers, its presence on the Web, and the type of data collected and stored.


Policies might include coverage for liability for exposing confidential information and for paying the bill to notify customers of a breach and providing them with credit-monitoring services.


Costs from business interruption, especially from disruption of service attacks by hackers, are another risk that can be covered.


From Jan. 1 through June 23, 380 breaches were reported by companies and other organizations in the United States. More than 117 million records were exposed, according to the Identity Theft Resource Center’s website, which gleans information from law enforcement and media sources.


The resource center said a breach occurs when an individual’s name along with his or her Social Security number, driver’s license number, medical record, or a financial record like a credit- or debit-card number, are put at risk.


The breaches included 10 companies either based in New Jersey or with a major presence in the state, including TD Bank, Toys "R" Us and Chubb & Son.


A Toys "R’’ Us representative said Friday in an email that the breach involved an attempt in late January to gain unauthorized access to the accounts of "a small percentage’’ of its Rewards "R" Us members who had requested password changes.


In only one of the New Jersey breaches, which occurred at Jersey City Medical Center, was the number of exposed records reported — 1,447.


In February, health insurance giant Anthem announced a data breach caused by a hacking attack from outside that company that compromised as many as 80 million records. The exposed information includes names, Social Security numbers and dates of birth.

"In the past, in the health care industry, it’s been mishaps — somebody loses a laptop or maybe a disgruntled employee steals some data," said Jeanmarie Tenuto, chief executive officer of Healthcare Technical Solutions in Newark. "Now, it’s attacks from the outside."


Earlier this year, reports emerged that an employee with a company working for a private physicians group that provided emergency services at The Valley Hospital in Ridgewood, Englewood Hospital and Medical Center, and Holy Name Medical Center in Teaneck stole names, Social Security numbers and birth dates of patients.


The average loss for a breach of 1,000 records would be between $52,000 and $87,000, according to Mount Laurel-based Verizon Enterprise Solutions’ latest report on data breaches worldwide.

Verizon’s Data Breach Investigations Report said there were 2,122 confirmed data breaches last year around the world. The report estimates the global financial loss from 700 million compromised records at $400 million.


Public entities and the information and financial services industries were the top victims, and a large majority of the attacks were from external sources.


RAM scraping is a kind of malware that grabs payment-card information before it is encrypted. It is reportedly the device used in the Target breach.


"RAM scraping has grown in a big way," the report said. "This type of malware was present in some of the most high-profile retail breaches."

"Malware is getting more sophisticated," and "phishing remains a serious threat," said Bhavesh Chauhan, regional security engineering manager at Verizon Enterprise Solutions.

more...
No comment yet.
Scoop.it!

How can hospitals protect their medical equipment from malware?

How can hospitals protect their medical equipment from malware? | HIPAA Compliance for Medical Practices | Scoop.it

The challenges in protecting hospitals from cyber attacks are very similar to those faced in ICS and SCADA environments; the equipment used in hospitals is not user-serviceable and therefore often running out-of-date software or firmware. This creates a dangerous situation where:

The devices have known vulnerabilities that can be easily exploited by bad actors

Administrators are not likely to notice malware running on the device as long as nominal operation is maintained

The end goal of bad actors infecting a medical device is to use it as an entry and pivot point in the network. Valuable patient records are not likely to be present on the medical devices, but those devices often have some level of network connection to the systems that do contain patient records.

What exactly is a bad actor likely to do after getting a foot-hold on the network? Move laterally to find patient records that can be used for:

  • Identify theft
  • Blackmail
  • Steal research data for financial gain
  • Deploy ransomware like Cryptolocker, effectively crippling the facility unless a bribe is paid
  • Trigger widespread system malfunctions as an act of terrorism
  • Carry out a 'hit' on a specific patient


The first three items are strictly motivated by financial gain, and this has been the extent of observed attacks to date. The fourth item seems possible but unlikely, either due to morals or the relatively higher value of attacking other targets like power plants or defense facilities. The fifth item hasn't been detected yet, but that doesn't exclude the possibility that it has happened. Carrying out a silent assassination with malware would be very hard to trace back to the attacker, and could even be sold as a service (similar to DDoS as a service).

The scenario for number 5 sounds like something out of a Tom Clancy novel, but it is completely plausible. The attacker (or entity paying for the attack) would only need to know the target, have knowledge of an upcoming procedure, and know where the procedure was to take place. One caveat is that identifying which device(s) would be used with that patient, and when, could be difficult but not impossible to know.#

Real-world vulnerability examples
Billy Rios, a security researcher, recently went public with a vulnerability that affects drug pumps and could potentially be exploited to administer a fatal dose of medication to a patient. Rios notified the DHS and FDA up to 400 days ago about the vulnerability and saw no response, so he went public to put pressure on the manufacturer to fix the issue. Faced with the reality that some medical equipment manufacturers do not invest in securing their devices from exploitation, the onus of security therefore falls on the users of such equipment.

This discovery shows a real-world example of how a cyber attack could affect a medical device and potentially endanger lives. There is no question that this type of threat needs to be taken seriously. The real question is, how can hospitals effectively protect devices such as these?

It's clear that installing antivirus software on medical equipment is impractical and basically impossible. Furthermore, healthcare IT are relatively helpless to patch the software and firmware running on these devices. So considering those vulnerabilities, and the difficulty in remotely scanning these devices, the best solution is simply to prevent malware from ever getting to these devices. Thankfully this challenge has already been solved in ICS and SCADA environments.

In a recently profiled attack on hospitals, one of the infection vectors was thought to be a technician visiting a compromised website on a PC with direct access to a picture archive and communication (PACS) system. The report details that the malware was detected but not before infecting the PACS system. Due to the nature of the system it could not be scanned for malware, let alone cleaned. It was then used as a pivot point to find a system with medical records that could be exfiltrated back to the attacker.

Medical facilities share vulnerabilities with SCADA and ICS, so why shouldn't they also share protection mechanisms? Critical infrastructure providers, especially power plants, often make use of air-gapped networks as a very effective defense mechanism. Taking the above story as an example, the PC with a web browser and internet access should not have also had access to PACS. This simple step would have stopped the infection from doing any damage at all. If, for example, the technician needed to download something from the internet and transfer it to PACS then it would have to be transferred onto the air-gapped network.

How sanitization of the operating room compares to preventing cyber infections
Hospitals and their staff are very accustomed to preventing the spread of biological infections and they must now apply similar levels of prevention to preventing the spread of cyber infections. Defending against cyber infections, by comparison, is much easier. The medical industry isn't alone in fighting this threat – they don't have to invent new techniques for preventing infection, they simply need to adapt the proven strategies employed by other industries.

Simply employing an air gap doesn't guarantee security. The point of the air gap is to create a point through which data movement is carefully controlled. Additional measures must be employed to ensure that pathogens are not allowed access. In medicine these measures consist of removing foreign material with soap and water, and disinfecting with various antimicrobial agents. It's not practical to scan doctors and nurses for bacteria, so every surface is assumed to be contaminated until sufficiently cleaned and disinfected. The control point in a data flow is comparatively easier to maintain, as there are techniques for quickly finding infections on media moving through the air gap. For extra protection, any files deemed 'clean' can still be disinfected to completely eradicate the possibility of a threat doing undetected.

more...
No comment yet.
Scoop.it!

4 HIPAA compliance areas your BAs must check

4 HIPAA compliance areas your BAs must check | HIPAA Compliance for Medical Practices | Scoop.it

It finally looks like the feds are starting up the next phase of HIPAA audits — but there’s still time to ensure your business associates (BAs) are staying compliant. 


In preparation of the next round of audits, the Department of Health and Human Services’ (HHS) Office for Civil Rights (OCR) has begun sending out pre-audit surveys to randomly selected providers, according to healthcare attorneys from the law firm McDermot, Will and Emory.

Originally, the surveys were meant to go out during the summer of 2014, but technical improvements and leadership transitions put the audits on hold until now.

Moving toward Phase 2

The OCR has sent surveys asking for organization and contact information from a pool of 550 to 800 covered entities. Based on the answers it receives, the agency will pick 350 for further auditing, including 250 healthcare providers.

The Phase 2 audits will primarily focus on covered entities’ and their BAs’ compliance with HIPAA Privacy, Security and Breach Notification standards regarding patients’ protected health information (PHI).

Since most of the audits will be conducted electronically, hospital leaders will have to ensure all submitted documents accurately reflect their compliance program since they’ll have minimal contact with the auditors.

4 vendor pitfalls

It’s not clear yet to what extent the OCR will evaluate BAs in the coming audits due to the prolonged delay. However, there are plenty of other good reasons hospital leaders need to pay attention to their vendors’ and partners’ approaches to HIPAA compliance and security.


Why?


Mainly because a lot of BAs aren’t 100% sure what HIPAA compliance entails, and often jeopardize patients’ PHI, according to Chris Bowen, founder and chief privacy and security officer at a cloud storage firm, in a recent HealthcareITNews article.


A large number of data breaches begin with a third party, so it’s important hospital leaders keep their BAs accountable by ensuring they regularly address these five areas:


  • Risk Assessments. As the article notes, research has shown about a third of IT vendors have failed to conduct regular risk analysis on their physical, administrative and technical safeguards. Ask your vendors to prove they have a risk analysis policy in place, and are routinely conducting these kinds of evaluations.
  • System activity monitoring. Many breaches go unnoticed for months, which is why it’s crucial your BAs have continuous logging, keep those logs protected and regularly monitor systems for strange activity.
  • Managing software patches. Even the feds can struggle with this one, as seen in a recent HHS auditon the branches within the department. Keeping up with security software patches as soon as they’re released is an important part of provider and BA security. Decisions about patching security should also be documented.
  • Staff training. Bowen recommends vendors include training for secure development practices and software development lifecycles, in addition to the typical General Security Awareness training that HIPAA requires.
more...
No comment yet.
Scoop.it!

OCR launches new HIPAA resource on mobile app development

OCR launches new HIPAA resource on mobile app development | HIPAA Compliance for Medical Practices | Scoop.it

The Office of Civil Rights (OCR) of the Department of Health and Human Services (HHS) recently launched a new resource: a platform for mobile health developers and “others interested in the intersection of health information technology and HIPAA privacy protection.”


In the announcement of this platform, OCR noted that there has been an “explosion” of technology using data regarding the health of individuals in innovative ways to improve health outcomes. However, OCR said that “many mHealth developers are not familiar with the HIPAA Rules and how the rules would apply to their products,” and that “[b]uilding privacy and security protections into technology products enhances their value by providing some assurance to users that the information is safe and secure and will be used and disclosed only as approved or expected.”


The OCR platform for mobile app developers has its own website. Anyone – not just mobile app developers – may browse and use the website. Users may submit questions, offer comments on other submissions and vote on a topic's relevance. OCR noted that to do so users will need to sign in using their email address, “but their identities and addresses will be anonymous to OCR.” 


OCR asked stakeholders to provide input on the following issues related to mobile app development: What topics should we address in guidance? What current provisions leave you scratching your heads? How should this guidance look in order to make it more understandable and more accessible?


Users can also submit questions about HIPAA or use cases through this website. OCR explained that, “we cannot respond individually to questions, we will try to post links to existing relevant resources when we can.” Finally, in the announcement OCR stated that posting or commenting on a question on this website, “will not subject anyone to enforcement action.” 

more...
No comment yet.
Scoop.it!

Ex-Hospital Worker Sentenced in $24 Million Fraud Case

Ex-Hospital Worker Sentenced in $24 Million Fraud Case | HIPAA Compliance for Medical Practices | Scoop.it

A former military hospital worker has been sentenced to 13-plus years of federal prison time for her involvement in a $24 million identity theft and tax fraud scheme, which also involved a former Alabama health department employee and several other co-conspirators.


On Aug. 10 in the U.S. District Court for the Middle District of Alabama, Tracy Mitchell, a former worker at a military hospital at Fort Benning, Georgia, was sentenced to serve 159 months in federal prison for crimes including one count of conspiracy to file false tax claims, one count of wire fraud and one count of aggravated identity theft, to which she pleaded guilty in April.


Eight others were also sentenced on Aug. 10 for their roles in the same fraud ring, which federal prosecutors say involved the theft of 9,000 identities stolen from the U.S. Army, various Alabama state agencies, an unidentified Georgia call center, and an unidentified Columbus, Georgia company.

Case Details

The U.S. Department of Justice in a statement says that while Mitchell worked at the military hospital, she had access to the identification data of military personnel, including soldiers who were deployed to Afghanistan. Mitchell stole personal information of soldiers and used them to file false tax returns. Court documents do not specify the job Mitchell held at the hospital.


Prosecutors say that between January 2011 and December 2013, Mitchell and a co-conspirator, Keisha Lanier, led the large-scale identity theft ring in which they and their co-defendants filed over 9,000 false tax returns that claimed in excess of $24 million in fraudulent claims. The IRS paid out close to $10 million in fraudulent refunds, the justice department says. Sentencing for Lanier is scheduled for Aug. 24.


Other members of the fraud ring who were sentenced on Aug. 10 included Sharondra Johnson, who worked at a Walmart money center in Columbus, Georgia. As part of her employment, Johnson cashed checks for customers of the money center. Prosecutors say Johnson cashed tax refund checks issued in the names of other individuals whose identities were stolen by the fraud ring. For her crimes, Johnson received a 24-month prison sentence.


Also, in another related case linked to the same fraud ring, Tamika Floyd, a former worker of the Alabama Department of Public Health from 2006 to May 2013, and the Alabama Department of Human Resources from May 2013 to July 2014, was sentenced in May to serve 87 months in federal prison after pleading guilty to fraud conspiracy and ID theft crimes. While working in her state jobs, Floyd had access to databases that contained identification information of individuals, which she stole and provided to the crime ring's co-conspirators for the filing of false tax returns, prosecutors say.

Of those sentenced so far, Mitchell received the stiffest penalty. Sentences for the other defendants in the case so far range from 60 months of prison time to two years of probation. Restitution will be determined at a later date, the DOJ says.

Preventing Insider Crimes

There are steps that healthcare organizations can take to deter insiders from committing fraud related crimes using patient data, say privacy and security experts.


Mac McMillan, CEO of security consulting firm CynergisTek suggests that entities enhance personnel screening, improve authorization practices, eliminate excess access, invest in monitoring technologies and diligently and proactively monitor users. Also, "we need to change our monitoring and audit practices and focus more on behavioral analysis," he adds.


Indeed, some healthcare CISOs say their organizations are putting those types of efforts in place to help safeguard patient data from being used in identity crimes.


"We are in close partnership with all the three-letter [law enforcement] agencies, and are constantly reviewing the crimes, such as identity theft, which continues to be on the FBI's top list of crimes throughout the nation in general," says Connie Barrera, CISO of Jackson Health System in Miami.


Unfortunately, "South Florida is a big repository of different kinds of issues, and crimes" involving identity fraud, including tax refund fraud, she says. "It's not only about educatingour population [of workers] but having the right monitoring in place."


For instance, "with our medical records, we have various ways to monitor that [access], and we let our workforce and constituents know that we are monitoring, and we do that on a regular basis," she says. "Employees are made aware, and word spreads."


Also, the organization provides access to data only "on a need to know basis, and we review that on a periodic basis." Still, "ensuring that the people who do have [authorized] access to data are only using it appropriately, that's a huge challenge."


On top of those efforts, law enforcement, prosecutors and the justice system pursuing fraud cases involving patient identities are also an important deterrent, McMillan says.


"These sentences should send the message that the government is serious about punishing those that abuse their trust and take advantage of others," he says. "If you do the crime and get caught, you can get serious time."

more...
No comment yet.
Scoop.it!

Data Breaches, Lawsuits Inescapable, but Liability Can Be Mitigated

Data Breaches, Lawsuits Inescapable, but Liability Can Be Mitigated | HIPAA Compliance for Medical Practices | Scoop.it

If your organization experiences a data breach—an increasingly likely scenario—and PHI is exposed, chances are you will be hit with a lawsuit in short order.

There's not much you can do about that, just like it's impossible to prevent every criminal attack. What you can do, though, is take steps to minimize the likelihood of being found liable for damages in court, says Reece Hirsch, Esq., a partner and regulatory attorney at Morgan Lewis in San Francisco, and a BOH editorial advisory board member.

Hirsch says companies should have two things in place as part of standard policy and procedure: an evolving breach response plan and an incident response team that meets on a regular basis. While class-action suits haven't gained much traction with judges yet—except in cases of clear financial damage to consumers—most of the claims boil down to some form of alleged negligence, he says.

"Given the increasingly sophisticated cyberthreats that companies face … you cannot have perfect security and you cannot completely insulate yourself from these types of events, but what you can do is show you acted reasonably and took reasonable measures to prevent a breach and not make yourself a target," Hirsch says.

Organizations demonstrate this with a good breach response plan to show they've identified the problem, mitigated damage, notified victims, and taken further action as necessary, he says. The team should represent each department that might be affected by a breach or that has to be mobilized to interact with the public, including legal, human resources, privacy, security, IT, communications, and investor relations. Part of the team's role is to analyze risks to data, data flow, and worst-case scenarios.

"Everything needs to be encrypted, data at rest as well as data in transit, which is something HIPAA specifically points out," says Jan McDavid, Esq., the compliance officer and general legal counsel at HealthPort, an Atlanta-based healthcare services firm. McDavid, who is a regular speaker on this subject, agrees that it's essential to have proper security policies as well as dedicated staff to regularly review systems and respond to incidents.

Comprehensive risk analyses, which HIPAA requires, should not just be done after a breach to assess the extent of damages after private data is "let out the door," she says, but up front as well to identify the risks. Inevitably, though, healthcare organizations with large electronic databases will likely experience a data breach.


"Once [companies] are put on notice that something has happened, they need to immediately stop the bleeding," McDavid says. Even though public breach notification may not be required on day one, the company should immediately shut off or fix whatever happened so it can't occur again, she says.

One of the issues she sees often is that as healthcare organizations struggle to keep pace with technology, security is affected too. In the rush to automation and interoperability with limited funds available, parts of older systems and databases may get upgraded and replaced, but in the process, new vulnerabilities may be created, McDavid says. It seems organizations don't always realize how their systems interact, leading them to overlook peripheral connections that may allow access to protected systems, she adds.

Federal legislation that called for providers to implement EHRs didn't contain the funding to help facilities make the switch—those incentives came later. Many of the hospitals McDavid works with have a hodgepodge of computer systems that were installed piecemeal as the hospitals received technology funding, and that may inadvertently lead to vulnerabilities.

Taking proactive measures to have strong security policies, plans, and personnel in place goes a long way toward mitigating company liability in a class-action suit, Hirsch and McDavid say.

Lawsuits may be unavoidable


"If people are going to sue you, they're going to sue you," Hirsch says. "But [proactive preparation] will position the company much better to defend the lawsuit." And even more importantly, he adds, it may deflect some of the greatest damage to a company's reputation and image, which occurs in the "court of public opinion" and in news media reports.

McDavid agrees. "Their name becomes mud when the news is out that they've had a major breach," she says, although she believes the public has become oversaturated with the plethora of recent breaches in the news to the point that such incidents are no longer viewed as alarming or unusual.

Chris Apgar, CISSP, president of Apgar & Associates, LLC, in Portland, Oregon, and a BOH editorial advisory board member, says the breach announced by Anthem, Inc., in February 2015 actually offers a good example of how to take the right approach to a data breach.

Apgar doesn't believe the health insurer took a big hit to its reputation because it acted relatively quickly to put security experts on the case and notify consumers and law enforcement authorities about the breach as required by HIPAA security regulations. In addition, he says, Anthem had relatively good security protections; however, those protections could only slow down a sufficiently skilled hacker, not stop the breach from occurring.


By comparison, Apgar says the class-action suits against Community Health Systems, Inc., are for actual negligence in responding to a known security vulnerability. The Franklin, Tennessee-based company announced hackers accessed data of 4.5 million individuals who were referred to or received care from physicians affiliated with its system over the last five years, according to an August 18, 2015, filing with the U.S. Securities and Exchange Commission.


Anthem disclosed on February 4 that it uncovered a massive breach affecting 80 million people that had occurred two months earlier. Less than 12 hours later, an Indianapolis attorney was already filing a class-action suit against the health insurer for failure to secure customers' data, negligence, breach of contract, and failure to notify victims in a timely manner.

In the days and weeks that followed, the class-action suits started to pile up across the country—dozens of complaints argued Anthem was lax in securing members' personal data, which wasn't encrypted. Plaintiffs argued Anthem only implemented reasonable security measures after it discovered the breach January 29—more than a month after the incident occurred.

Even if it were eventually proven in court that Anthem didn't follow industry best practices to secure data or that the breach was due to negligence, the bigger question is whether the plaintiffs can demonstrate harm as a result, Apgar says.

Building up case law


Currently, legal precedent favors the defendants, but that's an evolving process too.

McDavid explains there is no established federal law that stipulates companies are liable for damages just because they experienced a data breach that exposed clients' or patients' personal information.

That's where class-action attorneys enter the picture, she says. They're trying to make case law by obtaining favorable court opinions to set a legal precedent, but it's an uphill battle, she says. Under many federal and state laws, victims have to prove they were harmed in order to win damages.

"In the majority of cases now, the courts are ruling that you cannot certify a class unless you can prove the class has damages," McDavid says. "What that means is that even if you've breached 2 million records, if you don't have any notice that any of that [data] has been misused, then in most courts right now you have no damages."

In April, a federal judge dismissed a class-action suit against Horizon Blue Cross Blue Shield of New Jersey, ruling the plaintiffs didn't demonstrate they suffered financial harm. Two company laptop computers were stolen in 2013 from the health insurer's Newark headquarters, and nearly 840,000 customers' personal information was potentially exposed.


McDavid also points to a May Pennsylvania case where a county judge dismissed a suit from 62,000 employees of the University of Pittsburgh Medical Center following a criminal breach of the hospital's payroll database. Several hundred employees were victims of tax fraud, but the judge ruled the plaintiffs didn't prove that they were all financially harmed, that the medical center was negligent in its actions, or that there was any contract holding the university liable for security breaches.

What usually happens, Hirsch explains, is that the parties reach a settlement outside of court, and that's where many of the large payouts to affected consumers or patients happen.

Finding other ways in


It's becoming increasingly common, however, for class-action attorneys to file suit for violations of state privacy and security laws or various other federal statutes, which may contain stronger protections than HIPAA, McDavid says. Arguments under those laws have been more successful at convincing courts that the victims still have legal standing to sue even if they haven't experienced actual harm.

Apgar notes that 2010 contained an early example of this, when the Connecticut Attorney General's office sued Health Net of Connecticut in federal court for violations of HIPAA and state privacy protections regarding personal data. The attorney general's office alleged the health insurer failed to secure PHI and financial information prior to a 2009 data breach in which a computer disk drive was lost that contained unencrypted records on more than 500,000 Connecticut residents and 1.5 million consumers nationwide. Health Net also allegedly delayed notifying plan members and law enforcement authorities until several months after it discovered the breach.

Ultimately, the company agreed to a settlement that included the following:

  • Extended credit monitoring for affected plan members
  • Increased identity theft insurance and reimbursement for security freezes
  • An internal corrective action plan for stronger security measures
  • A $250,000 state fine
  • A $500,000 contingent payment to the state if it was established that affected individuals later became victims of identity theft or fraud


This was the first legal action taken by an attorney general since the HITECH Act in 2009 authorized state attorney generals to enforce violations of HIPAA.

Federal laws, such as the Fair Credit Reporting Act (FCRA), are also becoming an avenue for class-action attorneys. Hirsch says although it's not related to healthcare, one case winding its way through the U.S. Supreme Court—Spokeo, Inc. v. Robins—could change the legal landscape if the nation's highest court issues an opinion against the online company.

In February 2014, federal appellate judges for the 9th Circuit reversed a district court ruling that had originally dismissed plaintiff Thomas Robins' class-action suit alleging willful violations of the FCRA. He claimed Spokeo, an online information gathering service, published and marketed inaccurate personal information about him on its website, which he had no control over. While not claiming actual financial damages, he argued that since he was unsuccessful in securing employment, he was concerned the inaccurate report was affecting his ability to obtain employment, insurance, credit, etc.

The appellate panel found Robins did have constitutional standing to sue under the FCRA. This speaks to the same issues that are raised by victims of healthcare data breaches, who worry they will suffer financial harm from the exposure of their PHI, Hirsch says. Large technology companies urged the Supreme Court to take up an appeal of the 2014 decision, fearing it could cripple the industry by paving the way for billions of dollars in damages to consumers, he says.

In addition, there's another federal healthcare data breach suit—Smith, et al. v. Triad of Alabama—making a case for violations under the FCRA that will have big implications if the court finds the plaintiffs have legal standing for a class-action suit, McDavid says.

"They can keep it in court if the judge buys into their theory that they don't have to have damages in order to sue," she says.

more...
Jan Vajda's curator insight, August 13, 2015 9:44 AM

přidejte svůj pohled ...

Scoop.it!

The UCLA Health System Data Breach: How Bad Could It Be…?

The UCLA Health System Data Breach: How Bad Could It Be…? | HIPAA Compliance for Medical Practices | Scoop.it

Just hours ago, a Los Angeles Times report broke the news that hackers had broken into the UCLA Health System, creating a data breach that may affect 4.5 million people. This may turn out to be one of the biggest breaches of its kind in a single patient care organization to date, in the U.S. healthcare system. And it follows by only a few months the enormous data breach at Anthem, one of the nation’s largest commercial health insurers, a breach that has potentially compromised the data of 4.5 million Americans.


The L.A. Times report, by Chad Terhune, noted that “The university said there was no evidence yet that patient data were taken, but it can't rule out that possibility while the investigation continues. And it quoted Dr. James Atkinson, interim president of the UCLA Hospital System, as saying “We take this attack on our systems extremely seriously. For patients that entrust us with their care, their privacy is our highest priority we deeply regret this has happened.”


But Terhune also was able to report a truly damning  fact. He writes, “The revelation that UCLA hadn't taken the basic step of encrypting this patient data drew swift criticism from security experts and patient advocates, particularly at a time when cybercriminals are targeting so many big players in healthcare, retail and government.” And he quotes Dr. Deborah Peel, founder of Patient Privacy Rights in Austin, Texas, as saying, “These breaches will keep happening because the healthcare industry has built so many systems with thousands of weak links.”


What’s startling is that the breach at the Indianapolis-based Anthem, revealed on Feb. 5, and which compromised the data of up to 80 million health plan members, shared two very important characteristics with the UCLA Health breach, so far as we know at this moment, hours after the UCLA breach. Both were created by hackers; and both involved unencrypted data. That’s right—according to the L.A. Times report, UCLA Health’s data was also unencrypted.


Unencrypted? Yes, really. And the reality is that, even though the majority of patient care organizations do not yet encrypt their core, identifiable, protected health information (PHI) within their electronic health records (EHRs) when not being clinically exchanged, this breach speaks to a transition that patient care organizations should consider making soon. That is particularly so in light of the Anthem case. Indeed, as I noted in a Feb. 9 blog on the subject, “[A]s presented in one of the class action lawsuits just recently filed against it,” the language of that suit “contains the seeds of what could evolve into a functional legal standard on what will be required for health plans—and providers—to avoid being hit with multi-million-dollar judgments in breach cases.”


As I further stated in that blog, “I think one of the key causes in the above complaint [lawsuits were filed against Anthem within a few days of the breach] is this one: ‘the imminent and certainly impending injury flowing from potential fraud and identity theft posed by their personal and financial information being placed in the hands of hackers; damages to and diminution in value of their personal and financial information entrusted to Anthem for the sole purpose of obtaining health insurance from Anthem and with the mutual understanding that Anthem would safeguard Plaintiff’s and Class members’ data against theft and not allow access and misuse of their data by others.’ In other words, simply by signing up, or being signed up by their employers, with Anthem, for health insurance, health plan members are relying on Anthem to fully safeguard their data, and a significant data breach is essentially what is known in the law as a tort.”


Now, I am not a torts or personal injury lawyer, and I don’t even play one on TV. But I can see where, soon, the failure to encrypt core PHI within EHRs may soon become a legal liability.


Per that, just consider a March 20 op-ed column in The Washington Post by Andrea Peterson, with the quite-compelling headline, “2015 is already the year of the health-care hack—and it’s going to get worse.” In it, Peterson,  who, according to her authoring information at the close of the column, “covers technology policy for The Washington Post, with an emphasis on cybersecurity, consumer privacy, transparency, surveillance and open government,” notes that “Last year, the fallout from a string of breaches at major retailers like Target and Home Depot had consumers on edge. But 2015 is shaping up to be the year consumers should be taking a closer look at who is guarding their health information.” Indeed, she notes, “Data about more than 120 million people has been compromised in more than 1,100 separate breaches at organizations handling protected health data since 2009, according to Department of Health and Human Services data reviewed by The Washington Post.” Well, at this point, that figure would now be about 124.5 million, if the UCLA Health breach turns out to be as bad as one imagines it might be.


Indeed, Peterson writes, “Most breaches of data from health organizations are small and don't involve hackers breaking into a company's computer system. Some involve a stolen laptop or the inappropriate disposal of paper records, for example -- and not all necessarily involve medical information. But hacking-related incidents disclosed this year have dramatically driven up the number of people exposed by breaches in this sector. When Anthem, the nation's second-largest health insurer, announced in February that hackers broke into a database containing the personal information of nearly 80 million records related to consumers, that one incident more than doubled the number of people affected by breaches in the health industry since the agency started publicly reporting on the issue in 2009.”


And she quotes Rachel Seeger, a spokesperson for the Office for Civil Rights in the Department of Health and Human Services, as saying in a statement, following the Anthem breach, “These incidents have the potential to affect very large numbers of health care consumers, as evidenced by the recent Anthem and Premera breaches."


So this latest breach is big, and it is scary. And it might be easy (and lazy blogging and journalism) to describe this UCLA Health data breach as a “wake-up call”; but honestly, we’ve already had a series of wake-up calls in the U.S. healthcare industry over the past year or so. How many “wake-up calls” do we need before hospitals and other patient care organizations move to impose strong encryption regimens on their core sensitive data? The mind boggles at the prospects for the next 12 months in healthcare—truly.

more...
No comment yet.
Scoop.it!

21.5M Data Breach Victims Left Clueless

21.5M Data Breach Victims Left Clueless | HIPAA Compliance for Medical Practices | Scoop.it

In the two months since discovering that the sensitive personal information of 21.5 million Americans was compromised by a hack of U.S. federal computer networks, the government has yet to officially notify any of the victims, Reuters reported Tuesday (July 14).


Officials from multiple agencies told Reuters the Office of Personnel Management (OPM), which oversaw the stolen data, is working to setup a system to alert those affected but the mechanism will most likely still take weeks to complete.


While OPM recently claimed to impede 10 million intrusion attempts in an average month, the fact that such a massive breach in security was able to take place has cast a dark shadow over the office and its management. Just last week, OPM Director Katherine Archuleta resigned shortly after the office released the actual number of those left exposed by the considerable personnel data breach.


While it may have been understandable for OPM to take time confirming the full scope of the cyberattacks, as it needed time to perform its own forensic investigation, it remains unclear why there has been such a delay in reaching out to those impacted by the attacks.

An OPM official, who requested not to be identified, told Reuters the complicated nature of the data, coupled with the fact that government employees and contractors frequently move among various agencies, means it may be some time before all of the victims are notified. The government is attempting to establish a centralized system for notification rather than relying on separate agencies, the official said, also noting there is an expectation OPM may hire an outside contractor to perform the work.


Considering the nature of the data exposed, along with reports of the data breach going undetected for a full year, it seems there would be a higher priority placed on protecting those who have been impacted by first providing them with official notification.


In a recent agency release outlining the details of its internal investigation into the attacks, OPM said: “Following the conclusion of the forensics investigation, OPM has determined that the types of information in these records include identification details such as Social Security Numbers; residency and educational history; employment history; information about immediate family and other personal and business acquaintances; health, criminal and financial history; and other details.”


“Some records also include findings from interviews conducted by background investigators and fingerprints. Usernames and passwords that background investigation applicants used to fill out their background investigation forms were also stolen,” the statement continued.


While no official communication may be sent to victims for some time, OPM has confirmed that anyone who went through a security clearance background investigation performed by the office since 2000 can likely assume their information was compromised by the data breach.

more...
No comment yet.
Scoop.it!

Bill That Changes HIPAA Passes House

Bill That Changes HIPAA Passes House | HIPAA Compliance for Medical Practices | Scoop.it

The U.S. House of Representatives on July 10 passed a bill aimed at accelerating the advancement of medical innovation that contains a controversial provision calling for significant changes to the HIPAAPrivacy Rule.


The House approved the 21st Century Cures bill by a vote of 344 to 77. Among the 309-page bill's many provisions is a proposal that the Secretary of Health and Human Services "revise or clarify" the HIPAA Privacy Rule's provisions on the use and disclosure of protected health information for research purposes.


Under HIPAA, PHI is allowed to be used or disclosed by a covered entity for healthcare treatment, payment and operations without authorization by the patient. If the proposed legislation is eventually signed into law, patient authorization would not be required for PHI use or disclosure for research purposes if only covered entities or business associates, as defined under HIPAA, are involved in exchanging and using the data.


That provision - as well as many others in the bill - aim to help fuel more speedy research and development of promising medical treatments and devices.


"The act says ... if you're sharing [patient PHI] with a covered entity [or a BA], you don't necessarily need the individual's consent prior to sharing - and that's something our members have been receptive too," notes Leslie Krigstein, interim vice president of public policy at the College of Healthcare Information Management Executives, an organization that represents 1,600 CIOs and CISOs.


"The complexity of consent has been a barrier [to health information sharing] ... and the language [contained in the bill] will hopefully move the conversation forward," she says.


Some privacy advocates, however, have opposed the bill's HIPAA-altering provision.


Allowing the use of PHI by researchers without individuals' consent or knowledge only makes the privacy and security of that data less certain, says Deborah Peel, M.D., founder of Patient Privacy Rights, an advocacy group,.


"Researchers and all those that take our data magnify the risks of data breach, data theft, data sale and harms," she says. "Researchers are simply more weak links in the U.S. healthcare system which already has 100s of millions of weak links."

Changes Ahead?

If the legislation is signed into law in its current form, healthcare entities and business associateswould need to change their policies related to how they handle PHI.


"If the bill is enacted, it will not place additional responsibilities on covered entities and business associates. Rather, it will provide them with greater flexibility to use and disclose protected health information for research," says privacy attorney Adam Greene, partner at law firm Davis Wright Tremaine. "Covered entities and business associates who seek to take advantage of these changes would need to revise their policies and procedures accordingly." For instance, some covered entities also may need to revise their notices of privacy practices if their notices get into great detail on research, Greene notes.

Other Provisions

In addition to the privacy provisions, the bill also calls for penalizing vendors of electronic health records and other health IT systems that fail to meet standards for interoperable and secureinformation exchange.


The bill calls for HHS to develop methods to measure whether EHRs and other health information technology are interoperable, and authorizes HHS to penalize EHR vendors with decertification of their products if their software fails to meet interoperability requirements.


In addition, the bill also contains provisions for "patient empowerment," allowing individuals to have the right to "the entirety" of their health information, including data contained in an EHR, whether structured and unstructured. An example of unstructured data might include physician notes, for instance, although that is not specifically named in the legislation.


"Healthcare providers should not have the ability to deny a patient's request for access to the entirety of such health information," the bill says.


A House source tells Information Security Media Group that the Senate has been working on an "Innovation Agenda" for the past few months calling for policies similar to those contained in the 21st Century Cures bill. House leaders say it's their goal to have a bill sent to the president's desk by the end of the year, the source says.

more...
No comment yet.
Scoop.it!

State AGs clash with Congress over data breach laws

State AGs clash with Congress over data breach laws | HIPAA Compliance for Medical Practices | Scoop.it

Attorneys general from all 47 states with data breach notification laws are urging Congress not to preempt local rules with a federal standard.

“Any additional protections afforded consumers by a federal law must not diminish the important role states already play protecting consumers from data breaches and identity theft,” they wrote in a letter sent to congressional leaders on Tuesday.

Lawmakers have been weighing a number of measures that would create nationwide guidelines for notifying customers in the wake of a hack that exposes sensitive information. Industry groups have argued that complying with the patchwork set of rules in each state is burdensome and costly.


The rapidly rising number of breaches at retailers, banks and government agencies has only raised pressure on Congress to pass legislation.

While the concept of a federal standard has bipartisan appeal, the two parties have split over whether to totally preempt state laws.

Democrats fear a nationwide rubric that preempts state law could weaken standards in states that have moved aggressively on data breach laws. Republicans fear that an overly strict federal standard could empower overzealous government regulators.

Lawmakers also disagree on what type of breaches should trigger a notification.

The differing views have spawned a cavalcade of bills on Capitol Hill, many of which would preempt state laws.

“Given the almost constant stream of data security breaches, state attorneys general must be able to continue our robust enforcement of data breach laws,” said Virginia Attorney General William Sorrell, who oversees a law that requires companies to notify officials within 14 days of discovering a breach, in a statement. “A federal law is desirable, but only if it maintains the strong consumer protection provisions in place in many states.”

Many state attorneys general, including Sorrell, favor a Senate data breach offering from Sen. Patrick Leahy (D-Vt.) and co-sponsored by five other Democrats.

Notably the bill does not preempt state laws that are stricter than the standard delineated in Leahy’s bill.

It also provides a broad definition of what type of information would constitute a notification-worthy breach. It includes photos and videos in addition to more traditional sensitive data such as Social Security numbers or financial account information.

But most important for states is retaining their ability to set their own standards.

“States should also be assured continued flexibility to adapt their state laws to respond to changes in technology and data collection,” the letter said. “As we have seen over the past decade, states are better equipped to quickly adjust to the challenges presented by a data-driven economy.”

more...
No comment yet.
Scoop.it!

Data Breaches on Record Pace for 2015

Data Breaches on Record Pace for 2015 | HIPAA Compliance for Medical Practices | Scoop.it

Data breaches in 2015 are on pace to break records both in the number of breaches and records exposed, the San Diego-based Identity Theft Resource Center said.


In 2014, the number of U.S. data breaches tracked by ITRC hit a record high of 783, with 85,611,528 confirmed records exposed.

So far this year, as of June 30, the number of breaches captured on the ITRC report totaled 400 data incidents, one more than on June 30, 2014. Additionally, 117,576,693 records had been confirmed to be at risk.


That is significant given the finding of IBM Cost of Data Breach Study conducted by Ponemon Institute, which reported the cost incurred for each lost or stolen record containing sensitive averaged $154.

ITRC reported a significant jump of about 85% in the number of breaches in the banking sector over the same period last year. The biggest credit union breach so far this year took place at the $308 million Winston-Salem, N.C.-based Piedmont Advantage Credit Union, which notified its entire 46,000 membership in early March that one of its laptops containing personal information, including Social Security numbers, was missing.


Affected institutions are encouraged to participate in public comment on the assessment tool.


Year-to-date, the five industry sectors broken down by ITRC based on the percentage of breaches were business with 40.3%,

medical/healthcare at 34.8%, banking/credit/financial representing 10%, educational with 7.8% and government/military reporting 7.3%.

Based on the number of confirmed records, the medical/healthcare sector reported 100,926,229 records breached, government/military reported 15,391,057, educational had 724,318, banking/credit/financial reported 408,377 and business had 126,712.


The ITRC 2015 Breach Report was compiled using data breachesconfirmed by various media sources and/or notification lists from state governmental agencies.


Some breaches were not included in the report because they do not yet have reported statistics or remain unconfirmed, the firm said. 

more...
No comment yet.
Scoop.it!

HIPAA Criminal Violations on the Rise

HIPAA Criminal Violations on the Rise | HIPAA Compliance for Medical Practices | Scoop.it

Stories appear almost everyday about medical records being improperly accessed, hacked or otherwise being stolen. The number of stories about such thefts is almost matched by the number of stories about the high value placed upon medical records by identity thieves and others. This confluence of events highlights the pressure being faced by the healthcare industry to protect the privacy and security of medical records in all forms.


While stories about hacking and other outside attacks garner the most attention, the biggest threat to a healthcare organization’s records is most likely an insider. The threat from an insider can take the form of snooping (accessing and viewing records out of curiosity) to more criminal motives such as wanting to sell medical information. Examples of criminally motivated insiders, unfortunately, are increasing.


One recent example occurred at Montefiore Medical Center in New York where an assistant clerk allegedly stole patient names, Social Security numbers, and birth dates from thousands of patients. The hospital employee then sold the information for as little as $3 per record. The individuals who acquired the information used it to allegedly go on a shopping spree across New York for over $50,000.

Another recent example comes out of Providence Alaska Medical Center in Anchorage, AK. In Anchorage, a financial worker at a hospital provided information about a patient to a friend. Unfortunately, that friend he had injured for which he was under criminal investigation. The friend wanted to know if either of the patients had reported him to the police. Clearly, the access by the financial worker was improper.


While it could previously be said that instances of criminal convictions or indictments were rare, the examples do appear to be coming with increasing frequency. What should organizations do? Is this conduct actually preventable? As is true with HIPAA compliance generally, the key is to educate and train members of an organization’s workforce. If someone is unaware of HIPAA requirements, it is hard to comply.

However, it can also be extremely difficult to prevent criminal conduct altogether. If an individual has an improper motive, that individual will likely find a way to do what they want to do. From this perspective, organizations cannot prevent the conduct, but should consider what measures can be taken to mitigate the impact of improper access or taking of information. It would be a good idea to monitor and audit access or use of information to be able to catch when information could be going out or otherwise accessed when not appropriate. Overall, the issue becomes one of how well does an organization monitor its systems and take action when a suspected issue presents itself.

more...
No comment yet.
Scoop.it!

4 in 10 Midsize Businesses Have Experienced A Data Breach

4 in 10 Midsize Businesses Have Experienced A Data Breach | HIPAA Compliance for Medical Practices | Scoop.it

Most midsize business leaders view a data breach among their top risks and a majority consider IT security ‘very important’ when selecting a supplier, according to The Hartford’s survey of midsize business owners and C-level executives. They have good reason to be concerned: 43 percent had experienced a data breach in the prior three years, and 13 percent have had a supplier’s data breach impact their business information.


The Hartford survey found most midsize business leaders (82 percent) consider a data breach at least a minor risk to their business. Nearly one-third (32 percent) view it as a major risk.


“All types of businesses have networks and networks can be vulnerable to a breach,” said Joe Coray, vice president of The Hartford’s Technology & Life Science Practice. “As we have seen in recent years, a breach involving a supplier or vendor can impact a business as much as a breach of its own IT systems. Whether businesses are hosting their data internally or entrusting it to external business partners, it is important that they validate how their information is being secured.”


Recognizing the data risks involving suppliers, more than half of the midsize business leaders (53 percent) surveyed consider IT security and data protection practices very important when selecting a supplier. By comparison, 36 percent consider a supplier’s contingency planning and 28 percent view a supplier’s location relative to their business as very important.

more...
No comment yet.
Scoop.it!

The most dangerous data breach ever known

The most dangerous data breach ever known | HIPAA Compliance for Medical Practices | Scoop.it

From time to time I have the depressing task to write about yet another data loss event that caused the personal details of millions of people to fall into the hands of criminals. Usually this is credit card data, along with names and email addresses. Sometimes physical addresses are included, and occasionally even more sensitive data like Social Security numbers goes along for the ride. Usually this data was collected by a large retailer that had no qualms about storing the sensitive information, but clearly neglected to properly secure it.


Stolen data is primarily used for credit card fraud, though if there's enough information available, identity theft is a definite possibility. Millions of affected people have been forced to get new credit cards, check their statements for fraudulent charges, and rework any automated payment arrangements and whatnot. It's a big pain in the ass, and frankly, it has happened far too often, especially when once should be considered more than enough.




Heartland, Target, TJX, Anthem ... we've seen some massive data breaches over the years. But none can hold a candle to the breach the U.S. government announced last week. Not even close. On a scale of one to 10, with one being the loss of credit card numbers and names, this data loss event would conservatively be a 15.


Most people aren't aware of exactly what type of information the federal government collects on its employees, especially those with security clearances. We all have some idea that government employees have relatively strict reporting requirements for financial information, and we know that federal workers with higher clearances undergo thorough background checks and must submit to interviews of both themselves and their family and friends. This is done to flag potential problems and to prevent outside agents from having undue influence over people who may have access to sensitive information and materials.


Put simply, if you have a security clearance, the government would like to know if you have a drug problem or if you are in serious debt, because a foreign interest may try to use that situation as leverage to coerce you into revealing sensitive information. In the interest of national security, these safeguards make sense.


But the true nature and scope of the information required by the government and subsequently collected by the government on an employee is massive. Take a look at Standard Form 86. This is a 127-page form that usually takes a week or more to complete and requires the entry of the applicant's Social Security number on each page. The data included on this form is not just enough for identity theft, but enough to allow a person to literally become another person. Each Standard Form 86 fully documents the life of the subject. The only thing missing is the name of your first crush, though that might be in there somewhere too.


Some 18 million people had this level of personal data -- and more, including data collected by observers -- lost to foreign agents last week. If the government collected this data to know if an employee was vulnerable to undue outside influence, then it just succeeded in closing that loop itself, having now released it into the wild. All of those vulnerabilities are now known and available for exploit to whomever stole the data, or to whomever they wish to sell that data. This is very, very bad.


I should also mention that many of those whose personal information was swept up in this data loss event were never even government employees in the first place. They may have filled out the forms and submitted applications, but they were never hired or they declined the job. This includes prospective TSA agents right on up through CIA employees -- the higher the position, the higher the clearance, the more sensitive the data that was collected and lost. Information on these peoples' infidelities, sexual fetishes, mental illnesses, criminal activities, debts, and other highly personal information is now in the hands of cyber-attackers. This is damage that cannot be undone or mitigated. We can change credit card numbers and refund fraudulent charges, but we can't change any of the personal data and intimate details of these people's lives. That's a permanent loss.

One could argue that however disastrous this data loss event is, the government had a requirement to store this data. It needed to collect and maintain this data, even if it failed to secure it. That said, this is the same government that is collecting a massive amount of data on all of us, whether we're prospective federal employees or not, via Internet and phone surveillance. If the federal government is lax enough to lose immeasurably sensitive information on its employees, how secure is the data that it has decided it needs to collect on everyone in the world?


Many people believe that the U.S. government shouldn't be collecting and storing this data in the first place, and that there's no need to maintain that data collection. This event underscores the fact that maintaining this data is not just privacy invasion on a massive scale, but it's actually dangerous. What happens when the next data loss event contains highly sensitive data on hundreds of millions of people? We can't put that cat back in the box no matter how we might try. You might think that the best way to guard against that possibility is to stop collecting that data in the first place.

more...
No comment yet.
Scoop.it!

Patients Demand the Best Care … for Their Data

Patients Demand the Best Care … for Their Data | HIPAA Compliance for Medical Practices | Scoop.it

Whether it’s a senior’s first fitting for a hearing aid, or a baby boomer in for a collagen injection, both are closely scrutinizing new patient forms handed to them by the office clerk.  With 100 million medical records breached and stolen to date, patients have every reason to be reluctant when they’re asked to fill out forms that require their social security number, driver’s license, insurance card and date of birth — all the ingredients for identity fraud.  Patients are so squeamish about disclosing their personal information, even Medicare has plans to remove social security numbers on patients’ benefits cards.


Now patients have as much concern about protecting their medical records as they do about receiving quality care, and they’re getting savvy about data protection.  They have every right to be assured by their physician that his practice is as concerned about their privacy as he is about their health.


But despite ongoing reports of HIPAA violations and continuous breaking news about the latest widespread patient data breach, medical practices continue to treat ePHI security as a lesser priority.  And they neglect to train front office staff so the patient who now asks a receptionist where the practice stores her records either gets a quizzical look, or is told they’re protected in an EHR but doesn’t know how, or they’re filed in a bank box in “the back room” but doesn’t know why.


In some cases, the practice may hide the fact that office staff is throwing old paper records in a dumpster. Surprisingly this happens over and over.  Or, on the dark side, the receptionist accesses the EHR, steals patients’ social security numbers and other personal information and texts them to her criminal boyfriend for medical identity theft.


Another cybercrime threatening medical practices comes from hackers who attack a server through malware and encrypt all the medical files.  They hold the records hostage and ask for ransoms.  Medical records can vanish and the inability to access critical information about a patient’s medical condition could end up being life threatening.

Physicians should not only encrypt all mobile devices, servers and desktops, regularly review system activity, back up their servers and have a disaster recovery plan in place, etc. they should also share their security practices and policies with the patient who asks how his office is protecting her records.


Otherwise, the disgruntled patient whose question about security is dismissed won’t only complain to her friends over coffee, she’ll spread the word on Facebook.  Next time a friend on Facebook asks for a referral the patient tells her not to go to her doctor — not because he’s an incompetent surgeon but because he doesn’t know the answer when she asks specifically if the receptionist has unlimited access to her records.


And word gets out through social media that the practice is ‘behind the times.’  The doctor earns a reputation for not taking the patient’s question seriously, and for not putting the proper measures in place to secure the patient’s data.  This is the cockroach running through the restaurant that ends up on YELP.


It’s time to pull back the curtain and tell patients how you’re protecting their valuable data.  Hand them a HIPAA security fact sheet with key measures you’ve put in place to gain their confidence.  For example, our practice:


  • Performs annual risk assessments, with additional security implemented, including encryption and physical security of systems that contain patient information.
  • Shows patients that the organization has policies and procedures in place
  • Trains employees on how to watch for risks for breaches
  • Gives employees limited access to medical records
  • Backups systems daily
  • Performs system activity regularly


Practices that communicate to patients how they are protecting their information, whether it’s provided by the front office staff, stated in a fact sheet or displayed on their websites, not only instills confidence and maintains their reputations, they actually differentiate themselves in the market place and attract new patients away from competitors.

more...
No comment yet.