HIPAA Compliance for Medical Practices
75.5K views | +0 today
Follow
HIPAA Compliance for Medical Practices
HIPAA Compliance and HIPAA Risk management Articles, Tips and Updates for Medical Practices and Physicians
Your new post is loading...
Your new post is loading...
Scoop.it!

When does HIPAA require more than encryption?

When does HIPAA require more than encryption? | HIPAA Compliance for Medical Practices | Scoop.it

Encryption of sensitive electronic personal health information (ePHI) on mobile devices – including PCs – is often considered sufficient to protect that data well enough to achieve HIPAA compliance. However, it’s important that those handling this data understand the circumstances where encryption alone is not enough.


These situations do exist – and can be nightmares if they occur. The Department of Health and Human Services' HIPAA Security Rule describes satisfactory encryption as “an algorithmic process to transform data into a form in which there is a low probability of assigning meaning without use of a confidential process or key … and such confidential process or key that might enable decryption has not been breached.” That last part means that encryption is only adequate as a safeguard for HIPAA-protected ePHI if the situation is such that the encryption still secures the data.


There are several scenarios where even encrypted data can be breached relatively easily and, unfortunately, there are many real world examples of each of these scenarios occurring. The trouble with encrypted data is that it needs to be decrypted to be useful to those who would access it legitimately, and the bad guys will look to take advantage of those moments when encryption’s defenses are down. Encryption is a powerful defense for data when a device’s power is off and for when the password is unknown and can’t be learned or hacked. But putting it that way, we’ve actually rather narrowly defined where encryption is effective.


Here are some cases where it isn’t.


1. The data thief gains the password needed to get around the encryption on an ePHI-filled device. This can happen when the password is stolen along with the device - for example, if a laptop is taken along with a user’s notepad containing the password needed to access ePHI. HIPAA requires not only encrypting sensitive data but also paying attention to the safety of passwords or any such methods of access. Bad password security effectively negates encryption. Too often we’ve seen a sticky note of passwords attached to a laptop – or even passwords written on USB devices themselves – which is a great example of an encryption that is not HIPAA-secure.


In another type of case at Boston’s Brigham and Women’s Hospital, a physician was robbed at gunpoint and threatened into disclosing the pass codes on the laptop and cellphone that were taken from him, each of which contained ePHI. The doctor appears to have done all that could be done to comply with HIPAA as far as keeping data encrypted, but when forced to choose between personal health information and actual personal health, he made the reasonable choice. Still, the incident was a HIPAA breach, requiring patients and officials to be notified.


2. The stolen device is already running and an authorized user has already been authenticated. In this scenario, the legitimate user has already given his or her credentials and has a session accessing ePHI running when an unauthorized user gains control of the device. HIPAA contains measures to minimize the likelihood of this scenario, calling for the issue to be addressed with automatic log-off capability to “terminate an electronic session after a predetermined time of inactivity.” Still, authorized users should take care to close out sessions themselves if stepping away from their devices and leaving them unguarded.


3. A formerly authorized user becomes unauthorized, but still has access. This can happen when an employee quits or is terminated from a job but still possesses hardware and passwords to bypass encryption. A case such as this occurred at East Texas Hospital, where a former employee was recently sentenced to federal prison for obtaining HIPAA-protected health information with the intent to sell, transfer or otherwise use the data for personal gain. Criminals in these cases often use ePHI for credit card fraud or identity theft, demonstrating how important HIPAA safeguards can be to the patients they protect.


So how can ePHI be protected beyond encryption?


The safest security system to have in place when encountering each of these scenarios is one where the organization retains control over the data, and the devices containing ePHI are equipped with the ability to defend themselves automatically.


The fact is that employees will always seek and find ways to be their most productive, meaning that policies trying to keep ePHI off of certain devices are, for all intents and purposes, doomed to be burdensome and disrespected. For doctors and other healthcare staff, productivity trumps security. It’s best to take concerns around security off their plate and provide it at an organizational level. Organizations can implement strategies that maintain regular invisible communications between the IT department and all devices used for work with ePHI in a way that isn’t cumbersome to the user. Through these communications, the IT department can access devices to remotely block or delete sensitive data and revoke access by former employees. Software installed on devices can detect security risks and respond with appropriate pre-determined responses, even when communication can’t be established.


Given the high stakes of HIPAA compliance – where a single breach can lead to government fines and costly reputational damage – it would be wise for healthcare organizations to consider encryption only the beginning when it comes to their data security.

more...
Scoop.it!

Don't confuse EHR HIPAA compliance with total HIPAA compliance

Don't confuse EHR HIPAA compliance with total HIPAA compliance | HIPAA Compliance for Medical Practices | Scoop.it

Electronic health records (EHR) systems are revolutionizing the collection and standardization of patient medical information. Never before has it been so easy for healthcare practitioners to have patient information so readily available, allowing for more efficient and accurate care.


Unfortunately, what many organizations today don’t realize is, just because their EHR system is compliant with HIPAA security standards, their entity as a whole may not be fully compliant.

Every healthcare organization under HIPAA is responsible for the protection of patient data, regardless of whether they use a vendor to process or store their patient records. If your EHR vendor claims you don’t have to worry about HIPAA compliance, don’t believe them – it’s just not true.


Privacy and security are much more than simply having a HIPAA compliant EHR. It is truly frightening when I hear a healthcare company, or even worse, an EHR vendor, claim their EHR system covers all of a healthcare company’s HIPAA requirements. Even for cloud-based EHR systems, this simply is not the case.

Maintaining a secure EHR system

The newly revised HIPAA Security Rule requires providers to assess the security of their databases, applications, and systems that contain patient data against a list of 75 specific security controls. These controls include specific safeguards to be in place for the purpose of protecting PHI.


In our ever-changing digital environment, it’s critical that healthcare organizations regularly assess their security programs as a whole to ensure they have the policies, procedures, and security measures in place to better protect patient information and avoid costly regulatory enforcements.


Unfortunately, addressing risks to electronic patient data is not always a top priority.


We need to get the message out that HIPAA compliance (and the protection of patient data) cannot be relegated to simply checking a box (i.e., my EHR system is compliant, therefore, my practice is compliant, too). HIPAA compliance must, instead, be addressed across an organization wherever patient data is present.

Understand current security measures

The ongoing responsibility of managing patient data throughout an organization requires an organized, well-thought-out approach to risk management. No matter how small or long established, it’s critical for healthcare entities to understand what they are doing to protect patient data, what they are not doing, and what they should be doing in the future.


While some EHR systems and their related equipment have security features built into or provided as part of a service, they are not always configured or enabled properly. In addition, medical equipment is often web-enabled (can connect remotely to send information to a server), but that equipment may not be checked for proper security.

As the guardian of patient health information, it is up to each healthcare organization to learn and understand the basic features of their IT assets and medical devices, what security mechanisms are in place, and how to use them.


There are a number of actions an entity can take to make sure that their EHR systems and IT assets are secure. Such measures leverage an integrated use of data loss prevention tools, intrusion prevention, anti-malware, file integrity monitoring, robust identity management and authentication programs, role-based access and data security solutions.

The road to HIPAA compliance

Creating adequate safeguards does not happen overnight. While it may seem overwhelming and time-consuming at first (due to HIPAA’s complex nature), the biggest obstacle to overcome is actually getting the entire process started.


Begin by carving out a regular, weekly routine – perhaps starting at 30 minutes per week when your staff members who are responsible for HIPAA compliance can meet to discuss the privacy and security of patient data.


Here are some specific actions your entity should take when working to protect patient information:

  • Have a designated HIPAA-assigned compliance officer or team member. Clearly and specifically lay out the roles of everyone in your organization involved with HIPAA compliance responsibilities.
  • Ensure that access to ePHI is restricted based on an individual’s job roles and/or responsibilities.
  • Conduct an annual HIPAA security risk analysis (specifically required under HIPAA rules.) This can involve regularly engaging with a trusted provider that can remotely monitor and maintain your network and devices to ensure ongoing security.
  • Mitigate and address any risks identified during your HIPAA risk analysis including deficient security, administrative and physical controls, access to environments where ePHI is stored, and a disaster recovery plan.
  • Make sure your policies and procedures match up to the requirements of HIPAA.
  • Require user authentication, such as passwords or PIN numbers that limit access to patient information to authorized-only individuals.
  • Encrypt patient information using a key known or made available only to authorized individuals.
  • Incorporate audit trails, which record who accessed your information, what changes were made, and when they were made, providing an additional layer of security.
  • Implement workstation security, which ensures the computer terminals that access individual health records cannot be used by unauthorized persons.


Privacy and security concerns are key when it comes to HIPAA, but it’s also important to ensure your enterprise as a whole is protected. With 75 different requirements that fall under the HIPAA Security Rule umbrella, it’s critical to ensure all systems where ePHI resides are protected. Otherwise, organizations are placing themselves and their patients at serious risk.


more...
No comment yet.
Scoop.it!

Doctors' offices must be wary of data breaches as use of electronic records grows

Doctors' offices must be wary of data breaches as use of electronic records grows | HIPAA Compliance for Medical Practices | Scoop.it

As physician practices switch from paper charts to electronic medical records, they're coming under more pressure to guard against data breaches that deliver sensitive patient records and valuable personal information into the hands of hackers.


Nelson Gomes is chief executive of PriorityOne Group, which specializes in providing information technology for medical facilities. He said he’s concerned that small physician practices, which are the norm throughout New Jersey, aren’t doing enough to safeguard patient information. Federal law requires medical facilities to protect the privacy and security of patient records and in 2013 that rule was extended to the business associates of medical facilities.

Gomes said a small medical practice “needs to understand that just because they are small doesn’t mean they aren’t targets; in fact, they may just be easier targets.”

He tells clients that safeguarding patient data is a continuous commitment. Physicians need to have a plan in place and be able to demonstrate that it’s working: “Your staff needs to be trained and you should be able to prove and document that your staff is trained on how to deal with a breach.”

The public hears on almost a daily basis about data breaches at big national retailers like Target. Data breaches at small physician practices get less media attention, in part because each incident may involve only a few thousand patient records, or may go undetected, according to a report on physician practice data security by the IT consulting firm Spruce Technology in Clifton.

According to Spruce Technology, “With a strong data security strategy in place and the right person designated as data security officer, physicians’ practices and other small health care organizations can go a long way toward protecting themselves and their patients.”

Anthony Curro, chief executive of Spruce Technology, said hackers increasingly “are looking at doctor practices, which have an incredible amount of data. Vigilance (to protect that data) needs to continue to grow — and I think we are starting to see that happen.”

Srini Penumella, managing partner of Spruce Technology, said data security “is a combination of people, processes and technology.”  He said for a small physician office “there are (IT security) tools that they can implement, and that they can afford.” He said physicians “may not be aware of the tools that are available, or think it’s too much of a hassle to change their existing system. But if you look at the benefits, it’s really doable and accessible.”

Jarrett Farrell is vice president of business development at the Flemington insurance agency Cedar Risk Management.

Farrell pointed out that, when a data breach occurs, the physician practice will typically provide free credit monitoring for a year — which he said can cost between $250 to $400 per patient. “Those numbers can add up pretty quickly.”

One strategy he recommends is cyber-liability insurance protection to help cover the costs incurred after a data breach.

Sean Keegan is director of benefit services at Cedar Risk Management, which develops health benefits plans for employers.
“We deal with paperwork that has personal information, and we use encrypted emails when we are conversing with an insurance carriers, or changing someone’s coverage,” Keegan said, “Everything is going back and forth in encrypted files: there’s no more paper faxes sitting around on machines.”

But not all physician offices have embraced encryption, experts said. And that can become a weak link that is compounded by the human error factor.

Farrell said his company was called in recently by a dental practice that had lost a laptop.

“Someone was at Starbucks doing work on a Saturday. They got up and came back and their laptop was stolen. All the patient information was there — and it did not have the encryption that should have been there.”

Princeton attorney Helen Oscislawski specializes in regulatory matters, with an emphasis on compliance with the main federal law covering the privacy of patient information: the Health Insurance Portability and Accountability Act, or HIPPA.  She pointed out that, since 2008, there have been a growing number of settlement agreements stemming from HIPPA enforcement actions brought by the federal government, resulting in fines paid by health care organizations.

“Some of the lessons learned are that you have no excuse other than to implement encryption,” she said.

She said physicians need to spend more time on prevention of data breaches. “Physicians are not fully aware of what they need to do, because still to this day there is a huge lack of education.”

She said physician practices may rely on outside IT vendors to provide them with security technology — but when something goes wrong, the physician group is responsible.

“This really is a collaborative effort and, at the end of the day, neither the physicians nor the (IT) vendor should be pointing fingers. Both need to take ownership and responsibility.”

Darryl Neier, who heads the forensic accounting/litigation group at Sobel & Co., said data breaches in the health care space are on the rise “because the people who are perpetrating these crimes will go where the data is” — and health care is a treasure trove of personal data.

He said health care providers have to be accountable for how they safeguard that data and, in the event of a breach, they have to be prepared to answer questions like: “How are you protecting your data, how are you locking this down — and are you allowing this to happen?”



more...
No comment yet.
Scoop.it!

Stage 3 Meaningful Use: Breaking Down HIPAA Rules

Stage 3 Meaningful Use: Breaking Down HIPAA Rules | HIPAA Compliance for Medical Practices | Scoop.it

CMS released its Stage 3 Meaningful Use proposal last month, with numerous aspects that covered entities (CEs) need to be aware of and pay attention to. While the proposal has a large focus on EHR interoperability, it continues to build on the previously established frameworks in Stage 1 and Stage 2 – including keeping patient information secure.


HIPAA rules and regulations cannot be thrown out the window as CEs work toward meeting meaningful use requirements. We’ll break down the finer points of Stage 3 Meaningful Use as it relates to data security, and how organizations can remain HIPAA compliant while also make progress in the Meaningful Use program.


Stage 3 further protects patient information


One of the top objectives for Stage 3 Meaningful Use is to protect patient information. New technical, physical, and administrative safeguards are recommended that provide more strict and narrow requirements for keeping patient data secure.


The new proposal addresses how the encryption of patient electronic health information continues to be essential for the EHR Incentive Programs. Moreover, it explains that relevant entities will need to conduct risk analysis and risk management processes, as well as develop contingency plans and training programs.


In order to receive EHR incentive payments, covered entities must perform a security risk analysis. However, these analyses must go beyond just reviewing the data that is stored in an organization’s EHR. CEs need to address all electronic protected health information they maintain.


It is also important to remember that installing a certified EHR does not fulfill the Meaningful Use security analysis requirement. This security aspect ensures that all ePHI maintained by an organization is reviewed.  For example, any electronic device – tablets, laptops, mobile phones – that store, capture or modify ePHI need to be examined for security.

“Review all electronic devices that store, capture, or modify electronic protected health information,” states the ONC website. “Include your EHR hardware and software and devices that can access your EHR data (e.g., your tablet computer, your practice manager’s mobile phone). Remember that copiers also store data.”


It is also important to regularly review the existing security infrastructure, identify potential threats, and then prioritize the discovered risks. For example, a risk analysis could reveal that an organization needs to update its system software, change the workflow processes or storage methods, review and modify policies and procedures, schedule additional training for your staff, or take other necessary corrective action to eliminate identified security deficiency.

A security risk analysis does not necessarily need to be done every year. CEs only need to conduct one when they adopt an EHR. When a facility changes its setup or makes alterations to its electronic systems, for example, then it is time to review and make updates for any subsequent changes in risk.


Stage 3 works with HIPAA regulations


In terms of patient data security, it is important to understand that the Stage 3 Meaningful Use rule works with HIPAA – the two are able to compliment one another.


“Consistent with HIPAA and its implementing regulations, and as we stated under both the Stage 1 and Stage 2 final rules (75 FR 44368 through 44369 and 77 FR 54002 through 54003), protecting ePHI remains essential to all aspects of meaningful use under the EHR Incentive Programs,” CMS wrote in its proposal. “We remain cognizant that unintended or unlawful disclosures of ePHI could diminish consumer confidence in EHRs and the overall exchange of ePHI.”

As EHRs become more common, CMS explained that protecting ePHI becomes more instrumental in the EHR Incentive Program succeeding. However, CMS acknowledged that there had been some confusion in the previous rules when it came to HIPAA requirements and requirements for the meaningful use core objective:


For the proposed Stage 3 objective, we have added language to the security requirements for the implementation of appropriate technical, administrative, and physical safeguards. We propose to include administrative and physical safeguards because an entity would require technical, administrative, and physical safeguards to enable it to implement risk management security measures to reduce the risks and vulnerabilities identified.


CMS added that even as it worked to clarify security requirements under Stage 3, their proposal was not designed “to supersede or satisfy the broader, separate requirements under the HIPAA Security Rule and other rulemaking.”


For example, the CMS proposal narrows the requirements for a security risk analysis in terms of meaningful use requirements. Stage 3 states that the analysis must be done when CEHRT is installed or when a facility upgrades to a new certified EHR technology edition. From there, providers need to review the CEHRT security risk analysis, as well as the implemented safeguards, “as necessary, but at least once per EHR reporting period.”


However, CMS points out that HIPAA requirements “must assess the potential risks and vulnerabilities to the confidentiality, availability, and integrity of all ePHI that an organization creates, receives, maintains, or transmits” in all electronic forms.


Working toward exchange securely


The Stage 3 Meaningful Use proposal encourages CEs to work toward health information exchange and to focus on better health outcomes for patients. As healthcare facilities work toward both of these goals, it is essential that health data security still remains a priority and that PHI stays safe.


While HIPAA compliance ensures that CEs avoid any federal fines, it also ensures that those facilities are keeping patient information out of the wrong hands. The right balance needs to be found between health information security and health information exchange.


more...
No comment yet.
Scoop.it!

Why health IT companies may not take HIPAA seriously until 2016 | mHealthNews

Why health IT companies may not take HIPAA seriously until 2016 | mHealthNews | HIPAA Compliance for Medical Practices | Scoop.it

When the Final Omnibus Rule came into effect on March 23, 2013, the intent was to make business associates (BAs) more accountable for the protection of the data they were managing on behalf of covered entities (CEs) such as hospitals or health plans. Prior to this, BAs were only liable for whatever was put into a Business Associates Agreement (BAA) by the CE, and even then that liability was restricted to any civil action that may be taken by the CE. 

However, the Final Omnibus Rule extended the same federal provisions to BAs that had previously been restricted to CEs, meaning that whether a business associate signed a BA or not, they were federally required to operate in accordance with the Security, Privacy and Breach Notification rules. Failure to do so could result in federal penalties of up to $1.5 million per breach type, and even criminal prosecution.

This change was driven by the fact that an increasing percentage of heathcare data is being managed by BAs such as health IT vendors. While covered entities still account for the majority of breach incidents, BAs are responsible for most of the records breached.

However, after an initial flurry of activity before and after this date, most business associates have responded to this change with general apathy. Being in a position to talk to companies every day who operate as business associates, I am repeatedly underwhelmed by their efforts to take security and compliance seriously, despite this change in the law. Indeed, even when offered the chance to enhance their security posture and, by extension, their compliance to HIPAA regulations in a simple an affordable manner, many decline to do so, stating a conflict of priorities. It's not that they are necessarily unaware of the potential consequences – rather, they simply do not see it as a sufficient priority. They often see themselves as being too small, or that they first need to build a business before worrying about protecting it. And the reality is they see no immediate consequence to their procrastination.

It's like the speed limit being reduced from 65 mph to 55 mph. While notices are posted, after initial caution by drivers, they see no police cars on the side of the road or any evidence that anyone is being pulled over, so they don’t reduce their speed. Indeed, as more cars come onto the freeway some start to go faster, which encourages others to follow suit. Everyone knows they are speeding, but then everyone else is doing it and no one seems to be getting penalized for it.

The challenge for companies is that while there may not be visible enforcement right now, that is because it takes a while for breaches to be discovered, investigated and adjudicated – on average about three years. Most HIPAA judgments being pronounced today relate to breaches that occurred in 2011.

So to extend the previous analogy, while there may not be police visible on the side of the road, there are speed cameras. The violators will not receive their speeding ticket until a considerable time after the offence was committed, meaning they continue to speed long after their first offence.

In terms of HIPAA enforcement that means most judgments will not become public until 2016, at which time I would hope most BAs will already have realized that it can happen to them, and will have started making adequate protections an imperative.  But until they do, they will need to hope they do not drive past an OCR speed camera.



more...
No comment yet.