HIPAA Compliance for Medical Practices
77.7K views | +1 today
Follow
HIPAA Compliance for Medical Practices
HIPAA Compliance and HIPAA Risk management Articles, Tips and Updates for Medical Practices and Physicians
Your new post is loading...
Your new post is loading...
Scoop.it!

How Do HIPAA Regulations Affect Judicial Proceedings?

How Do HIPAA Regulations Affect Judicial Proceedings? | HIPAA Compliance for Medical Practices | Scoop.it

HIPAA regulations are designed to keep healthcare organizations compliant, ensuring that sensitive data - such as patient PHI - stays secure. Should a healthcare data breach occur, covered entities or their business associates will be held accountable, and will likely need to make adjustments to their data security approach to prevent the same type of incident from happening again.


However, there are often questions and concerns in how HIPAA regulations tie into certain judicial or administrative proceedings. For example, if there is a subpoena or search warrant issued to a hospital, is that organization obligated to supply the information? What if the information being sought qualifies as PHI? Can covered entities be held accountable if they release certain information, and then that data falls into unauthorized individuals’ control?


This week, HealthITSecurity.com will break down how judicial proceedings, and other types of legal action, could potentially be impacted by HIPAA regulations. We will discuss how PHI could possibly be disclosed, and review cases where search warrants and similar issues were affected by HIPAA.


What does HIPAA say about searches and legal inquiries?

The HIPAA Privacy Rule states that there are several permitted uses and disclosures of PHI. This does not mean that covered entities are required to disclose PHI without an individual’s permission, but healthcare organizations are permitted to do so under certain circumstances.


“Covered entities may rely on professional ethics and best judgments in deciding which of these permissive uses and disclosures to make,” the Privacy Rule explains.


The six examples of permitted uses and disclosures are the following:

  • To the Individual (unless required for access or accounting of disclosures)
  • Treatment, Payment, and Health Care Operations
  • Opportunity to Agree or Object
  • Incident to an otherwise permitted use and disclosure
  • Public Interest and Benefit Activities
  • Limited Data Set for the purposes of research, public health or health care operations.


Under the public interest and benefit activities, the Privacy Rule dictates that there are “important uses made of health information outside of the healthcare context.” Moreover, a balance must be found between individual privacy and the interest of the public.

There are several examples that relate to disclosing PHI due to types of legal action:


  • Required by law
  • Judicial and administrative proceedings
  • Law enforcement purposes


Covered entities and their business associates are permitted to disclose PHI as required by statute, regulation or court orders.

“Such information may also be disclosed in response to a subpoena or other lawful process if certain assurances regarding notice to the individual or a protective order are provided,” according to the HHS website.


For “law enforcement purposes” HIPAA regulations state that PHI can also be disclosed to help identify or locate a suspect, fugitive, material witness, or missing person. Law enforcement can also make requests for information if they are trying to learn more information about a victim - or suspected victim. Another important aspect to understand is that a covered entity can can disclose sensitive information if it believes that PHI is evidence of a crime that took place on the premises. Even if the organization does not think that a crime took place on its property, HIPAA regulations state that PHI can disclosed “when necessary to inform law enforcement about the commission and nature of a crime, the location of the crime or crime victims, and the perpetrator of the crime.”


Essentially, covered entities and business associates must use their own judgement when determining if it is an appropriate situation to release PHI without an individual’s knowledge. For example, if local law enforcement want more information from a hospital about a former patient whom they believe is dangerous, it is up to the hospital to weigh the options of releasing the information.

How have HIPAA regulations affected court rulings?

There have been several court rulings in the last year discussing HIPAA regulations and how covered entities are allowed to release PHI.


Connecticut: The Connecticut Supreme Court ruled in November 2014 that patients can sue a medical office for HIPAA negligence if it violates regulations that dictate how healthcare organizations must maintain patient confidentiality. In that case, a patient found out that she was pregnant in 2004 and asked her medical facility to not release the medical information to the child’s father. However, the organization released the patient’s information when it received a subpoena. The case claimed that the medical office was negligent in releasing the information, and that the child’s father used the information  for “a campaign of harm, ridicule, embarrassment and extortion” against the patient.


Florida: Just one month earlier, a Florida federal appeals court ruled that it is not a HIPAA violationfor physician defendants to have equal access to plaintiffs’ health information. In this case, a patient sued his doctor for medical negligence. Florida law states that the plaintiff must provide a health history, including copies of all medical records the plaintiff’s experts relied upon in forming their opinions and an “executed authorization form” permitting the release of medical information. However, the plaintiff claimed the move would violate his privacy. The appeals court ruled that two instances applied in this case where HIPAA regulations state that covered entities are permitted to release PHI.


As demonstrated in these two court cases, it is not always easy for covered entities to necessarily determine on their own when they are compromising patient privacy and when they are adhering to a court order. However, by seeking appropriate counsel, healthcare organizations can work on finding a solution that meets the needs of all parties involved.

more...
No comment yet.
Scoop.it!

Mega-Mergers: The Security, Privacy Concerns

Mega-Mergers: The Security, Privacy Concerns | HIPAA Compliance for Medical Practices | Scoop.it

Mergers and acquisitions, such as two pending mega-deals in the health insurance sector, pose security and privacy risks that need to be addressed before the transactions are completed, during the integration process and over the long haul.


In recent weeks, Anthem Inc. announced plans to buy rival Cigna for $48 billion, and Aetna unveiled a proposed $37 billion purchase of Humana.


"I can't speak specifically to these mergers, but in general they share the same challenges as others going through M&As," says Mac McMillan, CEO of the security consulting firm CynergisTek. Interoperability of systems, consolidation or merging of databases, differing architectures, disparate platforms, consolidation of accounts and accesses conversion of users are among the potential hurdles these companies face, he notes.


"For organizations this large, there is nothing trivial about integrating their networks, systems or controls," McMillan says. "The biggest issues are always disparate systems, controls and interoperability and the privacy and security issues those challenges can create."


When it comes to mergers, privacy and security attorney Stephen Wu of the law firm Silicon Valley Law Group notes, "I'm most worried about companies not doing enough diligence about security when these acquisitions are being considered. ... It's becoming increasingly complex to integrate two companies IT infrastructures, and those transitions create new vulnerabilities."


Concerning Anthem's proposed purchase of Cigna, Wu says Anthem's recent hacker attack, which affected nearly 80 million individuals, "shouldn't be downplayed, but I'd be more concerned about Cigna and whether that company also potentially had a breach that perhaps hasn't been discovered yet."


Privacy attorney Kirk Nahra of the law firm Wiley Rein LLP notes that the transition period after two companies merge presents new risks. "Because of the tremendous concerns about data security and cybersecurity breaches, integration of overall security is a particular challenge," he says. "It is easier to attack a hybrid, half-integrated company than two separate companies."


Anthem's proposed acquisition of Cigna comes "at a time where Anthem is under a lot of pressure with respect to its information security, [and] the acquisition of another large insurer represents a lot more to add to its plate," notes privacy attorney Adam Greene of the law firm Davis Wright Tremaine.


"It will need to integrate its information security processes into a host of new systems, with each new, potentially unfamiliar system bringing new risks if not properly integrated," he says.

Critical Decisions

When mergers and acquisition are completed, a big challenge is picking and choosing whoseinformation security program will dominate after the transaction is completed.


"Often times, the information security program of the larger entity takes over the smaller," Greene notes. "In good situations, each entity learns from the other and the overall information security is improved, after a painful integration process. But sometimes the reverse happens, and good information security practices are abandoned because they are not practiced by the larger entity."


McMillan says merging organizations should "take an inventory of which set of controls, processes,technologies, etc. are either the most mature or the best overall." Then they can consider merging the programs, "the same way they merge organizations - capitalizing on the best of both."


While that best-of-breed-themed approach might work well in some mergers and acquisitions, typically things don't end up going that smoothly, Nahra contends.


"There are two kinds of challenges - inconsistencies in practices, either involving data security or privacy, and then operational implications of these inconsistencies, where one of the entities tries to apply its process or practices to the differing practices or operations of the other," Nahra says. "These challenges are exacerbated when there hasn't been a lot of due diligence on privacy/data security issues."

Access Control

One issue that's frequently overlooked during the blending IT networks of merging companies is access control, says Rebecca Herold, partner and co-founder of SIMBUS Security and Privacy Services.


When an organization is undergoing a merger, some employees typically lose their jobs because their role duplicates another's role, Herold says. "But the company keeps them on for a certain amount of time because they are training another person or finishing up on a project," she says. "However, during this time, I've seen disgruntled insiders who have access to information or administrative controls and have tried to sabotage the company that fired them."


Often executives don't have insight into all the risks that are involved with blending computer networks, says Herold, who's served as an adviser to merged organizations.


"They want to join or connect the networks in some way, but there are huge risks. When you start connecting one huge network with another one, and start sharing data without proper planning, there are new vulnerabilities and risks that emerge," she says.


If the companies involved in the latest wave of healthcare sector mergers and acquisitions get the regulatory and shareholder approval needed to complete their transactions, they need to keep a few security tips in mind, McMillan says.


"The biggest tip is common sense: Don't undo anything that is currently in place to ensure continuity until what's new is in place and backed up," he says.

more...
No comment yet.
Scoop.it!

Hospital with repeat security failures hit with $218K HIPAA fine

Hospital with repeat security failures hit with $218K HIPAA fine | HIPAA Compliance for Medical Practices | Scoop.it

Does your hospital permit employees to use a file-sharing app to store patients' protected health information? Better think again. A Massachusetts hospital is paying up and reevaluating its privacy and security policies after a file-sharing complaint and following a HIPAA breach. 


St. Elizabeth's Medical Center in Brighton, Mass. – a member hospital of Steward Health Care system – will pay $218,400 to the Office for Civil Rights for alleged HIPAA violations. The settlement resulted from a 2012 complaint filed by hospital employees, stating that the medical center was using a Web-based document-sharing application to store data containing protected health information. Without adequately analyzing the security risks of this application, it put the PHI of nearly 500 patients at risk.


"Organizations must pay particular attention to HIPAA's requirements when using Internet-based document sharing applications," said Jocelyn Samuels, OCR director, in a July 10 statement announcing the settlement. "In order to reduce potential risks and vulnerabilities, all workforce members must follow all policies and procedures, and entities must ensure that incidents are reported and mitigated in a timely manner."


It wasn't just the complaint that got St. Elizabeth's in hot water, however. A HIPAA breach reported by the medical center in 2014 also called attention to the lack of adequate security policies. The hospital notified OCR in August of last year of a breach involving unsecured PHI stored on the personal laptop and USB drive of a former hospital employee. The breach ultimately impacted 595 patients, according to a July 10 OCR bulletin.


As part of the settlement, St. Elizabeth's will also be required to "cure the gaps in the organization's HIPAA compliance program," OCR officials wrote in the bulletin. More specifically, this includes conducting a self-assessment of its employees' awareness and compliance with hospital privacy and security policies. Part of this assessment will involve "unannounced visits" to various hospital departments to assess policy implementations. Officials will also interview a total of 15 "randomly selected" employees with access to PHI. Additionally, at least three portable devices across each department with access to PHI will be inspected.


Then there's the policies and training piece part of the settlement. With this, St. Elizabeth's based on the assessment, will submit revised policies and training to HHS for approval.


In addition to the filed complaint and the 2014 breach, the medical center also reported an earlier HIPAA breach in 2012when paper records containing billing data, credit card numbers and security codes of nearly 7,000 patients were not properly shredded by the hospital. Some of the files containing the data were reportedly found blowing in a field nearby.


To date, OCR has levied nearly $26.4 million from covered entities and business associates found to have violated HIPAA privacy, security and breach notification rules.


The largest settlement to date was the whopping $4.8 million fine paid by New York Presbyterian Hospital and Columbia University Medical Center after a single physician accidentally deactivated an entire computer server, resulting in ePHI being posted on Internet search engines. 

more...
Gerard Dab's curator insight, 17 July 2015, 01:05

Security! Security! Security!

#medicoolhc #medicoollifeprotector

Scoop.it!

State AGs clash with Congress over data breach laws

State AGs clash with Congress over data breach laws | HIPAA Compliance for Medical Practices | Scoop.it

Attorneys general from all 47 states with data breach notification laws are urging Congress not to preempt local rules with a federal standard.

“Any additional protections afforded consumers by a federal law must not diminish the important role states already play protecting consumers from data breaches and identity theft,” they wrote in a letter sent to congressional leaders on Tuesday.

Lawmakers have been weighing a number of measures that would create nationwide guidelines for notifying customers in the wake of a hack that exposes sensitive information. Industry groups have argued that complying with the patchwork set of rules in each state is burdensome and costly.


The rapidly rising number of breaches at retailers, banks and government agencies has only raised pressure on Congress to pass legislation.

While the concept of a federal standard has bipartisan appeal, the two parties have split over whether to totally preempt state laws.

Democrats fear a nationwide rubric that preempts state law could weaken standards in states that have moved aggressively on data breach laws. Republicans fear that an overly strict federal standard could empower overzealous government regulators.

Lawmakers also disagree on what type of breaches should trigger a notification.

The differing views have spawned a cavalcade of bills on Capitol Hill, many of which would preempt state laws.

“Given the almost constant stream of data security breaches, state attorneys general must be able to continue our robust enforcement of data breach laws,” said Virginia Attorney General William Sorrell, who oversees a law that requires companies to notify officials within 14 days of discovering a breach, in a statement. “A federal law is desirable, but only if it maintains the strong consumer protection provisions in place in many states.”

Many state attorneys general, including Sorrell, favor a Senate data breach offering from Sen. Patrick Leahy (D-Vt.) and co-sponsored by five other Democrats.

Notably the bill does not preempt state laws that are stricter than the standard delineated in Leahy’s bill.

It also provides a broad definition of what type of information would constitute a notification-worthy breach. It includes photos and videos in addition to more traditional sensitive data such as Social Security numbers or financial account information.

But most important for states is retaining their ability to set their own standards.

“States should also be assured continued flexibility to adapt their state laws to respond to changes in technology and data collection,” the letter said. “As we have seen over the past decade, states are better equipped to quickly adjust to the challenges presented by a data-driven economy.”

more...
No comment yet.
Scoop.it!

When does HIPAA require more than encryption?

When does HIPAA require more than encryption? | HIPAA Compliance for Medical Practices | Scoop.it

Encryption of sensitive electronic personal health information (ePHI) on mobile devices – including PCs – is often considered sufficient to protect that data well enough to achieve HIPAA compliance. However, it’s important that those handling this data understand the circumstances where encryption alone is not enough.


These situations do exist – and can be nightmares if they occur. The Department of Health and Human Services' HIPAA Security Rule describes satisfactory encryption as “an algorithmic process to transform data into a form in which there is a low probability of assigning meaning without use of a confidential process or key … and such confidential process or key that might enable decryption has not been breached.” That last part means that encryption is only adequate as a safeguard for HIPAA-protected ePHI if the situation is such that the encryption still secures the data.


There are several scenarios where even encrypted data can be breached relatively easily and, unfortunately, there are many real world examples of each of these scenarios occurring. The trouble with encrypted data is that it needs to be decrypted to be useful to those who would access it legitimately, and the bad guys will look to take advantage of those moments when encryption’s defenses are down. Encryption is a powerful defense for data when a device’s power is off and for when the password is unknown and can’t be learned or hacked. But putting it that way, we’ve actually rather narrowly defined where encryption is effective.


Here are some cases where it isn’t.


1. The data thief gains the password needed to get around the encryption on an ePHI-filled device. This can happen when the password is stolen along with the device - for example, if a laptop is taken along with a user’s notepad containing the password needed to access ePHI. HIPAA requires not only encrypting sensitive data but also paying attention to the safety of passwords or any such methods of access. Bad password security effectively negates encryption. Too often we’ve seen a sticky note of passwords attached to a laptop – or even passwords written on USB devices themselves – which is a great example of an encryption that is not HIPAA-secure.


In another type of case at Boston’s Brigham and Women’s Hospital, a physician was robbed at gunpoint and threatened into disclosing the pass codes on the laptop and cellphone that were taken from him, each of which contained ePHI. The doctor appears to have done all that could be done to comply with HIPAA as far as keeping data encrypted, but when forced to choose between personal health information and actual personal health, he made the reasonable choice. Still, the incident was a HIPAA breach, requiring patients and officials to be notified.


2. The stolen device is already running and an authorized user has already been authenticated. In this scenario, the legitimate user has already given his or her credentials and has a session accessing ePHI running when an unauthorized user gains control of the device. HIPAA contains measures to minimize the likelihood of this scenario, calling for the issue to be addressed with automatic log-off capability to “terminate an electronic session after a predetermined time of inactivity.” Still, authorized users should take care to close out sessions themselves if stepping away from their devices and leaving them unguarded.


3. A formerly authorized user becomes unauthorized, but still has access. This can happen when an employee quits or is terminated from a job but still possesses hardware and passwords to bypass encryption. A case such as this occurred at East Texas Hospital, where a former employee was recently sentenced to federal prison for obtaining HIPAA-protected health information with the intent to sell, transfer or otherwise use the data for personal gain. Criminals in these cases often use ePHI for credit card fraud or identity theft, demonstrating how important HIPAA safeguards can be to the patients they protect.


So how can ePHI be protected beyond encryption?


The safest security system to have in place when encountering each of these scenarios is one where the organization retains control over the data, and the devices containing ePHI are equipped with the ability to defend themselves automatically.


The fact is that employees will always seek and find ways to be their most productive, meaning that policies trying to keep ePHI off of certain devices are, for all intents and purposes, doomed to be burdensome and disrespected. For doctors and other healthcare staff, productivity trumps security. It’s best to take concerns around security off their plate and provide it at an organizational level. Organizations can implement strategies that maintain regular invisible communications between the IT department and all devices used for work with ePHI in a way that isn’t cumbersome to the user. Through these communications, the IT department can access devices to remotely block or delete sensitive data and revoke access by former employees. Software installed on devices can detect security risks and respond with appropriate pre-determined responses, even when communication can’t be established.


Given the high stakes of HIPAA compliance – where a single breach can lead to government fines and costly reputational damage – it would be wise for healthcare organizations to consider encryption only the beginning when it comes to their data security.

more...
Scoop.it!

Website Error Leads to Data Breach

Website Error Leads to Data Breach | HIPAA Compliance for Medical Practices | Scoop.it

An error in a coding upgrade for a Blue Shield of California website resulted in a breach affecting 843 individuals. The incident is a reminder to all organizations about the importance of sound systems development life cycle practices.


In a notification letter being mailed by Blue Shield of California to affected members, the insurer says the breach involved a secure website that group health benefit plan adminstrators and brokers use to manage information about their own plans' members. "As the unintended result of a computer code update Blue Shield made to the website on May 9," the letter states, three users who logged into their own website accounts simultaneously were able to view member information associated with the other users' accounts. The problem was reported to Blue Shield's privacy office on May 18.


Blue Shield of California tells Information Security Media Group that the site affected was the company's Blue Shield Employer Portal. "This issue did not impact Blue Shield's public/member website," the company says. When the issue was discovered, the website was promptly taken offline to identify and fix the problem, according to the insurer.


"The website was returned to service on May 19, 2015," according to the notification letter. The insurer is offering all impacted individuals free credit monitoring and identity theft resolution services for one year.


Exposed information included names, Social Security numbers, Blue Shield identification numbers, dates of birth and home addresses. "None of your financial information was made available as a result of this incident," the notification letter says. "The users who had unauthorized access to PHI as a result of this incident have confirmed that they did not retain copies, they did not use or further disclose your PHI, and that they have deleted, returned to Blue Shield, and/or securely destroyed all records of the PHI they accessed without authorization."


The Blue Shield of California notification letter also notes that the company's investigation revealed that the breach "was the result of human error on the part of Blue Shield staff members, and the matter was not reported to law enforcement authorities for further investigation."

Similar Incidents

The coding error at Blue Shield of California that led to the users being able to view other individuals' information isn't a first in terms of programming mistakes on a healthcare-sector website leading to privacy concerns.


For example, in the early weeks of the launch of HealthCare.gov in the fall of 2013, a software glitch allowed a North Carolina consumer to access personal information of a South Carolina man. The Department of Health and Human Services' Centers for Medicare and Medicaid Services said at the time that the mistake was "immediately" fixed once the problem was reported. Still, the incident raised more concerns about the overall security of the Affordable Care Act health information exchange site.


Software design and coding mistakes that leave PHI viewable on websites led to at least one healthcare entity paying a financial penalty to HHS' Office for Civil Rights.


An OCR investigation of Phoenix Cardiac Surgery P.C., with offices in Phoenix and Prescott, began in February 2009, following a report that the practice was posting clinical and surgical appointments for its patients on an Internet-based calendar that was publicly accessible.

The investigation determined the practice had implemented few policies and procedures to comply with the HIPAA privacy and security rules and had limited safeguards in place to protect patients' information, according to an HHS statement. The investigation led to the healthcare practice signing an OCR resolution agreement, which included a corrective action plan and a $100,000 financial penalty.


The corrective action plan required the physicians practice, among other measures, to conduct arisk assessment and implement appropriate policies and procedures.

Measures to Take

Security and privacy expert Andrew Hicks, director and healthcare practice lead at the risk management consulting firm Coalfire, says that to avoid website-related mistakes that can lead toprivacy breaches, it's important that entities implement appropriate controls as well as follow the right systems development steps.


"Organizations should have a sound systems development life cycle - SDLC - in place to assess all systems in a production environment, especially those that are externally facing," he says. "Components of a mature SDLC would include code reviews, user acceptance testing, change management, systems analysis, penetration testing, and application validation testing."


Healthcare entities and business associates need to strive for more than just HIPAA compliance to avoid similar mishaps, he notes.

"Organizations that are solely seeking HIPAA compliance - rather than a comprehensive information security program - will never have the assurance that website vulnerabilities have been mitigated through the implementation of appropriate controls," he says. "In other words, HIPAA does not explicitly require penetration testing, secure code reviews, change management, and patch management, to name a few. These concepts are fundamental to IT security, but absent from any OCR regulation, including HIPAA."

Earlier Blue Shield Breach

About a year ago, Blue Shield of California reported a data breach involving several spreadsheet reports that inadvertently contained the Social Security numbers of 18,000 physicians and other healthcare providers.


The spreadsheets submitted by the plan were released 10 times by the state's Department of Managed Health Care. In California, health plans electronically submit monthly to the state agency a roster of all physicians and other medical providers who have contracts with the insurers. Those rosters are supposed to contain the healthcare providers' names, business addresses, business phones, medical groups and practice areas - but not Social Security numbers. DMHC makes those rosters available to the public, upon request.

more...
No comment yet.
Scoop.it!

10 ways to prevent a data breach and protect your small business

10 ways to prevent a data breach and protect your small business | HIPAA Compliance for Medical Practices | Scoop.it

Today, virtually all businesses collect personal information about customers, employees and others. This information is valuable to hackers – evidenced by the increasing frequency and severity of data breaches across the globe.

Big businesses are not the only ones who are vulnerable. Small and medium-sized businesses with fewer data security resources are often targets for cybercriminals. In fact, research we’ve conducted with the Ponemon Institute shows that more than half have experienced a data breach and nearly three out of four report they can’t restore all their data.


The good news is that businesses can take steps to protect themselves from destructive cyber intrusions. To preempt hacking activity, you must think like a hacker. Here are a few tips to get you started.

1. Think beyond passwords. Never reuse them and don’t trust any website to store them securely. To increase the level of security, set up a two-factor authentication for all your online business accounts. This authentication relies on something only you should know (your password) and authenticates something only you should have (typically your phone) to verify your identity.

2. Stop transmission of data that is not encrypted. Mandate encryption of all data. This includes data at “rest” and “in motion.” Consider encrypting email within your company if personal information is transmitted. Avoid using WiFi networks, as they may permit interception of data.

3. Outsource payment processing. Avoid handling credit card data on your own. Reputable vendors, whether it’s for point-of-sale or web payments, have dedicated security staff that can protect data better than you can.

4. Separate social media activity from financial activity. Use a dedicated device for online banking and other financial activities, and a different device for email and social media. Otherwise, just visiting one infected social site could compromise your banking machine and sensitive business accounts.

5. “Clean house” and update procedures. Evaluate your assets and valuable data to identify where your organization is most at risk. It’s important to reduce the volume of information you keep on hand (only keep what you need!) and properly destroy all paper documents, CDs/DVDs and disks before disposal. Consider assessing your business’s email infrastructure, browser vulnerability, and ID system. Do not use Social Insurance Numbers as employee ID numbers or client account numbers. You should also question the security posture of your business lines, vendors, suppliers or partners.

6. Secure your browser. Watering holes – malicious code installed on trusted websites – are a common method of attack against businesses. How do you know which websites to trust? Focus on keeping up-to-date with the latest version of your browser. Then, test your browser’s configuration for weakness.

7. Secure your computers and operating system. Implement password protection and “time out” functions (requires re-login after period of inactivity) for all business computers. Require strong passwords that must be changed on a regular basis. Also be sure to update all operating systems, which have major security improvements baked in. It’s far easier to break into older operating systems like Windows XP or OS X 10.6.

8. Secure your internet router. Make sure someone can’t intercept all the data sent through it. Consider configuring your wireless network so the Service Set Identifier (SSID) – the name the wireless network broadcasts to identify itself – is hidden.

9. Safeguard and back up your data. Lock physical records containing private information in a secure location and create backups. These should be encrypted and off-site in case there’s a fire or burglary.

10. Educate and train employees. Establish a written policy about data security, and communicate it to all employees. Educate them about what types of information are sensitive or confidential and what their responsibilities are to protect that data. In addition, restrict employee usage of computers for only business purposes. Do not permit use of file sharing peer-to-peer websites or software applications and block access to inappropriate websites.

It’s important to remember that no business is “too small” for a hacker–all businesses are vulnerable. The sooner you can get ahead of potential hacking activity, using the above steps, the sooner you’ll be prepared to thwart, mitigate and manage a data breach.

more...
No comment yet.
Scoop.it!

Unencrypted Device Breaches Persist

Unencrypted Device Breaches Persist | HIPAA Compliance for Medical Practices | Scoop.it

Although hacker attacks have dominated headlines in recent months, a snapshot of the federal tally of major health databreaches shows that stolen unencrypted devices continue to be a common breach cause, although these incidents usually affect far fewer patients.


As of June 23, the Department of Health and Human Services' Office for Civil Rights' "wall of shame" website of health data breaches affecting 500 or more individuals showed 1,251 incidents affecting nearly 134.9 million individuals.


Those totals have grown from 1,213 breaches affecting 133.2 million individuals in an April 29 snapshot prepared by Information Security Media Group.


The federal tally lists all major breaches involving protected health information since September 2009, when the HIPAA Breach Notification rule went into effect. As of June 23, about 52 percent of breaches on the tally listed "theft" as the cause.


Among the breaches added to the tally in recent weeks are about a dozen involving stolen unencrypted computers. Lately, those type of incidents have been overshadowed by massive hacking attacks, such as those that hit Anthem Inc.and Premera Blue Cross.


"Although we've seen some large hacking attacks, they are aimed at higher-profile organizations than the more typical provider organization," says privacy and security expert Kate Borten, founder of the consulting firm, The Marblehead Group. "Attackers know that these organizations have a very high volume of valuable data. But I continue to believe that unencrypted PHI on devices and media that are lost or stolen is 'the' most common breach scenario affecting organizations of any size."


Borten predicts that many incidents involving unencrypted devices will continue to be added to the wall of shame. "Getting those devices encrypted is an ongoing challenge when we expand the requirement to tablets and smartphones, particularly when owned by the users, not the organization," she says. "We also shouldn't overlook encryption of media, including tapes, disks and USB storage drives."

Unencrypted Device Breaches

The largest breach involving unencrypted devices that was recently added to the tally was an incident reported to HHS on June 1 by Oregon Health Co-Op., an insurer.


That incident, which impacted 14,000 individuals, involved a laptop stolen on April 3. In a statement, the insurer says the device contained member and dependent names, addresses, health plan and identification numbers, dates of birth and Social Security numbers. "There is no indication this personal information has been accessed or inappropriately used by unauthorized individuals," the statement says.

Also recently added to the federal tally was a breach affecting 12,000 individuals reported on June 10 by Nevada healthcare provider Implants, Dentures & Dental, which is listed on the federal tally as "doing business as Half Dental." The incident is listed as a theft involving electronic medical records, a laptop, a network server and other portable electronic devices.


In addition to the recent incidents involving stolen or lost unencrypted devices, several breaches added to the wall of shame involve loss or stolen paper records or film.


"Breaches of non-electronic film and paper will never end, but at least these breaches are typically limited to one or a small number of affected individuals," Borten says. Because many of the breaches involving paper or film are often due to human error, "effective, repeated training is essential" to help prevention of such incidents, she says.

Hacking Incidents Added

The largest breach added to the tally in recent weeks, however, is the hacker attack on CareFirst BlueCross BlueShield, which was reported on May 20 to HHS and affected 1.1 million individuals. Baltimore-based CareFirst has said that an "unauthorized intrusion" into a database dating back to June 2014 was discovered in April by Mandiant, a cyberforensics unit of security vendor FireEye, discovered the attack on CareFirst in April. Mandiant was asked by CareFirst to conduct a proactive examination of CareFirst's environment, following the hacker attacks on Anthem and Premera.


Another hacker incident added to the tally affected South Bend, Ind.-based Beacon Health System. That incident, reported to HHS on May 20, is listed as affecting about 307,000 individuals. The organization has said patients' protected health information, including patient name, doctor's name, internal patient ID number, and in some cases, Social Security numbers and treatment information, was exposed as a result of phishing attacks on some employees that started in November 2013. The attacks led to hackers accessing "email boxes" that contained patient information.

Addressing Multiple Threats

Healthcare organizations need to continue their efforts to protect data from the threats posed by cyber-attackers, insiders or street thieves, says Borten, the consultant.


"There's no simple answer, but security is complex, and so the solutions, or mitigating controls, must be numerous and varied."

more...
No comment yet.
Scoop.it!

Data security and HIPAA guidelines: A delicate balance

Data security and HIPAA guidelines: A delicate balance | HIPAA Compliance for Medical Practices | Scoop.it

How can healthcare organizations reassure patients that their personal data is secure, while also satisfying their demands for quick and easy access to their own data? In a recent interview, Chris White, head of commercial data protection services at Booz Allen, offered tips to healthcare providers trying to balance health data security with HIPAA requirements.


White says in the HealthITSecurity interview that maintaining data security is absolutely critical at the patient, doctor and provider organization levels. Each has to take a focused approach. That involves understanding what data is most important and putting a risk evaluation behind that to prioritize protecting different types of data.


Providing encryption across the data life cycle is important, too.

Nonetheless, providers have to realize that a breach is inevitable, and it's crucial to understand what's going on with data inside the network.


White says HIPAA needs to change with regard to the level of accountability for organizations from a regulatory standpoint, and how much investment they need to make in structuring the appropriate data protection program.


Public heathcare agencies are not immune to data security issues. On April 21, the Texas Department of Aging and Disability Services inadvertently made Medicaid patients' information available online. The breach impacted about 6,600 people.


Against those existing risks, technology continues to move forward, White said, noting that the proliferation of mobile devices is adding another layer of risk for potential cyberattacks.

more...
No comment yet.
Scoop.it!

Physicians: Protect Your Data from Hackers in 5 Steps

Physicians: Protect Your Data from Hackers in 5 Steps | HIPAA Compliance for Medical Practices | Scoop.it

According to a recent CNBC report, hackers may have stolen personnel data and Social Security numbers for every single federal employee last December. If true, the cyberattack on federal employee data is far worse than the Obama administration has acknowledged.

J. David Cox, president of the American Federal of Government Employees Union, believes "hackers stole military records and veterans' status information, address, birth date, job and pay history, health insurance, life insurance, and pension information; [as well as] age, gender, race data," according to the report. This would be all that is needed for cybercriminals to steal identities of the employees, divert funds from one account to another, submit fake healthcare claims, and create fake accounts for everything from credit cards to in-store credit card purchases.


Although physicians maintain personal and professional data which is especially valuable to thieves, you are not the federal government. Make it hard enough on cybercriminals, and they will move on for lower-hanging fruit. Readers Digest offers good advice in five simple steps in its article, "Internet Security, How not to Get Hacked":


1. Be aware of what you share.


On Facebook, Twitter, or social media, avoid posting birth dates, graduation years, or your mother's maiden name — info often used to answer security questions to access your accounts online or over the phone.


2. Pick a strong password.


Hackers guess passwords using a computer. The longer your password and the more nonsensical characters it contains, the longer it takes the computer. The idea here is that longer, more complicated passwords could take a computer 1,000 years to guess. Give 'em a challenge


3. Use a two-step password if offered.


Facebook and Gmail have an optional security feature that, once activated, requires you to enter two passwords: your normal password plus a code that the companies text to your phone-to access your account. "The added step is a slight inconvenience that's worth the trouble when the alternative can be getting hacked,"  CNET tech writer Matt Elliot told Readers Digest. To set up the verification on Gmail, click on Account, then Security. On Facebook, log in, click on the down icon next to Home, and then click on Account Setting, Security, and finally Login Approvals.


4. Use Wi-Fi hot spots sparingly.


By now, you probably know that Internet cafés and free hotspots are not secure. You shouldn't be doing your online banking from these spots. However, the little button that turns off your laptops Wi-Fi so that your laptop cannot be accessed remotely is also handy. In Windows, right click on the wireless icon in the taskbar to it off. On a Mac, click the Wi-Fi icon in the menu bar to turn off Wi-Fi.


5. Back up your data.


Hackers can delete years' worth of e-mails, photos, documents, and music from your computer in minutes. Protect your digital files by using a simple and free backup system available on websites such as Crashplan and Dropbox


Take this basic instruction and build on it yourself. Google, for example offers advice expanding on the concept of "stong passwords." The worst thing you can do is use "dictionary words," the word "password," and sequential keystrokes, such as "1234" or "qwerty," because the hacker's computers will try these first. For e-mail, pick a phrase, such as "[m]y friends Tom and Jasmine send me a funny e-mail once a day" and then use numbers and letters to recreate it as a cryptic password. "MfT&Jsmafe1ad."

more...
No comment yet.
Scoop.it!

Cybersecurity in Healthcare – The Human Factor

Cybersecurity in Healthcare – The Human Factor | HIPAA Compliance for Medical Practices | Scoop.it

Following the Anthem breach, and more recent Premera breach, cybersecurity and protecting patient data is top of mind for every organization in the healthcare industry. Every cybersecurity solution out there will tell you they have the latest and greatest technology for detecting the bad guys and keeping them out. The truth is, you can have the best systems in the world, but how your staff interacts with the technology is just as important. For example, if a phishing email makes its way to a staff person’s inbox, all it takes is one employee to activate a malicious file on their desktop and the bad guys have access to your entire network.


Cyber-criminals are advancing right along with technology, so educating your staff is an absolute priority. However, it can sometimes be a challenge to get everyone on the same page. Here are some tips to ensure organizations are in the best position to protect against today’s evolving threat environment:

  • Bring all departments into the fold – Ensuring security isn’t just the realm of the IT department. All groups, on both the clinical and administration sides, need to have a stake in the protection of patient data. An internal security committee made up of representatives from each department can make sure that all groups, including board members and the C-suite, have buy-in. The group should also conduct formal risk assessments and identify any areas at risk for a data breach, then develop plans to educate and communicate protocols throughout the organization.
  • Spread the word on new procedures – To ensure cybersecurity measures are taken seriously across an organization, the message needs to be delivered from the top and repeated often. Organizations must provide employees with training sessions on a regular basis, frequent reminders to speak up about suspicious emails, prompts to change passwords regularly and encrypt communication with protected health information. This way it’s clear that the matter isn’t taken lightly.
  • Learn from recent cyber-criminal activity – Cyber-threats are a new territory for everyone. Use recent breaches and cyber-criminal activity to educate your staff and provide training. Chances are that when the media is covering a breach, people will be interested in learning how to protect themselves both at home and at work.


Unfortunately these cyber-criminals are advanced in their tactics, and there’s no end-all-be-all solution to guarantee they are kept out of your organization. But there are ways to make it harder for them to get in, and it starts with educating your team on security best practices, as well as how to recognize a potential threat.


more...
No comment yet.
Scoop.it!

4 keys to HIPAA audit prep

4 keys to HIPAA audit prep | HIPAA Compliance for Medical Practices | Scoop.it

With the delay of the Office for Civil Rights (OCR) HIPAA audits, organizations would be wise to not push compliance further down the priority list. Yet many are woefully unprepared for both data breaches and the audits, writes Mark Fulford, partner at LBMC Security & Risk Services in an article at Health IT and Security Review.


"If organizations let down their guard, they will become vulnerable to both data breaches and the OCR audits themselves when they inevitably arrive," he says. "And all indications are that the audits will bring an unprecedented level of scrutiny and enforcement to healthcare security."


Being chosen for an audit means submitting documentation of your organization's compliance. Yet HIPAA guidance isn't specific, he says, allowing you to explain your reasoning behind your security approach.

Among his recommendations:

  1. Conduct a risk assessment. Evaluate your organization before OCR does, making sure you have everything covered including servers, personal computers, mobile devices and more
  2. Document everything. Keep detailed records of your security measures and procedures, as well as your incident response plans
  3. Identify your business associates. Verify that these entities also maintain appropriate security
  4. Train your team and stay-up-to-date. Security is a team effort; ensure that your employees are trained to respond to phishing, social engineering, malware and other attacks.


Despite a proliferation of healthcare breaches and warnings from the Office of Civil Rights that it plans to crack down on organizations that don't effectively protect patient data, research from ProPublica found that few organizations actually have been fined for it.


However, that's expected to change. Privacy attorney Adam Greene said he's heard that OCR has pipeline of "unprecedented" settlements in the works.


An OCR attorney made a similar statement nearly a year ago. Jerome B. Meites, OCR chief regional counsel for the Chicago area, said the HIPAA enforcement actions over the past year would pale in comparison to the following 12 months.


more...
No comment yet.
Scoop.it!

Stage 3 Meaningful Use: Breaking Down HIPAA Rules

Stage 3 Meaningful Use: Breaking Down HIPAA Rules | HIPAA Compliance for Medical Practices | Scoop.it

CMS released its Stage 3 Meaningful Use proposal last month, with numerous aspects that covered entities (CEs) need to be aware of and pay attention to. While the proposal has a large focus on EHR interoperability, it continues to build on the previously established frameworks in Stage 1 and Stage 2 – including keeping patient information secure.


HIPAA rules and regulations cannot be thrown out the window as CEs work toward meeting meaningful use requirements. We’ll break down the finer points of Stage 3 Meaningful Use as it relates to data security, and how organizations can remain HIPAA compliant while also make progress in the Meaningful Use program.


Stage 3 further protects patient information


One of the top objectives for Stage 3 Meaningful Use is to protect patient information. New technical, physical, and administrative safeguards are recommended that provide more strict and narrow requirements for keeping patient data secure.


The new proposal addresses how the encryption of patient electronic health information continues to be essential for the EHR Incentive Programs. Moreover, it explains that relevant entities will need to conduct risk analysis and risk management processes, as well as develop contingency plans and training programs.


In order to receive EHR incentive payments, covered entities must perform a security risk analysis. However, these analyses must go beyond just reviewing the data that is stored in an organization’s EHR. CEs need to address all electronic protected health information they maintain.


It is also important to remember that installing a certified EHR does not fulfill the Meaningful Use security analysis requirement. This security aspect ensures that all ePHI maintained by an organization is reviewed.  For example, any electronic device – tablets, laptops, mobile phones – that store, capture or modify ePHI need to be examined for security.

“Review all electronic devices that store, capture, or modify electronic protected health information,” states the ONC website. “Include your EHR hardware and software and devices that can access your EHR data (e.g., your tablet computer, your practice manager’s mobile phone). Remember that copiers also store data.”


It is also important to regularly review the existing security infrastructure, identify potential threats, and then prioritize the discovered risks. For example, a risk analysis could reveal that an organization needs to update its system software, change the workflow processes or storage methods, review and modify policies and procedures, schedule additional training for your staff, or take other necessary corrective action to eliminate identified security deficiency.

A security risk analysis does not necessarily need to be done every year. CEs only need to conduct one when they adopt an EHR. When a facility changes its setup or makes alterations to its electronic systems, for example, then it is time to review and make updates for any subsequent changes in risk.


Stage 3 works with HIPAA regulations


In terms of patient data security, it is important to understand that the Stage 3 Meaningful Use rule works with HIPAA – the two are able to compliment one another.


“Consistent with HIPAA and its implementing regulations, and as we stated under both the Stage 1 and Stage 2 final rules (75 FR 44368 through 44369 and 77 FR 54002 through 54003), protecting ePHI remains essential to all aspects of meaningful use under the EHR Incentive Programs,” CMS wrote in its proposal. “We remain cognizant that unintended or unlawful disclosures of ePHI could diminish consumer confidence in EHRs and the overall exchange of ePHI.”

As EHRs become more common, CMS explained that protecting ePHI becomes more instrumental in the EHR Incentive Program succeeding. However, CMS acknowledged that there had been some confusion in the previous rules when it came to HIPAA requirements and requirements for the meaningful use core objective:


For the proposed Stage 3 objective, we have added language to the security requirements for the implementation of appropriate technical, administrative, and physical safeguards. We propose to include administrative and physical safeguards because an entity would require technical, administrative, and physical safeguards to enable it to implement risk management security measures to reduce the risks and vulnerabilities identified.


CMS added that even as it worked to clarify security requirements under Stage 3, their proposal was not designed “to supersede or satisfy the broader, separate requirements under the HIPAA Security Rule and other rulemaking.”


For example, the CMS proposal narrows the requirements for a security risk analysis in terms of meaningful use requirements. Stage 3 states that the analysis must be done when CEHRT is installed or when a facility upgrades to a new certified EHR technology edition. From there, providers need to review the CEHRT security risk analysis, as well as the implemented safeguards, “as necessary, but at least once per EHR reporting period.”


However, CMS points out that HIPAA requirements “must assess the potential risks and vulnerabilities to the confidentiality, availability, and integrity of all ePHI that an organization creates, receives, maintains, or transmits” in all electronic forms.


Working toward exchange securely


The Stage 3 Meaningful Use proposal encourages CEs to work toward health information exchange and to focus on better health outcomes for patients. As healthcare facilities work toward both of these goals, it is essential that health data security still remains a priority and that PHI stays safe.


While HIPAA compliance ensures that CEs avoid any federal fines, it also ensures that those facilities are keeping patient information out of the wrong hands. The right balance needs to be found between health information security and health information exchange.


more...
No comment yet.
Scoop.it!

Hospital Slammed With $218,000 HIPAA Fine

Hospital Slammed With $218,000 HIPAA Fine | HIPAA Compliance for Medical Practices | Scoop.it

Federal regulators have slapped a Boston area hospital with a $218,000 HIPAA penalty after an investigation following two security incidents. One involved staff members using an Internet site to share documents containing patient data without first assessing risks. The other involved the theft of a worker's personally owned unencrypted laptop and storage device.


The Department of Health and Human Services' Office for Civil Rights says it has entered a resolution agreement with St. Elizabeth's Medical Center that also includes a "robust" corrective action plan to correct deficiencies in the hospital's HIPAA compliance program.

The Brighton, Mass.-based medical center is part of Steward Health Care System.


Privacy and security experts say the OCR settlement offers a number of valuable lessons, including the importance of the workforce knowing how to report security issues internally, as well as the need to have strong policies and procedures for safeguarding PHI in the cloud.

Complaint Filed

On Nov. 16, 2012, OCR received a complaint alleging noncompliance with the HIPAA by medical center workforce members. "Specifically, the complaint alleged that workforce members used an Internet-based document sharing application to store documents containing electronic protected health information of at least 498 individuals without having analyzed the risks associated with such a practice," the OCR statement says.


OCR's subsequent investigation determined that the medical center "failed to timely identify and respond to the known security incident, mitigate the harmful effects of the security incident and document the security incident and its outcome."


"Organizations must pay particular attention to HIPAA's requirements when using internet-based document sharing applications," says Jocelyn Samuels, OCR director in the statement. "In order to reduce potential risks and vulnerabilities, all workforce members must follow all policies and procedures, and entities must ensure that incidents are reported and mitigated in a timely manner."


Separately, on Aug. 25, 2014, St. Elizabeth's Medical Center submitted notification to OCR regarding abreach involving unencrypted ePHI stored on a former hospital workforce member's personal laptop and USB flash drive, affecting 595 individuals. The OCR "wall of shame" website of health data breaches impacting 500 or more individuals says the incident involved a theft.

Corrective Action Plan

In addition to the financial penalty - which OCR says takes into consideration the circumstances of the complaint and breach, the size of the entity, and the type of PHI disclosed - the agreement includes a corrective action plan "to cure gaps in the organization's HIPAA compliance program raised by both the complaint and the breach."

The plan calls for the medical center to:


  • Conduct a "self-assessment" of workforce members' familiarity and compliance with the hospital's policies and procedures that address issues including transmission and storage of ePHI;
  • Review and revise policies and procedures related to ePHI; and
  • Revise workforce training related to HIPAA and protection of PHI.
Lessons Learned

Other healthcare organizations and their business associates need to heed some lessons from OCR's latest HIPAA enforcement action, two compliance experts say.


Privacy attorney Adam Greene of the law firm Davis Wright Tremaine notes: "The settlement indicates that OCR first learned of alleged noncompliance through complaints by the covered entity's workforce members. Entities should consider whether their employees know how to report HIPAA issues internally to the privacy and security officers and ensure that any concerns are adequately addressed. Otherwise, the employees' next stop may be complaining to the government."

The settlement also highlights the importance of having a cloud computing strategy, Greene points out. That strategy, he says, should include "policies, training and potential technical safeguards to keep PHI off of unauthorized online file-sharing services."


The enforcement action spotlights the continuing challenge of preventing unencrypted PHI from ending up on personal devices, where it may become the subject of a breach, he notes.

The case also sheds light on how OCR evaluates compliance issues, he says. "The settlement highlights that OCR will look at multiple HIPAA incidents together, as it is not clear that OCR would have entered into a settlement agreement if there had only been the incident involving online file sharing software, but took action after an unrelated second incident involving PHI ending up on personal devices."


Privacy attorney David Holtzman, vice president of compliance at security consulting firm CynergisTek, says the settlement "serves as an important reminder that a covered entity or a business associate must make sure that the organization's risk assessment takes into account any relationship where PHI has been disclosed to a contractor or vendor so as to ensure that appropriate safeguards to protect the data are in place."


The alleged violations involving the document sharing vendor, he says, "involve failure to have a BA agreement in place prior to disclosing PHI to the vendor, as well as failing to have appropriate security management processes in place to evaluate when a BA agreement is needed when bringing on a new contractor that will handle PHI."

St. Elizabeth's Medical Center did not immediately respond to an Information Security Media Group request for comment.

Previous Settlements

The settlement with the Boston-area medical center is the second HIPAA resolution agreement signed by OCR so far this year. In April, the agency OK'd an agreement with Cornell Prescription Pharmacyfor an incident related to unsecure disposal of paper records containing PHI. In that agreement, Cornell was fined $125,000 and also adopted a corrective action plan to correct deficiencies in its HIPAA compliance program.


The settlement with St. Elizabeth is OCR's 25th HIPAA enforcement action involving a financial penalty and/or resolution agreement that OCR has taken since 2008.


But privacy advocate Deborah Peel, M.D., founder of Patient Privacy Rights, says OCR isn't doing enough to crack down on organizations involved in HIPAA privacy breaches.


"Assessing penalties that low - St. Elizabeth will pay $218,400 - guarantees that virtually no organizations will fix their destructive practices," she says. "Industry views low fines as simply a cost of doing business. They'll take their chances and see if they're caught."


The largest HIPAA financial penalty to date issued by OCR was a $4.8 million settlement with New York-Presbyterian Hospital and Columbia University for incidents tied to the same 2010 breach that affected about 6,800 patients. The incidents involved unsecured patient data on a network.

more...
No comment yet.
Scoop.it!

The Cloud is Good, But Know Where Data Go

The Cloud is Good, But Know Where Data Go | HIPAA Compliance for Medical Practices | Scoop.it
A recent settlement announcement from the U.S. Department of Health and Human Services Office for Civil Rights (“OCR”) highlights the need to evaluate web-based applications and storage solutions. Web-based or cloud solutions are viable options and tools for healthcare entities to utilize, but those tools need to evaluated for compliance with HIPAA security requirements.

Saint Elizabeth’s Medical Center (“SEMC”), located outside of Boston, MA, learned this lesson the hard way. On November 16, 2012, certain workforce members at SEMC reported suspected non-compliance with HIPAA to OCR. The report focused upon use of an internet-based document sharing and storage application. The specific site is not identified in the OCR Resolution Agreement, but Dropbox is an example of an online storage site that does not meet HIPAA security requirements. OCR notified SEMC of the results of its investigation on February 14, 2013. Fast forward a year and SEMC then reported a breach regarding a workforce member’s unsecured laptop and USB storage device. The combination of events led OCR to conclude that SEMC failed to implement sufficient security measures required by HIPAA and SEMC did not timely identify or mitigate harmful effects from identified deficiencies.

As a result of the two reported incidents, SEMC is now paying $218,400 to OCR in settlement funds. The settlement continues to trend of not being able to accurately guess the amount of a fine that will be levied. As stated in the announcement, OCR “takes into consideration the circumstances of the complaint and breach, the size of the entity, and the type of PHI disclosed.” This statement potentially gives some insight, which can be interpreted to mean that entities with bigger pockets will be hit with larger fines because such entities can absorb larger fines.

The other consideration raised by the SEMC settlement is what to do about cloud based storage and sharing solutions. Should all such tools be locked away from use healthcare organizations? This is not necessarily the answer because some tools do follow HIPAA security requirements. For example, some cloud storage services were built specifically for healthcare, and as such are more cognizant of applicable regulatory requirements. More general sites, such as Box, noted HIPAA requirements and claim to meet required standards. As such, it is possible for organizations to utilize cloud based options.

However, it is not necessarily the choices of an organization as a whole that are troublesome. In SEMC’s case, it is not clear whether the workforce members acted under SEMC’s direction or utilized the cloud sites without SEMC’s direct knowledge. The unsupervised actions of workforce members are what can cause an organization a lot of concern. Organization’s need to train and educate workforce members, but cannot always control their actions. Despite the inability to constantly track what a workforce member is doing, certain steps could be taken to alleviate concerns. One measure would be to block access to websites that could lead to a potential breach or other non-compliance. Such a measure may not make all workforce members happy, but an organization should assess its risks and take appropriate measures. Additionally, an organization can suggest sites that are compliant be used.

Regardless of the approach taken, organizations need to be cognizant of the risks posed by cloud based storage, especially on the individual level. OCR’s settlement with SEMC is only the most recent action to highlight the concern. As has been stated before, once OCR releases a settlement addressing an issue, subsequent organizations with the same issue can expect greater focus on the identified issue and less leniency when it comes to a violation.
more...
No comment yet.
Scoop.it!

Bill That Changes HIPAA Passes House

Bill That Changes HIPAA Passes House | HIPAA Compliance for Medical Practices | Scoop.it

The U.S. House of Representatives on July 10 passed a bill aimed at accelerating the advancement of medical innovation that contains a controversial provision calling for significant changes to the HIPAAPrivacy Rule.


The House approved the 21st Century Cures bill by a vote of 344 to 77. Among the 309-page bill's many provisions is a proposal that the Secretary of Health and Human Services "revise or clarify" the HIPAA Privacy Rule's provisions on the use and disclosure of protected health information for research purposes.


Under HIPAA, PHI is allowed to be used or disclosed by a covered entity for healthcare treatment, payment and operations without authorization by the patient. If the proposed legislation is eventually signed into law, patient authorization would not be required for PHI use or disclosure for research purposes if only covered entities or business associates, as defined under HIPAA, are involved in exchanging and using the data.


That provision - as well as many others in the bill - aim to help fuel more speedy research and development of promising medical treatments and devices.


"The act says ... if you're sharing [patient PHI] with a covered entity [or a BA], you don't necessarily need the individual's consent prior to sharing - and that's something our members have been receptive too," notes Leslie Krigstein, interim vice president of public policy at the College of Healthcare Information Management Executives, an organization that represents 1,600 CIOs and CISOs.


"The complexity of consent has been a barrier [to health information sharing] ... and the language [contained in the bill] will hopefully move the conversation forward," she says.


Some privacy advocates, however, have opposed the bill's HIPAA-altering provision.


Allowing the use of PHI by researchers without individuals' consent or knowledge only makes the privacy and security of that data less certain, says Deborah Peel, M.D., founder of Patient Privacy Rights, an advocacy group,.


"Researchers and all those that take our data magnify the risks of data breach, data theft, data sale and harms," she says. "Researchers are simply more weak links in the U.S. healthcare system which already has 100s of millions of weak links."

Changes Ahead?

If the legislation is signed into law in its current form, healthcare entities and business associateswould need to change their policies related to how they handle PHI.


"If the bill is enacted, it will not place additional responsibilities on covered entities and business associates. Rather, it will provide them with greater flexibility to use and disclose protected health information for research," says privacy attorney Adam Greene, partner at law firm Davis Wright Tremaine. "Covered entities and business associates who seek to take advantage of these changes would need to revise their policies and procedures accordingly." For instance, some covered entities also may need to revise their notices of privacy practices if their notices get into great detail on research, Greene notes.

Other Provisions

In addition to the privacy provisions, the bill also calls for penalizing vendors of electronic health records and other health IT systems that fail to meet standards for interoperable and secureinformation exchange.


The bill calls for HHS to develop methods to measure whether EHRs and other health information technology are interoperable, and authorizes HHS to penalize EHR vendors with decertification of their products if their software fails to meet interoperability requirements.


In addition, the bill also contains provisions for "patient empowerment," allowing individuals to have the right to "the entirety" of their health information, including data contained in an EHR, whether structured and unstructured. An example of unstructured data might include physician notes, for instance, although that is not specifically named in the legislation.


"Healthcare providers should not have the ability to deny a patient's request for access to the entirety of such health information," the bill says.


A House source tells Information Security Media Group that the Senate has been working on an "Innovation Agenda" for the past few months calling for policies similar to those contained in the 21st Century Cures bill. House leaders say it's their goal to have a bill sent to the president's desk by the end of the year, the source says.

more...
No comment yet.
Scoop.it!

HIPAA Criminal Violations on the Rise

HIPAA Criminal Violations on the Rise | HIPAA Compliance for Medical Practices | Scoop.it

Stories appear almost everyday about medical records being improperly accessed, hacked or otherwise being stolen. The number of stories about such thefts is almost matched by the number of stories about the high value placed upon medical records by identity thieves and others. This confluence of events highlights the pressure being faced by the healthcare industry to protect the privacy and security of medical records in all forms.


While stories about hacking and other outside attacks garner the most attention, the biggest threat to a healthcare organization’s records is most likely an insider. The threat from an insider can take the form of snooping (accessing and viewing records out of curiosity) to more criminal motives such as wanting to sell medical information. Examples of criminally motivated insiders, unfortunately, are increasing.


One recent example occurred at Montefiore Medical Center in New York where an assistant clerk allegedly stole patient names, Social Security numbers, and birth dates from thousands of patients. The hospital employee then sold the information for as little as $3 per record. The individuals who acquired the information used it to allegedly go on a shopping spree across New York for over $50,000.

Another recent example comes out of Providence Alaska Medical Center in Anchorage, AK. In Anchorage, a financial worker at a hospital provided information about a patient to a friend. Unfortunately, that friend he had injured for which he was under criminal investigation. The friend wanted to know if either of the patients had reported him to the police. Clearly, the access by the financial worker was improper.


While it could previously be said that instances of criminal convictions or indictments were rare, the examples do appear to be coming with increasing frequency. What should organizations do? Is this conduct actually preventable? As is true with HIPAA compliance generally, the key is to educate and train members of an organization’s workforce. If someone is unaware of HIPAA requirements, it is hard to comply.

However, it can also be extremely difficult to prevent criminal conduct altogether. If an individual has an improper motive, that individual will likely find a way to do what they want to do. From this perspective, organizations cannot prevent the conduct, but should consider what measures can be taken to mitigate the impact of improper access or taking of information. It would be a good idea to monitor and audit access or use of information to be able to catch when information could be going out or otherwise accessed when not appropriate. Overall, the issue becomes one of how well does an organization monitor its systems and take action when a suspected issue presents itself.

more...
No comment yet.
Scoop.it!

Shoring Up HealthCare.gov Security

Shoring Up HealthCare.gov Security | HIPAA Compliance for Medical Practices | Scoop.it

The future of Obamacare seems more certain now that the Supreme Court has upheld subsidies for consumers who purchase policies on the federal health insurance exchange. As a result, it's more critical than ever for the federal government to ensure that personally identifiable information is adequately safeguarded on the HealthCare.gov website for the program, as well as state insurance exchanges, as they gear up for open enrollment in the fall.


In recent months, hackers have increasingly focused their attacks on government and healthcare systems. Targets of attacks have included the U.S. Office of Personnel Management and the Internal Revenue Service, as well as health insurers Anthem Inc. and Premera Blue Cross


That's why many security experts are calling attention to the need to make certain that systems supporting the Affordable Care Act, or Obamacare, programs are secure.


"Affordable Care Act insurance exchanges are a hodgepodge of programs operated by states and the federal governments," notes privacy attorney David Holtzman, vice president of compliance at the security consulting firm CynergisTek. "With the recent news of discovery of coordinated, highly sophisticated attacks on large government operated databases, as well as incidents involving large health insurers, it stands to reason that the information systems serving as the backbone to the health insurance marketplaces are an attractive target because of their size and the sensitivity of the information they hold."


Lee Tien, a senior staff attorney at the Electronic Frontier Foundation, a civil liberties group, notes: "All large collections of sensitive personal data are at risk." When it comes to potential fraud, "healthcare data is considered more valuable on the open market," he says. "Obviously it matters how well they're protected."

Under Scrutiny

Certainly, security of the federal HealthCare.gov health insurance exchange, which facilitates the electronic health insurance marketplaces for 34 states, has been under intense scrutiny since its rollout in the fall of 2013 during the first open enrollment season for Obamacare.


Congress, as well as government watchdog agencies, including the Government Accountability Office and the Department of Health and Human Services' Office of Inspector General, have examined whether the federal health insurance exchanges - and the 16 state-operated health insurance exchanges - have in place the processes and technology to prevent breaches involving consumers' personal information, including Social Security numbers.


For instance, in April, the OIG issued a report reviewing California's health insurance exchange - Covered California - and the security controls that were in place as of June 2014. The OIG found that California had implemented security controls for its website and databases for its health insurance exchange, but the watchdog agency said more improvements were needed.


OIG determined that California had not performed a vulnerability scan in accordance with federal requirements. Also, the GAO said that Covered California's security plan did not meet some of the Centers for Medicare and Medicaid Services' minimum requirements for protection of marketplace systems, and that Covered California did not have security settings for some user accounts. California officials, in their response to the report, said they planned to implement the OIG's recommendations related to vulnerability scans, security plans and user account settings.


A September 2014 GAO report examining HealthCare.gov security found that CMS - the Department of Health and Human Services unit responsible for the federal insurance exchange - had not always required or enforced strong password controls, adequately restricted systems supporting HealthCare.gov from accessing the Internet, consistently implemented software patches and properly configured an administrative network.


In addition to the HealthCare.gov exchange, another related potential target for hackers is HHS' Multidimensional Insurance Data Analytics System, or MIDAS, which a federal IT budget planning document describes as a "perpetual central repository for capturing, aggregating and analyzing information on health insurance coverage."

The GAO noted in its September 2014 report that MIDAS is intended to create summary reporting and performance metrics related to the federally facilitated marketplace and otherHealthCare.gov-related systems by aggregating data, including PII, collected during the plan enrollment process. GAO found, however, that at the time of its review, CMS hadn't yet approved an impact analysis of MIDAS privacy risks "to demonstrate that it has assessed the potential for PII to be displayed to users, among other risks, and taken steps to ensure that the privacy of that data is protected."


In a recent report, the Associated Press noted a variety of concerns about MIDAS, including current plans for data to be retained indefinitely. "Despite [a] poor track record on protecting the private information of Americans, [the Obama administration] continues to use systems without adequately assessing these critical components," said Sen. Orrin Hatch, R-Utah.


CMS did not immediately respond to an Information Security Media Group request for an update on the security of the MIDAS system.

Data Risks

Health insurers, as well as health insurance exchanges and their related databases, are a potential target for hackers because "any collection of data that includes Social Security numbers is particularly vulnerable," notes security expert Tom Walsh, founder of the consulting firm tw-Security.


"Healthcare was doing a good job of eliminating Social Security numbers from our systems. In the old days, the SSN was a person's member number for their insurance. It was finally getting to the point where SSNs were less frequently collected and used in healthcare," he says.


However, under Obamacare, sensitive consumer data, including Social Security numbers and income information, is used on the insurance exchanges to help individuals enroll in insurance plans and qualify for subsidies, Walsh notes. "So healthcare is back in the SSN game again - especially insurance companies."


Ray Biondo, chief information security officer at insurer Health Care Services Corp. says that the federal government has been taking action to address cyberthreats.


"We have been partnering with the Department of Homeland Security and the FBI and sharing threat information," Biondo says. "They've been collaborative and cooperative and helping us in that space."

Still, all players in the healthcare arena are anxious about potential attacks, he admits. "Everyone is worried about being next."

Playing Politics

Holtzman, the consultant, says it's important that politics don't get in the way of government agencies making the investments that are needed to shore up the security of health insurance exchange data.

"Everyone agrees that the federal and state governments should take decisive action to test existing information security safeguards on the systems that support the health insurance marketplace, and to take appropriate measures to ensure that the data, wherever it is held, is secured from the cybersecurity threat," he says.


"What concerns me is that in the long-running political debate over ACA, Congress has said that the HHS may not spend federal funds to support the development and implementation of the ACA. Perhaps it would be in the public interest to ensure that the fight over whether ACA is good policy does not prevent critical funds needed for investment in protecting the government information systems holding the personal information of millions of Americans from the cybersecurity threat."


Walsh says that protecting the health insurance exchanges also comes down to basics. "I was surprised when I read that the OPM did not encrypt data at rest. The government should lead by example and implement better security practices."


Tien of the Electronic Frontier Foundation, sums up his concerns: "The OPM example shows how pathetically lax information security can be. [The government] needs to make defense a priority and spend money on it."

more...
No comment yet.
Scoop.it!

Data breach costs on the rise, according to annual Ponemon Institute study

Data breach costs on the rise, according to annual Ponemon Institute study | HIPAA Compliance for Medical Practices | Scoop.it

Given the number and severity of publicized data breaches over the past year, it should come as little surprise that the average cost of a data breach is on the rise. According to the “2015 Cost of Data Breach Study: Global Analysis,” which was conducted by the Ponemon Institute and sponsored by IBM, the average cost of a data breach increased from $3.52 million in last year’s study to $3.79 million in this year’s edition.


While the year-over-year jump may seem small, the rise actually represents a 23 percent increase in the total cost of a data breach since 2013. The research, which included responses from personnel at 350 companies spanning 11 different countries, also found that lost business as the result of a data breach potentially has the most severe financial consequences for organizations as these costs increased from an average of $1.33 million last year to $1.57 million in 2015. Lost business costs include; abnormal turnover of customers; increased customer acquisition activities; reputation losses; and diminished goodwill.      


Diana Kelley, executive security advisor for IBM Security, said one thing that really stood out to her was the root causes of data breaches examined in the study, the majority of which (47 percent) were found to be the result of malicious or criminal attacks. The study found that the average cost per record to resolve such an attack is $170, compared to system glitches which cost $142 per record to resolve and human error or negligence that cost $134 per record to correct.  

“That indicates something that we’ve seen in other studies that this is organized criminal activity for data breaches,” she said. “We’re moving past the random, somebody left their laptop in a car, and we’re really looking at very targeted attacks from organized criminals.”

Kevin Beaver, an IT security consultant with Atlanta-based Principle Logic LLC, said that data breaches continue to persist on such a massive scale because many companies mistakenly believe they can just buy a piece of security technology that will take care of all of their problems.


“It doesn't work that way,” he said “Even if you have the very best of security controls you still have to have ongoing oversight and vulnerability testing because things are going to fall through the cracks.”


Another common issue, according to Beaver, is that companies simply place too much trust in employees and vendors.



“It's always best to err on the side of caution and put the proper controls in place so everyone, and especially the business, are setup for success. Another big issue I see is all the organizations, especially in the healthcare industry, that believe their high-level audits and policies are sufficient for minimizing their risks. It's not. Unless and until you test for - and resolve - the growing amount of security vulnerabilities on your network, you're a sitting duck waiting to be made to look bad,” said Beaver. “This is especially true to social engineering (i.e. phishing) testing. It's unbelievable how many people are still gullible and give up their network credentials or other sensitive info without question.”


Although data breaches that involve the theft of credit or debit card numbers seem to carry a greater amount of weight with the media and public in general, Kelley said the data shows that things such as protected health Information (PHI) and other personal data are more coveted by hackers as they have a longer lifespan for resale. Kelley advises companies to identify what their “crown jewels” are from a data perspective and to conduct threat assessments and risk modeling around protecting those assets.


“I think organizations need to look at the big picture. We do see evidence of more sophisticated criminal, organized attacks. On the other hand, we can’t forget all of the good security hygiene and just try and focus on what’s the next big scary attack,” said Kelley. “We have to do a very robust, layered set of security throughout our organization to include security awareness and training and monitoring. You’re looking for anywhere in that stack where there could be an exposure or there could be a vulnerability. Companies need to not just think about the big attack, but really think about a robust security model because that is going to help prevent the smaller attacks, as well as the larger attacks.”


Perhaps one of the study’s silver linings is that the involvement of a company’s board-level managers was found to help reduce costs associated with data breaches by $5.5 per record. Insurance protection was also found to reduce cost by $4.4 per record. Despite the increased awareness and involvement by senior leadership, Kelley said companies cannot completely protect against the threats posed by hackers.


“It’s important to remember that awareness and ability to stop something aren’t necessarily always aligned. If we look in the real world, we’re all very aware and highly concerned about something like cancer, but preventing it is very, very difficult,” said Kelley. “We can have the C-suite be very aware of security, but still some companies are at different levels of maturity. Attackers, they are, again, organized and sophisticated, so the level of prevention and controls you need in place to stop the attacks is very high. The fact that we still have attacks going on doesn’t mean companies aren’t putting security controls into place.”   


However, Beaver adds that while some executives may say and do all of the right things in public when it comes to their data protection efforts, the reality is some of them are just paying lip service to the issue.



“It's all about policies and related security theater to appease those not savvy enough - or politically powerful enough - to look deeper or question things further,” said Beaver.  


Conversely, Beaver said that there are a lot of companies who are taking the right approach to cybersecurity, which involves recognition by senior management of the seriousness of the issue.


“I see many organizations doing security well,” he added. “The key characteristics of well-run security are: executive acknowledgement of the challenges, ongoing financial and political support for IT and security teams, periodic and consistent security testing, and the willingness to make changes where changes need to be made - even if it's not politically favorable.”


Another bright spot in the study was that it found a correlation between organizational preparedness and reduced financial impact of a data breach. Companies that employed some level of business continuity management (BCM) within their organization were able to reduce their costs by an average of $7.1 per compromised record.


“Companies that brought in an incident response team or had an incident response program in place were able to save $12.60 per record,” added Kelley. “The biggest takeaway is to get some kind of plan in place. Have business continuity, have an incident response plan in place and be continually detecting and monitoring activity on the network so that if a breach is occurring, you can either see the very beginning of it or you can see one in process and respond as quickly as possible to reduce the impact to the business.”

more...
No comment yet.
Scoop.it!

4 HIPAA compliance areas your BAs must check

4 HIPAA compliance areas your BAs must check | HIPAA Compliance for Medical Practices | Scoop.it

It finally looks like the feds are starting up the next phase of HIPAA audits — but there’s still time to ensure your business associates (BAs) are staying compliant. 


In preparation of the next round of audits, the Department of Health and Human Services’ (HHS) Office for Civil Rights (OCR) has begun sending out pre-audit surveys to randomly selected providers, according to healthcare attorneys from the law firm McDermot, Will and Emory.

Originally, the surveys were meant to go out during the summer of 2014, but technical improvements and leadership transitions put the audits on hold until now.

Moving toward Phase 2

The OCR has sent surveys asking for organization and contact information from a pool of 550 to 800 covered entities. Based on the answers it receives, the agency will pick 350 for further auditing, including 250 healthcare providers.

The Phase 2 audits will primarily focus on covered entities’ and their BAs’ compliance with HIPAA Privacy, Security and Breach Notification standards regarding patients’ protected health information (PHI).

Since most of the audits will be conducted electronically, hospital leaders will have to ensure all submitted documents accurately reflect their compliance program since they’ll have minimal contact with the auditors.

4 vendor pitfalls

It’s not clear yet to what extent the OCR will evaluate BAs in the coming audits due to the prolonged delay. However, there are plenty of other good reasons hospital leaders need to pay attention to their vendors’ and partners’ approaches to HIPAA compliance and security.


Why?


Mainly because a lot of BAs aren’t 100% sure what HIPAA compliance entails, and often jeopardize patients’ PHI, according to Chris Bowen, founder and chief privacy and security officer at a cloud storage firm, in a recent HealthcareITNews article.


A large number of data breaches begin with a third party, so it’s important hospital leaders keep their BAs accountable by ensuring they regularly address these five areas:


  • Risk Assessments. As the article notes, research has shown about a third of IT vendors have failed to conduct regular risk analysis on their physical, administrative and technical safeguards. Ask your vendors to prove they have a risk analysis policy in place, and are routinely conducting these kinds of evaluations.
  • System activity monitoring. Many breaches go unnoticed for months, which is why it’s crucial your BAs have continuous logging, keep those logs protected and regularly monitor systems for strange activity.
  • Managing software patches. Even the feds can struggle with this one, as seen in a recent HHS auditon the branches within the department. Keeping up with security software patches as soon as they’re released is an important part of provider and BA security. Decisions about patching security should also be documented.
  • Staff training. Bowen recommends vendors include training for secure development practices and software development lifecycles, in addition to the typical General Security Awareness training that HIPAA requires.
more...
No comment yet.
Scoop.it!

Lessons from the Sony Hack: The Importance of a Data Breach Response Plan

Lessons from the Sony Hack: The Importance of a Data Breach Response Plan | HIPAA Compliance for Medical Practices | Scoop.it

In a decision emphasizing the need for employers to focus on data security, on June 15, 2015, the U.S. District Court for the Central District of California refused to dismiss a lawsuit filed by nine former employees of Sony Pictures Entertainment who allege the company’s negligence caused a massive data breach. 


In November 2014, Sony was the victim of a cyber-attack, which has widely been reported as perpetrated by North Korean hackers in relation for “The Interview,” a Sony comedy parodying Kim Jong Un. According to the complaint in this case, the hackers stole nearly 100 terabytes of data, including sensitive personal information, such as financial, medical, and other personally identifiable information (“PII”), of at least 15,000 current and former Sony employees.  The hackers then posted this information on the internet and used it to threaten individual victims and their families.  The nine named plaintiffs purchased identity protection services and insurance, as well as took other measures, to protect their compromised PII.

The plaintiffs filed a class action lawsuit alleging Sony failed to implement and maintain adequate security measures to protect its employees’ PII, and then improperly waited at least three weeks to notify plaintiffs that their PII had been compromised.  The plaintiffs asserted claims of negligence, breach of implied contract, and statutory violations of California, Virginia, and Colorado law.

Sony moved to dismiss the complaint.  First, Sony argued that plaintiffs lacked standing because they had not alleged a current injury or a threatened injury that is currently impending.  The court disagreed, concluding that the allegations of increased risk of future identity theft sufficiently established certainly impending injury.

Sony then challenged the viability of each claim.  While the court dismissed certain of the claims, the court allowed the plaintiffs to proceed with their claims of negligence and violations of California’s Confidentiality of Medical Information Act and Unfair Competition Law.  Key to the court’s decision on the negligence claim were its findings that (a) the costs plaintiffs incurred related to credit monitoring, identity theft protection, and penalties resulting from frozen credit constituted a cognizable injury, and (b) an exception to the economic loss doctrine applied because the parties had a “special relationship” whereby plaintiffs had to provide their PII to Sony in order to get paid and receive benefits.

Regarding the Confidentiality of Medical Information Act claim, the court found sufficient the allegations that Sony failed to maintain the confidentiality of the plaintiff’s medical information, which Sony has admitted included HIPAA-protected health information, and failed to institute reasonable safeguards to protect that information from unauthorized use.

While it remains to be seen whether the plaintiffs will prevail on any of their theories of recovery against Sony, this matter should be a lesson to companies that have not implemented appropriate data security measures more than just the loss of proprietary information. 

Employers have a duty to protect the personal sensitive information that they obtain from their employees, and the failure to take preventative measures may result in legal claims, reduction in employee morale, and loss of reputation.

Employers should begin by auditing their information technology infrastructure and network for security vulnerabilities.  Any such audit should be done under the supervision of counsel to maintain the privilege and confidentiality of the audit.  Based on that audit, employers should take steps to mitigate the vulnerabilities found to a reasonable and appropriate level given the threats to the organization.

 The Sony breach, like nearly all recent breaches, had an element of social engineering. To protect against these types of attacks employers should also train their workforces on information security best practices.  Finally, employers should be prepared to respond to breaches when they occur.  Employers should formulate and implement a breach response plan to minimize the time from the discovery of the compromise to the reporting of the incident to affected persons.

If a data breach does occur, the company should immediately execute the data breach response plan and quickly investigate the nature and scope of the data breach.  A forensic review should be conducted using an IT specialist that can trace the origins of the breach.  Employees and anyone affected should be notified so that they may take appropriate steps to prevent or limit identity theft and other damages.  Employers also should consider proactively notifying the police to work with the local cyber-crimes unit, as well as filing a civil suit against the perpetrator(s) to obtain injunctive relief and reduce further damage.  Appropriate legal counsel can assist in pursuing these options.

more...
No comment yet.
Scoop.it!

Four Common HIPAA Misconceptions

Four Common HIPAA Misconceptions | HIPAA Compliance for Medical Practices | Scoop.it

While practices must work hard to comply with HIPAA, some are taking HIPAA compliance efforts a bit too far. That's according to risk management experts, who say there are some common compliance misconceptions that are costing practices unnecessary time and resources.

Here's what they say many practices are avoiding that they don't necessarily need to avoid, and some extra steps they say practices are taking that they don't necessarily need to take.


1. Avoiding leaving phone messages


While it's true that a phone message from your practice to a patient could be overheard by the wrong party, phone messages that contain protected health information (PHI) don't need to be strictly off limits at your practice, says Jim Hook, director of consulting services at healthcare consulting firm The Fox Group, LLC."Many offices adopt a blanket policy of, well, 'We can't leave you any phone messages because HIPAA says we can't,' and, that's really not true," he says. "You can always get consent from a patient on how they want to be communicated with."


Hook recommends asking all of your patients to sign a form indicating in what manner you are permitted to communicate with them, such as by mail, e-mail, text, and phone message. "If the patient says, 'Yes, you can call and leave me phone messages at this phone number I'm giving you,' then it's not a HIPAA violation to use that method of communication," he says.


2. Avoiding discussing PHI


It's important to safeguard PHI as much as possible, but some practices are taking unnecessary precautions, says Michelle Caswell, senior director, legal and compliance, at healthcare risk-management consulting firm Clearwater Compliance, LLC.


"I think there's still a fear among small providers ... that they can't discuss protected health information anywhere in the [practice]," she says. "They feel that they have to almost build soundproof walls and put up bulletproof glass or soundproof glass to prevent any sort of disclosure of protected health information, and that's not what HIPAA requires at all. HIPAA allows for incidental disclosures, [which] are disclosures that happen [incidentally] around your job. So if you've got a nurse and a doctor talking, maybe at the nurses' station, and someone overhears that Mr. Smith has blood work today, that probably wouldn't be a violation because it's incidental to the job. Where else are the doctors and nurses going to talk?"


As long as you are applying "reasonable and appropriate" safeguards, Caswell says you should be in the clear.


3. Requiring unnecessary business associate agreements


HIPAA requires practices to have written agreements, often referred to as business associate agreements (BAAs), with other entities that receive or work with their PHI. Essentially, the agreements state that the business associates will appropriately safeguard the PHI they receive or create on behalf of the practice.


Still, some practices take unnecessary precautions when it comes to BAAs, says Robert Tennant, senior policy adviser of government affairs for the Medical Group Management Association. "A lot of practices are very concerned about people like janitorial services [and] plant maintenance folks, and they have them sign business associate agreements, but those folks are not business associates for the most part," says Tennant. "You may want to have them sign confidentiality agreements basically saying, 'If you do come across any information of a medical nature, protected health information, you are not permitted to look at it, copy it, keep it ...,' But, you do not need to sign a business associate agreement with anybody other than those folks that you actually give PHI to for a specific reason, like if you've got a law office or accounting office or a shredding company that is coming in to pick up PHI to destroy it."


4. Requiring unnecessary patient authorizations


While it's critical to comply with HIPAA's requirement that only those who have a valid reason to access a patient's medical record, such as treatment purposes, payment purposes, or healthcare operations, have access to it — some practices are misconstruing that rule, says Tennant. "They demand patient authorization before they transfer data to another provider for treatment purposes," he says. "I understand why they do it, but it's one of those things that … can cause delays and confusion, and even some acrimony between the patient and the provider. If it's for treatment purposes specifically, you do not need a patient authorization."

more...
No comment yet.
Scoop.it!

Data breach at White Plains Hospital involving emergency room patients

Data breach at White Plains Hospital involving emergency room patients | HIPAA Compliance for Medical Practices | Scoop.it

A security breach has been disclosed at a hospital in Westchester County.

Personal information about hundreds of emergency room patients over a two year period was leaked to someone or some entity that shouldn't have it.

So what if you're one of those patients? And who gave away the information?

White Plains Hospital is the latest target of a data breach.

An employee working for a billing company called Medical Management LLC. allegedly copied personal information including names, dates of birth, and social security numbers then gave it away to a third party.

MML handles the billing and coding for White Plains Hospital's emergency room.

"It should be held securely. Its information you should not give to certain people. I don't like giving my information out at all to anybody," said Jeffry Jones, a former patient.

The employee was fired and other hospitals in the state are affected.

Now patients at White Plains Hospital are waiting to find out if they're personal information was compromised.

"We're going to have to catch the company that's doing it. Wipe them out. The hospital is great. They're making them look bad. It's not right for them to mess up our lives," said Diana Bennett, a patient.

The breach was from February 2013 to March 2015.

Now the hospital is offering identity theft protection services for anyone who may have been impacted.

Credit protection expert Adam Levine has this advice for the 1,100 people affected.

"...." Levine said.

Anyone who may have fallen victim will be notified by mail.

Those affected by the breach are also being offered identity threat protection services at no cost.

There was no indication that any medical history or treatment information was disclosed.

Victims are being advised to place a fraud alert or a security freeze on their accounts through a national credit bureau and to review all bills and account statements.


more...
No comment yet.
Scoop.it!

A Call For a New Accreditation Body For Health IT Privacy

A Call For a New Accreditation Body For Health IT Privacy | HIPAA Compliance for Medical Practices | Scoop.it

As shown by breaches of personal information on innumerable individuals over the years, our approach to IT security falls short. Recent intrusions at Sony Pictures Entertainment and Anthem Health (80 million individuals) against a backdrop of substantial losses of personal health (PHI) and other IT information previously again brought this deficiency again to public attention. According to one estimate, almost 1 billion records were stolen via 1500 breaches in 2014, a 78% increase from the previous year and a clear indication of an increasing problem. Among personal information, health records are particular targets, bringing in $20 per record versus $1-$2 for a credit card and surveys consistently show considerable public concern about the privacy of PHI.


In a recent commentary, David Brailer proposed that raised security standards for health information be one of four principles underlying new privacy legislation. I strongly agree and would add a specific step to apply this principle – privacy accreditation for health data custodians.


Whether the information is stored for care, insurance or research, the public lacks understanding of the complexity of their stored PHI and the large number of individuals with access to or custodial responsibility for it. There is thus a wide gap and power differential between data providers and those who hold enormous amounts of sensitive health data. This circumstance creates a need for an empowered intermediary to act on the public’s behalf, i.e. an accreditation body.

I would advocate for a new IT health privacy accreditation body. It should be a non-profit entity, jump-started by legislation and funded via fees buttressed by a congressional appropriation with a three year sunset. It would evaluate data security measures comprehensively, in particular technical and personnel matters, including data-sharing procedures, encryption or equivalent, etc. It would then confer accreditation and as such formally interpret, maintain, apply, enforce and in certain cases set privacy standards. It would have similar processes as analogous entities, such as The Joint Commission and should be adaptable to the many and constantly changing technical and procedural details involved with securing data in a shifting terrain.


Accreditation would apply to hospitals, insurance companies, health plans, research centers and others who hold at least a certain number of health records (to be determined).  The accreditation body would conduct periodic announced and unannounced site-visits and audits with graded outcomes and there would be an appeals process. To give the body teeth and similar to other entities, its accreditation should be necessary for federal funding (Medicare, NIH). Conflicts of interest within the body would be addressed by policies and by a balance of competing interests including a spectrum of relevant stakeholders (corporations, patients, healthcare professionals, researchers, privacy experts, etc.) in its Board of Directors.


At present corporate responsibility primarily governs IT security. The Office of Civil Rights provides federal enforcement and penalties via responding to complaints and state governments also play a role. However, these entities do not act as accrediting bodies. Making privacy more a part of other accreditation reviews would not provide a sufficient concentration of expertise focused on the complexities of IT security and certification in specific areas does not address the overall problem.


Perhaps the major concern for a new accreditation process is that it would saddle healthcare entities with yet another bureaucratic step and still more site visits, audits and reviews. It would likely cause dismay and considerable (appropriate) discussion. The healthcare system is burdened enough though an additional, detailed process seems necessary to meaningfully upgrade IT security.


Also, no audit can guarantee perfect and complete security. A favorable audit could be followed by a breach. But the process, with mechanisms for self-improvement, would make such breaches far less likely. While technology can change very quickly (including between audits), accreditation reviews would determine if the data custodian has the personnel and technical capacity to keep abreast of and deal with rapid changes. Warning signs preceded the large loss at Target and a smaller breach of personal information preceded the later Anthem loss. Accreditation reviews would have noted both occurrences.


In conclusion, the privacy of health information has been considered a personal right since Hippocrates. Despite surveys showing strong concern about health privacy in the general population, our culture may or may not still be serious about its maintenance. If it is, preserving privacy will not come easily. Privacy accreditation of healthcare data custodians seems an achievable way to address this monumental and labyrinthine problem.


more...
No comment yet.
Scoop.it!

Physician practices be alert: you might be violating HIPAA if you produce medical records in response to state court subpoenas

Physician practices be alert: you might be violating HIPAA if you produce medical records in response to state court subpoenas | HIPAA Compliance for Medical Practices | Scoop.it

Over the past months, my experiences with physician practices have made me realize that many practices do not understand how HIPAA applies to subpoenas for medical records.  More worrisome, I suspect that many practices nationwide routinely violate HIPAA when they receive a subpoena.


Here’s what I’ve observed:  Practices receive state court subpoenas that are signed by lawyers and that demand the production of medical records, and the practices automatically assume they must produce the records.  This is a dangerous assumption—the production of the records may very well violate HIPAA.


Here’s what HIPAA requires in general terms:  If a practice receives a state court subpoena for medical records that is signed by a lawyer, the practice should not produce the records unless (1) the practice also receives (a) a court order requiring production, (b) a HIPAA “qualified protective order” that’s been entered in the lawsuit, (c) a HIPAA compliant authorization from the patient that authorizes the disclosure demanded by the subpoena, or (d) certain other matters designated by HIPAA’s rule concerning subpoenas, or (2) the practice takes certain additional actions required by HIPAA’s rule for subpoenas. 

If a practice receives such a subpoena without receiving any of these “additional” items or taking these “additional” actions, the practice will likely violate HIPAA if the records are produced.


Here’s what practices should do.  Because this area of HIPAA is somewhat complex and difficult for practices to navigate on their own, practices should consult with legal counsel when they receive such a subpoena.  Legal counsel can advise whether HIPAA permits the disclosure, whether the practice needs to object to the subpoena, and whether other actions should be taken.  On numerous occasions, we have reviewed such subpoenas, determined that they did not comply with HIPAA, and sent a letter objecting to the subpoena, and the practice never heard from the parties again.


more...
No comment yet.