HIPAA Compliance for Medical Practices
62.2K views | +12 today
Follow
HIPAA Compliance for Medical Practices
HIPAA Compliance and HIPAA Risk management Articles, Tips and Updates for Medical Practices and Physicians
Your new post is loading...
Your new post is loading...
Scoop.it!

Where Is HIPAA Taking Physician Practices?

Where Is HIPAA Taking Physician Practices? | HIPAA Compliance for Medical Practices | Scoop.it

Introduction:

Several provisions of the Health Insurance Portability and Accountability Act of 1996, or HIPAA, were intended to encourage electronic data interchange (EDI) and safeguard the security, privacy, and confidentiality of patient health information In the context of this act, security is the means by which confidentiality and privacy are insured. Confidentially defines how patient data can be protected from inappropriate access, while privacy is concerned with who should have access to the patient data. This article explores how the policies stipulated by HIPAA are shaping the practice of medicine and will likely affect your practice in the future.

 

HIPAA Security vs Innovation:

If you're a typical small-practice physician, odds are that you view HIPAA as simply another federally mandated cost of practising medicine, regardless of the intended outcome of the act. This position is understandable, given the cost of mandated training for you and your office staff. Furthermore, if your practice is computerised, then you'll need to spend even more money on software upgrades and possibly additional training from the vendor.

HIPAA rules and regulations are complex, in part because much of compliance is open to interpretation. For example, security issues, which are predominantly in the domain of software and hardware vendors, are based on “risk assessment,” not specific technology standards. The act doesn't stipulate specific technologies or endorse nationally recognised procedures, but leaves it up to the physician practice or medical enterprise to ensure that patient health data are secure. (HIPAA's security standards take effect on April 20, 2005, for all “covered entities” except small health plans However, because HIPAA enforcement is complaint-driven – there are no “HIPAA Police” checking to see that your practice meets the law's requirements – differences in interpretation of the act are likely to end up in a courtroom at some point. For this reason, some experts recommend assessment of HIPAA compliance by outside counsel.

Most physicians are understandably concerned with the immediate compliance issues surrounding HIPAA and privacy and confidentiality of patient data. Even though the security standards were designed to be “technology-neutral,” the vagaries of these requirements are having a direct impact on medicine beyond the acute phase of compliance, especially in the introduction of new technologies in the clinical arena. New technologies, from wireless to tablet PCs, bring with them added functionality, potential workflow enhancements, and efficiencies – as well as new HIPAA security compliance issues.

Consider, for example, the effect of HIPAA's privacy rules on a physician contemplating the purchase of a Palm Pilot or other PDA. Even late adopters have probably observed the benefit of PDAs. Need to share patient data? Just beam it across the infrared link from one PDA to the next. Need to review patient lab data? Just touch the screen and the data are only a second away.

But it isn't that simple once HIPAA enters into the picture. Now a PDA carrying patient data is a compliance concern, as HIPAA's privacy rule applies to all mediums of a patient's protected health information, whether it's print, verbal, or electronic. Does your PDA have a login and auto logout feature? If not, then anyone could take your PDA and look up patient data. Consider the liability issues if you forgot your PDA at a coffee shop and someone picked it up and scanned through your list of patients. But with a login screen, one of the major benefits of a PDA – instant access to data – is lost.

If you use one of the wireless PDAs, such as the BlackBerry, then there are additional HIPAA-related issues: Does your PDA support the encryption of email and patient data it sends over the Internet? Is the encryption enabled? Is the level of encryption good enough for HIPAA?

Perhaps you've been considering adding a wireless (WiFi) LAN to your clinic or practice. You may have good reason to; wireless will allow you to carry a laptop into examining rooms for decision support and not have to worry about Ethernet cords. But considering HIPAA, is your WiFi system secure? Is the data encryption good enough? If not, will you have to buy new PCs and PDAs, or simply upgrade the operating systems? Do you need to hire a consultant? Maybe it's easier to simply string cables to each office and forget about the laptop this year. Or maybe it would be better to hold off on the computer-assisted decision support project altogether.

Paradoxically, although proponents of HIPAA once thought that it would enhance the move toward the electronic medical record (EMR), I believe that it is having the opposite effect. Because of the uncertainty surrounding HIPAA compliance and whether the legal system will be swamped with cases alleging violations of privacy, it's simply safer for small practices to stay with paper charts, and let the big medical practices deal with the inevitable lawsuits.

This brings up another cost issue: Does your insurance cover a patient suit over HIPAA? If so, how inclusive is the insurance? For example, let's say your practice regularly sends digital audio files overseas for transcription. You send the audio files and receive text documents a day later. Do you know how the patient data are handled at the transcription service? If a transcriptionist overseas decides to protest his or her low wages by posting a transcription of your patient's clinic visit openly on the Web, are you liable? Will your insurer pay? This example isn't as far-fetched as it might seem. In October 2003, a disgruntled Pakistani transcriber threatened the University of California-San Francisco over back pay.[3] She threatened to post patients' confidential files on the Internet unless she was paid more money. To show that she was serious, she sent UCSF an unencrypted email with a patient record attached.

 

HIPAA, Privacy, and the Physician:

Whereas compliance with HIPAA's upcoming security requirements is largely in the purview of vendors and the information services department in most larger medical centres, privacy concerns are usually addressed at the physician level. Consider the major privacy provisions of the act, most of which took effect in April 2003, listed in the Table.

Major Privacy Components of HIPAA, Based on Data From the DHHS.

Implementing each of these privacy components falls squarely on you and your office staff. You, your office manager, or someone else in your practice must be designated the Privacy Officer and given the responsibility of ensuring compliance with the act. If you haven't already had at least 1 practice walk-through with the major privacy provisions, make sure you do so.

 

 

 
Technical Dr. Inc.'s insight:
Contact Details :

inquiry@technicaldr.com or 877-910-0004
www.technicaldr.com/tdr

more...
No comment yet.
Scoop.it!

HIPAA’s demands on the IT industry

HIPAA’s demands on the IT industry | HIPAA Compliance for Medical Practices | Scoop.it

We’re familiar with signing our lives away at the doctor’s office on HIPAA paperwork, but how is this policy affecting the IT industry?

Since the mid ’90s, the Health Insurance Portability and Accountability Act has regulated health insurance coverage and health care transactions. HIPAA protects patient privacy to ensure safekeeping of all medical information the patient may not wish to disclose. Long story short: HIPAA creates a higher standard to protect patient privacy and confidentiality. HIPAA holds institutions, organizations and offices responsible for protecting private patient information — and provides a framework for punishment when violators unlawfully access or share protected information.


In the past, HIPAA primarily affected hospital procedures. However, a large shift in policy created a ripple that stretched out to the IT industry. The Health Information Technology for Economic and Clinical Health Act of 2009 added technology and financial associates to the list of regulated parties. Things changed even more in 2013 when lawmakers added the Final Omnibus Rule, which significantly expanded the act's Protected Health Information regulations. This ruling greatly changed the relationship between HIPAA and the IT industry.


The rule’s provisions allowed HIPAA to administer new regulations on modern technology and the IT industry. The Final Omnibus Rule paid special attention to cloud storage, mobile devices and remote technologies that offer new ways to access patient information — and, consequently, provide more opportunities for privacy and security breaches. Formerly, a security breach was only considered a breach if the information contained birthdates or ZIP codes. Under the Final Omnibus Rule, all breaches of limited data must be handled the same, regardless of their content.


So, where does this leave the IT industry? When a cloud database administrator or independent IT consultant works directly with protected health information, the person or company automatically becomes a business associate who is subject to the rules and penalties of HIPAA. Since health care providers and their system administrators already know HIPAA regulations well, the IT industry and service providers are now playing catch-up. This means the IT industry has to learn the new regulations quickly and thoroughly to ensure the rules are being followed accordingly. For those still playing catch-up, or those that need a refresher course, allow us to summarize the rules of Title II:


The Privacy Rule  —Gives patients more control and protection over their confidential information.


The Transactions and Code Sets Rule — Keeps transactions standard throughout the industry.


The Security Rule — Updated to accommodate for the technological advances and thus the new forms of security breaches.


The Unique Identifiers Rule — Standardizes and protects the communication between health care providers and insurers.


The Enforcement Rule — Includes harsh penalties for HIPAA violations.


For people working with medical and patient data on a daily basis, HIPAA's privacy and security rules directly affect both the hardware and the software used to store and send data. According to the U.S. Department of Health & Human Services, everything from Drug Enforcement Administration numbers to vendor finances to patient identities can be subject to security breaches in health care databases. With so much at risk, the IT industry must be aware of the new regulations and be prepared to provide counsel on security and backup plans.


IT companies have come up with several solutions for security and backup that are HIPAA compliant, due to an increased need after 2013. Cloud computing offers ease of access, reliable backups and streamlined communication. Additional private cloud options were created with HIPAA regulations in mind — making sure all operations are secure, smart and compliant. With a private cloud, data is separate, safe and in an identifiable location. Only the particular client has access to the data in private clouds, perfectly complying with HIPAA policy.


New regulations are always a headache for database administrators, but HIPAA might settle the score with its new rules by preventing many more problems in the future. Hopefully, stricter privacy regulations and more defensive systems will emphasize the importance of innovative, up-to-date storage centers and solutions.

more...
No comment yet.
Scoop.it!

Reminders on HIPAA Enforcement: Breaking Down HIPAA Rules

Reminders on HIPAA Enforcement: Breaking Down HIPAA Rules | HIPAA Compliance for Medical Practices | Scoop.it

HIPAA enforcement is an important aspect of The HIPAA Privacy Rule, and also one that no covered entity actually wants to be a part of. However, it is essential that healthcare organizations of all sizes understand the implications of an audit from the Office for Civil Rights (OCR), and how they can best prepare.


This week, HealthITSecurity.com is breaking down the major aspects of OCR HIPAA enforcement, and what healthcare organizations and their business associates need to understand to guarantee that they keep patient data secure. Additionally, we’ll review some recent cases where the OCR fined organizations because of HIPAA violations.


What is the enforcement process?


OCR enforces HIPAA compliance by investigating any filed complaints and will conduct compliance reviews to determine if covered entities are in compliance. Additionally, OCR performs education and outreach to further underline the importance of HIPAA compliance. The Department of Justice (DOJ) also works with OCR in criminal HIPAA violation cases.


“If OCR accepts a complaint for investigation, OCR will notify the person who filed the complaint and the covered entity named in it,”according to HHS’ website. “Then the complainant and the covered entity are asked to present information about the incident or problem described in the complaint. OCR may request specific information from each to get an understanding of the facts. Covered entities are required by law to cooperate with complaint investigations.”


Sometimes OCR determines that HIPAA Privacy or Security requirements were not violated. However, when violations are found, OCR will need to obtain voluntary compliance, corrective action, and/or a resolution agreement with the covered entity:


“If the covered entity does not take action to resolve the matter in a way that is satisfactory, OCR may decide to impose civil money penalties (CMPs) on the covered entity. If CMPs are imposed, the covered entity may request a hearing in which an HHS administrative law judge decides if the penalties are supported by the evidence in the case. Complainants do not receive a portion of CMPs collected from covered entities; the penalties are deposited in the U.S. Treasury.”


During the intake and review process, OCR considers several conditions. For example, the alleged action must have taken place after the dates the Rules took effect. In the case of the Privacy Rule, the alleged incident will need to have taken place after April 14, 2003, whereas compliance with the Security Rule was not required until April 20, 2005.


The complaint must also be filed against a covered entity, and a complaint must allege an activity that, if proven true, would violate the Privacy or Security Rule. Finally, complaints must be filed within 180 days of “when the person submitting the complaint knew or should have known about the alleged violation.”


Recent cases of OCR HIPAA fines


One of the more recent examples of HIPAA enforcement took place in Massachusetts, when the OCR fined St. Elizabeth’s Medical Center (SEMC) $218,400 after potential HIPAA violations stemming from 2012.


The original complaint alleged that SEMC employees had used an internet-based document sharing application to store documents containing ePHI of nearly 500 individuals. OCR claimed that this was done without having analyzed the risks associated with the practice.

“OCR’s investigation determined that SEMC failed to timely identify and respond to the known security incident, mitigate the harmful effects of the security incident, and document the security incident and its outcome,” OCR explained in its report. “Separately, on August 25, 2014, SEMC submitted notification to HHS OCR regarding a breach of unsecured ePHI stored on a former SEMC workforce member’s personal laptop and USB flash drive, affecting 595 individuals.”


OCR Director Jocelyn Samuels reiterated the importance of all employees ensuring that they maintain HIPAA compliance, regardless of the types of applications they use. Staff at all levels “must follow all policies and procedures, and entities must ensure that incidents are reported and mitigated in a timely manner,” she stated.


In April of 2015, the OCR also agreed to a $125,000 settlement fine with Cornell Prescription Pharmacy (Cornell) after allegations that also took place in 2012. In that case, Cornell was accused of improperly disposing of PHI documents. Papers with information on approximately 1,600 individuals were found in an unlocked, open container on Cornell’s property.


“Regardless of size, organizations cannot abandon protected health information or dispose of it in dumpsters or other containers that are accessible by the public or other unauthorized persons,” OCR Director Samuels said in a statement, referring to the fact that Cornell is a small, single-location pharmacy in Colorado. “Even in our increasingly electronic world, it is critical that policies and procedures be in place for secure disposal of patient information, whether that information is in electronic form or on paper.”


However, not all OCR HIPAA settlements stay in the thousand dollar range. In 2014, OCR fined New York and Presbyterian Hospital (NYP) and Columbia University (CU) $4.8 million from a joint breach report that was filed in September 2010.


NYP and CU were found to have violated HIPAA by exposing 6,800 patients’ ePHI when an application developer for the organizations tried to deactivate a personally-owned computer server on the network that held NYP patient ePHI. Once the server was deactivated, ePHI became accessible on internet search engines.


“In addition to the impermissible disclosure of ePHI on the internet, OCR’s investigation found that neither NYP nor CU made efforts prior to the breach to assure that the server was secure and that it contained appropriate software protections,” OCR explained in its statement. “Moreover, OCR determined that neither entity had conducted an accurate and thorough risk analysis that identified all systems that access NYP ePHI.”


Regardless of an organization’s size, HIPAA compliance is essential. Regular risk analysis and comprehensive employee training are critical to keeping covered entities up to date and patient data secure. By reviewing federal, state and local laws, healthcare organizations can work on taking the necessary steps to make changes and improve their data security measures.

more...
No comment yet.
Scoop.it!

Bill That Changes HIPAA Passes House

Bill That Changes HIPAA Passes House | HIPAA Compliance for Medical Practices | Scoop.it

The U.S. House of Representatives on July 10 passed a bill aimed at accelerating the advancement of medical innovation that contains a controversial provision calling for significant changes to the HIPAAPrivacy Rule.


The House approved the 21st Century Cures bill by a vote of 344 to 77. Among the 309-page bill's many provisions is a proposal that the Secretary of Health and Human Services "revise or clarify" the HIPAA Privacy Rule's provisions on the use and disclosure of protected health information for research purposes.


Under HIPAA, PHI is allowed to be used or disclosed by a covered entity for healthcare treatment, payment and operations without authorization by the patient. If the proposed legislation is eventually signed into law, patient authorization would not be required for PHI use or disclosure for research purposes if only covered entities or business associates, as defined under HIPAA, are involved in exchanging and using the data.


That provision - as well as many others in the bill - aim to help fuel more speedy research and development of promising medical treatments and devices.


"The act says ... if you're sharing [patient PHI] with a covered entity [or a BA], you don't necessarily need the individual's consent prior to sharing - and that's something our members have been receptive too," notes Leslie Krigstein, interim vice president of public policy at the College of Healthcare Information Management Executives, an organization that represents 1,600 CIOs and CISOs.


"The complexity of consent has been a barrier [to health information sharing] ... and the language [contained in the bill] will hopefully move the conversation forward," she says.


Some privacy advocates, however, have opposed the bill's HIPAA-altering provision.


Allowing the use of PHI by researchers without individuals' consent or knowledge only makes the privacy and security of that data less certain, says Deborah Peel, M.D., founder of Patient Privacy Rights, an advocacy group,.


"Researchers and all those that take our data magnify the risks of data breach, data theft, data sale and harms," she says. "Researchers are simply more weak links in the U.S. healthcare system which already has 100s of millions of weak links."

Changes Ahead?

If the legislation is signed into law in its current form, healthcare entities and business associateswould need to change their policies related to how they handle PHI.


"If the bill is enacted, it will not place additional responsibilities on covered entities and business associates. Rather, it will provide them with greater flexibility to use and disclose protected health information for research," says privacy attorney Adam Greene, partner at law firm Davis Wright Tremaine. "Covered entities and business associates who seek to take advantage of these changes would need to revise their policies and procedures accordingly." For instance, some covered entities also may need to revise their notices of privacy practices if their notices get into great detail on research, Greene notes.

Other Provisions

In addition to the privacy provisions, the bill also calls for penalizing vendors of electronic health records and other health IT systems that fail to meet standards for interoperable and secureinformation exchange.


The bill calls for HHS to develop methods to measure whether EHRs and other health information technology are interoperable, and authorizes HHS to penalize EHR vendors with decertification of their products if their software fails to meet interoperability requirements.


In addition, the bill also contains provisions for "patient empowerment," allowing individuals to have the right to "the entirety" of their health information, including data contained in an EHR, whether structured and unstructured. An example of unstructured data might include physician notes, for instance, although that is not specifically named in the legislation.


"Healthcare providers should not have the ability to deny a patient's request for access to the entirety of such health information," the bill says.


A House source tells Information Security Media Group that the Senate has been working on an "Innovation Agenda" for the past few months calling for policies similar to those contained in the 21st Century Cures bill. House leaders say it's their goal to have a bill sent to the president's desk by the end of the year, the source says.

more...
No comment yet.
Scoop.it!

Data Breaches on Record Pace for 2015

Data Breaches on Record Pace for 2015 | HIPAA Compliance for Medical Practices | Scoop.it

Data breaches in 2015 are on pace to break records both in the number of breaches and records exposed, the San Diego-based Identity Theft Resource Center said.


In 2014, the number of U.S. data breaches tracked by ITRC hit a record high of 783, with 85,611,528 confirmed records exposed.

So far this year, as of June 30, the number of breaches captured on the ITRC report totaled 400 data incidents, one more than on June 30, 2014. Additionally, 117,576,693 records had been confirmed to be at risk.


That is significant given the finding of IBM Cost of Data Breach Study conducted by Ponemon Institute, which reported the cost incurred for each lost or stolen record containing sensitive averaged $154.

ITRC reported a significant jump of about 85% in the number of breaches in the banking sector over the same period last year. The biggest credit union breach so far this year took place at the $308 million Winston-Salem, N.C.-based Piedmont Advantage Credit Union, which notified its entire 46,000 membership in early March that one of its laptops containing personal information, including Social Security numbers, was missing.


Affected institutions are encouraged to participate in public comment on the assessment tool.


Year-to-date, the five industry sectors broken down by ITRC based on the percentage of breaches were business with 40.3%,

medical/healthcare at 34.8%, banking/credit/financial representing 10%, educational with 7.8% and government/military reporting 7.3%.

Based on the number of confirmed records, the medical/healthcare sector reported 100,926,229 records breached, government/military reported 15,391,057, educational had 724,318, banking/credit/financial reported 408,377 and business had 126,712.


The ITRC 2015 Breach Report was compiled using data breachesconfirmed by various media sources and/or notification lists from state governmental agencies.


Some breaches were not included in the report because they do not yet have reported statistics or remain unconfirmed, the firm said. 

more...
No comment yet.
Scoop.it!

The most dangerous data breach ever known

The most dangerous data breach ever known | HIPAA Compliance for Medical Practices | Scoop.it

From time to time I have the depressing task to write about yet another data loss event that caused the personal details of millions of people to fall into the hands of criminals. Usually this is credit card data, along with names and email addresses. Sometimes physical addresses are included, and occasionally even more sensitive data like Social Security numbers goes along for the ride. Usually this data was collected by a large retailer that had no qualms about storing the sensitive information, but clearly neglected to properly secure it.


Stolen data is primarily used for credit card fraud, though if there's enough information available, identity theft is a definite possibility. Millions of affected people have been forced to get new credit cards, check their statements for fraudulent charges, and rework any automated payment arrangements and whatnot. It's a big pain in the ass, and frankly, it has happened far too often, especially when once should be considered more than enough.




Heartland, Target, TJX, Anthem ... we've seen some massive data breaches over the years. But none can hold a candle to the breach the U.S. government announced last week. Not even close. On a scale of one to 10, with one being the loss of credit card numbers and names, this data loss event would conservatively be a 15.


Most people aren't aware of exactly what type of information the federal government collects on its employees, especially those with security clearances. We all have some idea that government employees have relatively strict reporting requirements for financial information, and we know that federal workers with higher clearances undergo thorough background checks and must submit to interviews of both themselves and their family and friends. This is done to flag potential problems and to prevent outside agents from having undue influence over people who may have access to sensitive information and materials.


Put simply, if you have a security clearance, the government would like to know if you have a drug problem or if you are in serious debt, because a foreign interest may try to use that situation as leverage to coerce you into revealing sensitive information. In the interest of national security, these safeguards make sense.


But the true nature and scope of the information required by the government and subsequently collected by the government on an employee is massive. Take a look at Standard Form 86. This is a 127-page form that usually takes a week or more to complete and requires the entry of the applicant's Social Security number on each page. The data included on this form is not just enough for identity theft, but enough to allow a person to literally become another person. Each Standard Form 86 fully documents the life of the subject. The only thing missing is the name of your first crush, though that might be in there somewhere too.


Some 18 million people had this level of personal data -- and more, including data collected by observers -- lost to foreign agents last week. If the government collected this data to know if an employee was vulnerable to undue outside influence, then it just succeeded in closing that loop itself, having now released it into the wild. All of those vulnerabilities are now known and available for exploit to whomever stole the data, or to whomever they wish to sell that data. This is very, very bad.


I should also mention that many of those whose personal information was swept up in this data loss event were never even government employees in the first place. They may have filled out the forms and submitted applications, but they were never hired or they declined the job. This includes prospective TSA agents right on up through CIA employees -- the higher the position, the higher the clearance, the more sensitive the data that was collected and lost. Information on these peoples' infidelities, sexual fetishes, mental illnesses, criminal activities, debts, and other highly personal information is now in the hands of cyber-attackers. This is damage that cannot be undone or mitigated. We can change credit card numbers and refund fraudulent charges, but we can't change any of the personal data and intimate details of these people's lives. That's a permanent loss.

One could argue that however disastrous this data loss event is, the government had a requirement to store this data. It needed to collect and maintain this data, even if it failed to secure it. That said, this is the same government that is collecting a massive amount of data on all of us, whether we're prospective federal employees or not, via Internet and phone surveillance. If the federal government is lax enough to lose immeasurably sensitive information on its employees, how secure is the data that it has decided it needs to collect on everyone in the world?


Many people believe that the U.S. government shouldn't be collecting and storing this data in the first place, and that there's no need to maintain that data collection. This event underscores the fact that maintaining this data is not just privacy invasion on a massive scale, but it's actually dangerous. What happens when the next data loss event contains highly sensitive data on hundreds of millions of people? We can't put that cat back in the box no matter how we might try. You might think that the best way to guard against that possibility is to stop collecting that data in the first place.

more...
No comment yet.
Scoop.it!

243 arrested in 10 states for healthcare fraud, false claims, kickbacks, medical ID theft

243 arrested in 10 states for healthcare fraud, false claims, kickbacks, medical ID theft | HIPAA Compliance for Medical Practices | Scoop.it
The Medicare Fraud Strike Force swept through 10 states and arrested 243 people—46 of them physicians, nurses, and other licensed medical professionals—for allegedly defrauding the government out of $712 million in false Medicare and Medicaid billings, federal officials announced June 18. In addition to targeting instances of false claims and kickbacks, the strike force also uncovered evidence of medical identity theft.
Among the defendants is Mariamma Viju of Garland, Texas, an RN and the co-owner and nursing director for Dallas Home Health, Inc. A federal indictment accuses Viju and a co-conspirator of stealing patient information from Dallas-area hospitals in order to then solicit those patients for her business, as well as submitting false Medicare and Medicaid claims, and paying out cash kickbacks to beneficiaries.
In total, the scheme netted Viju $2.5 million in fraudulently obtained payments between 2008 and 2013. She was arrested June 16 and charged with one count of conspiracy to commit healthcare fraud, five counts of healthcare fraud, and one count of wrongful disclosure of individually identifiable health information.
The indictment says Viju allegedly took patient information from Baylor University Medical Center at Dallas, where she worked as a nurse until she was fired in 2012. Dallas Home Health then billed Medicare and Texas Medicaid for home health services on behalf of beneficiaries who were not homebound or otherwise eligible for covered home health services.
Viju also allegedly falsified and exaggerated patients’ health conditions to increase the amounts billed to Medicare and Medicaid, and thereby boost payments to Dallas Home Health. The indictment says she paid kickbacks to Medicare beneficiaries as well to recruit and retain them as patients of Dallas Home Health.
Viju’s co-conspirator—a co-owner of Dallas Home Health—wasn’t named in the indictment, but in a news release from the U.S. Attorney’s Office for the Northern District of Texas, that person was identified as her husband Viju Mathew. He’s a former registration specialist at Parkland Hospital in Dallas and pleaded guilty in November 2014 to one count of fraud and related activity in connection with identity theft.
Prosecutors say he used his position to obtain PHI, including names, phone numbers, birthdates, Medicare information, and government-issued health insurance claim numbers, so he could use it to contact prospective patients for his home health care business. He is due to be sentenced in August 2015.
In another case in Maryland, Harry Crawford—owner of RX Resources and Solutions—and two of his employees—Elma Myles and Matthew Hightower—are all charged with aggravated identity theft in addition to healthcare fraud and conspiracy to commit healthcare fraud.
An indictment from a federal grand jury accuses Crawford, Myles, and Hightower of fraudulently using actual names, addresses, and unique insurance identification numbers of numerous Medicaid beneficiaries to submit fraudulent claims totaling approximately $900,000 between 2010 and 2014.
The alleged scheme used Crawford’s durable medical equipment and disposable medical supply company to bill insurers for equipment and supplies that were never provided to beneficiaries, bill for amounts far in excess of the services delivered, and bill for supplies that weren’t needed and were never prescribed by a physician.
These are just two examples of the criminal fraud uncovered by the strike force.
In other cases, defendants face similar fraud and conspiracy charges for fraudulent billing schemes as well as charges for cash kickbacks, and money laundering, according to the Department of Justice (DOJ). The DOJ says more than 40 defendants are accused of defrauding the Medicare prescription drug program.
This was the largest coordinated takedown, in terms of defendants and money, in the history of the Medicare Fraud Strike Force, according to the DOJ. CMS also suspended licenses for several healthcare providers with authority granted to the agency under the Affordable Care Act.
more...
No comment yet.
Scoop.it!

Upcoming HIPAA audits may target financial institutions - here’s how to prepare

Upcoming HIPAA audits may target financial institutions - here’s how to prepare | HIPAA Compliance for Medical Practices | Scoop.it

Much like a tornado watch, the conditions appear to be right for a coming storm: the upcoming Phase 2 HIPAA audits. The Department of Health and Human Services Office for Civil Rights (OCR) has begun verifying contact information of potential audit targets. This serves as a warning that OCR will be auditing for HIPAA compliance, which unlike the pilot audits, will target business associates, including financial institutions, as well as HIPAA-covered entities.


Government regulation is not new to financial institutions. What is new is that an additional regulator with a different perspective has been thrown into the mix. And the stakes are high, with HIPAA carrying both civil and criminal penalties, and resolution amounts tending to reflect the size of the organization being scrutinized.


Financial institutions usually enjoy a statutory exemption from HIPAA when they provide “typical” banking services such as processing payments and issuing credit, even when the financial institutions come in contact with protected health information. But, when services go beyond these recognized functions, financial institutions that create, receive, maintain, or transmit protected health information may well have become business associates with direct obligations under HIPAA (as well as contractual obligation through “business associate contracts”). Additionally, financial institutions that convert non-standard electronic HIPAA transactions (usually transactions related to health care billing and payment) into standard transactions—and vice versa—may be health care clearinghouses that are covered entities under HIPAA.


Preparation for Audits


To prepare for the next round of audits, which is described in more detail below, financial institutions that are business associates or covered entities may want to consider the following steps:

  • Verify that a current HIPAA risk analysis is in place and that the risk analysis actually identifies and categorizes risks (e.g., low, medium, high) rather than merely documenting that controls are in place or documenting the gaps in compliance with the HIPAA Security Rule (see OCR Guidance on Risk Analysis and HHS’ Security Risk Assessment Tool). This may entail establishing an inventory of information, systems, and devices
  • Document the action items identified in the risk analysis and the steps taken to address these items or establish reasonable timelines
  • Verify that policies are up to date and dated, particularly pertaining to:
    • Data breach notification
    • Risk analysis and risk management
  • Have supplemental documentation related to the above topics readily available and relatively self-explanatory (e.g., clearly labeled) such as:
    • Risk analyses and risk management plans
    • Documentation that addressable implementation specifications have been addressed
    • Documentation of investigations relating to breaches
    • A copy of any recent breach notifications
    • Breach risk assessments where notifications were not made
    • Documentation of the timelines from the discovery of a breach until the notifications of the breach were made
  • Maintain a current list of business associates and subcontractor business associates with relevant contact information (an internal audit of accounts payable may help identify business associates and is a methodology that was used by OCR’s contractors in Phase 1 audits to identify business associates)
  • Confirm that appropriate workforce members have received HIPAA training (and that training has been documented)
  • Prepare for an audit, perhaps including using an audit assessment tool, such as the DWT audit toolkit. Consider whether it is appropriate to involve legal counsel, which may extend a privilege over the preparation process.


What to expect from the Phase 2 audits


For the first time, business associates will be included in OCR’s HIPAA audits. OCR will request a list of business associates from covered entities (and perhaps other business associates).


Phase 2 will be conducted primarily by OCR staff. Most of these audits likely will be desk audits, although some on-site audits may occur, depending on OCR resources.


As originally announced, OCR plans to audit approximately 350 covered entities and 50 business associates. To start the audit process, OCR will verify contact information – which now is underway. Then, OCR will collect relevant information through a pre-audit survey to select an appropriate sample. OCR will follow up with notifications and data requests to those selected for the audit.


As currently anticipated, Phase 2 audits will be more narrowly focused than the comprehensive audits in Phase 1. Phase 2 topics are to be based on deficiencies identified in Phase 1, including breach notification, risk analysis, and a corresponding risk management plan.

Covered entities and business associates will have about two weeks to respond to initial data requests. OCR has indicated that auditors will not seek clarification or additional data and only data submitted on time will be considered. OCR discourages submitting extraneous information. OCR will not consider policies and similar documentation created after the date of the audit request. OCR will provide a draft report to audited entities and provide an opportunity for comment prior to issuing a final report.


Projected “Round 2” of Phase 2 audits and beyond may move to device and media controls, transmission security (e.g., encryption of transmitted protected health information), Privacy Rule safeguards (e.g., governing hard copy and oral information), encryption and decryption, physical facility access controls, breach reports (e.g., to OCR), and complaint processes.


Impacts of Audits


Although OCR’s communications regarding Phase 1 audits suggested that they would not be used as a vehicle for formal enforcement, OCR has indicated that Phase 2 and future audits may be more closely tied to enforcement, where adverse findings could lead to civil monetary penalties or resolution agreements.


This alert describes OCR’s most recent information on its audit program. The information is subject to significant change as OCR rolls out Phase 2.

more...
No comment yet.
Scoop.it!

Four Common HIPAA Misconceptions

Four Common HIPAA Misconceptions | HIPAA Compliance for Medical Practices | Scoop.it

While practices must work hard to comply with HIPAA, some are taking HIPAA compliance efforts a bit too far. That's according to risk management experts, who say there are some common compliance misconceptions that are costing practices unnecessary time and resources.

Here's what they say many practices are avoiding that they don't necessarily need to avoid, and some extra steps they say practices are taking that they don't necessarily need to take.


1. Avoiding leaving phone messages


While it's true that a phone message from your practice to a patient could be overheard by the wrong party, phone messages that contain protected health information (PHI) don't need to be strictly off limits at your practice, says Jim Hook, director of consulting services at healthcare consulting firm The Fox Group, LLC."Many offices adopt a blanket policy of, well, 'We can't leave you any phone messages because HIPAA says we can't,' and, that's really not true," he says. "You can always get consent from a patient on how they want to be communicated with."


Hook recommends asking all of your patients to sign a form indicating in what manner you are permitted to communicate with them, such as by mail, e-mail, text, and phone message. "If the patient says, 'Yes, you can call and leave me phone messages at this phone number I'm giving you,' then it's not a HIPAA violation to use that method of communication," he says.


2. Avoiding discussing PHI


It's important to safeguard PHI as much as possible, but some practices are taking unnecessary precautions, says Michelle Caswell, senior director, legal and compliance, at healthcare risk-management consulting firm Clearwater Compliance, LLC.


"I think there's still a fear among small providers ... that they can't discuss protected health information anywhere in the [practice]," she says. "They feel that they have to almost build soundproof walls and put up bulletproof glass or soundproof glass to prevent any sort of disclosure of protected health information, and that's not what HIPAA requires at all. HIPAA allows for incidental disclosures, [which] are disclosures that happen [incidentally] around your job. So if you've got a nurse and a doctor talking, maybe at the nurses' station, and someone overhears that Mr. Smith has blood work today, that probably wouldn't be a violation because it's incidental to the job. Where else are the doctors and nurses going to talk?"


As long as you are applying "reasonable and appropriate" safeguards, Caswell says you should be in the clear.


3. Requiring unnecessary business associate agreements


HIPAA requires practices to have written agreements, often referred to as business associate agreements (BAAs), with other entities that receive or work with their PHI. Essentially, the agreements state that the business associates will appropriately safeguard the PHI they receive or create on behalf of the practice.


Still, some practices take unnecessary precautions when it comes to BAAs, says Robert Tennant, senior policy adviser of government affairs for the Medical Group Management Association. "A lot of practices are very concerned about people like janitorial services [and] plant maintenance folks, and they have them sign business associate agreements, but those folks are not business associates for the most part," says Tennant. "You may want to have them sign confidentiality agreements basically saying, 'If you do come across any information of a medical nature, protected health information, you are not permitted to look at it, copy it, keep it ...,' But, you do not need to sign a business associate agreement with anybody other than those folks that you actually give PHI to for a specific reason, like if you've got a law office or accounting office or a shredding company that is coming in to pick up PHI to destroy it."


4. Requiring unnecessary patient authorizations


While it's critical to comply with HIPAA's requirement that only those who have a valid reason to access a patient's medical record, such as treatment purposes, payment purposes, or healthcare operations, have access to it — some practices are misconstruing that rule, says Tennant. "They demand patient authorization before they transfer data to another provider for treatment purposes," he says. "I understand why they do it, but it's one of those things that … can cause delays and confusion, and even some acrimony between the patient and the provider. If it's for treatment purposes specifically, you do not need a patient authorization."

more...
No comment yet.
Scoop.it!

Think Your Practice is HIPAA Compliant? Think Again.

Think Your Practice is HIPAA Compliant? Think Again. | HIPAA Compliance for Medical Practices | Scoop.it

You may think you know HIPAA inside and out, but experts say many practices and physicians are making mistakes regarding protected health information (PHI) that could get them into big trouble with the law. Here are nine of the most common compliance missteps they say practices and physicians are making.

1. TEXTING UNENCRYPTED PHI


For most physicians, texting is an easy, convenient, and efficient way to communicate with patients and colleagues. But if a text contains unencrypted PHI, that could raise serious HIPAA problems.


"One of the big things people are doing these days is texting PHI, and people seem to be ignoring the fact that text messages can be read by anyone, they can be forwarded to anyone, [and] they're not encrypted in any fashion when they reside on a telecommunications provider's server," says Jim Hook, director of consulting services at healthcare consulting firm The Fox Group, LLC. "People really need to understand that [short message service (SMS)] text messaging is inherently nonsecure, and it's noncompliant with HIPAA."


That's not to say that texting PHI is never appropriate, it just means that physicians must find a way to do so securely. While the privacy and security rules don't provide explicit text messaging guidelines, they do state that covered entities must have "reasonable and appropriate safeguards to protect the confidentiality, availability, and integrity of protected health information," says Michelle Caswell, senior director, legal and compliance, at healthcare risk-management consulting firm Clearwater Compliance, LLC. As a result, Caswell, who formerly worked for HHS' Office for Civil Rights, says physicians must consider, "What would I put on my [smart] phone to reasonably and appropriately safeguard that information?" Most likely, the answer will be a secure messaging service with encryption, she says, adding that many inexpensive solutions are available to providers.


2. E-MAILING UNENCRYPTED PHI


Similar to text messaging, many physicians are e-mailing unencrypted PHI to patients and colleagues. As Robert Tennant, senior policy adviser of government affairs for the Medical Group Management Association says, e-mailing is becoming ubiquitous in our society, and healthcare is no exception.


If your providers are e-mailing PHI, consider implementing a secure e-mail application; for instance, one that recognizes when content included in the e-mail contains sensitive information and therefore automatically encrypts the e-mail. Your practice could use the application to specify certain circumstances in which e-mails should be encrypted; such as the inclusion of social security numbers or credit card numbers. The application would then filter e-mails for that specified content, and when it finds that content, encrypt those e-mails automatically, says Caswell.


Another option is to use a secure e-mail application to set up filters to automatically encrypt e-mails sent with attachments, or encrypt e-mails when senders include a word like "sensitive" or "encrypt" in the subject line, she says. An added benefit of encrypting e-mail is if a potential breach occurs, like the theft of a laptop containing e-mails with PHI, that is not considered a reportable breach if the e-mails stored on the laptop are encrypted, says Tennant. "You don't need to go through all of the rigmarole in terms of reporting the breach to the affected individual, and ultimately, to the government," he says. "So it's sort of a get out of jail free card in that sense."


If your practice would rather prohibit the use of e-mail altogether, a great alternative might be a patient portal that enables secure messaging.


Finally, if patients insist on having PHI e-mailed to them despite the risks, get their permission in writing for you to send and receive their e-mails, says Tennant.


3. FAILING TO CONDUCT A RISK ANALYSIS


If your practice has not conducted a security risk analysis — and about 31 percent of you have not, according to our 2014 Technology Survey, Sponsored by Kareo — it is violating HIPAA. The security rule requires any covered entity creating or storing PHI electronically to perform one. Essentially, this means practices must go through a series of steps to assess potential risks and vulnerabilities to the confidentiality, integrity, and availability of their electronic protected health information (ePHI).


Though the security risk analysis requirement has been in place since the security rule was formally adopted in 2003, it's been pretty widely ignored by practices, says Hook. Part of the reason, he says, is lack of enforcement of the requirement until recently. Since conducting a security risk analysis is now an attestation requirement in the EHR incentive program, auditors are increasingly noting whether practices are in compliance.


4. FAILING TO UPDATE THE NPP


If your practice has not updated its Notice of Privacy Practices (NPP) recently, it could be violating HIPAA. The HIPAA Omnibus Rule requires practices to update these policies and take additional steps to ensure patients are aware of them, says Tennant.

Some of the required updates to the NPP include:


• Information regarding uses and disclosures that require authorization;


• Information about an individual's right to restrict certain disclosures of PHI to a health plan; and


• Information regarding an affected individual's right to be notified following a privacy or security breach.


In addition to updating the NPP, a practice must post it prominently in its facility and on the website, and have new patients sign it and offer a copy to them, says Tennant. "I'd say of every 10 practices, hospitals, dental offices I go into, nine of them don't have their privacy notice in the waiting room," he says.


5. IGNORING RECORD AMMENDMENT REQUESTS


Don't hesitate to take action when patients request an amendment to information in their medical records, cautions Cindy Winn, deputy director of consulting services at The Fox Group, LLC. Under the HIPAA Privacy Rule, patients have the right to request a change to their records, and providers must act on those requests within 60 days, she says.


If you disagree with a patient's requested change, you must explain, in writing, why you are not making the requested change, says Hook. Then, share that reasoning with the patient and store a copy of it in the patient's medical record, as well as a copy of the patient's written request for the amendment.


6. NOT PROVIDING ENOUGH TRAINING


The privacy and security rules require formal HIPAA education and training of staff. Though the rules don't provide detailed guidance regarding what training is required, Hook recommends training all the members of your workforce on policies and procedures that address privacy and security at the time of hire, and at least annually thereafter.


The HIPAA Security Rule also requires practices to provide "periodic security reminders" to staff, says Caswell, adding that many practices are unaware of this. Actions that might satisfy this requirement include sending e-mails to staff when privacy and security issues come up in the news, such as information about a recent malware outbreak; or inserting a regular "security awareness" column in staff e-newsletters.

Finally, be sure to document any HIPAA training provided to staff.


7. OVERCHARING FOR RECORD COPIES


With few exceptions, the privacy rule requires practices to provide patients with copies of their medical records when requested. It also requires you to provide access to the record in the form requested by the individual, if it is readily producible in that manner.


While practices can charge for copies of records, some practices may be getting into trouble due to the fee they are charging, says Tennant. "HIPAA is pretty clear that you can only charge a cost-based fee and most of those are set by the state, so most states have [limits such as] 50 cents a page up to maybe $1 a page ... but you can't charge a $50 handling fee or processing fee; at least it's highly discouraged," says Tennant.


To ensure you are following the appropriate guidelines when dealing with record copy requests, review your state's regulations and consult an attorney. Also, keep in mind that though the privacy rule requires practices to provide copies within 30 days of the request, some states require even shorter timeframes.


8. BEING TOO OPEN WITH ACCESS


If your practice does not have security controls in place regarding who can access what medical records and in what situations, it's setting itself up for a HIPAA violation. The privacy rule requires that only those who have a valid reason to access a patient's record — treatment purposes, payment purposes, or healthcare operations — should do so, says Caswell. "If none of those things exist, then a person shouldn't [access] an individual's chart."


Caswell says practices need to take steps to ensure that staff members do not participate in "record snooping" — inappropriately accessing a neighbor's record, a family member's record, or even their own record.


She recommends practices take the following precautions:

• Train staff on appropriate record access;

• Implement policies related to appropriate record access; and

• Run EHR audits regularly to determine whether inappropriate access is occurring.


9. RELEASING TOO MUCH INFORMATION


Similar to providing too much access to staff, some practices provide too much access to outside entities, says Caswell. For instance, they release too much PHI when responding to requests such as subpoenas for medical records, requests for immunization information from schools, or requests for information from a payer.


"If there's, say, for instance, litigation going on and an attorney says, 'I need the record from December 2012 to February 2014,' it is your responsibility to only send that amount of information and not send anything else, so sort of applying what's called the minimum necessary standard," says Caswell. "When you receive outside requests for PHI, pay close attention to the dates for which information is requested, as well as the specific information requested."

more...
No comment yet.
Scoop.it!

Why Even Small Data Breaches Hit Hard

Why Even Small Data Breaches Hit Hard | HIPAA Compliance for Medical Practices | Scoop.it

For the past few years, major organizations have dropped the ball on cybersecurity again and again. Retailers, insurance providers, educational institutions, and even the U.S. government have all exposed inordinate amounts of their customers’ personal, financial and sometimes even medical information.


This sensitive data is often used to commit identity theft and fraud — a correlation so strong that two-thirds of identity fraud victims in 2014 had previously received a data breach notification.


It may feel like every Fortune 500 company will inevitably be breached, which could lead consumers to believe they can just sit back and wait. But that’s simply not the case. And this “data breach fatigue” is a rather dangerous mindset to sink into.


Signs of identity theft can hide in the smallest of spaces: deep within your credit file, in archived taxes and even in your medical records. Without quick action following a data breach, you may miss major red flags and end up paying the consequences only after the problem has exponentially grown.


Let’s erase this cloudy viewpoint by shifting our focus on when and where a breach will really hurt by taking a look at the little guys — data breaches in small businesses.


Small businesses, particularly small medical practices, are major targets for cybercriminals. These organizations hold a plethora of sensitive data, while typically possessing only the bare minimum in terms of security.


On average, it takes more than 200 days for an organization to detect that it has been hacked.


A small-scale data breach is also rather lackluster in comparison to it’s brand-name counterparts. Large breaches garner widespread media attention, which drives swift action amongst all parties — the impacted organization, financial institutions and consumers. With time and public awareness against them, hackers know their stolen data will soon be too hot to profit from and will only use a small percentage of it as quickly as possible.


This attention-grabbing factor is completely obsolete in terms of a small-scale data breach, giving the hackers time to sort out how to most effectively maximize their profits.


So how often do these small-scale breaches occur? Just take a look at Fighting Identity Crime’s monthly breach summaries and you’ll see a distinct pattern — small medical practices and businesses flood the list, each with a considerable amount of exposed data associated with their attack.


Many of these exposed customers may still be unaware of their vulnerability to identity theft and fraud. Meanwhile, others probably know their data was leaked but still don’t fully understand the risks associated with a small-scale data breach.


On average, the total cost of a data breach is now $3.8 million, up from $3.5 million in 2014. While a consumers’ financial institution will immediately bear this cost, it will likely impact the consumer later through indirect fees and a reduction of product offerings.

more...
No comment yet.
Scoop.it!

Four Common HIPAA Misconceptions

Four Common HIPAA Misconceptions | HIPAA Compliance for Medical Practices | Scoop.it

While practices must work hard to comply with HIPAA, some are taking HIPAA compliance efforts a bit too far. That's according to risk management experts, who say there are some common compliance misconceptions that are costing practices unnecessary time and resources.

Here's what they say many practices are avoiding that they don't necessarily need to avoid, and some extra steps they say practices are taking that they don't necessarily need to take.


1. Avoiding leaving phone messages

While it's true that a phone message from your practice to a patient could be overheard by the wrong party, phone messages that contain protected health information (PHI) don't need to be strictly off limits at your practice, says Jim Hook, director of consulting services at healthcare consulting firm The Fox Group, LLC."Many offices adopt a blanket policy of, well, 'We can't leave you any phone messages because HIPAA says we can't,' and, that's really not true," he says. "You can always get consent from a patient on how they want to be communicated with."


Hook recommends asking all of your patients to sign a form indicating in what manner you are permitted to communicate with them, such as by mail, e-mail, text, and phone message. "If the patient says, 'Yes, you can call and leave me phone messages at this phone number I'm giving you,' then it's not a HIPAA violation to use that method of communication," he says.


2. Avoiding discussing PHI

It's important to safeguard PHI as much as possible, but some practices are taking unnecessary precautions, says Michelle Caswell, senior director, legal and compliance, at healthcare risk-management consulting firm Clearwater Compliance, LLC.


"I think there's still a fear among small providers ... that they can't discuss protected health information anywhere in the [practice]," she says. "They feel that they have to almost build soundproof walls and put up bulletproof glass or soundproof glass to prevent any sort of disclosure of protected health information, and that's not what HIPAA requires at all. HIPAA allows for incidental disclosures, [which] are disclosures that happen [incidentally] around your job. So if you've got a nurse and a doctor talking, maybe at the nurses' station, and someone overhears that Mr. Smith has blood work today, that probably wouldn't be a violation because it's incidental to the job. Where else are the doctors and nurses going to talk?"


As long as you are applying "reasonable and appropriate" safeguards, Caswell says you should be in the clear.


3. Requiring unnecessary business associate agreements

HIPAA requires practices to have written agreements, often referred to as business associate agreements (BAAs), with other entities that receive or work with their PHI. Essentially, the agreements state that the business associates will appropriately safeguard the PHI they receive or create on behalf of the practice.


Still, some practices take unnecessary precautions when it comes to BAAs, says Robert Tennant, senior policy adviser of government affairs for the Medical Group Management Association. "A lot of practices are very concerned about people like janitorial services [and] plant maintenance folks, and they have them sign business associate agreements, but those folks are not business associates for the most part," says Tennant. "You may want to have them sign confidentiality agreements basically saying, 'If you do come across any information of a medical nature, protected health information, you are not permitted to look at it, copy it, keep it ...,' But, you do not need to sign a business associate agreement with anybody other than those folks that you actually give PHI to for a specific reason, like if you've got a law office or accounting office or a shredding company that is coming in to pick up PHI to destroy it."


4. Requiring unnecessary patient authorizations

While it's critical to comply with HIPAA's requirement that only those who have a valid reason to access a patient's medical record, such as treatment purposes, payment purposes, or healthcare operations, have access to it — some practices are misconstruing that rule, says Tennant. "They demand patient authorization before they transfer data to another provider for treatment purposes," he says. "I understand why they do it, but it's one of those things that … can cause delays and confusion, and even some acrimony between the patient and the provider. If it's for treatment purposes specifically, you do not need a patient authorization."

more...
No comment yet.
Scoop.it!

Think Your Practice is HIPAA Compliant? Think Again.

Think Your Practice is HIPAA Compliant? Think Again. | HIPAA Compliance for Medical Practices | Scoop.it

You may think you know HIPAA inside and out, but experts say many practices and physicians are making mistakes regarding protected health information (PHI) that could get them into big trouble with the law. Here are nine of the most common compliance missteps they say practices and physicians are making.

1. TEXTING UNENCRYPTED PHI

For most physicians, texting is an easy, convenient, and efficient way to communicate with patients and colleagues. But if a text contains unencrypted PHI, that could raise serious HIPAA problems.


"One of the big things people are doing these days is texting PHI, and people seem to be ignoring the fact that text messages can be read by anyone, they can be forwarded to anyone, [and] they're not encrypted in any fashion when they reside on a telecommunications provider's server," says Jim Hook, director of consulting services at healthcare consulting firm The Fox Group, LLC. "People really need to understand that [short message service (SMS)] text messaging is inherently nonsecure, and it's noncompliant with HIPAA."


That's not to say that texting PHI is never appropriate, it just means that physicians must find a way to do so securely. While the privacy and security rules don't provide explicit text messaging guidelines, they do state that covered entities must have "reasonable and appropriate safeguards to protect the confidentiality, availability, and integrity of protected health information," says Michelle Caswell, senior director, legal and compliance, at healthcare risk-management consulting firm Clearwater Compliance, LLC. As a result, Caswell, who formerly worked for HHS' Office for Civil Rights, says physicians must consider, "What would I put on my [smart] phone to reasonably and appropriately safeguard that information?" Most likely, the answer will be a secure messaging service with encryption, she says, adding that many inexpensive solutions are available to providers.


2. E-MAILING UNENCRYPTED PHI

Similar to text messaging, many physicians are e-mailing unencrypted PHI to patients and colleagues. As Robert Tennant, senior policy adviser of government affairs for the Medical Group Management Association says, e-mailing is becoming ubiquitous in our society, and healthcare is no exception.


If your providers are e-mailing PHI, consider implementing a secure e-mail application; for instance, one that recognizes when content included in the e-mail contains sensitive information and therefore automatically encrypts the e-mail. Your practice could use the application to specify certain circumstances in which e-mails should be encrypted; such as the inclusion of social security numbers or credit card numbers. The application would then filter e-mails for that specified content, and when it finds that content, encrypt those e-mails automatically, says Caswell.


Another option is to use a secure e-mail application to set up filters to automatically encrypt e-mails sent with attachments, or encrypt e-mails when senders include a word like "sensitive" or "encrypt" in the subject line, she says. An added benefit of encrypting e-mail is if a potential breach occurs, like the theft of a laptop containing e-mails with PHI, that is not considered a reportable breach if the e-mails stored on the laptop are encrypted, says Tennant. "You don't need to go through all of the rigmarole in terms of reporting the breach to the affected individual, and ultimately, to the government," he says. "So it's sort of a get out of jail free card in that sense."


If your practice would rather prohibit the use of e-mail altogether, a great alternative might be a patient portal that enables secure messaging.


Finally, if patients insist on having PHI e-mailed to them despite the risks, get their permission in writing for you to send and receive their e-mails, says Tennant.


3. FAILING TO CONDUCT A RISK ANALYSIS

If your practice has not conducted a security risk analysis — and about 31 percent of you have not, according to our 2014 Technology Survey, Sponsored by Kareo — it is violating HIPAA. The security rule requires any covered entity creating or storing PHI electronically to perform one. Essentially, this means practices must go through a series of steps to assess potential risks and vulnerabilities to the confidentiality, integrity, and availability of their electronic protected health information (ePHI).


Though the security risk analysis requirement has been in place since the security rule was formally adopted in 2003, it's been pretty widely ignored by practices, says Hook. Part of the reason, he says, is lack of enforcement of the requirement until recently. Since conducting a security risk analysis is now an attestation requirement in the EHR incentive program, auditors are increasingly noting whether practices are in compliance.


4. FAILING TO UPDATE THE NPP

If your practice has not updated its Notice of Privacy Practices (NPP) recently, it could be violating HIPAA. The HIPAA Omnibus Rule requires practices to update these policies and take additional steps to ensure patients are aware of them, says Tennant.

Some of the required updates to the NPP include:


• Information regarding uses and disclosures that require authorization;

• Information about an individual's right to restrict certain disclosures of PHI to a health plan; and

• Information regarding an affected individual's right to be notified following a privacy or security breach.


In addition to updating the NPP, a practice must post it prominently in its facility and on the website, and have new patients sign it and offer a copy to them, says Tennant. "I'd say of every 10 practices, hospitals, dental offices I go into, nine of them don't have their privacy notice in the waiting room," he says.


5. IGNORING RECORD AMMENDMENT REQUESTS

Don't hesitate to take action when patients request an amendment to information in their medical records, cautions Cindy Winn, deputy director of consulting services at The Fox Group, LLC. Under the HIPAA Privacy Rule, patients have the right to request a change to their records, and providers must act on those requests within 60 days, she says.


If you disagree with a patient's requested change, you must explain, in writing, why you are not making the requested change, says Hook. Then, share that reasoning with the patient and store a copy of it in the patient's medical record, as well as a copy of the patient's written request for the amendment.


6. NOT PROVIDING ENOUGH TRAINING

The privacy and security rules require formal HIPAA education and training of staff. Though the rules don't provide detailed guidance regarding what training is required, Hook recommends training all the members of your workforce on policies and procedures that address privacy and security at the time of hire, and at least annually thereafter.


The HIPAA Security Rule also requires practices to provide "periodic security reminders" to staff, says Caswell, adding that many practices are unaware of this. Actions that might satisfy this requirement include sending e-mails to staff when privacy and security issues come up in the news, such as information about a recent malware outbreak; or inserting a regular "security awareness" column in staff e-newsletters.

Finally, be sure to document any HIPAA training provided to staff.


7. OVERCHARING FOR RECORD COPIES

With few exceptions, the privacy rule requires practices to provide patients with copies of their medical records when requested. It also requires you to provide access to the record in the form requested by the individual, if it is readily producible in that manner.


While practices can charge for copies of records, some practices may be getting into trouble due to the fee they are charging, says Tennant. "HIPAA is pretty clear that you can only charge a cost-based fee and most of those are set by the state, so most states have [limits such as] 50 cents a page up to maybe $1 a page ... but you can't charge a $50 handling fee or processing fee; at least it's highly discouraged," says Tennant.


To ensure you are following the appropriate guidelines when dealing with record copy requests, review your state's regulations and consult an attorney. Also, keep in mind that though the privacy rule requires practices to provide copies within 30 days of the request, some states require even shorter timeframes.


8. BEING TOO OPEN WITH ACCESS

If your practice does not have security controls in place regarding who can access what medical records and in what situations, it's setting itself up for a HIPAA violation. The privacy rule requires that only those who have a valid reason to access a patient's record — treatment purposes, payment purposes, or healthcare operations — should do so, says Caswell. "If none of those things exist, then a person shouldn't [access] an individual's chart."


Caswell says practices need to take steps to ensure that staff members do not participate in "record snooping" — inappropriately accessing a neighbor's record, a family member's record, or even their own record.


She recommends practices take the following precautions:


• Train staff on appropriate record access;

• Implement policies related to appropriate record access; and

• Run EHR audits regularly to determine whether inappropriate access is occurring.


9. RELEASING TOO MUCH INFORMATION

Similar to providing too much access to staff, some practices provide too much access to outside entities, says Caswell. For instance, they release too much PHI when responding to requests such as subpoenas for medical records, requests for immunization information from schools, or requests for information from a payer.


"If there's, say, for instance, litigation going on and an attorney says, 'I need the record from December 2012 to February 2014,' it is your responsibility to only send that amount of information and not send anything else, so sort of applying what's called the minimum necessary standard," says Caswell. "When you receive outside requests for PHI, pay close attention to the dates for which information is requested, as well as the specific information requested."

more...
No comment yet.
Scoop.it!

Data Breaches Expose Nearly 140 Million Records

Data Breaches Expose Nearly 140 Million Records | HIPAA Compliance for Medical Practices | Scoop.it

The latest report from the Identity Theft Resource Center (ITRC) reveals that there has been a total of 472 data breaches recorded through August 11, 2015, and more than 139 million records have been exposed. The annual total includes 21.5 million records exposed in the attack on the U.S. Office of Personnel Management in June and 78.8 million health care customer records exposed at Anthem in February.

A June report by cybersecurity firm Trustwave said that of the 574 hacking incidents and data breaches the company was asked to investigate in 2014, 43% came in the retail industry, 13% came from the food and beverage industry and 12% from the hospitality industry. More striking, perhaps: 81% of victims did not discover on their own that they had been hacked. In cases where a company discovers the attack on its own, it takes about two weeks to stop it. When companies do not run their own security programs, it takes more than five months to contain the breach.


E-commerce sites were compromised in 42% of attacks and point-of-sales systems were hit in 40%. The totals were up 7% and 13%, respectively, from 2013.


The total number of data breaches increased by six in the week, according to the ITRC. The business sector accounts for about 645,000 exposed records in 184 incidents so far in 2015. That represents 39% of the incidents, but just 0.5% of the exposed records.


The medical/health care sector posted the second-largest percentage of the total breaches so far this year, 35.6% (168) out of the total of 472. The number of records exposed in these breaches totaled 109.5 million, or 78.6% of the total so far in 2015.


The number of banking/credit/financial breaches totals 45 for the year to date and involves more than 411,000 records, some 9.7% of the total number of breaches and 0.3% of the records exposed. These numbers are unchanged from the prior week.


The government/military sector has suffered 36 data breaches so far this year, just 7.7% of the total, but about 20% of the total number of records exposed. These numbers were also unchanged from the prior week.


The educational sector has seen 39 data breaches in 2015, accounting for 8.3% of all breaches for the year. Nearly 740,000 records have been exposed, about 0.5% of the total so far in 2015.

In all of 2014, ITRC tracked an annual record number of 783 data breaches, up 27.5% year over year. The previous high was 662 breaches in 2010. Since beginning to track data breaches in 2005, ITRC had counted 5,497 breaches through August 11, 2015, involving more than 818 million records. Compared with 2014, the number of data breaches is about 2.3% lower to date in 2015.

more...
No comment yet.
Scoop.it!

How Do HIPAA Regulations Affect Judicial Proceedings?

How Do HIPAA Regulations Affect Judicial Proceedings? | HIPAA Compliance for Medical Practices | Scoop.it

HIPAA regulations are designed to keep healthcare organizations compliant, ensuring that sensitive data - such as patient PHI - stays secure. Should a healthcare data breach occur, covered entities or their business associates will be held accountable, and will likely need to make adjustments to their data security approach to prevent the same type of incident from happening again.


However, there are often questions and concerns in how HIPAA regulations tie into certain judicial or administrative proceedings. For example, if there is a subpoena or search warrant issued to a hospital, is that organization obligated to supply the information? What if the information being sought qualifies as PHI? Can covered entities be held accountable if they release certain information, and then that data falls into unauthorized individuals’ control?


This week, HealthITSecurity.com will break down how judicial proceedings, and other types of legal action, could potentially be impacted by HIPAA regulations. We will discuss how PHI could possibly be disclosed, and review cases where search warrants and similar issues were affected by HIPAA.


What does HIPAA say about searches and legal inquiries?

The HIPAA Privacy Rule states that there are several permitted uses and disclosures of PHI. This does not mean that covered entities are required to disclose PHI without an individual’s permission, but healthcare organizations are permitted to do so under certain circumstances.


“Covered entities may rely on professional ethics and best judgments in deciding which of these permissive uses and disclosures to make,” the Privacy Rule explains.


The six examples of permitted uses and disclosures are the following:

  • To the Individual (unless required for access or accounting of disclosures)
  • Treatment, Payment, and Health Care Operations
  • Opportunity to Agree or Object
  • Incident to an otherwise permitted use and disclosure
  • Public Interest and Benefit Activities
  • Limited Data Set for the purposes of research, public health or health care operations.


Under the public interest and benefit activities, the Privacy Rule dictates that there are “important uses made of health information outside of the healthcare context.” Moreover, a balance must be found between individual privacy and the interest of the public.

There are several examples that relate to disclosing PHI due to types of legal action:


  • Required by law
  • Judicial and administrative proceedings
  • Law enforcement purposes


Covered entities and their business associates are permitted to disclose PHI as required by statute, regulation or court orders.

“Such information may also be disclosed in response to a subpoena or other lawful process if certain assurances regarding notice to the individual or a protective order are provided,” according to the HHS website.


For “law enforcement purposes” HIPAA regulations state that PHI can also be disclosed to help identify or locate a suspect, fugitive, material witness, or missing person. Law enforcement can also make requests for information if they are trying to learn more information about a victim - or suspected victim. Another important aspect to understand is that a covered entity can can disclose sensitive information if it believes that PHI is evidence of a crime that took place on the premises. Even if the organization does not think that a crime took place on its property, HIPAA regulations state that PHI can disclosed “when necessary to inform law enforcement about the commission and nature of a crime, the location of the crime or crime victims, and the perpetrator of the crime.”


Essentially, covered entities and business associates must use their own judgement when determining if it is an appropriate situation to release PHI without an individual’s knowledge. For example, if local law enforcement want more information from a hospital about a former patient whom they believe is dangerous, it is up to the hospital to weigh the options of releasing the information.

How have HIPAA regulations affected court rulings?

There have been several court rulings in the last year discussing HIPAA regulations and how covered entities are allowed to release PHI.


Connecticut: The Connecticut Supreme Court ruled in November 2014 that patients can sue a medical office for HIPAA negligence if it violates regulations that dictate how healthcare organizations must maintain patient confidentiality. In that case, a patient found out that she was pregnant in 2004 and asked her medical facility to not release the medical information to the child’s father. However, the organization released the patient’s information when it received a subpoena. The case claimed that the medical office was negligent in releasing the information, and that the child’s father used the information  for “a campaign of harm, ridicule, embarrassment and extortion” against the patient.


Florida: Just one month earlier, a Florida federal appeals court ruled that it is not a HIPAA violationfor physician defendants to have equal access to plaintiffs’ health information. In this case, a patient sued his doctor for medical negligence. Florida law states that the plaintiff must provide a health history, including copies of all medical records the plaintiff’s experts relied upon in forming their opinions and an “executed authorization form” permitting the release of medical information. However, the plaintiff claimed the move would violate his privacy. The appeals court ruled that two instances applied in this case where HIPAA regulations state that covered entities are permitted to release PHI.


As demonstrated in these two court cases, it is not always easy for covered entities to necessarily determine on their own when they are compromising patient privacy and when they are adhering to a court order. However, by seeking appropriate counsel, healthcare organizations can work on finding a solution that meets the needs of all parties involved.

more...
No comment yet.
Scoop.it!

Data Breaches, Lawsuits Inescapable, but Liability Can Be Mitigated

Data Breaches, Lawsuits Inescapable, but Liability Can Be Mitigated | HIPAA Compliance for Medical Practices | Scoop.it

If your organization experiences a data breach—an increasingly likely scenario—and PHI is exposed, chances are you will be hit with a lawsuit in short order.

There's not much you can do about that, just like it's impossible to prevent every criminal attack. What you can do, though, is take steps to minimize the likelihood of being found liable for damages in court, says Reece Hirsch, Esq., a partner and regulatory attorney at Morgan Lewis in San Francisco, and a BOH editorial advisory board member.

Hirsch says companies should have two things in place as part of standard policy and procedure: an evolving breach response plan and an incident response team that meets on a regular basis. While class-action suits haven't gained much traction with judges yet—except in cases of clear financial damage to consumers—most of the claims boil down to some form of alleged negligence, he says.

"Given the increasingly sophisticated cyberthreats that companies face … you cannot have perfect security and you cannot completely insulate yourself from these types of events, but what you can do is show you acted reasonably and took reasonable measures to prevent a breach and not make yourself a target," Hirsch says.

Organizations demonstrate this with a good breach response plan to show they've identified the problem, mitigated damage, notified victims, and taken further action as necessary, he says. The team should represent each department that might be affected by a breach or that has to be mobilized to interact with the public, including legal, human resources, privacy, security, IT, communications, and investor relations. Part of the team's role is to analyze risks to data, data flow, and worst-case scenarios.

"Everything needs to be encrypted, data at rest as well as data in transit, which is something HIPAA specifically points out," says Jan McDavid, Esq., the compliance officer and general legal counsel at HealthPort, an Atlanta-based healthcare services firm. McDavid, who is a regular speaker on this subject, agrees that it's essential to have proper security policies as well as dedicated staff to regularly review systems and respond to incidents.

Comprehensive risk analyses, which HIPAA requires, should not just be done after a breach to assess the extent of damages after private data is "let out the door," she says, but up front as well to identify the risks. Inevitably, though, healthcare organizations with large electronic databases will likely experience a data breach.


"Once [companies] are put on notice that something has happened, they need to immediately stop the bleeding," McDavid says. Even though public breach notification may not be required on day one, the company should immediately shut off or fix whatever happened so it can't occur again, she says.

One of the issues she sees often is that as healthcare organizations struggle to keep pace with technology, security is affected too. In the rush to automation and interoperability with limited funds available, parts of older systems and databases may get upgraded and replaced, but in the process, new vulnerabilities may be created, McDavid says. It seems organizations don't always realize how their systems interact, leading them to overlook peripheral connections that may allow access to protected systems, she adds.

Federal legislation that called for providers to implement EHRs didn't contain the funding to help facilities make the switch—those incentives came later. Many of the hospitals McDavid works with have a hodgepodge of computer systems that were installed piecemeal as the hospitals received technology funding, and that may inadvertently lead to vulnerabilities.

Taking proactive measures to have strong security policies, plans, and personnel in place goes a long way toward mitigating company liability in a class-action suit, Hirsch and McDavid say.

Lawsuits may be unavoidable


"If people are going to sue you, they're going to sue you," Hirsch says. "But [proactive preparation] will position the company much better to defend the lawsuit." And even more importantly, he adds, it may deflect some of the greatest damage to a company's reputation and image, which occurs in the "court of public opinion" and in news media reports.

McDavid agrees. "Their name becomes mud when the news is out that they've had a major breach," she says, although she believes the public has become oversaturated with the plethora of recent breaches in the news to the point that such incidents are no longer viewed as alarming or unusual.

Chris Apgar, CISSP, president of Apgar & Associates, LLC, in Portland, Oregon, and a BOH editorial advisory board member, says the breach announced by Anthem, Inc., in February 2015 actually offers a good example of how to take the right approach to a data breach.

Apgar doesn't believe the health insurer took a big hit to its reputation because it acted relatively quickly to put security experts on the case and notify consumers and law enforcement authorities about the breach as required by HIPAA security regulations. In addition, he says, Anthem had relatively good security protections; however, those protections could only slow down a sufficiently skilled hacker, not stop the breach from occurring.


By comparison, Apgar says the class-action suits against Community Health Systems, Inc., are for actual negligence in responding to a known security vulnerability. The Franklin, Tennessee-based company announced hackers accessed data of 4.5 million individuals who were referred to or received care from physicians affiliated with its system over the last five years, according to an August 18, 2015, filing with the U.S. Securities and Exchange Commission.


Anthem disclosed on February 4 that it uncovered a massive breach affecting 80 million people that had occurred two months earlier. Less than 12 hours later, an Indianapolis attorney was already filing a class-action suit against the health insurer for failure to secure customers' data, negligence, breach of contract, and failure to notify victims in a timely manner.

In the days and weeks that followed, the class-action suits started to pile up across the country—dozens of complaints argued Anthem was lax in securing members' personal data, which wasn't encrypted. Plaintiffs argued Anthem only implemented reasonable security measures after it discovered the breach January 29—more than a month after the incident occurred.

Even if it were eventually proven in court that Anthem didn't follow industry best practices to secure data or that the breach was due to negligence, the bigger question is whether the plaintiffs can demonstrate harm as a result, Apgar says.

Building up case law


Currently, legal precedent favors the defendants, but that's an evolving process too.

McDavid explains there is no established federal law that stipulates companies are liable for damages just because they experienced a data breach that exposed clients' or patients' personal information.

That's where class-action attorneys enter the picture, she says. They're trying to make case law by obtaining favorable court opinions to set a legal precedent, but it's an uphill battle, she says. Under many federal and state laws, victims have to prove they were harmed in order to win damages.

"In the majority of cases now, the courts are ruling that you cannot certify a class unless you can prove the class has damages," McDavid says. "What that means is that even if you've breached 2 million records, if you don't have any notice that any of that [data] has been misused, then in most courts right now you have no damages."

In April, a federal judge dismissed a class-action suit against Horizon Blue Cross Blue Shield of New Jersey, ruling the plaintiffs didn't demonstrate they suffered financial harm. Two company laptop computers were stolen in 2013 from the health insurer's Newark headquarters, and nearly 840,000 customers' personal information was potentially exposed.


McDavid also points to a May Pennsylvania case where a county judge dismissed a suit from 62,000 employees of the University of Pittsburgh Medical Center following a criminal breach of the hospital's payroll database. Several hundred employees were victims of tax fraud, but the judge ruled the plaintiffs didn't prove that they were all financially harmed, that the medical center was negligent in its actions, or that there was any contract holding the university liable for security breaches.

What usually happens, Hirsch explains, is that the parties reach a settlement outside of court, and that's where many of the large payouts to affected consumers or patients happen.

Finding other ways in


It's becoming increasingly common, however, for class-action attorneys to file suit for violations of state privacy and security laws or various other federal statutes, which may contain stronger protections than HIPAA, McDavid says. Arguments under those laws have been more successful at convincing courts that the victims still have legal standing to sue even if they haven't experienced actual harm.

Apgar notes that 2010 contained an early example of this, when the Connecticut Attorney General's office sued Health Net of Connecticut in federal court for violations of HIPAA and state privacy protections regarding personal data. The attorney general's office alleged the health insurer failed to secure PHI and financial information prior to a 2009 data breach in which a computer disk drive was lost that contained unencrypted records on more than 500,000 Connecticut residents and 1.5 million consumers nationwide. Health Net also allegedly delayed notifying plan members and law enforcement authorities until several months after it discovered the breach.

Ultimately, the company agreed to a settlement that included the following:

  • Extended credit monitoring for affected plan members
  • Increased identity theft insurance and reimbursement for security freezes
  • An internal corrective action plan for stronger security measures
  • A $250,000 state fine
  • A $500,000 contingent payment to the state if it was established that affected individuals later became victims of identity theft or fraud


This was the first legal action taken by an attorney general since the HITECH Act in 2009 authorized state attorney generals to enforce violations of HIPAA.

Federal laws, such as the Fair Credit Reporting Act (FCRA), are also becoming an avenue for class-action attorneys. Hirsch says although it's not related to healthcare, one case winding its way through the U.S. Supreme Court—Spokeo, Inc. v. Robins—could change the legal landscape if the nation's highest court issues an opinion against the online company.

In February 2014, federal appellate judges for the 9th Circuit reversed a district court ruling that had originally dismissed plaintiff Thomas Robins' class-action suit alleging willful violations of the FCRA. He claimed Spokeo, an online information gathering service, published and marketed inaccurate personal information about him on its website, which he had no control over. While not claiming actual financial damages, he argued that since he was unsuccessful in securing employment, he was concerned the inaccurate report was affecting his ability to obtain employment, insurance, credit, etc.

The appellate panel found Robins did have constitutional standing to sue under the FCRA. This speaks to the same issues that are raised by victims of healthcare data breaches, who worry they will suffer financial harm from the exposure of their PHI, Hirsch says. Large technology companies urged the Supreme Court to take up an appeal of the 2014 decision, fearing it could cripple the industry by paving the way for billions of dollars in damages to consumers, he says.

In addition, there's another federal healthcare data breach suit—Smith, et al. v. Triad of Alabama—making a case for violations under the FCRA that will have big implications if the court finds the plaintiffs have legal standing for a class-action suit, McDavid says.

"They can keep it in court if the judge buys into their theory that they don't have to have damages in order to sue," she says.

more...
Jan Vajda's curator insight, August 13, 2015 9:44 AM

přidejte svůj pohled ...

Scoop.it!

Hospital Slammed With $218,000 HIPAA Fine

Hospital Slammed With $218,000 HIPAA Fine | HIPAA Compliance for Medical Practices | Scoop.it

Federal regulators have slapped a Boston area hospital with a $218,000 HIPAA penalty after an investigation following two security incidents. One involved staff members using an Internet site to share documents containing patient data without first assessing risks. The other involved the theft of a worker's personally owned unencrypted laptop and storage device.


The Department of Health and Human Services' Office for Civil Rights says it has entered a resolution agreement with St. Elizabeth's Medical Center that also includes a "robust" corrective action plan to correct deficiencies in the hospital's HIPAA compliance program.

The Brighton, Mass.-based medical center is part of Steward Health Care System.


Privacy and security experts say the OCR settlement offers a number of valuable lessons, including the importance of the workforce knowing how to report security issues internally, as well as the need to have strong policies and procedures for safeguarding PHI in the cloud.

Complaint Filed

On Nov. 16, 2012, OCR received a complaint alleging noncompliance with the HIPAA by medical center workforce members. "Specifically, the complaint alleged that workforce members used an Internet-based document sharing application to store documents containing electronic protected health information of at least 498 individuals without having analyzed the risks associated with such a practice," the OCR statement says.


OCR's subsequent investigation determined that the medical center "failed to timely identify and respond to the known security incident, mitigate the harmful effects of the security incident and document the security incident and its outcome."


"Organizations must pay particular attention to HIPAA's requirements when using internet-based document sharing applications," says Jocelyn Samuels, OCR director in the statement. "In order to reduce potential risks and vulnerabilities, all workforce members must follow all policies and procedures, and entities must ensure that incidents are reported and mitigated in a timely manner."


Separately, on Aug. 25, 2014, St. Elizabeth's Medical Center submitted notification to OCR regarding a breach involving unencrypted ePHI stored on a former hospital workforce member's personal laptop and USB flash drive, affecting 595 individuals. The OCR "wall of shame" website of health data breaches impacting 500 or more individuals says the incident involved a theft.

Corrective Action Plan

In addition to the financial penalty - which OCR says takes into consideration the circumstances of the complaint and breach, the size of the entity, and the type of PHI disclosed - the agreement includes a corrective action plan "to cure gaps in the organization's HIPAA compliance program raised by both the complaint and the breach."

The plan calls for the medical center to:


  • Conduct a "self-assessment" of workforce members' familiarity and compliance with the hospital's policies and procedures that address issues including transmission and storage of ePHI;
  • Review and revise policies and procedures related to ePHI; and
  • Revise workforce training related to HIPAA and protection of PHI.


Lessons Learned

Other healthcare organizations and their business associates need to heed some lessons from OCR's latest HIPAA enforcement action, two compliance experts say.


Privacy attorney Adam Greene of the law firm Davis Wright Tremaine notes: "The settlement indicates that OCR first learned of alleged noncompliance through complaints by the covered entity's workforce members. Entities should consider whether their employees know how to report HIPAA issues internally to the privacy and security officers and ensure that any concerns are adequately addressed. Otherwise, the employees' next stop may be complaining to the government."

The settlement also highlights the importance of having a cloud computing strategy, Greene points out. That strategy, he says, should include "policies, training and potential technical safeguards to keep PHI off of unauthorized online file-sharing services."


The enforcement action spotlights the continuing challenge of preventing unencrypted PHI from ending up on personal devices, where it may become the subject of a breach, he notes.


The case also sheds light on how OCR evaluates compliance issues, he says. "The settlement highlights that OCR will look at multiple HIPAA incidents together, as it is not clear that OCR would have entered into a settlement agreement if there had only been the incident involving online file sharing software, but took action after an unrelated second incident involving PHI ending up on personal devices."


Privacy attorney David Holtzman, vice president of compliance at security consulting firm CynergisTek, says the settlement "serves as an important reminder that a covered entity or a business associate must make sure that the organization's risk assessment takes into account any relationship where PHI has been disclosed to a contractor or vendor so as to ensure that appropriate safeguards to protect the data are in place."


The alleged violations involving the document sharing vendor, he says, "involve failure to have a BA agreement in place prior to disclosing PHI to the vendor, as well as failing to have appropriate security management processes in place to evaluate when a BA agreement is needed when bringing on a new contractor that will handle PHI."

St. Elizabeth's Medical Center did not immediately respond to an Information Security Media Group request for comment.

Previous Settlements

The settlement with the Boston-area medical center is the second HIPAA resolution agreement signed by OCR so far this year. In April, the agency OK'd an agreement with Cornell Prescription Pharmacyfor an incident related to unsecure disposal of paper records containing PHI. In that agreement, Cornell was fined $125,000 and also adopted a corrective action plan to correct deficiencies in its HIPAA compliance program.


The settlement with St. Elizabeth is OCR's 25th HIPAA enforcement action involving a financial penalty and/or resolution agreement that OCR has taken since 2008.


But privacy advocate Deborah Peel, M.D., founder of Patient Privacy Rights, says OCR isn't doing enough to crack down on organizations involved in HIPAA privacy breaches.


"Assessing penalties that low - St. Elizabeth will pay $218,400 - guarantees that virtually no organizations will fix their destructive practices," she says. "Industry views low fines as simply a cost of doing business. They'll take their chances and see if they're caught."

The largest HIPAA financial penalty to date issued by OCR was a $4.8 million settlement with New York-Presbyterian Hospital and Columbia University for incidents tied to the same 2010 breach that affected about 6,800 patients. The incidents involved unsecured patient data on a network.

more...
No comment yet.
Scoop.it!

Orlando Health reports data breach for 3,200 patients

Orlando Health reports data breach for 3,200 patients | HIPAA Compliance for Medical Practices | Scoop.it

Orlando Health said Thursday about 3,200 patients’ records were accessed illegally by one of its employees, who was fired during an investigation.



The hospital system said it discovered the data breach on May 27. A news release on Thursday, July 2, said it began notifying patients “today”, which would be more than 30 days after the breach.



According to the release, there was no evidence that the data was copied or used illegally, but Orlando Health reported the incident in accordance with its data breach policies.


Under Florida law, notice to victims of a data breach is required within 30 days, unless the custodian of records has determined that nobody suffered identity theft or any other financial harm.


The records included certain patients at Winnie Palmer Hospital for Women & Babies, Dr. P. Phillips Hospital and a limited number of patients treated at Orlando Regional Medical Center from January 2014 to May 2015.


Theft of patient information at health-related companies is one of the primary ways that tax refund fraud has been occurring in Florida, according to federal authorities. Thieves can use the information to submit a fake tax return in your name, claiming refunds that could prevent or delay a legitimate refund.


In the Orlando Health incident, stolen data may have included names, dates of birth, addresses, medications, medical tests and results, the last four digits of social security numbers, and other clinical information. The former employee may have also accessed insurance information in approximately 100 of those patient records.


Steve Stallard, corporate director for compliance and information security said in a statement that Orlando Health “deeply regrets any concern or inconvenience this may cause our patients or their family members.”


The organization is providing affected patients with call center and other support, the news release said.


Orlando Health has reported other data breaches, such as a March 2014 incident where over 500 child patient records were misplaced.

more...
No comment yet.
Scoop.it!

Website Error Leads to Data Breach

Website Error Leads to Data Breach | HIPAA Compliance for Medical Practices | Scoop.it

An error in a coding upgrade for a Blue Shield of California website resulted in a breach affecting 843 individuals. The incident is a reminder to all organizations about the importance of sound systems development life cycle practices.


In a notification letter being mailed by Blue Shield of California to affected members, the insurer says the breach involved a secure website that group health benefit plan adminstrators and brokers use to manage information about their own plans' members. "As the unintended result of a computer code update Blue Shield made to the website on May 9," the letter states, three users who logged into their own website accounts simultaneously were able to view member information associated with the other users' accounts. The problem was reported to Blue Shield's privacy office on May 18.


Blue Shield of California tells Information Security Media Group that the site affected was the company's Blue Shield Employer Portal. "This issue did not impact Blue Shield's public/member website," the company says. When the issue was discovered, the website was promptly taken offline to identify and fix the problem, according to the insurer.


"The website was returned to service on May 19, 2015," according to the notification letter. The insurer is offering all impacted individuals free credit monitoring and identity theft resolution services for one year.


Exposed information included names, Social Security numbers, Blue Shield identification numbers, dates of birth and home addresses. "None of your financial information was made available as a result of this incident," the notification letter says. "The users who had unauthorized access to PHI as a result of this incident have confirmed that they did not retain copies, they did not use or further disclose your PHI, and that they have deleted, returned to Blue Shield, and/or securely destroyed all records of the PHI they accessed without authorization."


The Blue Shield of California notification letter also notes that the company's investigation revealed that the breach "was the result of human error on the part of Blue Shield staff members, and the matter was not reported to law enforcement authorities for further investigation."

Similar Incidents

The coding error at Blue Shield of California that led to the users being able to view other individuals' information isn't a first in terms of programming mistakes on a healthcare-sector website leading to privacy concerns.


For example, in the early weeks of the launch of HealthCare.gov in the fall of 2013, a software glitch allowed a North Carolina consumer to access personal information of a South Carolina man. The Department of Health and Human Services' Centers for Medicare and Medicaid Services said at the time that the mistake was "immediately" fixed once the problem was reported. Still, the incident raised more concerns about the overall security of the Affordable Care Act health information exchange site.


Software design and coding mistakes that leave PHI viewable on websites led to at least one healthcare entity paying a financial penalty to HHS' Office for Civil Rights.


An OCR investigation of Phoenix Cardiac Surgery P.C., with offices in Phoenix and Prescott, began in February 2009, following a report that the practice was posting clinical and surgical appointments for its patients on an Internet-based calendar that was publicly accessible.

The investigation determined the practice had implemented few policies and procedures to comply with the HIPAA privacy and security rules and had limited safeguards in place to protect patients' information, according to an HHS statement. The investigation led to the healthcare practice signing an OCR resolution agreement, which included a corrective action plan and a $100,000 financial penalty.


The corrective action plan required the physicians practice, among other measures, to conduct arisk assessment and implement appropriate policies and procedures.

Measures to Take

Security and privacy expert Andrew Hicks, director and healthcare practice lead at the risk management consulting firm Coalfire, says that to avoid website-related mistakes that can lead toprivacy breaches, it's important that entities implement appropriate controls as well as follow the right systems development steps.


"Organizations should have a sound systems development life cycle - SDLC - in place to assess all systems in a production environment, especially those that are externally facing," he says. "Components of a mature SDLC would include code reviews, user acceptance testing, change management, systems analysis, penetration testing, and application validation testing."


Healthcare entities and business associates need to strive for more than just HIPAA compliance to avoid similar mishaps, he notes.

"Organizations that are solely seeking HIPAA compliance - rather than a comprehensive information security program - will never have the assurance that website vulnerabilities have been mitigated through the implementation of appropriate controls," he says. "In other words, HIPAA does not explicitly require penetration testing, secure code reviews, change management, and patch management, to name a few. These concepts are fundamental to IT security, but absent from any OCR regulation, including HIPAA."

Earlier Blue Shield Breach

About a year ago, Blue Shield of California reported a data breach involving several spreadsheet reports that inadvertently contained the Social Security numbers of 18,000 physicians and other healthcare providers.


The spreadsheets submitted by the plan were released 10 times by the state's Department of Managed Health Care. In California, health plans electronically submit monthly to the state agency a roster of all physicians and other medical providers who have contracts with the insurers. Those rosters are supposed to contain the healthcare providers' names, business addresses, business phones, medical groups and practice areas - but not Social Security numbers. DMHC makes those rosters available to the public, upon request.

more...
No comment yet.
Scoop.it!

4 HIPAA compliance areas your BAs must check

4 HIPAA compliance areas your BAs must check | HIPAA Compliance for Medical Practices | Scoop.it

It finally looks like the feds are starting up the next phase of HIPAA audits — but there’s still time to ensure your business associates (BAs) are staying compliant. 


In preparation of the next round of audits, the Department of Health and Human Services’ (HHS) Office for Civil Rights (OCR) has begun sending out pre-audit surveys to randomly selected providers, according to healthcare attorneys from the law firm McDermot, Will and Emory.

Originally, the surveys were meant to go out during the summer of 2014, but technical improvements and leadership transitions put the audits on hold until now.

Moving toward Phase 2

The OCR has sent surveys asking for organization and contact information from a pool of 550 to 800 covered entities. Based on the answers it receives, the agency will pick 350 for further auditing, including 250 healthcare providers.

The Phase 2 audits will primarily focus on covered entities’ and their BAs’ compliance with HIPAA Privacy, Security and Breach Notification standards regarding patients’ protected health information (PHI).

Since most of the audits will be conducted electronically, hospital leaders will have to ensure all submitted documents accurately reflect their compliance program since they’ll have minimal contact with the auditors.

4 vendor pitfalls

It’s not clear yet to what extent the OCR will evaluate BAs in the coming audits due to the prolonged delay. However, there are plenty of other good reasons hospital leaders need to pay attention to their vendors’ and partners’ approaches to HIPAA compliance and security.


Why?


Mainly because a lot of BAs aren’t 100% sure what HIPAA compliance entails, and often jeopardize patients’ PHI, according to Chris Bowen, founder and chief privacy and security officer at a cloud storage firm, in a recent HealthcareITNews article.


A large number of data breaches begin with a third party, so it’s important hospital leaders keep their BAs accountable by ensuring they regularly address these five areas:


  • Risk Assessments. As the article notes, research has shown about a third of IT vendors have failed to conduct regular risk analysis on their physical, administrative and technical safeguards. Ask your vendors to prove they have a risk analysis policy in place, and are routinely conducting these kinds of evaluations.
  • System activity monitoring. Many breaches go unnoticed for months, which is why it’s crucial your BAs have continuous logging, keep those logs protected and regularly monitor systems for strange activity.
  • Managing software patches. Even the feds can struggle with this one, as seen in a recent HHS auditon the branches within the department. Keeping up with security software patches as soon as they’re released is an important part of provider and BA security. Decisions about patching security should also be documented.
  • Staff training. Bowen recommends vendors include training for secure development practices and software development lifecycles, in addition to the typical General Security Awareness training that HIPAA requires.
more...
No comment yet.
Scoop.it!

HIPAA Could Hurt, Not Help, Data Privacy and Security

HIPAA Could Hurt, Not Help, Data Privacy and Security | HIPAA Compliance for Medical Practices | Scoop.it

By now, you have probably heard about the theft of more than 14 million dossiers on federal employees and the theft of the personal health information (PHI) of 80 million people from Anthem-Blue Cross. You may not have heard about many of the other computer security flaws and breaches that are reported almost daily.

Here are a few from the last couple of weeks:


• A vulnerability in Samsung's Android keyboard installed on over 600m devices worldwide could allow hackers to take full control of the smartphone or tablet.


• Security researchers have uncovered a flaw in the way thousands of popular mobile applications store data online, leaving users' personal information, including passwords, addresses, door codes, and location data, vulnerable to hackers.


• Macs older than a year are vulnerable to exploits that remotely overwrite the firmware that boots up the machine, a feat that allows attackers to control vulnerable devices from the very first instruction.


• Professor Phil Koopman , an expert who testified at one of the Toyota "sticking throttle" trials, detailed a myriad of defects in the software of the throttle control system and in Toyota's software development process. Michael Barr, another expert, cited a heavily redacted report that suggests the presence of at least 243 violations of the " Power of 10—Rules for Developing Safety Critical Code," published in IEEE Computer in 2006 by NASA team member Gerard Holzmann.


• The Boeing 787 aircraft's electrical power control units shut down if powered without interruption for 248 days. As a result, the FAA is telling the airlines they have to do a maintenance reboot of their planes every 120 days.


I've always assumed, as I imagine that you have, that, if any organizations could be expected to use "best practices" and thereby avoid flaws and breaches, it would be Anthem, the feds, Google, Samsung, Apple, Boeing, and Toyota. The only reasonable conclusion is that impenetrable, flaw-free systems are simply not possible and this will not change any time soon. Keep that in mind during the upcoming discussion.


The government, at the behest of lawmakers, loves to tell people what to do. Feasibility and relevance are annoying details =best dispensed with. Even vocal conservatives and libertarians, who should be staying out of other people's business on principle, love to tell people what to do. These folks got together in 1996 and enacted HIPAA (in full disclosure, I testified before a congressional subcommittee on this bill before it was enacted).


Among other things HIPAA tells people what to do about privacy and security of patient data, but without much evidence that they needed telling.


I always wondered:


1. Were privacy and security a huge, out-of-control problem before HIPAA?


2. What was the evidence that existing laws regarding inappropriate release of PHI were not sufficient to induce people to exercise due diligence? If they were adequate, were they being enforced? If they were inadequate could they not have been strengthened?


3. Has HIPAA helped?


4. Do billions of signed statements acknowledging privacy policies actually protect anyone's privacy?


5. If there was an incremental improvement as a result of HIPAA, was it worth the billions that have been spent?


6. Do the penalties reduce the chances of a breach?


7.  And finally, is there any chance that the technical measures that are demanded will be effective, given the state of the art.


The approach to the first six questions has basically been one of "don't ask, don't tell," so we will never be able to judge whether the whole thing was worth the trouble or not. The answer to the last question, based on the material presented in the introduction, is: No. The technical expectations embodied in HIPAA are little more than someone's dream. There is no evidence that even the most capable, best resourced organizations in the country are capable of satisfying them (that doesn't mean they shouldn't try). A great deal of time and money could be saved or redirected to patient care if a more realistic approach was taken toward privacy and security. The magnitude and prevalence of breaches has been growing steadily. As it stands, HIPAA may actually harmful because it distracts attention and diverts resources away from those actions that might actually improve privacy and security.

more...
No comment yet.
Scoop.it!

State agency HIPAA security gaffe puts patient data on the Internet

State agency HIPAA security gaffe puts patient data on the Internet | HIPAA Compliance for Medical Practices | Scoop.it

A Texas state agency has come forward to notify its Medicaid recipients that due to security shortfalls, their Social Security numbers and protected health information became accessible on the Internet.


The Texas Department of Aging and Disability Services, a state agency responsible for administering support and services for the aging individuals and people with disabilities, announced June 11 a data breach following the "unintentional release" of personal data. The breach impacted 6,600 of its Medicaid recipients, state officials said, including the compromise of their names, dates of birth, addresses, Social Security numbers, Medicaid numbers and clinical diagnoses and treatment information.


According to the agency notice, the department was notified that patient information was available via the Internet April 21, 2015. Officials provided no additional details on the incident. As of publication time, they had not responded toHealthcare IT News' inquiries around details of what occurred and whether a third-party vendor was involved.


In the notice, there were no apologies issued from department officials over the incident, but they did indicate they had "strengthened" Web-app security and policies "in an effort to prevent such a breach from occurring again."


To date, nearly 135 million people have had their protected health information compromised in reportable HIPAA breaches, according to data from the Office for Civil Rights, the HHS division responsible for enforcing HIPAA. In this tally, only HIPAA breaches involving 500 or more individuals are counted.


In Texas, specifically, since the HIPAA breach notification rule went into effect in 2009, nearly 3.6 million people have had their protected health information compromised. One of the biggest HIPAA violators in the state has been the University of Texas MD Anderson Cancer Center, with officials reporting three HIPAA breaches since 2012, impacting nearly 35,000 individuals.


The HealthTexas Provider Network, which is affiliated with Baylor Scott & White Health, has also reported three HIPAA breaches since 2011, including a case of hacking, unauthorized access and theft of an unencrypted laptop.

more...
Cameron's curator insight, July 2, 2015 5:59 PM

The article involves a situation where health information was leaked into the Internet due to security breaches in the system of a Texas cancer treatment facility. 

Medical information is something I believe a lot of healthy, less frequent doctor visitors, and everyone forgets about and is one of the most identifiable things when it comes to finding out who someone is. The hack that happened in Texas caused Social Security numbers to be leaked as well as patient diagnoses and treatment information. Social Security information in the wrong hands can ultimately ruin lives. Once your identity in the most technical form, such as your Social Security Number, are stolen there is not much you can do to get it back. In health communications class we have seen the dangerous amount of information that insurance and medical facilities have on people. The Internet just turns it into an even bigger sea to fish your personal data out of. 

Scoop.it!

Bill Would Clarify HIPAA Privacy Rules

Bill Would Clarify HIPAA Privacy Rules | HIPAA Compliance for Medical Practices | Scoop.it

Rep. Doris Matsui (D-Calif.), a member of the House Energy and Commerce Health Subcommittee, has introduced legislation to “elevate and formalize agency guidance on HIPAA privacy rules,” particularly as it pertains to patients with mental illness.


The Including Families in Mental Health Recovery Act (H.R. 2690) has been endorsed by the American Psychological Association, American Psychiatric Association, National Council for Behavioral Health, and National Disability Rights Network, among other stakeholder organizations.


“Healthcare providers and administrators have long lacked clarity on HIPAA rules and thus have been cautious to share information with family members and caregivers of patients,” said Matsui in a written statement. “This lack of clarity creates significant challenges to patients, their doctors and family. Sharing the right information with the right family and caregivers can help a patient, while still protecting their privacy. Assisting family and caregivers with being involved in a patient’s care can be of the utmost importance, and can even mean life or death.”


Although the Department of Health and Human Services’ Office of Civil Rights issued guidance in February 2014 on the topic, Matsui argues that “better understanding and awareness of the guidance will give providers the confidence to practice discretion in delicate situations, to best determine whether it is in an individual’s best interest to share information with family members and caregivers on a case-by-case basis.”


According to John Snook, executive director of the Treatment Advocacy Center,HIPAA privacy rules were never intended to prevent people from receiving necessary medical care. “But we hear from families every day who are kept in the dark about their loved one’s treatment because of confusion and uncertainty around the requirements of HIPAA,” Snook said. “We applaud Congresswoman Matsui for seeking a solution that will safeguard necessary confidentiality while ensuring families can share and receive critical information during a psychiatric crisis.”


Rusty Selix,executive director of the Mental Health Association of California and executive director of the California Council of Community Mental Health Agencies, added: “Government officials, healthcare providers and administrators have long lacked clarity and thus have been cautious to allow sharing of information that they fear might violate the HIPAA privacy law in regards to sharing health information with family members and caregivers of patients. This legislation will provide the education to eliminate the lack of clarity and give providers and administrators the confidence to share information with family members whose support is needed in crisis situations.” 

more...
No comment yet.
Scoop.it!

Businesses taking more than 100 days to contain data breaches, finds study

Businesses taking more than 100 days to contain data breaches, finds study | HIPAA Compliance for Medical Practices | Scoop.it

In its global security report for 2015 information security provider Trustwave said it takes businesses 86 days on average to detect a data breach and 111 days on average to contain the breach from the date of intrusion.


Trustwave's report was based on 574 data breach cases it investigated last year. It said that businesses that spot data breaches themselves contain those breaches faster than where they are told about the intrusions by people or bodies external to the organisation, such as customers, regulators or law enforcement. Breaches were not detected by victim organisations in 81% of the cases.


"Victims that don’t detect the compromise themselves don’t become aware of a breach until later," the report said. "As a result, they simply cannot respond to contain it as quickly as victims that detect the breach themselves. So it stands to reason that victims that didn’t detect the breach themselves endured incidents nearly a month longer in terms of the median in 2014. Breaches detected by an external party lasted from one to 1,692 days from intrusion to containment, with a median of 154 days (27 days more than in 2013)."


Trustwave said that 43% of the 574 data breach cases it investigated last year concerned retailers. Nearly half of all the investigations (49%) concerned the theft of personal data and payment card information. The report said that 40% of cases involved the loss of data at point-of-sale terminals. However, it said US retailers are more likely to be exposed to data breach cases at POS terminals because of their "lagging adoption" of 'chip and pin' technology that is used by UK banks.


The report also highlighted common security vulnerabilities it had identified in mobile technologies, networks and applications and found that many businesses are still using basic passwords.

"Administrators should consider enforcing a length of at least 10 characters," Trustwave said. "As proof, passwords with eight characters, for example, can be cracked within a day using brute-force techniques with technology easily available to attackers. We estimate that the same techniques and technology would crack a 10-character password in 591 days (19.5 months)."

more...
No comment yet.
Scoop.it!

Complying with the HIPAA Nondisclosure Rule

Complying with the HIPAA Nondisclosure Rule | HIPAA Compliance for Medical Practices | Scoop.it

Under the HIPAA Omnibus Rule, patients can request a restriction on a disclosure of PHI to a health plan if they pay out of pocket, in full for the service. Practices must agree to such a request unless they are required by law to bill that health plan (as is the case with some Medicaid plans).

During a session at the Medical Group Management Association 2014 Annual Conference, Loretta Duncan, senior medical practice consultant with malpractice insurer the State Volunteer Mutual Insurance Company in Brentwood, Tenn., shared some of her compliance tips:


• If the service the patient does not want disclosed is bundled with something else, explain that the patient may need to pay more out-of-pocket costs than expected.

• Make sure that communication is tight between all staff and departments regarding nondisclosure.

• Document your new nondisclosure policies and procedures.

• Be careful when e-prescribing, as pharmacies may bill to the insurance plan before the patient has a chance to let the pharmacy know that the information should not be disclosed.

more...
No comment yet.