HIPAA Compliance for Medical Practices
63.1K views | +5 today
Follow
HIPAA Compliance for Medical Practices
HIPAA Compliance and HIPAA Risk management Articles, Tips and Updates for Medical Practices and Physicians
Your new post is loading...
Your new post is loading...
Scoop.it!

HIT vendors rely on security standards that don't meet HIPAA requirements

HIT vendors rely on security standards that don't meet HIPAA requirements | HIPAA Compliance for Medical Practices | Scoop.it

Health IT vendors don't often protect electronic patient information in accordance with HIPAA, even when they and their provider clients think that they're in compliance with the law, according to a new article by Dan Schroeder, an attorney with Habif, Arogeti & Wynne in Atlanta.


Writing for the Health Law eSource, the monthly e-zine of the American Bar Association's Health Law Section, Schroeder points out that while the potential security risks of health IT companies are "very high," many of them are falling short on HIPAA compliance. For example, they're not conducting a risk analysis of potential threats and vulnerabilities regarding the data, a fundamental HIPAA requirement.


Health IT vendors and the providers who use them are expected to come under increased scrutiny, particularly over the next year, according to one attorney with the U.S. Department of Health and Human Services Office for Civil Rights. Both the Office of Inspector General and OCR have announced their intention to targeting cloud vendors and other business associates to ensure that patient data is adequately protected pursuant to HIPAA requirements.


Some vendors erroneously rely on alternative security standards as evidence that they adequately protect patient information. For instance, many health IT companies believe that obtaining a Service Organization Control (SOC) 1 Report--also known as an SSAE 16--is sufficient to comply with HIPAA. SOC 1 Reports, which are prepared by a certified public accountant in accordance with guidelines from the American Institute of Certified Public Accountants (AICPA), attest to a company's internal controls. However, they apply only to financial reporting, such as debits and credits.


"A basic Internet search uncovers numerous HIT companies that offer up SOC 1 reports as evidence that they have fulfilled their HIPAA responsibilities, even though AICPA standards explicitly restrict the report from being used to address operational and compliance risks [e.g., security, privacy, integrity and availability risks]," he warns.

Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

more...
No comment yet.
Scoop.it!

Thousands of hospitals making simple cyber security error, exposing devices

Thousands of hospitals making simple cyber security error, exposing devices | HIPAA Compliance for Medical Practices | Scoop.it

Drug infusion pumps that can be manipulated from afar, defibrillators than can be programmed to deliver random shocks, refrigerators whose temperature settings can be reset, these are some of the cybersecurity problems uncovered by Scott Erven, the head of information security for healthcare facility operator Essentia Health.


It took Erven's team only half an hour to find another healthcare organization that was exposing information about 68,000 systems, including at least 488 cardiology systems, 332 radiology systems and 32 pacemakers, according to Wired Magazine.


"Now we know all the targeted info and we know that systems that are publicly connected to the internet are vulnerable to the exploit," Erven told Wired. "We can exploit them with no user interaction… [then] pivot directly at the medical devices that you want to attack."


The problem stems from poorly configured settings on the Server Message Block protocol that allows information like computer IDs to be shared publicly instead of just with select staff. And Erven said thousands of other healthcare organizations around the globe are making the same mistake.


Computer viruses exploiting the information can then be sent to hospitals via spam emails. Worst of all, if the computer ID contains a doctor's name, as it sometimes does, that information can be used to target individual patients, the article says. 


While shocking, news of poor cybersecurity in the med tech and healthcare industries shouldn't be "news" anymore. On June 23, Medtronic ($MDT) said that it, along with two other large medical device manufacturers, discovered an "unauthorized intrusion" to its systems last year that could be traced back to hackers in Asia. The company also disclosed that it lost an unnamed number of patient records from its diabetes unit in a separate incident, but does not know what type of information was included in the records.


The FDA has taken notice and experts say it will soon start rejecting devices that aren't secure. In addition, growing concerns from patients could jolt companies and hospitals into action. A fictional cyber attack on the TV show Homeland and increased media attention have brought the issue to life.

Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

more...
No comment yet.
Scoop.it!

Security tips from the health IT pros | Healthcare IT News

Security tips from the health IT pros | Healthcare IT News | HIPAA Compliance for Medical Practices | Scoop.it

As anyone who's ever worked for IT security can attest, the job is no walk in the park. New threats, compliance mandates, vulnerabilities and updates are constant. But with strong leadership, and a culture of compliance and responsibility to match, many healthcare organizations have shown it can be done right -- and well.   Beth Israel Deaconess Medical Center's Chief Information Officer John Halamka, MD, said for this kind of career, it's a matter of first understanding that, "a CIO has limited authority but infinite accountability." You have to ask, "How do you reduce risk to the point where government regulators and, more importantly, patients will say, 'What you have done is reasonable?'" he said.   [See also: Hacker calls health security 'Wild West'.]   This involves thinking about how to encrypt every device and how to protect the data center from both internal and external attacks.

"Much of what I have to do is meet with my business owners and ask, 'What are the risks? Reputational risks? Patient privacy breach risks? Data integrity risks? We're never going to be perfect," he added. "But we can put in place, what I call a 'multilayer defense.'"   Another fundamental piece to doing privacy and security right? No surprise here: Get your risk analysis done – and done properly.

"This is the single most important document as part of the OCR investigation," said Lynn Sessions, partner at BakerHostetler, who focuses on healthcare privacy. "(OCR is) asking for the current one; they are asking for two, three, five years back. They want to see the evolution of what was going on from a risk analysis standpoint at your institution to see if you were appreciating the risk."   This includes showing the safeguards your organization has put in place from technical, physical and administrative standpoints, explained Sessions. Things such as staff training and education, penetration tests, cable locks or trackers for unencrypted devices all matter.    Time to encrypt   "Encrypt; encrypt; encrypt," said Sessions. It's a safe harbor for the HIPAA breach notification requirements, but that still fails to motivate some.    [See also: Hacker calls health security 'Wild West'.]   "(Physical theft and loss) is the biggest hands down problem in healthcare that we are seeing," said Suzanne Widup, senior analyst on the Verizon RISK team, discussing the 2014 annual Verizon breach report released in April. "It really surprises me that this is still such a big problem ... other industries seem to have gotten this fairly clearly."   According to OCR data, theft and loss of unencrypted laptops and devices account for the lion's share of HIPAA privacy and security breaches, nearing 60 percent. (Hacking accounts for some 7 percent, and unauthorized disclosure accounts for 16 percent).   "Pay attention to encryption, for any devices that can leave the office," said former OCR deputy director for health information privacy Susan McAndrew at HIMSS14 this past February.   Of course, the healthcare breach numbers are going to be slightly higher because the federal government has mandated specific HIPAA privacy and security breach notification requirements for organizations, but that has no bearing on the reality that these organizations still fail to implement basic encryption practices, Widup pointed out.    Sessions conceded that it is a pricing concern. "At a time where reimbursements are going down and technology costs are going up with the advent of the electronic health record, there are competing priorities within a healthcare organization of where they can spend their money."   A 2011 Ponemon Institute report estimated full disk encryption costs to be around $232 per user, per year, on average, a number representing the total cost of ownership. And that number could go as high as $399 per users, per year, the data suggest.    Kaiser Permanente Chief Security Officer and Technology Risk Officer Jim Doggett, however, said encryption presents a challenge not only because of costs but also because of the data itself. "The quantity of data is huge," he told Healthcare IT News.    The 38-hospital health system encrypts data on endpoint devices in addition to sensitive data in transit, said Doggett, who currently leads a 300-person technology risk management team, in charge of 273,000 desktop computers, 65,000 laptops, 21,700 smartphones and 21,000 servers. And don't forget the health data of some 9 million Kaiser members Doggett and his team are responsible for.

"This kind of scale presents unique challenges, and calls for the rigor and vigilance of not only the technology teams but of every staff member across Kaiser Permanente," he added. 

more...
No comment yet.
Scoop.it!

The HHS/OCR Hit List for HIPAA Audits

The HHS/OCR Hit List for HIPAA Audits | HIPAA Compliance for Medical Practices | Scoop.it

As the HHS Office for Civil Rights analyzes breach reports for vulnerabilities, it has learned lessons on areas where covered entities should pay particular attention to their HIPAA compliance efforts. With OCR hoping soon to launch a permanent random HIPAA Audit program, the agency has reiterated six core ways to avoid common types of breaches, which will be among the targeted focus areas of audits.



more...
No comment yet.
Scoop.it!

Privacy and security experts: mHealth requires a new approach | mHealthNews

Privacy and security experts: mHealth requires a new approach | mHealthNews | HIPAA Compliance for Medical Practices | Scoop.it

The proliferation of mobile devices in healthcare, from smartphones and tablets to the clinical devices themselves, is forcing healthcare executives to take a new approach to privacy and security.

Gone is the "security cop" approach, in which staff and employees are simply told what they can and can't use and do. Instead, we're seeing a "business enablement" approach, in which privacy and security concerns are woven into the workflow.

The reasoning behind this, says Jim Doggett, Kaiser Permanente's senior vice president, chief security officer and chief technology risk officer, is that cybercrime is an industry now, and the old method of "do it my way or else" won't work any more. With new ways of delivering healthcare must come new ways of protecting it.

"We're a bit out of alignment," Doggett said during a recent presentation at the HIMSS Media Privacy and Security Forum. "We're still solving yesterday's problems when we need to be solving today's and tomorrow's problems."

To wit: Doggett said he wanted to determine how to best implement a new policy on privacy and security. He tailed a physician during a normal workday, and watched the man log on and off and back onto various systems "maybe 50 times." Doggett said he realized the doctor wasn't going to adopt any new privacy and security rule that added to his workload, and would in fact welcome something that improved it.

The answer: Don't just establish a policy and enforce it; work with doctors, nurses and other staff members to see how it can best be implemented.

That was the thinking prevalent during the first day of the two-day forum, being held in San Diego. Healthcare is changing so much as it is, so privacy and security methods have to be woven into those changes. If mHealth and telemedicine are going to improve healthcare delivery over the coming years, develop privacy and security platforms that enhance those methods, rather than pushing people away or hindering adoption.

The takeaway for mHealth enthusiasts during the first day of the conference is that privacy and security has to become more fluid – rigid rules just won't work any more – and mindful of the fact that sensitive data is moving in and out of the enterprise in more ways and on more devices.

Mobile devices and social media "are really big areas of compliance concern," said Iliana L. Peters, senior advisor for HIPAA compliance and enforcement with the U.S. Health and Human Services Department's Office for Civil Rights. She said too many healthcare providers aren't taking this seriously. "They neglect to acknowledge where their data is or the risk to that data."

Encryption of data has to become the norm, rather than a suggested policy.

"If your entity is not encrypting, it should be," she said.

And doctors and nurses have to be made to understand that protection of sensitive data is "a part of efficient healthcare." Michael Allred, Intermountain Healthcare's information security consultant and identity and access team manager, said clinicians are the toughest to educate and may be frustrated with privacy and security efforts, but one breach could cost them and their institution much in terms of reputation and money.

more...
No comment yet.
Scoop.it!

Key privacy rule could fall to accountable care push | Vital Signs | The healthcare business blog from Modern Healthcare

Key privacy rule could fall to accountable care push | Vital Signs | The healthcare business blog from Modern Healthcare | HIPAA Compliance for Medical Practices | Scoop.it

History may look back on last week as an inflection point for privacy and technology in the healthcare industry.

That's because what happened makes it possible that a bulwark federal privacy rule will become a casualty of the push to accountable care, patient-centered medical homes and other population-health oriented care plans.

If the considered rule change happens, proponents of these care plans could have broader access to the medical records of patients of drug and alcohol abuse programs without those patients' consent. That will help healthcare providers afford those patients better coordinated, higher quality and more cost efficient care, these proponents say.

Opponents of the rule change warn, however, without the law's current stringent consent requirements, drug and alcohol abuse patients will avoid seeking treatment, out of concern their stigmatizing and or illegal activity will be exposed, a situation the rule, created in the 1970s, sought to avoid.

“I think what will happen is you'll see some people who will be in substance abuse treatment either won't get it or will stop confiding as much as they do in their therapist,” said Jim Pyles, a Washington privacy lawyer who testified last week on behalf of the American Psychoanalytic Association and in favor of maintaining a stringent federal privacy rule covering these patient records.

Here are three actions last week around which federal technology privacy policy may turn and why:

A federal regulatory advisory panel last Tuesday accepted recommendations from its privacy workgroup that would put off until 2017 the introduction of some narrow and largely voluntary privacy protection criteria under the electronic health-record incentive payment program of the American Recovery and Reinvestment Act of 2009. The policy recommendations are for technology to protect the privacy of behavioral health patient information. The federal privacy workgroup, the Privacy and Security Tiger Team, has been looking at certain privacy protection technology since 2010. But it has not recommended that the feds put a regulatory stake in the ground, telling developers of electronic health-record systems and information exchange systems that they should add this technology to their own systems, or encouraging healthcare providers to incorporate the technology in their workflows.

On Wednesday, the Substance Abuse and Mental Health Services Administration, in a day-long listening session, heard conflicting testimony on whether it should consider modifying the federal privacy rule, 42 CFR Part 2, covering the transmission and sharing of medical records of many drug and alcohol abuse patients.

Many “general” healthcare providers aren't using substance-abuse treatment data because they don't have the technology to help them handle it efficiently and in compliance with the law.

But if SAMHSA continues its unflagging support for the special rule for handling substance abuse information, it may force technology developers to incorporate privacy-protecting technology into their systems, and induce providers to use it, affording better protection for all healthcare data, privacy advocates say. If the rule is weakened, however, that technology may never be rolled out.

Finally, on Thursday came the news that privacy advocate Joy Pritts, the first chief privacy officer at the Office of the National Coordinator for Health Information Technology at HHS, would be stepping down after 4 ½ years on the job.

Pritts praised an early implementation of data segmentation technology for behavioral health demonstrated by EHR developer Cerner Corp. at this year's Health Information and Management Systems Society, adding her hope that “other vendors follow that lead.” The concern, expressed by several privacy advocates bemoaning her July departure, is that her successor— unknown at this pivotal moment—might not be as stalwart an advocate for patients' rights and data segmentation technology as Pritts has been.

Development and adoption of the technical capabilities to affix so-called “meta-data tags” to patient records—which also would aid in interoperability and research as well as privacy protection—was urged by the President's Council of Advisors on Science and Technology in 2010 and by the JASON, a group of top scientists working for the Agency for Healthcare Research and Quality, this April.

SAMHSA, itself, sponsored one of six ONC “Data Segmentation for Privacy” pilot projects to test the technology. But the agency is under considerable pressure to ease regulatory restrictions on the flow of this data, rather than hold firm and press the industry to adopt technology that will help providers comply with the consent provisions of 42 CFR Part 2.

Either way, SAMHSA's decision will likely have a wider impact on privacy protections than the scope of that rule now.

more...
No comment yet.
Scoop.it!

Security: healthcare's fixer-upper | Healthcare IT News

Healthcare's all about the patients, right? Earning their trust so they return for annual checkups, delivering high-quality care while respecting their medical privacy at the highest level. But far too often, there's a disconnect – the idea that the care ends when the patient exits the building or a diagnosis is made, the idea that clinical deals with clinical and information technology deals with IT. But, that's not often the case in this digital age. Lines are blurred, and what happens in one area can have serious implications for another – especially when it comes to patient privacy. 

Healthcare organizations are charged with safekeeping some of the most personal and sensitive information on individuals who come to receive care. That bout of depression you had in your early 20s, the sexually transmitted infection you were treated for last year, blood tests of every ilk, cancer diagnoses, medical procedures, HIV statuses, psychiatric disorders, every medication you've ever been prescribed, administered vaccinations, Social Security numbers, dates of birth, demographics, where you live, insurance details, even payment information. Healthcare organizations are gold mines of data. Valuable data. And, traditionally, protecting said data hasn't been the industry's strong suit.    Since 2009 when the HIPAA breach notification requirements took effect, nearly 1,000 large data breaches – those involving 500 individuals or more – have been reported to the Department of Health and Human Services, affecting almost 32 million people.     In addition to the breaches reported by covered entities and business associates  themselves, the Office for Civil Rights, the HHSdivision responsible for enforcing HIPAA,  has received nearly 95,000 privacy and security complaints over the handling of health data  since 2003. That's a number meriting a reevaluation of how healthcare does privacy and  security. 

Complex nature 

Of course, the reasons behind why many organizations have reported egregious privacy  and security failings are not always one dimensional. Oftentimes, data breaches are the  result of mistakes by well-intentioned people governed by poor policies and paltry staff  training, and sometimes it's the other way around.    Frequently, it's a matter of unencrypted devices being stolen or lost, but there's low probability the data has actually been compromised. 

Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

more...
No comment yet.
Scoop.it!

Health Organizations Across the U.S. Report Data Breaches - iHealthBeat

Health Organizations Across the U.S. Report Data Breaches - iHealthBeat | HIPAA Compliance for Medical Practices | Scoop.it

Several data breaches and security issues recently occurred at health care providers and entities across the U.S.

Details of Mont. Department of Public Health and Human Services Breach

On Thursday, a spokesperson for the Montana Department of Public Health and Human Services announced that hackers have had access to a computer server since July 2013 but that there is no evidence that individuals' personal health data have been stolen, the AP/San Francisco Chronicle reports.

The data breach was identified by employees in the agency's IT department on May 15, at which time the server was shut down and the police were contacted. DPHHS spokesperson Jon Ebelt said that a forensic review did not show any evidence that information was stolen. He said that he does not know how many individuals' data are stored on the server but that it is used to store information on those served by the agency, including:

  • Addresses;
  • Clinical information;
  • Dates of birth;
  • Names; and
  • Social Security numbers.

Health department officials are mailing letters to those whose information could have been on the server with instructions on how to monitor their credit. The agency purchased additional security software and is reviewing its policies and procedures to avoid another breach (Volz, AP/San Francisco Chronicle, 5/29).

Details of Denver-Based Department of Veterans Affairs Hospital Breach

A Department of Veterans Affairs hospital in Denver last week reported that two pulmonary laboratory computers containing data from tests on 239 patients were stolen, according to a VA spokesperson, the Denver Post reports (Phillips, Denver Post, 5/27).

The computers were stored on portable carts located inside a locked room in the hospital's pulmonary lab. Officials reported the computers as stolen to local law enforcement in Denver and Aurora, the VA inspector general's office and the local VA police. Officials have launched a criminal investigation and affected patients are being contacted and given information for credit monitoring (Ferrugia, ABC News Denver, 5/27).

Details of New Hampshire Hospital Breach

Officials at Manchester, N.H.-based Elliot Hospital and local law enforcement are investigating the theft of four computers that contain the personal health data of 1,213 patients, WMUR New Hampshire reports.

The computers were stolen from an employee's car in March while they were being transported to a location to be disposed.

Hospital officials said the computers had auto-archived three spreadsheets and 20 emails, and included patient information such as:

  • Billing codes;
  • Dates of service; and
  • Names.

Patients whose information was stored on the computers are being contacted by the hospital (Sexton, WMUR New Hampshire, 5/24).

Details of Oregon-Based Hospital Breach

Officials at ProMedica Bay Park Hospital are notifying more than 500 patients of an internal security breach of protected health information, WNWO reports.

Officials said that an internal investigation showed that between April 1, 2013, and April 1, 2014, a hospital employee accessed records of patients who were outside of the employee's care. The information accessed included patients':

  • Attending physicians;
  • Dates of birth;
  • Diagnoses;
  • Medications; and
  • Names.

The employee was immediately fired and the hospital is offering all affected patients a one-year membership for identity theft protection services (Mack, WNWO, 5/28).

Details of Humana Data Breach

Last month, Humana notified 2,962 Atlanta-area members about a security breach that could have compromised their personal information, Modern Healthcare reports.

In a statement, Humana said the breach occurred when an unencrypted USB drive and an encrypted laptop were stolen from an employee's car. The data on the USB drive included members':

  • Medical record information;
  • Names; and
  • Social Security numbers (Herman, Modern Healthcare, 5/27).



Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

more...
No comment yet.
Scoop.it!

Staff blunder leads to HIPAA breach | Healthcare IT News

Staff blunder leads to HIPAA breach | Healthcare IT News | HIPAA Compliance for Medical Practices | Scoop.it
A Pennsylvania-based hospital is notifying nearly 2,000 patients of a HIPAA breach after an employee accessed and transmitted patients' protected health data outside of the hospital's secure information network. 
 
After conducting an internal investigation, the 551-bed Penn State Milton S. Hershey hospital on Friday notified 1,801 patients that their names, medical records numbers, medical lab tests and results and visits dates could have been accessed by an unauthorized person or entity due to an employee mistake, according to a hospital notice.  
Officials discovered a Penn State Hershey clinical laboratory technician, who was authorized to work with protected health information, accessed patient data via an unsecure USB device through his home network rather than the hospital network. Moreover, he also transmitted patient data via his personal email to two Penn State physicians. 
 
The breach was discovered by hospital officials on April 11. 
 
"Penn State Hershey considers patient privacy and confidentiality to be of the utmost importance and chose to notify patients of this incident out of an abundance of caution," read a June 6 public notice. "To decrease the likelihood of similar circumstances occurring in the future, Penn State Hershey is increasing education efforts with employees, focusing on the essential responsibility of all staff to safeguard patient health information at all times and follow proper practices for doing so." 
 
This is the first large HIPAA breach Penn State Hershey has reported to the Department of Health and Human Services. 
 
As Mayo Clinic's Mark Parkulo pointed out in an interview with Healthcare IT News late last month, employee education proves one of the most important pieces to doing patient privacy right. "Some of it is a real education issue," Parkulo, who is the vice chair of Mayo's Meaningful Use Coordinating Group, noted. "A number of providers and other people don't understand that typical unencrypted email; you're not even sure exactly what locations it's going to, whether it could be intercepted or not."
 
Because of that, in addition to the standard education for employee orientation, Mayo tries to get out to employees multiple times per year for education sessions, whether that be through grand rounds, via online, email or even via the CEO. 
 
To date, more than 31.6 million people have had their protected health information compromised in a HIPAA privacy or security breach -- only in breaches involving 500 individuals or more -- according to data from HHS. 
 
The Office for Civil Rights, the HHS division responsible for investigating HIPAA breaches, has levied more than $25.1 million in fines against covered entities and business associates found to have violated privacy and security rules.
more...
No comment yet.
Scoop.it!

In-Depth: Consumer health and data privacy issues beyond HIPAA

In-Depth: Consumer health and data privacy issues beyond HIPAA | HIPAA Compliance for Medical Practices | Scoop.it

“The issue of consumer generated health data is on that is near and dear to my heart,” Federal Trade Commission Commissioner Julie Brill told attendees at an event focused on the protection of such health data earlier this month. “…Big picture, consumer generated health information is proliferating, not just on the web but also through connected devices and the internet of things.” As Brill noted this kind of health data are “health data flows that are occurring outside of HIPAA and outside of any medical context, and therefore outside of any regulatory regime that focuses specifically on health information.” That’s why it falls to the FTC to oversee privacy-related concerns for consumer generated health data. 

“I was at the Consumer Electronics Show in January and was really wowed by much that I saw,” Brill said. “Some of the devices that I saw were particularly focused on health and the quantified life. One in particular that struck me was [Rest Device's] Mimo, a onesie developed to measure the heart beats, respiration rate, and other vital signs of an infant or newborn. It could send information to an app, to the parent’s mobile device. Think about the benefits to any parent who is worried about SIDS or might want to get their baby to sleep better or get themselves to sleep better.”

Brill also noted the rise of step counting devices, the trend of some physicians Googling their patients, and what she called an ongoing ethical debate about physicians “friending” their patients on Facebook. “There is also the now infamous example of companies that are generating their own health data about their customers with respect to their purchases, like Target did with its pregnancy predictor score,” she recalled.

As Nicholas Terry, Professor of Law & Co­‐Director of the Hall Center for Law and Health, Indiana University Robert H. McKinney School of Law explained in a must-read commentary that he submitted to the FTC in response to its discussion of privacy for consumer generated health date, none of this health data is protected by HIPAA or HITECH.

“At root such patient curation of health data bespeaks autonomy and is symbolic of patient ownership of the data,” Terry writes. “However, it fails to take into account one practical limitation—the canonical version of the record will remain in the provider’s control–and one legal limitation—that only the provider-‐curated copy is protected by HIPAA-­HITECH. In contrast, the patient­‐curated ‘copy’ attracts little meaningful privacy protection. Well­‐meaning privacy advocates should think carefully before promoting this autonomy‐friendly ‘control’ model until data protection laws (not to mention patient education as to good data practices) catch up with patient curated data.”

Brill said that legislation like HIPAA and HITECH show that the US believes health data is sensitive and deserving of special protection, but “then the question becomes, though, if we do have a law that protects health information but only in certain contexts, and then the same type of information or something very close to it is flowing outside of those silos that were created a long time ago, what does that mean? Are we comfortable with it? And should be we be breaking down the legal silos to better protect that same health information when it is generated elsewhere.”

During the FTC event the agency seemed to take pains not to point to specific examples of wrongdoing, but the commissioner and other participants in the forum did raise up relevant examples of products for the sake of discussion.

“We recently read about one insurance company, Aetna, that has developed an app for its beneficiaries to use,” Brill said during her opening remarks. “It will allow their users to set goals and track their progress on all sorts of health indicators: weight, exercise, things like that. I think it’s wonderful that Aetna set this up, it’s great, but I don’t know precisely what they are doing with this information. We’ve looked at the terms of service. It just raises interesting questions: To what extent could this information be used for rating purposes? We all know under the FCRA It ought not to be, but what are the rules of the road here?”

Brill also pointed to a company called Blue Chip Marketing, which she said mines social media feeds and other databases to help recruit patients to clinical trials: It doesn’t “work with doctors or hospitals, instead it surfed social media, searched cable TV subscriptions, and got lots of information that allowed it to infer whether consumers were obese, potentially had diabetes, potentially had other conditions, and then offer to them to join a clinical trial. Some consumers would think that’s great – yes, I’d like to be part of a clinical trial, but others were really shocked when they were contacted by the company or others working with them. They asked: what makes you think I’m obese? Or how did you know I was a diabetic? [This raises] really interesting issues.”

In Terry’s commentary to the FTC, he points to a recent paper by McKinsey called The ‘Big Data’ Revolution in Healthcare, which identified four primary data pools that were driving big data in healthcare: They are activity/claims and cost data, clinical data, pharmaceutical R&D data, and patient behavior and sentiment data. Terry also described three types of health data that big data companies use to create proxies for HIPAA-protected data: “‘laundered’ HIPAA data, patient-­curated information and medically inflected data.”

Terry also rightly recognizes the rise of quantified self tools as an important consideration in the discussion. “A similarly dichotomous result is likely as the medically quantified self develops,” Terry writes. “The quantified-­self movement concentrates on personal collection and curation of inputs and performance. Obviously, health, wellness and medically inflected data will likely comprise a large proportion of such data. A similar, if less formal, scenario is emerging around health and wellness apps on smartphones and connected domestic appliances such as scales and blood pressure cuffs. Smartphones are crammed with sensors for location, orientation, sound and pictures that add richness to data collection. And there is ongoing and explosive growth in the medical apps space that seeks to leverage such sensors… These processes will in most cases lead to medically inflected data that exists outside of the HIPAA-HITECH protected zone.”

more...
No comment yet.
Scoop.it!

Class action lawsuit filed against UPMC over data breach

Class action lawsuit filed against UPMC over data breach | HIPAA Compliance for Medical Practices | Scoop.it

PITTSBURGH —A class action lawsuit filed in federal court targets both the University of Pittsburgh Medical Center (UPMC) and a payroll software company it uses called the Ultimate Software Group, seeking damages and protection for employees affected by a breach of confidential data.

The lawsuit claims the "defendants had a duty to protect the private, highly sensitive, confidential personal and financial information and the tax documents" and "failed to safeguard and prevent vulnerabilities from being taken advantage of."

VIDEO: Watch Bob Mayo's report

In April, UPMC Vice President of Privacy and Information Security John Houston told Pittsburgh's Action News 4, "We are going to give of our employees the opportunity to sign up for a credit monitoring service. We're underwriting the cost for that for one year."

This lawsuit seeks a court injunction forcing 25 years worth of identity theft insurance, credit restoration services, and credit and bank monitoring services.

Some UPMC employees interviewed on the streets of the city's Oakland section expressed concerned about identity thieves.

"They're going to wait one year, they're going to wait two years, they're going to wait three years, and they could come back. I could be affected by a job I took in college, which is sort of scary," said Allisandra Supinski.

"I feel comfortable with the one year that I have. If i look into it more, I may change my mind," said Amy Hoffman.

"As long as you are with UPMC, they should cover us. As long as we work there for them, we should be able to get protected," said Rodreda Tate.

The employee named as the lead plaintiff, Alice Patrick, works as a dialysis clinician at UPMC McKeesport, according to the lawsuit. Both the attorneys who filed the lawsuit on her behalf and representatives of UPMC declined to be interviewed for this story.



more...
No comment yet.
Scoop.it!

Upcoming Webinar: Hear from Health & Human Services, Avoid the Biggest HIPAA Mistakes | Business Wire

Upcoming Webinar: Hear from Health & Human Services, Avoid the Biggest HIPAA Mistakes | Business Wire | HIPAA Compliance for Medical Practices | Scoop.it
May 14, 2014 09:08 AM Eastern Daylight Time
CLEARWATER, Fla.--(BUSINESS WIRE)--
 

WHAT:

FairWarning, Inc., the inventor and KLAS Category Leader in Patient Privacy

Monitoring1 will host an industry-wide webinar titled “Straight from the Source:

HHS Tools for Avoiding Some of the Biggest HIPAA Mistakes” featuring

Laura Rosas, Senior Advisor, Office of the Chief Privacy Officer, Health & Human Services. In 2014, covered entities can expect to receive an inquiry letter covering the mostfrequent problem areas in the HIPAA pilot audits. The Security Risk Assessment isrequired by both the HIPAA Security Rule and the CMS EHR Incentive Program, also known asMeaningful Use. Health & Human Services released a downloadable Security Risk Assessment Toolthis past March to help covered entities evaluate their security position and identifyareas needing improvement. The time is now to identify the weakest areas, and takeaction to improve prior to an audit. 

During this webinar, Ms. Rosas will walk through this tool to help attendees avoid

some of the biggest HIPAA mistakes, including tips and recommendations for

getting the most from the self-assessment. 

WHEN & WHERE:

Tuesday, May 20, 2014

2:00 Eastern / 11:00 Pacific

Broadcast via Webex, register at:

https://fairwarningevents.webex.com/ec0701l/eventcenter/enroll/register.do?formId=0&formType=0&loadFlag=1&siteurl=fairwarningevents&confId=1749154321

About FairWarning, Inc.

FairWarning®’s mission is to lead the industry expansion of trust in Electronic Health Records empowering care providers to grow their reputation for protecting confidentiality, scale their digital health initiatives and comply with complex Federal and state privacy laws such as HIPAA. By partnering with FairWarning, care providers are able to direct their focus on delivering the best patient outcomes possible while receiving expert, sustainable and affordable privacy and compliance solutions. Customers consider FairWarning® privacy auditing solutions essential for compliance with healthcare privacy regulations such as ARRA HITECH privacy and meaningful use criteria, HIPAA, UK and EU Data Protection, California SB 541 and AB 211, Texas HB 300, and Canadian provincial healthcare privacy law. For more information on FairWarning® visit http://www.FairWarning.com or email Soultions@FairWarning.com.

1 2013 Best in KLAS: Software & Services report, January, 2014. © 2014 KLAS Enterprises, LLC. All rights reserved. www.KLASresearch.com

Contacts

FairWarning, Inc.
Sadie Peterson, 727-576-6700 Ext. 119
Sadie@FairWarning.com

more...
No comment yet.
Scoop.it!

Largestever HIPAA settlement rings in at 5 million, should be a lesson to providers sharing computer networks, feds announce

Largestever HIPAA settlement rings in at 5 million, should be a lesson to providers sharing computer networks, feds announce | HIPAA Compliance for Medical Practices | Scoop.it

New York Presbyterian Hospital and Columbia University have entered into the largest-ever government settlement over an electronic data breach, totaling $4.8 million, the Department of Health and Human Services announced Wednesday. 

The breach occurred when a Columbia University physician and computer application developer attempted to deactivate a server he personally owned, which was on a data network shared with New York Presbyterian, according to HHS. The two organizations operate jointly as New York Presbyterian Hospital/Columbia University Medical Center.

Because “technical safeguards” were lacking, deactivating the server allowed personal health information of about 6,800 patients to be accessed through public Internet search engines, HHS explained. The providers reported the breach in 2010, after someone found the personal information of a deceased loved one on the Web.

The settlement should be cautionary for joint healthcare providers that both are covered by Health Insurance Portability and Accountability Act provisions, said Christina Heide, acting deputy director for health information privacy at the HHS Office of Civil Rights.

“When entities participate in joint compliance arrangements, they share the burden of addressing the risks to protected health information,” She said. “Our cases against NYP and CU should remind healthcare organizations of the need to make data security central to how they manage their information systems.”

New York Presbyterian's share of the settlement totaled about $3.3 million, and Columbia's came to $1.4 million. Both have agreed to a “substantive corrective action plan,” including risk analysis and management, HHS noted.

more...
No comment yet.
Scoop.it!

Hospitals 'very sloppy' about security efforts

Hospitals 'very sloppy' about security efforts | HIPAA Compliance for Medical Practices | Scoop.it

Healthcare facilities are constantly in danger of being hacked and having data stolen, but two researchers have found that many hospitals themselves leak valuable information online.


The data leaks result from network administrators enabling Server Message Block, or SMB, which, when configured a certain way, broadcasts the data externally, researchers Scott Erven, head of information security for Essentia Health, and Shawn Merdinger, an independent healthcare security researcher and consultant, shared in a recent Wired article.


SMB is a protocol used by administrators to quickly identify, locate and communicate with computers and equipment connected to an internal network, according to the article. Erven and Merdinger found that hospitals misconfigure the SMB service, which allows outsiders to see it. 


Security issues at healthcare facilities are nothing new, and the SMB protocol vulnerability is just another problem to add to a growing list of ways information can be compromised.


"It goes to show that healthcare [organizations are] very sloppy in configuring their external edge networks and are not really taking security seriously," Erven told Wired.


He added that the problems can occur because of too much focus on HIPAA compliance--which causes providers to pay too little attention to testing and securing their systems.


With a spike in HIPAA fines possible, healthcare facilities may be even more focused on compliance with those standards then working to properly secure their networks.


To that end, even a recent White House report pointed out that HIPAA compliance might not be enough to ensure privacy in the electronic age.

Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

more...
No comment yet.
Scoop.it!

Tracking confidential data a major worry in healthcare security

Tracking confidential data a major worry in healthcare security | HIPAA Compliance for Medical Practices | Scoop.it

Uncertainty about where sensitive and confidential data is located causes more worry for security pros than hackers or malicious employees, according to a new survey from the Ponemon Institute.

The report, based on a poll of 1,587 IT security practitioners in 16 countries, focuses on the state of data-centric security, which it describes as a security policy that follows data wherever it is replicated, copied or integrated.

Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

more...
No comment yet.
Scoop.it!

OCR attorney predicts spike in HIPAA fines

OCR attorney predicts spike in HIPAA fines | HIPAA Compliance for Medical Practices | Scoop.it

The Office for Civil Rights' crackdown on HIPAA violations over the past year will "pale in comparison" to the next 12 months, a U.S. Department of Health and Human Services attorney recently told an American Bar Association conference.

Jerome B. Meites, OCR chief regional counsel for the Chicago area, said that the office wants to send a strong message through high-impact cases, according to Data Privacy Monitor.

The Office for Civil Rights has been levying fines to make healthcare entities take notice: nine settlements since June 1, 2013, have totaled more than $10 million. That includes a record $4.8 million fine announced in May against New York-Presbyterian Hospital and Columbia University.


Choosing an efficient patient portal solution can be a daunting task and can cost you big without proper research. By asking the right questions and connecting the right stakeholders, you can ensure that you implement a true community solution that will improve the continuum of care for your clinicians and patients. Click here to learn more about 7 key questions that can help you choose the best patient portal solution for your facility. To learn more, download today.


"Knowing what's in the pipeline, I suspect that that number will be low compared to what's coming up," Meites said in the article.

The OCR has said that when it resumes HIPAA audits this fall, the investigations will have a narrow focus and there will be fewer onsite visits. Meites told the American Bar Association that the OCR still has to decide which organizations it will select for an audit from a list of 1,200 candidates--800 healthcare providers, health plans or clearinghouses--and 400 of their business associates.

A report last December from the Office of Inspector General criticized the OCR's enforcement of the HIPAA provisions, including inadequate focus on system and data security.

Meanwhile, the number of breaches on the U.S. Department of Health and Human Services' "wall of shame" topped 1,000 this month, with at least 34 breaches so far in June. The records of nearly 31.7 million people have been exposed since federal reporting was mandated in September 2009.





more...
No comment yet.
Scoop.it!

iOS changes will address HIPAA risk | Healthcare IT News

iOS changes will address HIPAA risk | Healthcare IT News | HIPAA Compliance for Medical Practices | Scoop.it

Imagine if almost everyone walking into your hospital – patients, doctors, visitors, salespeople – was carrying an active homing beacon, which broadcast, unencrypted, their presence and repeatedly updated exact location to anyone who chose to listen.

[See also: Where will HIT security be in 3 years?]

That's where things stand today, courtesy of the mobile MAC address signal (it stands for media access control), a unique ID coming from every smartphone, tablet and wearable device.

But not for long, given upcoming changes to how Apple products will handle MAC address broadcasts –  a move almost certain to be copied by Google's Android.

[See also: 'Troubling disconnect' between mobile security threats and protections in place]

Apple's iOS 8 change, focusing initially on how MAC addressing interacts with Wi-Fi scans, will shift to using "randomly, locally administered" MAC addresses. The result, according to Apple: "The MAC address used for Wi-Fi scans may not always be the device's real – universal – address." (That description is on page 18 of an Apple PDF, available here.)

As a practical matter, using this kind of a randomized bogus address approach will make tracking people via mobile devices impossible or, at best, impractical, depending on the level of randomization used and how often – if ever – the true MAC address is broadcast.

It will still be months before Apple releases this new version of its mobile OS publicly (it's now solely with developers), weeks and maybe months before most consumers will upgrade and longer still before others – especially Google's Android – mimic the move.

That means that, for now, this security privacy risk is still a very real threat.

The risk is twofold. First, there is the potential for a renegade member of the hospital's staff to track people. Second, there exists the possibility that hospital visitors could wirelessly track other hospital visitors.

With the first scenario, this is not as much of a concern for tracking doctors and other hospital staff, as they could just as easily be tracked the instant they log into the hospital's local area network, so the MAC address broadcast is not necessary. With visiting cyberthieves or stalkers, anyone with a mobile device is a potentially tracked victim.

The security risk is that a specific MAC address would be tracked over time, showing all travel activity within the hospital. Retail offers a great example of the risk: Retailers work with vendors who have contracts with lots of other retailers. This allows those companies to create – and to then sell – detailed reports of every store and mall and parking lot that a MAC address visits. By overlaying it with purchase records, that address can be associated with specific purchases. If those purchases used a payment card or loyalty card, that MAC address can then be associated with a specific person.

more...
No comment yet.
Scoop.it!

Big Data, My Data - iHealthBeat

Big Data, My Data - iHealthBeat | HIPAA Compliance for Medical Practices | Scoop.it

"The routine operation of modern health care systems produces an abundance of electronically stored data on an ongoing basis," Sebastian Schneeweis writes in a recent New England Journal of Medicine Perspective.

Is this abundance of data a treasure trove for improving patient care and growing knowledge about effective treatments? Is that data trove a Pandora's black box that can be mined by obscure third parties to benefit for-profit companies without rewarding those whose data are said to be the new currency of the economy? That is, patients themselves?

In this emerging world of data analytics in health care, there's Big Data and there's My Data ("small data"). Who most benefits from the use of My Data may not actually be the consumer.

Big focus on Big Data. Several reports published in the first half of 2014 talk about the promise and perils of Big Data in health care. The Federal Trade Commission's study, titled "Data Brokers: A Call for Transparency and Accountability," analyzed the business practices of nine "data brokers," companies that buy and sell consumers' personal information from a broad array of sources. Data brokers sell consumers' information to buyers looking to use those data for marketing, managing financial risk or identifying people. There are health implications in all of these activities, and the use of such data generally is not covered by HIPAA. The report discusses the example of a data segment called "Smoker in Household," which a company selling a new air filter for the home could use to target-market to an individual who might seek such a product. On the downside, without the consumers' knowledge, the information could be used by a financial services company to identify the consumer as a bad health insurance risk.

"Big Data and Privacy: A Technological Perspective," a report from the President's Office of Science and Technology Policy, considers the growth of Big Data's role in helping inform new ways to treat diseases and presents two scenarios of the "near future" of health care. The first, on personalized medicine, recognizes that not all patients are alike or respond identically to treatments. Data collected from a large number of similar patients (such as digital images, genomic information and granular responses to clinical trials) can be mined to develop a treatment with an optimal outcome for the patients. In this case, patients may have provided their data based on the promise of anonymity but would like to be informed if a useful treatment has been found. In the second scenario, detecting symptoms via mobile devices, people wishing to detect early signs of Alzheimer's Disease in themselves use a mobile device connecting to a personal couch in the Internet cloud that supports and records activities of daily living: say, gait when walking, notes on conversations and physical navigation instructions. For both of these scenarios, the authors ask, "Can the information about individuals' health be sold, without additional consent, to third parties? What if this is a stated condition of use of the app? Should information go to the individual's personal physicians with their initial consent but not a subsequent confirmation?"

The World Privacy Foundation's report, titled "The Scoring of America: How Secret Consumer Scores Threaten Your Privacy and Your Future," describes the growing market for developing indices on consumer behavior, identifying over a dozen health-related scores. Health scores include the Affordable Care Act Individual Health Risk Score, the FICO Medication Adherence Score, various frailty scores, personal health scores (from WebMD and OneHealth, whose default sharing setting is based on the user's sharing setting with the RunKeeper mobile health app), Medicaid Resource Utilization Group Scores, the SF-36 survey on physical and mental health and complexity scores (such as the Aristotle score for congenital heart surgery). WPF presents a history of consumer scoring beginning with the FICO score for personal creditworthiness and recommends regulatory scrutiny on the new consumer scores for fairness, transparency and accessibility to consumers.

At the same time these three reports went to press, scores of news stories emerged discussing the Big Opportunities Big Data present. The June issue of CFO Magazine published a piece called "Big Data: Where the Money Is." InformationWeek published "Health Care Dives Into Big Data," Motley Fool wrote about "Big Data's Big Future in Health Care" and WIRED called "Cloud Computing, Big Data and Health Care" the "trifecta."

Well-timed on June 5, the Office of the National Coordinator for Health IT's Roadmap for Interoperability was detailed in a white paper, titled "Connecting Health and Care for the Nation: A 10-Year Vision to Achieve an Interoperable Health IT Infrastructure." The document envisions the long view for the U.S. health IT ecosystem enabling people to share and access health information, ensuring quality and safety in care delivery, managing population health, and leveraging Big Data and analytics. Notably, "Building Block #3" in this vision is ensuring privacy and security protections for health information. ONC will "support developers creating health tools for consumers to encourage responsible privacy and security practices and greater transparency about how they use personal health information." Looking forward, ONC notes the need for "scaling trust across communities."

Consumer trust: going, going, gone? In the stakeholder community of U.S. consumers, there is declining trust between people and the companies and government agencies with whom people deal. Only 47% of U.S. adults trust companies with whom they regularly do business to keep their personal information secure, according to a June 6 Gallup poll. Furthermore, 37% of people say this trust has decreased in the past year. Who's most trusted to keep information secure? Banks and credit card companies come in first place, trusted by 39% of people, and health insurance companies come in second, trusted by 26% of people. 

Trust is a basic requirement for health engagement. Health researchers need patients to share personal data to drive insights, knowledge and treatments back to the people who need them. PatientsLikeMe, the online social network, launched the Data for Good project to inspire people to share personal health information imploring people to "Donate your data for You. For Others. For Good." For 10 years, patients have been sharing personal health information on the PatientsLikeMe site, which has developed trusted relationships with more than 250,000 community members.

On the bright side, there is tremendous potential for My Data to join other peoples' data to drive better health for "Me" and for public health. On the darker side, there is also tremendous financial gain to be made by third-party data brokers to sell people's information in an opaque marketplace of which consumers have no knowledge. Individuals have the most to gain from the successful use of Big Data in health. But people also have a great deal to lose if that personal information is used against them unwittingly.

Deven McGraw, a law partner in the health care practice of Manatt, Phelps & Phillips, recently told a bipartisan policy forum on Big Data in health care, "If institutions don't have a way to connect and trust one another with respect to the data that they each have stewardship over, we won't have the environment that we need to improve health and health care." This is also true for individual consumers when it comes to privacy rights over personal health data.



more...
No comment yet.
Scoop.it!

Will Healthcare Ever Take IT Security Seriously?

Will Healthcare Ever Take IT Security Seriously? | HIPAA Compliance for Medical Practices | Scoop.it

CIO - In the years since the HITECH Act, the number of reported healthcare data breaches has been on the rise - partly because organizations have been required to disclose breaches that, in the past, would have gone unreported and partly because healthcare IT security remains a challenge.

Recent research from Experian suggests that 2014 may be the worst year yet for healthcare data breaches, due in part to the vulnerability of the poorly assembled Healthcare.gov.

"The volume of IPs detected in this sample can be extrapolated to assume that there are millions of compromised healthcare organizations, applications, devices and systems sending malicious packets from around the globe."

Hacks and other acts of thievery get the attention, but the root cause of most healthcare data breaches is carelessness: Lost or stolen hardware that no one bothered to encrypt, protected health information emailed or otherwise exposed on the Internet, paper records left on the subway and so on.

What will it take for healthcare to take data security seriously?

Healthcare IT So Insecure It's 'Alarming'

Part of the problem is that healthcare information security gets no respect; at most healthcare organizations, security programs are immature at best, thanks to scarce funding and expertise. As a result, the majority of reported data breaches are, in fact, avoidable events.

[Related: Healthcare IT Security Is Difficult, But Not Impossible]

The recent SANS Health Care Cyber Threat Report underscores this point all too well. Threat intelligence vendor Norse, through its global network or honeypots and sensors, discovered almost 50,000 unique malicious events between September 2012 and October 2013, according to the SANS Institute, which analyzed Norse's data and released its report on Feb. 19. The vast majority of affected institutions were healthcare providers (72 percent), followed by healthcare business associates (10 percent) and payers (6 percent).

SANS uses the words "alarming" and "troubling" often in its analysis. "The sheer volume of IPs detected in this targeted sample can be extrapolated to assume that there are, in fact, millions of compromised health care organizations, applications, devices and systems sending malicious packets from around the globe," writes Senior SANS Analyst and Healthcare Specialist Barbara Filkins.

[ Tips: How to Prevent (and Respond to) a Healthcare Data Breach ]

More than half of that malevolent traffic came from network-edge devices such as VPNs (a whopping 33 percent), firewalls (16 percent) and routers (7 percent), suggesting "that the security devices and applications themselves were either compromised ... or that these 'protection' systems are not detecting malicious traffic coming from the network endpoints inside the protected perimeter," Filkins writes, noting that many vulnerabilities went unnoticed for months. Connected endpoints such as radiology imaging software and digital video systems also accounted for 17 percent of malicious traffic.

Norse executives say this stems from a disconnect between compliance and regulation. Simply put, says Norse CEO Sam Glines, "There is no best practice applied." Many firewall devices with a public-facing interface, for example, still use the factory username and password. The same is true of many surveillance cameras and network-attached devices such as printers - the default passwords for which can be obtained not through hacking but through a simple Internet search. "It's just not good enough in today's market."

The United States would do well to heed the European Union's data breach laws, Glines says, as they take a "categorically different" approach and include specific language about what is and isn't compliant. This could include, for example, specific policies for managing anything connected to an IP address or basic password and access control management measures, he says.

Mobile Health Security Especially Suspect

In the absence of such regulation, though, patient privacy is a myth. Data is shared freely in a hospital setting, for starters, and clinical systems favor functionality over privacy, so much so that privacy and security are often an afterthought in the development lifecycle. This is especially true in the growing mobile health marketplace, which largely places innovation before security.

[Related: Healthcare.gov Still has Major Security Problems, Experts Say]

Harold Smith discovered this all too quickly in December. After Happtique, a mobile health application certification group, released its first list of approved applications, Smith, the CEO of software firm Monkton Health, decided to check out a couple apps.

It wasn't pretty. He installed one on a jailbroken iPhone and, in less than a minute, pulled electronic protected health information (ePHI) from a plain text, unencrypted HTML5 file. He also found that this data - specifically, blood glucose levels - was being sent over HTTP, not HTTPS. "That was the first hint that something was wrong," he says. "That's a pretty big 'Security 101' thing to miss."

A second app, which Smith tested a few days later, also stored ePHI in unencrypted, plain text files. Though this app uses HTTPS, Smith notes in his blog that it sends usernames and passwords in plain text. "That was another big problem," he says.

[Slideshow: 12 Tips to Prevent a Healthcare Data Breach]

Happtique suspended its application certification program in light of Smith's findings, but the application developers themselves (as well as healthcare IT news sites and blogs) glossed over the issue. This irked Smith: "As someone who develops mobile health software, if someone tells me they've found a vulnerability, I get to them right away."

At the very least, Smith says, mobile health applications need a pin screen and data encryption. The bigger issue, though, is the tendency for developers treat mobile apps like desktop apps. That doesn't work in the "whole new Wild West" of mobile development, Smith says, where databases aren't encrypted and passwords need to be hashed. Five years after the release of the Apple SDK, he adds, people are still trying to figure it all out.

Is Red Tape to Blame for Poor Healthcare IT Security?

The same is true of healthcare's privacy and security regulations - which, to put it mildly are conflicting. Sharon Klein, head of the privacy, security and data protection practice at the law firm Pepper Hamilton, notes that, in the United States, there are 47 different sets of (inconsistent) data breach regulations and multiple regulatory frameworks.

[ Study: Healthcare Industry CIOs, CSOs Must Improve Security ]

If there are overarching standards, they come from the National Institute of Standards and Technology, Klein says, noting the Office for Civil Rights and Department of Health and Human Services have "consistently" used NIST standards. At the same time, other agencies are getting involved:

  • The Federal Trade Commission emphasizes privacy by design in the collection, transmission, use, retention and destruction of data;
  • The Food and Drug Administration's guidance on cybersecurity in medical devices and hospital networks pinpoints data confidentiality, integrity and availability, and
  • The Federal Communications Commission, in the wake of weak 802.11 wireless security, has issued disclaimers regarding text messaging and geolocation with implications for clinical communications.

[ Related: Solving Healthcare's Big Data Analytics Security Conundrum ]

Given the regulatory inconsistencies, Klein says it's best to document everything you're doing and conduct vigorous training and awareness programs for all staff. "Minimum necessary" policies, which limit who get to see which data and, critically, which change as an individual employee's role evolves, can eliminate unnecessary security holes, as does the appropriate de-identification of data.

Software developers have additional priorities. If anything is regulated, isolate it, Klein says, and make sure you disclose to consumers what data you are obtaining, what you intend to do with it, what third-parties will have access to it and whom to contact if there is an issue. Startups want to be first to market, she admits, but in the process - as Smith found - they can put security on the back burner, only to scramble to fill the gaps once vulnerabilities are discovered.

Balancing Healthcare IT Security and Accessibility

Experts largely agree that a cogent approach to health data security must balance security and accessibility, whether it's patients, physicians or third parties who want the data. This is especially important as the healthcare industry emphasizes more widespread health information exchange as part of a larger goal to provide more coordinated care.

"Security has for a long time been an afterthought. Now it has to be part of the build," says Glines - adding that, if it isn't, an app simply shouldn't be released.

Smith suggests that developers and security professionals hack iOS apps, as he did, and see for themselves how easy it is. Then, he says, they should ask, "If it's not that difficult, and [I'm] storing all that data on the phone, what can I do beyond what the OS offers?"

As it turns out, there are a "whole litany of things" that application developers can do, even in an ever-changing field. Specifically, Smith points to viaForensics' secure mobile development best practices, which apply to iOS and Android.

Given the findings of the Norse and SANS Institute study, Glines says it's worth having two conversations. One is with network administrators about the "basic blocking and tackling" work, such as actually changing default device passwords, which can bring about "simple, powerful change." The other is with executive staff about the implications of lax security - a conversation unfortunately made easier in the wake of the Target breach, which it turns out stemmed from systemic failures and not a single point of attack.

Regulators won't cut you any slack if a breach occurs, Klein says, especially if you knew vulnerabilities existed and didn't fix them. Under the new HIPAA Omnibus Rule, which went into effect in September 2013, firms face fines of up to $1.5 million in the event of the "willful neglect" of security issues."

Glines says boardrooms are beginning to shift their security mentality, but this will take time to trickle down. "In the next eight to 12 months, we will continue to see more front-page news" about data breaches, he says.



more...
No comment yet.
Scoop.it!

Time for hard HITECH reboot | Healthcare IT News

Time for hard HITECH reboot | Healthcare IT News | HIPAA Compliance for Medical Practices | Scoop.it

So, you dropped a huge chunk of change on a new IT system. Now you are frustrated and have buyer’s regret. The “installation requirements” are very complex. And although you just bought it, the system already needs to be updated. It only seems to run one application – “EHR version 1.0,” but you want to do many other things too. EHR v1 is not very user friendly and sure makes you do your work differently. And there is more. This expensive new system doesn’t seem to connect to anything. Sure, there is a basic email application available, but you also want to look for information and get what you need when and where you need it. Isn’t that what the Internet enabled years ago?


If this is the metaphorical world of HITECH, here’s to giving Karen DeSalvo, the new national coordinator for Health IT, all the support she needs to do a full and hard HITECH reboot. More than 30 billion dollars have been spent. And while it is reasonable that many HIT outcomes are still unfulfilled, the path forward seems murky. EHR adoption has surged, but much of what has been broken about health IT in the United States still remains. And the leverage of the HITECH funds is dwindling fast.

Now there is yet another independent report, this time from the JASON group which, like the report from the President’s Council of Advisors on Science and Technology before it, suggests the need for a major architecting effort for health IT nationally. The Government Accountability Office reports that there is a lack of strategy, prioritized actions, and milestones in HITECH. HIT interoperability is recognized as being limited at multiple levels. And resultantly, the benefits of HIT that depend on a combination of adoption, interoperability, and health information exchange as table stakes are elusive.



Meanwhile political resistance has reached a frenzied pitch. Entrenched interests are back on top and aggressively lobbying to advance their respective positions. Very few hospitals and providers have achieved Stage 2 of meaningful use, Stage 3 concepts have received substantial push-back, EHR certification is under fire, and ICD-10 has been pushed back yet another time.  Dr. DeSalvo has initiated a reorganization of the HIT Policy and the HIT Standards Committees, but has much more work cut out for her. So, here we offer our unsolicited top 10 list of HITECH reboot priorities:

10) Divide and conquer - Separate healthcare provider adoption angst and frustration from EHR vendor complaints about standards implementation and certification. Although these two things are frequently conflated during political push-back, they should be addressed very differently…

9) Meaningful relief - Providers need to be left alone for a while. They were already under incredible strain from many non-HIT related pressures. HITECH added to these pressures (necessarily) by fostering EHR adoption. But meaningful use added much more strain through sometimes aspirational criteria that demand workflow and process changes well beyond simply adopting an EHR. Give providers meaningful relief from many of these new business requirements. It is not clear that there are incentives to sustain them after HITECH and the infrastructure needs attention before many are viable.

8) Double down on interoperability - Don’t give in, on the other hand, to push back against standards and certification. In fact, the entire standards, implementation guidance, and certification process needs a boost to achieve just a strategic sample of the transactions needed in health. There needs to be a broader, more inclusive, standards process. The ONC Standards & Interoperability Framework has good ideas, but there are many more needs than ONC alone can promote. There are also needs for broader standardization and specification of technologies beyond just data and messages. Constructively re-engage the industry to help make this happen.

7) Certify more, not less – Certify more systems not less. Hang more government incentives on certified HIT. Use CLIA, the DoD EHR investment, TRICARE, all federal contracts and grants – whatever it takes.  Standards, interoperability specifications, and certification that includes conformance testing is the recognized path to interoperability. Certification needs to be specific enough for robust conformance testing and interoperability certification needs to be applied to all HIT participants not just EHRs. Identifying business outcomes, even when incentives are aligned, will not suffice for interoperability. But by all means make certification as minimally burdensome as possible by focusing only on interoperability and security.

Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

more...
No comment yet.
Scoop.it!

Medical kiosks raise security flags | Healthcare IT News

Medical kiosks raise security flags | Healthcare IT News | HIPAA Compliance for Medical Practices | Scoop.it

As the Cleveland Clinic adds its prestigious name to the hospital groups that have embraced next-generation medical kiosks – groups that include Metro Health, Miami Children's Hospital, Kaiser Permanente, Central Ohio Primary Care and Nationwide Children's Hospital – healthcare IT executives are wrestling with the powerful pros and cons of such a move.

[See also: Ohio hospital launches kiosk pilot.]

These kiosks have compartments that software can open to release various medical instruments (blood pressure cuff, thermometer, stethoscope, scale, pulse oximeter, otoscope, dermascope, etc.), which a nurse can help the patient use. The kiosk transmits the data to a cloud controlled by a kiosk vendor (in the case of the Cleveland Clinic, it's a vendor called HealthSpot) and then establishes a live video stream with a physician for a consultation. The kiosks can also accept payment card transactions.

The advantages of such systems are extensive, allowing far more patients to be seen. The kiosks can be placed in malls, houses of worship, schools, community centers and other locations that can be far more convenient for patients, especially in rural areas. The video streaming also offers the potential for efficiency improvements, where a doctor can hop onto the network whenever he/she has a free 10-15 minutes and see a patient.

"A doctor can jump on the laptop and get on the network if needed," said Christopher Soska, the Cleveland Clinic's chief of operations for community hospitals and family health centers. "The convenience for the patients is the biggest factor. Remember that the true value of these units is not within a medical office building. It's being out in the market in retail, in community centers, in churches. These patients may not have the means to get to our care providers."

[See also: Telehealth gives Miami docs global reach.]

That convenience, though, comes with inevitable IT security and privacy risks. How well are those highly-sensitive video streams protected? What about that payment card information? Is that highly profitable data going to attract cyber thieves? All of that data being gathered is being sent to the vendor's cloud. How secure is that cloud? Will cloud neighbors present a threat?

What are the privacy and marketing issues?

HealthSpot CEO Steve Cashman, for example, said his company has yet to decide how they will use – and potentially make money from – that information. That means that the hundreds of patients who have already given information to these kiosks did so with no guarantees how the data would ultimately be used. Patients are presented with a terms and conditions page, but HealthSpot refused to share its wording, saying the phrasing was proprietary. Yet the terms and conditions page is open to anyone who goes to a mall and walks into one of the kiosks.

How could that ultra-sensitive medical data be used?

"We understand the long-term value of that data," Cashman said. "If we're successful in building this out, we in theory could have the largest amount of data in the entire healthcare (community). We ultimately have to decide who that data is valuable to."

Among the privacy issues is how patients regard the kiosks.  When patients see the Cleveland Clinic name, for example, they might apply the trust they have for Cleveland Clinic brand  to the kiosk. And if anything were to happen to the data – regardless of whether it's the fault of Cleveland Clinic – the patient is likely to blame the trusted brand that made them comfortable using the kiosk initially.

Cleveland's Soska avoided discussing data use limitations and security issues, saying some of those details might have been covered in the group's contract with the vendor. It raises the question of how much patients themselves know the limits – and the exposures – before they use the kiosks.

Kathy Jobes is the chief information security officer at the 125-year-old 12-hospital $5.6 billion Sentara Healthcare enterprise. Jobes said that her organization is not, at the moment, exploring such advanced healthcare kiosks, but she'd have plenty of security concerns if it were.

These kiosks "are very exciting and they look very sexy, but it all comes with a price," Jobes said. "If it's not carefully thought through, there could be a negative ending for the organization. I can definitely see where it's very exciting, but I would have a lot of questions."

For example, she said, the streaming video part could be problematic, from a security perspective. "Streaming video is (security) difficult. It presents a lot of challenges," Jobes said. "It's not just the encryption issues, but the authentication is challenging at best for streaming video."

The authentication risk, Jobes said, is to prevent an identity thief from impersonating a prior user of the kiosk. With different medical personnel looking at the video, there is likely to be no one who would recognize that patient by sight. Not only could such a thief potentially learn intimate medical details about the patient – as the physician reveals details accessed in the patient's file – but it could also be used to steal prescriptions.

Another concern that Jobes had about such data-hungry kiosks is how the data would be destroyed when the relationship ended. Would an agreement allow for some data-retention in case patients access the kiosk through a different hospital group? Is the vendor allowed to use data in aggregate and to be able to sell it? Are there restrictions on whether the data can be placed on mobile devices or thumb drives, if an executive wanted to work on material at home or while on the road? Also, with relentless backups or backups in multiple locations, is it even possible to guarantee that every last bit of data will be destroyed, even if a contract requires that? If it's in the cloud, for example, who knows how many automated backups of that server are being executed?

"I absolutely would require upfront the foundational agreement that very explicitly set forth the scope of how the data could be used and how the data would be destroyed," Jobes said.

The cost of the HealthSpot kiosks to the hospital is about a $15,000 implementation fee, a multiyear support/maintenance agreement for about $1,000/month and a portion of the appointment fee (often around $12 goes to the vendor), Cashman said.  "If somebody is not insured and they just want to pay, we do offer a $59 cash pay option," he said, adding that they are preparing to accept PayPal soon.

Cleveland's Soska said these kind of next-generation kiosks have strong potential to get more patients seen more quickly and more efficiently. But it's not for all ailments. "This is not for emergent care. Chest pains still need to go to the emergency room," Soska said.

more...
No comment yet.
Scoop.it!

FTC Urges Congress To Better Protect Consumer Health Data

FTC Urges Congress To Better Protect Consumer Health Data | HIPAA Compliance for Medical Practices | Scoop.it


n Tuesday, the Federal Trade Commission released a reportrecommending that Congress pass legislation to make data broker practices more transparent and give consumers more control over their personal health information, the Los Angeles Times' "Technology Now" reports (Faturechi, "Technology Now," Los Angeles Times, 5/27).

Report Details

The report is based off of an FTC investigation launched in 2012 involving nine major data-broker companies that aggregate consumer information from various online and off-line sources with little oversight (Ryan, National Journal, 5/27). Those companies were:

  • Acxiom;
  • CoreLogic;
  • Datalogix;
  • eBureau;
  • ID Analytics;
  • Intelius;
  • PeekYou;
  • Rapleaf; and
  • Recorded Future (Kerr, AP/ABC News, 5/28).
Report Findings

The report found that data brokers are collecting large amounts of personal information, including medical data, about "nearly every U.S. consumer" and are operating with a "fundamental lack of transparency" (National Journal, 5/27).

Specifically, FTC noted that brokers commonly:

  • Fail to verify data;
  • Offer risk-mitigation services that let companies check consumers' information before providing services;
  • Store data indefinitely, which could increase the risk of data theft; and
  • Sort data into potentially sensitive categories such as "cholesterol focus" and "diabetes interest."

In addition, the report found that brokers commonly analyze data to "predict which consumers are likely to use brand-name medicine, order prescriptions by mail, research medications online or respond to pharmaceutical advertisements" (Tahir, "Vital Signs," Modern Healthcare, 5/27).

FTC noted that such practices could increase the risk of data breaches and raise privacy concerns (National Journal, 5/27).

Recommendations

In the report, FTC called on Congress to enact legislation that would:

  • Allow consumers to access and correct their data or opt out of the data collection ("Vital Signs,"Modern Healthcare, 5/27);
  • Build a public, central database with details on the data brokers and the information they collect; and
  • Require brokers to disclose the names in and categories of their data (AP/ABC News, 5/27).

The report concluded, "Allowing consumers the ability to exercise control over the use of sensitive information is particularly important," adding, "For categories that some consumers might find sensitive and others may not (e.g., visually impaired, balding, overweight), having access to this data, along with the ability to suppress the use of it for marketing, will improve the transparency of data broker practices and allow consumers to control uses of the data about which they care the most" (Slabodkin, Health Data Management, 5/28).

Reaction

Sen. Jay Rockefeller (D-W.Va.), who has previously introduced legislation that would increase transparency among data brokers, in a statement said, "With the release of today's report, which is supported by Democratic and Republican FTC commissioners, our conclusion is stronger than ever -- big data practices pose risks of consumer harm, including discrimination based on financial, health and other personal information." He added, "Congress can no longer put off action on this important issue" (National Journal, 5/27).

more...
No comment yet.
Scoop.it!

Omnibus HIPAA in action: The good, bad and scary | Government Health IT

Omnibus HIPAA in action: The good, bad and scary | Government Health IT | HIPAA Compliance for Medical Practices | Scoop.it


2014 is the first financial year after the HIPAA Final Rule, and healthcare privacy has transformed in ways that are good, bad, and downright scary.

The good: Higher awareness
On the positive side, the total number of data breaches in healthcare has declined slightly, according to the Fourth Annual Benchmark Study on Patient Privacy and Data Security by Ponemon Institute. Clearly, healthcare organizations are aware of the requirements, and their data security budgets and systems are catching up.

On the forensics side, for instance, we've seen a strengthening of networks and locally stored data by hospitals and other healthcare organizations, with a greater use of applications that monitor networks. Lots of data breaches have been avoided.

The bad: Bigger, badder breaches
The news isn’t all rosy, however. Ponemon also found that 90 percent of respondents had at least one data breach over the past two years, while 38 percent have had more than five data breaches in the same period. Clearly, threats to patient data remain high.

More complex information systems and business relationships are leading to larger, more complex breaches. Ironically, the data on the internal systems of HIPAA covered entities is now much better protected, but with so much data in the cloud or shared with business associates, large amounts of information have become less well protected.

There are two big issues with cloud storage. First, organizations and users fail to realize that when they look at their email or share folder, they don't know where that data is actually located. People assume it is well protected, but it may not be. Second is the amount of data kept in the cloud. The low cost of cloud storage means many people use email as their storage system instead of using folders on a local file system. While their computer is in a highly protected work environment, their email is in the cloud, protected by only an email address and password. When the dam breaks, there's a huge amount of data.

Outsourcing to business associates also creates vulnerabilities. Healthcare organizations outsource work, such as claims processing, to cut costs and become more competitive. Unfortunately, those vendors are also competing with each other on costs, leading some to overstate the data security they provide.

The scary: Threats you can’t predict
The unpredictable threats come from the newest developments in the healthcare ecosystem, and from the computing infrastructure itself. In the healthcare market, health information exchanges are one of the big unknowns. The Ponemon study found that one-third of organizations surveyed have no plans to become a member of an HIE, in fact, with 72 percent not confident or only somewhat confident in the security and privacy of patient data shared on HIEs.

Security problems in the exchanges will arise because multiple levels of outsourcing, contracting, and subcontracting make them so opaque. When security incidents happen, organizations may not know for sure who is responsible for detecting or addressing the breach. And unlike platforms that have been around longer, we have not yet seen all the bugs and vulnerabilities in exchanges.

3 tips to protect health data
Despite unpredictability and greater complexity, organizations can still protect their patients’ privacy and secure their data with some common-sense strategies:

1. Don’t cut security costs. Reputable cloud storage companies have tools available for logging, monitoring, and controlling or restricting data access. But because organizations move to the cloud to save money, they’re not inclined to spend on security add-ons and plugins. People tend to see add-ons as optional, and companies are not buying the bells and whistles when they move to the cloud. Paying for security will pay off in the long run.

2. Don’t assume your security investment will protect you 100 percent. This may seem counter-intuitive to the first step, but it’s impossible to anticipate every threat. Organizations should assume that their security will fail, and go back to basic questions:
• Can we divest ourselves of data?
• Can we decrease our vulnerability?
• Can we take data off line?
• Can we limit the number of people with access?

3. Communicate, communicate, communicate. Organizations need to figure out the lines of communication before an incident happens. The more communication and transparency a covered entity has, both internally and to the regulatory agencies and the public, the better off it will be. Roles and responsibilities need to be clearly defined ahead of time, key decision-makers need to be in the meetings, and all stakeholders need to be in the loop.

And when the worst does happen, organizations need to go the extra mile in making things right for the individuals affected. If they don’t, consumers and patients may look elsewhere for other healthcare providers.
 

more...
No comment yet.
Scoop.it!

New York-Presbyterian And Columbia Hospitals To Pay Record HIPAA Settlement - Food, Drugs, Healthcare, Life Sciences - United States

New York-Presbyterian And Columbia Hospitals To Pay Record HIPAA Settlement - Food, Drugs, Healthcare, Life Sciences - United States | HIPAA Compliance for Medical Practices | Scoop.it

On May 7, 2014, the US Department of Health and Human Services Office of Civil Rights (OCR) announced settlements with two New York-based hospitals totaling $4.8 million for violations of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy and Security Rules. The settlements related to the hospitals' failure to secure the electronic protected health information (ePHI) of thousands of patients held on their networks and are the latest example of OCR's increased enforcement action.

The two hospitals, New York-Presbyterian Hospital (Presbyterian) and Columbia University (Columbia), which participate in a joint arrangement allowing Columbia faculty members to serve as attending physicians at Presbyterian, were the subject of investigation following their submission of a joint breach report to OCR in September, 2010. As part of their joint arrangement, the hospitals operate a shared data network, administered by employees of both entities, which links to Presbyterian patient information systems containing ePHI.  The breach occurred when a physician employed by Columbia attempted to deactivate a personal computer server that was on the shared network and contained Presbyterian patient ePHI. The improper deactivation of the server resulted in ePHI being accessible through Internet search engines. Presbyterian and Columbia reported the disclosure of the ePHI of 6,800 individuals, including patient status, vital signs, medications, and laboratory results.

As part of their investigation, OCR also determined that neither hospital had conducted a thorough risk analysis to determine all systems accessing the shared data network and that neither hospital had an adequate risk management plan to address the potential threats to ePHI. Christina Heide, Acting Deputy Director of Health Information Privacy for OCR, noted that entities participating in joint compliance arrangements "share the burden of addressing the risks to protected health information," and that the cases against the hospitals should "remind health care organizations of the need to make data security central to how they manage their information systems."

Presbyterian has paid OCR a settlement of $3.3 million, while Columbia has paid $1.5 million. In addition to the monetary penalties, both hospitals agreed to substantive corrective action plans, which include requirements for the hospitals to undertake a risk analysis, develop a risk management plan, revise policies and procedures, and complete staff training.

OCR's settlements with Presbyterian and Columbia come one week after the agency announced settlements with two health care entities totaling close to $2 million for violations of the Privacy and Security Rules. The two companies, Concentra Health Services and QCA Health Plan, Inc., were the subject of separate OCR investigations initiated following reports of breaches of ePHI by the entities to OCR. Both breaches were the result of the thefts of unencrypted laptops containing ePHI. Concentra agreed to pay OCR $1.725 million and to adopt a corrective action plan to ensure that sufficient protections are put into place to safeguard ePHI. QCA agreed to a fine of $250,000 and to provide OCR with a risk management plan including additional risk-limiting security measures to secure QCA's ePHI.

OCR has substantially increased its HIPAA enforcement efforts in recent years. The Health Information Technology for Economic and Clinical Health Act (HITECH), as implemented by the Omnibus HIPAA Rule issued on January 25, 2013 (available at 78 Fed. Reg. 5566), increased the potential civil monetary penalties that OCR could impose on Covered Entities — health care providers, health plans, and health care clearinghouses — and their Business Associates — entities that create, receive, maintain or transmit Protected Health Information for or on behalf of Covered Entities — for violating HIPAA. The Director of the OCR, Leon Rodriguez, has been quoted as saying the Omnibus Rule strengthened OCR's ability to "vigorously enforce the HIPAA privacy and security protections, regardless of whether the information is being held by a health plan, a health care provider or one of their business associates."

In order to mitigate the risk of a potential breach, it is critical that Covered Entities and their Business Associates conduct a thorough risk analysis and develop risk management plans to address the potential threats and hazards to the security of ePHI. The risk analysis should frequently be reviewed and updated to account for changes in technology and/or new risks and risk management plans should be modified accordingly. Covered Entities and their Business Associates should also implement policies and procedures addressing workforce member access to databases and network security and should ensure that all employees and workforce members with access to ePHI are properly trained on the policies and procedures. As OCR's latest settlement indicates, failure to take these steps can result in severe financial penalties.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.



more...
No comment yet.
Scoop.it!

Is your client’s wellness plan fully HIPAA compliant?

Is your client’s wellness plan fully HIPAA compliant? | HIPAA Compliance for Medical Practices | Scoop.it
By Melissa A. Winn
May 9, 2014

Employers that offer an outcome-based wellness program are required by federal law to also offer a reasonable alternative standard (RAS), such as an educational class or health program, but advisers and employers need to know the RAS must continue to be offered annually even to employees who continuously fail to meet the desired health outcome.

Like what you see? Click here to sign up for Employee Benefit Adviser's weekly newsletter to get exclusive articles and polls that are designed to help build business.

Under the Health Insurance Portability and Accountability Act of 1996 all outcome-based wellness programs, those that offer a reward under a group health plan for individuals who attain or maintain a specific health outcome such as not smoking, must also offer an RAS to obtain the reward. This can include allowing employees to complete a smoking cessation program to earn the reward or avoid a surcharge to their premium. But, HIPAA rules also require employers to offer the RAS annually and allow employees to qualify for the reward through the RAS regardless of whether they fail to meet the health outcome, such as quitting smoking.

“Even if a participant continues to fail to meet the desired health outcome … [like] smoking cessation, healthy cholesterol level, healthy BMI … year after year, the participant must be able to continue obtaining the reward, avoiding any surcharge, by completing an appropriate RAS,” say attorneys Amy Ciepluch and Sarah Fowles of the Milwaukee, Wisc.-based Quarles and Brady law firm.

Completion of the program results in receiving the reward or avoiding the premium surcharge, regardless of whether the employee has stopped smoking or achieved a healthier BMI or cholesterol level. And the next year, the employer must offer the employee the same opportunity to complete the program (and possibly fail) to avoid the surcharge, the lawyers say in a blog posted this week on the subject.

Compliance for the RAS does not end there, however, Ciepulch and Fowles add, the RAS must also meet the following HIPAA requirements:

  • If the RAS is completion of an educational program, the employer must make the educational program available to the individual or assist the individual in finding such a program. The program must be free for the individual.
  • The time commitment must be reasonable.
  • If the RAS is a diet program, the employer must pay any membership or participation fee but is not required to pay for the cost of food.
  • If an individual’s personal physician states that a plan standard is not medically appropriate for the individual, the plan must provide a RAS that accommodates the physician’s recommendations.
  • If the RAS is another outcome-based wellness program, it must comply with the outcome-based wellness program rules.
  • The RAS cannot be a requirement to meet a different level of the same standard without providing an individual with additional time to comply with the RAS.

Notice of the availability of an RAS must be provided in all plan materials describing the terms of an outcome-based wellness program and any disclosure that an individual did not satisfy an initial outcome-based standard.

Increasingly, Ciepulch and Fowles say they are seeing wellness program designs that offer participants a “menu” of options to obtain a specific health plan reward or avoid a surcharge. While some of the methods in these designs are outcome-based and some methods are participatory and/or activity-based, offering employees more choice and flexibility, they caution employers to have the plans reviewed for HIPAA compliance.

Additionally, they say, some wellness programs provide rewards in cash, gift cards, or other tangible goods, and do not connect the rewards with a group health plan, thus avoiding regulation as an outcome-based wellness program.

But, these programs should also be reviewed for compliance with other applicable laws, they add.



more...
No comment yet.