HIPAA Compliance for Medical Practices
82.5K views | +6 today
HIPAA Compliance for Medical Practices
HIPAA Compliance and HIPAA Risk management Articles, Tips and Updates for Medical Practices and Physicians
Your new post is loading...
Your new post is loading...

The HHS/OCR Hit List for HIPAA Audits

The HHS/OCR Hit List for HIPAA Audits | HIPAA Compliance for Medical Practices | Scoop.it

As the HHS Office for Civil Rights analyzes breach reports for vulnerabilities, it has learned lessons on areas where covered entities should pay particular attention to their HIPAA compliance efforts. With OCR hoping soon to launch a permanent random HIPAA Audit program, the agency has reiterated six core ways to avoid common types of breaches, which will be among the targeted focus areas of audits.

No comment yet.

Privacy and security experts: mHealth requires a new approach | mHealthNews

Privacy and security experts: mHealth requires a new approach | mHealthNews | HIPAA Compliance for Medical Practices | Scoop.it

The proliferation of mobile devices in healthcare, from smartphones and tablets to the clinical devices themselves, is forcing healthcare executives to take a new approach to privacy and security.

Gone is the "security cop" approach, in which staff and employees are simply told what they can and can't use and do. Instead, we're seeing a "business enablement" approach, in which privacy and security concerns are woven into the workflow.

The reasoning behind this, says Jim Doggett, Kaiser Permanente's senior vice president, chief security officer and chief technology risk officer, is that cybercrime is an industry now, and the old method of "do it my way or else" won't work any more. With new ways of delivering healthcare must come new ways of protecting it.

"We're a bit out of alignment," Doggett said during a recent presentation at the HIMSS Media Privacy and Security Forum. "We're still solving yesterday's problems when we need to be solving today's and tomorrow's problems."

To wit: Doggett said he wanted to determine how to best implement a new policy on privacy and security. He tailed a physician during a normal workday, and watched the man log on and off and back onto various systems "maybe 50 times." Doggett said he realized the doctor wasn't going to adopt any new privacy and security rule that added to his workload, and would in fact welcome something that improved it.

The answer: Don't just establish a policy and enforce it; work with doctors, nurses and other staff members to see how it can best be implemented.

That was the thinking prevalent during the first day of the two-day forum, being held in San Diego. Healthcare is changing so much as it is, so privacy and security methods have to be woven into those changes. If mHealth and telemedicine are going to improve healthcare delivery over the coming years, develop privacy and security platforms that enhance those methods, rather than pushing people away or hindering adoption.

The takeaway for mHealth enthusiasts during the first day of the conference is that privacy and security has to become more fluid – rigid rules just won't work any more – and mindful of the fact that sensitive data is moving in and out of the enterprise in more ways and on more devices.

Mobile devices and social media "are really big areas of compliance concern," said Iliana L. Peters, senior advisor for HIPAA compliance and enforcement with the U.S. Health and Human Services Department's Office for Civil Rights. She said too many healthcare providers aren't taking this seriously. "They neglect to acknowledge where their data is or the risk to that data."

Encryption of data has to become the norm, rather than a suggested policy.

"If your entity is not encrypting, it should be," she said.

And doctors and nurses have to be made to understand that protection of sensitive data is "a part of efficient healthcare." Michael Allred, Intermountain Healthcare's information security consultant and identity and access team manager, said clinicians are the toughest to educate and may be frustrated with privacy and security efforts, but one breach could cost them and their institution much in terms of reputation and money.

No comment yet.

Key privacy rule could fall to accountable care push | Vital Signs | The healthcare business blog from Modern Healthcare

Key privacy rule could fall to accountable care push | Vital Signs | The healthcare business blog from Modern Healthcare | HIPAA Compliance for Medical Practices | Scoop.it

History may look back on last week as an inflection point for privacy and technology in the healthcare industry.

That's because what happened makes it possible that a bulwark federal privacy rule will become a casualty of the push to accountable care, patient-centered medical homes and other population-health oriented care plans.

If the considered rule change happens, proponents of these care plans could have broader access to the medical records of patients of drug and alcohol abuse programs without those patients' consent. That will help healthcare providers afford those patients better coordinated, higher quality and more cost efficient care, these proponents say.

Opponents of the rule change warn, however, without the law's current stringent consent requirements, drug and alcohol abuse patients will avoid seeking treatment, out of concern their stigmatizing and or illegal activity will be exposed, a situation the rule, created in the 1970s, sought to avoid.

“I think what will happen is you'll see some people who will be in substance abuse treatment either won't get it or will stop confiding as much as they do in their therapist,” said Jim Pyles, a Washington privacy lawyer who testified last week on behalf of the American Psychoanalytic Association and in favor of maintaining a stringent federal privacy rule covering these patient records.

Here are three actions last week around which federal technology privacy policy may turn and why:

A federal regulatory advisory panel last Tuesday accepted recommendations from its privacy workgroup that would put off until 2017 the introduction of some narrow and largely voluntary privacy protection criteria under the electronic health-record incentive payment program of the American Recovery and Reinvestment Act of 2009. The policy recommendations are for technology to protect the privacy of behavioral health patient information. The federal privacy workgroup, the Privacy and Security Tiger Team, has been looking at certain privacy protection technology since 2010. But it has not recommended that the feds put a regulatory stake in the ground, telling developers of electronic health-record systems and information exchange systems that they should add this technology to their own systems, or encouraging healthcare providers to incorporate the technology in their workflows.

On Wednesday, the Substance Abuse and Mental Health Services Administration, in a day-long listening session, heard conflicting testimony on whether it should consider modifying the federal privacy rule, 42 CFR Part 2, covering the transmission and sharing of medical records of many drug and alcohol abuse patients.

Many “general” healthcare providers aren't using substance-abuse treatment data because they don't have the technology to help them handle it efficiently and in compliance with the law.

But if SAMHSA continues its unflagging support for the special rule for handling substance abuse information, it may force technology developers to incorporate privacy-protecting technology into their systems, and induce providers to use it, affording better protection for all healthcare data, privacy advocates say. If the rule is weakened, however, that technology may never be rolled out.

Finally, on Thursday came the news that privacy advocate Joy Pritts, the first chief privacy officer at the Office of the National Coordinator for Health Information Technology at HHS, would be stepping down after 4 ½ years on the job.

Pritts praised an early implementation of data segmentation technology for behavioral health demonstrated by EHR developer Cerner Corp. at this year's Health Information and Management Systems Society, adding her hope that “other vendors follow that lead.” The concern, expressed by several privacy advocates bemoaning her July departure, is that her successor— unknown at this pivotal moment—might not be as stalwart an advocate for patients' rights and data segmentation technology as Pritts has been.

Development and adoption of the technical capabilities to affix so-called “meta-data tags” to patient records—which also would aid in interoperability and research as well as privacy protection—was urged by the President's Council of Advisors on Science and Technology in 2010 and by the JASON, a group of top scientists working for the Agency for Healthcare Research and Quality, this April.

SAMHSA, itself, sponsored one of six ONC “Data Segmentation for Privacy” pilot projects to test the technology. But the agency is under considerable pressure to ease regulatory restrictions on the flow of this data, rather than hold firm and press the industry to adopt technology that will help providers comply with the consent provisions of 42 CFR Part 2.

Either way, SAMHSA's decision will likely have a wider impact on privacy protections than the scope of that rule now.

No comment yet.

Security: healthcare's fixer-upper | Healthcare IT News

Healthcare's all about the patients, right? Earning their trust so they return for annual checkups, delivering high-quality care while respecting their medical privacy at the highest level. But far too often, there's a disconnect – the idea that the care ends when the patient exits the building or a diagnosis is made, the idea that clinical deals with clinical and information technology deals with IT. But, that's not often the case in this digital age. Lines are blurred, and what happens in one area can have serious implications for another – especially when it comes to patient privacy. 

Healthcare organizations are charged with safekeeping some of the most personal and sensitive information on individuals who come to receive care. That bout of depression you had in your early 20s, the sexually transmitted infection you were treated for last year, blood tests of every ilk, cancer diagnoses, medical procedures, HIV statuses, psychiatric disorders, every medication you've ever been prescribed, administered vaccinations, Social Security numbers, dates of birth, demographics, where you live, insurance details, even payment information. Healthcare organizations are gold mines of data. Valuable data. And, traditionally, protecting said data hasn't been the industry's strong suit.    Since 2009 when the HIPAA breach notification requirements took effect, nearly 1,000 large data breaches – those involving 500 individuals or more – have been reported to the Department of Health and Human Services, affecting almost 32 million people.     In addition to the breaches reported by covered entities and business associates  themselves, the Office for Civil Rights, the HHSdivision responsible for enforcing HIPAA,  has received nearly 95,000 privacy and security complaints over the handling of health data  since 2003. That's a number meriting a reevaluation of how healthcare does privacy and  security. 

Complex nature 

Of course, the reasons behind why many organizations have reported egregious privacy  and security failings are not always one dimensional. Oftentimes, data breaches are the  result of mistakes by well-intentioned people governed by poor policies and paltry staff  training, and sometimes it's the other way around.    Frequently, it's a matter of unencrypted devices being stolen or lost, but there's low probability the data has actually been compromised. 

Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

No comment yet.

Health Organizations Across the U.S. Report Data Breaches - iHealthBeat

Health Organizations Across the U.S. Report Data Breaches - iHealthBeat | HIPAA Compliance for Medical Practices | Scoop.it

Several data breaches and security issues recently occurred at health care providers and entities across the U.S.

Details of Mont. Department of Public Health and Human Services Breach

On Thursday, a spokesperson for the Montana Department of Public Health and Human Services announced that hackers have had access to a computer server since July 2013 but that there is no evidence that individuals' personal health data have been stolen, the AP/San Francisco Chronicle reports.

The data breach was identified by employees in the agency's IT department on May 15, at which time the server was shut down and the police were contacted. DPHHS spokesperson Jon Ebelt said that a forensic review did not show any evidence that information was stolen. He said that he does not know how many individuals' data are stored on the server but that it is used to store information on those served by the agency, including:

  • Addresses;
  • Clinical information;
  • Dates of birth;
  • Names; and
  • Social Security numbers.

Health department officials are mailing letters to those whose information could have been on the server with instructions on how to monitor their credit. The agency purchased additional security software and is reviewing its policies and procedures to avoid another breach (Volz, AP/San Francisco Chronicle, 5/29).

Details of Denver-Based Department of Veterans Affairs Hospital Breach

A Department of Veterans Affairs hospital in Denver last week reported that two pulmonary laboratory computers containing data from tests on 239 patients were stolen, according to a VA spokesperson, the Denver Post reports (Phillips, Denver Post, 5/27).

The computers were stored on portable carts located inside a locked room in the hospital's pulmonary lab. Officials reported the computers as stolen to local law enforcement in Denver and Aurora, the VA inspector general's office and the local VA police. Officials have launched a criminal investigation and affected patients are being contacted and given information for credit monitoring (Ferrugia, ABC News Denver, 5/27).

Details of New Hampshire Hospital Breach

Officials at Manchester, N.H.-based Elliot Hospital and local law enforcement are investigating the theft of four computers that contain the personal health data of 1,213 patients, WMUR New Hampshire reports.

The computers were stolen from an employee's car in March while they were being transported to a location to be disposed.

Hospital officials said the computers had auto-archived three spreadsheets and 20 emails, and included patient information such as:

  • Billing codes;
  • Dates of service; and
  • Names.

Patients whose information was stored on the computers are being contacted by the hospital (Sexton, WMUR New Hampshire, 5/24).

Details of Oregon-Based Hospital Breach

Officials at ProMedica Bay Park Hospital are notifying more than 500 patients of an internal security breach of protected health information, WNWO reports.

Officials said that an internal investigation showed that between April 1, 2013, and April 1, 2014, a hospital employee accessed records of patients who were outside of the employee's care. The information accessed included patients':

  • Attending physicians;
  • Dates of birth;
  • Diagnoses;
  • Medications; and
  • Names.

The employee was immediately fired and the hospital is offering all affected patients a one-year membership for identity theft protection services (Mack, WNWO, 5/28).

Details of Humana Data Breach

Last month, Humana notified 2,962 Atlanta-area members about a security breach that could have compromised their personal information, Modern Healthcare reports.

In a statement, Humana said the breach occurred when an unencrypted USB drive and an encrypted laptop were stolen from an employee's car. The data on the USB drive included members':

  • Medical record information;
  • Names; and
  • Social Security numbers (Herman, Modern Healthcare, 5/27).

Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

No comment yet.

Staff blunder leads to HIPAA breach | Healthcare IT News

Staff blunder leads to HIPAA breach | Healthcare IT News | HIPAA Compliance for Medical Practices | Scoop.it
A Pennsylvania-based hospital is notifying nearly 2,000 patients of a HIPAA breach after an employee accessed and transmitted patients' protected health data outside of the hospital's secure information network. 
After conducting an internal investigation, the 551-bed Penn State Milton S. Hershey hospital on Friday notified 1,801 patients that their names, medical records numbers, medical lab tests and results and visits dates could have been accessed by an unauthorized person or entity due to an employee mistake, according to a hospital notice.  
Officials discovered a Penn State Hershey clinical laboratory technician, who was authorized to work with protected health information, accessed patient data via an unsecure USB device through his home network rather than the hospital network. Moreover, he also transmitted patient data via his personal email to two Penn State physicians. 
The breach was discovered by hospital officials on April 11. 
"Penn State Hershey considers patient privacy and confidentiality to be of the utmost importance and chose to notify patients of this incident out of an abundance of caution," read a June 6 public notice. "To decrease the likelihood of similar circumstances occurring in the future, Penn State Hershey is increasing education efforts with employees, focusing on the essential responsibility of all staff to safeguard patient health information at all times and follow proper practices for doing so." 
This is the first large HIPAA breach Penn State Hershey has reported to the Department of Health and Human Services. 
As Mayo Clinic's Mark Parkulo pointed out in an interview with Healthcare IT News late last month, employee education proves one of the most important pieces to doing patient privacy right. "Some of it is a real education issue," Parkulo, who is the vice chair of Mayo's Meaningful Use Coordinating Group, noted. "A number of providers and other people don't understand that typical unencrypted email; you're not even sure exactly what locations it's going to, whether it could be intercepted or not."
Because of that, in addition to the standard education for employee orientation, Mayo tries to get out to employees multiple times per year for education sessions, whether that be through grand rounds, via online, email or even via the CEO. 
To date, more than 31.6 million people have had their protected health information compromised in a HIPAA privacy or security breach -- only in breaches involving 500 individuals or more -- according to data from HHS. 
The Office for Civil Rights, the HHS division responsible for investigating HIPAA breaches, has levied more than $25.1 million in fines against covered entities and business associates found to have violated privacy and security rules.
No comment yet.

In-Depth: Consumer health and data privacy issues beyond HIPAA

In-Depth: Consumer health and data privacy issues beyond HIPAA | HIPAA Compliance for Medical Practices | Scoop.it

“The issue of consumer generated health data is on that is near and dear to my heart,” Federal Trade Commission Commissioner Julie Brill told attendees at an event focused on the protection of such health data earlier this month. “…Big picture, consumer generated health information is proliferating, not just on the web but also through connected devices and the internet of things.” As Brill noted this kind of health data are “health data flows that are occurring outside of HIPAA and outside of any medical context, and therefore outside of any regulatory regime that focuses specifically on health information.” That’s why it falls to the FTC to oversee privacy-related concerns for consumer generated health data. 

“I was at the Consumer Electronics Show in January and was really wowed by much that I saw,” Brill said. “Some of the devices that I saw were particularly focused on health and the quantified life. One in particular that struck me was [Rest Device's] Mimo, a onesie developed to measure the heart beats, respiration rate, and other vital signs of an infant or newborn. It could send information to an app, to the parent’s mobile device. Think about the benefits to any parent who is worried about SIDS or might want to get their baby to sleep better or get themselves to sleep better.”

Brill also noted the rise of step counting devices, the trend of some physicians Googling their patients, and what she called an ongoing ethical debate about physicians “friending” their patients on Facebook. “There is also the now infamous example of companies that are generating their own health data about their customers with respect to their purchases, like Target did with its pregnancy predictor score,” she recalled.

As Nicholas Terry, Professor of Law & Co­‐Director of the Hall Center for Law and Health, Indiana University Robert H. McKinney School of Law explained in a must-read commentary that he submitted to the FTC in response to its discussion of privacy for consumer generated health date, none of this health data is protected by HIPAA or HITECH.

“At root such patient curation of health data bespeaks autonomy and is symbolic of patient ownership of the data,” Terry writes. “However, it fails to take into account one practical limitation—the canonical version of the record will remain in the provider’s control–and one legal limitation—that only the provider-‐curated copy is protected by HIPAA-­HITECH. In contrast, the patient­‐curated ‘copy’ attracts little meaningful privacy protection. Well­‐meaning privacy advocates should think carefully before promoting this autonomy‐friendly ‘control’ model until data protection laws (not to mention patient education as to good data practices) catch up with patient curated data.”

Brill said that legislation like HIPAA and HITECH show that the US believes health data is sensitive and deserving of special protection, but “then the question becomes, though, if we do have a law that protects health information but only in certain contexts, and then the same type of information or something very close to it is flowing outside of those silos that were created a long time ago, what does that mean? Are we comfortable with it? And should be we be breaking down the legal silos to better protect that same health information when it is generated elsewhere.”

During the FTC event the agency seemed to take pains not to point to specific examples of wrongdoing, but the commissioner and other participants in the forum did raise up relevant examples of products for the sake of discussion.

“We recently read about one insurance company, Aetna, that has developed an app for its beneficiaries to use,” Brill said during her opening remarks. “It will allow their users to set goals and track their progress on all sorts of health indicators: weight, exercise, things like that. I think it’s wonderful that Aetna set this up, it’s great, but I don’t know precisely what they are doing with this information. We’ve looked at the terms of service. It just raises interesting questions: To what extent could this information be used for rating purposes? We all know under the FCRA It ought not to be, but what are the rules of the road here?”

Brill also pointed to a company called Blue Chip Marketing, which she said mines social media feeds and other databases to help recruit patients to clinical trials: It doesn’t “work with doctors or hospitals, instead it surfed social media, searched cable TV subscriptions, and got lots of information that allowed it to infer whether consumers were obese, potentially had diabetes, potentially had other conditions, and then offer to them to join a clinical trial. Some consumers would think that’s great – yes, I’d like to be part of a clinical trial, but others were really shocked when they were contacted by the company or others working with them. They asked: what makes you think I’m obese? Or how did you know I was a diabetic? [This raises] really interesting issues.”

In Terry’s commentary to the FTC, he points to a recent paper by McKinsey called The ‘Big Data’ Revolution in Healthcare, which identified four primary data pools that were driving big data in healthcare: They are activity/claims and cost data, clinical data, pharmaceutical R&D data, and patient behavior and sentiment data. Terry also described three types of health data that big data companies use to create proxies for HIPAA-protected data: “‘laundered’ HIPAA data, patient-­curated information and medically inflected data.”

Terry also rightly recognizes the rise of quantified self tools as an important consideration in the discussion. “A similarly dichotomous result is likely as the medically quantified self develops,” Terry writes. “The quantified-­self movement concentrates on personal collection and curation of inputs and performance. Obviously, health, wellness and medically inflected data will likely comprise a large proportion of such data. A similar, if less formal, scenario is emerging around health and wellness apps on smartphones and connected domestic appliances such as scales and blood pressure cuffs. Smartphones are crammed with sensors for location, orientation, sound and pictures that add richness to data collection. And there is ongoing and explosive growth in the medical apps space that seeks to leverage such sensors… These processes will in most cases lead to medically inflected data that exists outside of the HIPAA-HITECH protected zone.”

No comment yet.

Class action lawsuit filed against UPMC over data breach

Class action lawsuit filed against UPMC over data breach | HIPAA Compliance for Medical Practices | Scoop.it

PITTSBURGH —A class action lawsuit filed in federal court targets both the University of Pittsburgh Medical Center (UPMC) and a payroll software company it uses called the Ultimate Software Group, seeking damages and protection for employees affected by a breach of confidential data.

The lawsuit claims the "defendants had a duty to protect the private, highly sensitive, confidential personal and financial information and the tax documents" and "failed to safeguard and prevent vulnerabilities from being taken advantage of."

VIDEO: Watch Bob Mayo's report

In April, UPMC Vice President of Privacy and Information Security John Houston told Pittsburgh's Action News 4, "We are going to give of our employees the opportunity to sign up for a credit monitoring service. We're underwriting the cost for that for one year."

This lawsuit seeks a court injunction forcing 25 years worth of identity theft insurance, credit restoration services, and credit and bank monitoring services.

Some UPMC employees interviewed on the streets of the city's Oakland section expressed concerned about identity thieves.

"They're going to wait one year, they're going to wait two years, they're going to wait three years, and they could come back. I could be affected by a job I took in college, which is sort of scary," said Allisandra Supinski.

"I feel comfortable with the one year that I have. If i look into it more, I may change my mind," said Amy Hoffman.

"As long as you are with UPMC, they should cover us. As long as we work there for them, we should be able to get protected," said Rodreda Tate.

The employee named as the lead plaintiff, Alice Patrick, works as a dialysis clinician at UPMC McKeesport, according to the lawsuit. Both the attorneys who filed the lawsuit on her behalf and representatives of UPMC declined to be interviewed for this story.

No comment yet.

Upcoming Webinar: Hear from Health & Human Services, Avoid the Biggest HIPAA Mistakes | Business Wire

Upcoming Webinar: Hear from Health & Human Services, Avoid the Biggest HIPAA Mistakes | Business Wire | HIPAA Compliance for Medical Practices | Scoop.it
May 14, 2014 09:08 AM Eastern Daylight Time


FairWarning, Inc., the inventor and KLAS Category Leader in Patient Privacy

Monitoring1 will host an industry-wide webinar titled “Straight from the Source:

HHS Tools for Avoiding Some of the Biggest HIPAA Mistakes” featuring

Laura Rosas, Senior Advisor, Office of the Chief Privacy Officer, Health & Human Services. In 2014, covered entities can expect to receive an inquiry letter covering the mostfrequent problem areas in the HIPAA pilot audits. The Security Risk Assessment isrequired by both the HIPAA Security Rule and the CMS EHR Incentive Program, also known asMeaningful Use. Health & Human Services released a downloadable Security Risk Assessment Toolthis past March to help covered entities evaluate their security position and identifyareas needing improvement. The time is now to identify the weakest areas, and takeaction to improve prior to an audit. 

During this webinar, Ms. Rosas will walk through this tool to help attendees avoid

some of the biggest HIPAA mistakes, including tips and recommendations for

getting the most from the self-assessment. 


Tuesday, May 20, 2014

2:00 Eastern / 11:00 Pacific

Broadcast via Webex, register at:


About FairWarning, Inc.

FairWarning®’s mission is to lead the industry expansion of trust in Electronic Health Records empowering care providers to grow their reputation for protecting confidentiality, scale their digital health initiatives and comply with complex Federal and state privacy laws such as HIPAA. By partnering with FairWarning, care providers are able to direct their focus on delivering the best patient outcomes possible while receiving expert, sustainable and affordable privacy and compliance solutions. Customers consider FairWarning® privacy auditing solutions essential for compliance with healthcare privacy regulations such as ARRA HITECH privacy and meaningful use criteria, HIPAA, UK and EU Data Protection, California SB 541 and AB 211, Texas HB 300, and Canadian provincial healthcare privacy law. For more information on FairWarning® visit http://www.FairWarning.com or email Soultions@FairWarning.com.

1 2013 Best in KLAS: Software & Services report, January, 2014. © 2014 KLAS Enterprises, LLC. All rights reserved. www.KLASresearch.com


FairWarning, Inc.
Sadie Peterson, 727-576-6700 Ext. 119

No comment yet.

Largestever HIPAA settlement rings in at 5 million, should be a lesson to providers sharing computer networks, feds announce

Largestever HIPAA settlement rings in at 5 million, should be a lesson to providers sharing computer networks, feds announce | HIPAA Compliance for Medical Practices | Scoop.it

New York Presbyterian Hospital and Columbia University have entered into the largest-ever government settlement over an electronic data breach, totaling $4.8 million, the Department of Health and Human Services announced Wednesday. 

The breach occurred when a Columbia University physician and computer application developer attempted to deactivate a server he personally owned, which was on a data network shared with New York Presbyterian, according to HHS. The two organizations operate jointly as New York Presbyterian Hospital/Columbia University Medical Center.

Because “technical safeguards” were lacking, deactivating the server allowed personal health information of about 6,800 patients to be accessed through public Internet search engines, HHS explained. The providers reported the breach in 2010, after someone found the personal information of a deceased loved one on the Web.

The settlement should be cautionary for joint healthcare providers that both are covered by Health Insurance Portability and Accountability Act provisions, said Christina Heide, acting deputy director for health information privacy at the HHS Office of Civil Rights.

“When entities participate in joint compliance arrangements, they share the burden of addressing the risks to protected health information,” She said. “Our cases against NYP and CU should remind healthcare organizations of the need to make data security central to how they manage their information systems.”

New York Presbyterian's share of the settlement totaled about $3.3 million, and Columbia's came to $1.4 million. Both have agreed to a “substantive corrective action plan,” including risk analysis and management, HHS noted.

No comment yet.

HIPAA Enforcement: Leadership Changes

HIPAA Enforcement: Leadership Changes | HIPAA Compliance for Medical Practices | Scoop.it

As the Department of Health and Human Services' Office for Civil Rights ramps up its enforcement of HIPAA with costly settlements and a new round of compliance audits, the agency is in a state of leadership transition. Susan McAndrew, a long-time OCR leader in HIPAA enforcement, has retired, and OCR Director Leon Rodriguez may be departing soon.

McAndrew, whose official title was OCR deputy director for health information privacy, but who some insiders at OCR called "the mother of HIPAA," retired from federal service on May 2. "Sue was instrumental in spearheading the development and implementation of health information privacy policy and enforcement at HHS," an OCR spokeswoman tells Information Security Media Group.

Meanwhile, Rodriguez, who was nominated by President Obama last December to become director of U.S. Citizenship and Immigration Services, an agency of the Department of Homeland Security, is awaiting a full Senate vote to confirm his nomination to that post.

The Senate judiciary committee in March held a hearing on Rodriguez's nomination. On April 3 the outcome of the hearing was reported as "favorable" by committee chair Sen. Patrick Leahy, D-Vt., to the Senate, and the nomination was placed on the Senate Executive Calendar for 2014. But no date on Rodriguez' nomination has been listed yet on the Senate calendar for a vote.

The OCR spokeswoman tells ISMG that there is "no update to share at this time on director Rodriguez' confirmation."

Privacy Leadership

Commenting on the recent retirement of McAndrew, the spokeswoman says: "Her vision and leadership have been essential in moving OCR's work forward to keep pace with the advances of health information technology.

"McAndrew worked each day to move the department in a direction where consumers' right to the privacy of their health information dovetails with common sense standards for the health care industry to follow. She leaves a deep and lasting legacy, and her presence will be greatly missed."

McAndrew could not be reached for comment.

The attorney played a critical role in crafting HIPAA policies and enforcement activities, including the agency's first round of compliance audits that were conducted under the 2012 pilot program.

"Sue has been the guiding force behind the development and implementation of the HIPAA privacy, security and breach notification rules as well as the audit program," says David Holtzman, a former OCR senior adviser who left the agency in December to join security consulting firm CynergisTek. "The [OCR] deputy director plays a significant role in the development of regulatory policy and enforcement strategy."

Filling Positions

Christina Heide, OCR's senior adviser for health information privacy policy, is serving as acting deputy director for OCR's Health Information Privacy Division, the OCR spokeswoman says. Heide will be responsible for leading decision-making on enforcement, policy, and strategy.

Heide, an attorney, has worked with the HIPAA program at HHS since August 1999 and serves as the senior adviser for policy matters.

If Rodriguez is confirmed as director of U.S. Citizenship and Immigration Services, the HHS secretary will appoint a new director of OCR. That means the appointment could be made by Sylvia Mathews Burwell, who has been nominated by Obama to replace Kathleen Sebelius, who resigned last month as HHS secretary. Burwell is slated to face a second round of Senate finance committee confirmation hearings this week.

In the meantime, OCR is also adding to its enforcement staff. Last week, OCR posted notices that it's recruiting for several positions in its regional offices, including Kansas City, Missouri; Boston and Denver.

For example, the Kansas City job's primary duties include, "investigating complaints, conducting compliance reviews, and providing technical assistance and outreach to health and human services institutions, agencies or organizations which are covered entities to ensure compliance with civil rights laws and regulations and with the privacy of protected health information under HIPAA."

No comment yet.

IE flaw ushers risky new era for XP use | Healthcare IT News

IE flaw ushers risky new era for XP use | Healthcare IT News | HIPAA Compliance for Medical Practices | Scoop.it

When even the Department of Homeland Security is warning against using Internet Explorer, it's a safe bet its security flaws are serious. But for many healthcare providers -- notably those still running on Windows XP -- IE's recently-exposed vulnerabilities won't be fixed by Microsoft.

Microsoft confirmed this week that versions 6 through 11 of Internet Explorer "are susceptible to a newly discovered vulnerability, and that cyberattackers have already exploited the flaw."

It pledged to release a fix for the so-called "zero day" threat. But not for computers still running on Windows XP. On April 8, after years of warning, Microsoft stopped delivering technical assistance or software updates for the nearly 13-year-old operating system.

"These include security updates that can help protect your PC from harmful viruses, spyware, and other malicious software, which can steal your personal information," wrote officials at the Redmond, Wash. giant.

Windows XP was first released way back in 2001, but security experts guess that 15 to 25 percent of the world's PCs still run on the system. It's a safe bet that includes an untold number of machines at physician practices and small hospitals nationwide.

This serious security gap for Internet Explorer is just the first of many vulnerabilities that will be left unfixed from here on in for any provider using XP. One tech writer called it the "the first sign of the XPocalypse."

Sergio Galindo, general manager of the infrastructure business unit at computer security firm GFI Software, says his company has been working with many small- and medium-sized clients to help them prepare for the end of XP support.

"With 20 percent of our customers still running Windows XP, it still holds a good portion of our attention," he said.

Healthcare organizations are particularly vulnerable.

"For those healthcare providers that fall under HIPAA, having a Windows XP machine as part of your business practice may put your compliance at risk," said Galindo.

Computers running XP will continue to work, of course, "but with greater and greater risk," he said. Still, despite the fact that this wide-open vulnerability "has been widely communicated," there still exists an "'it won't happen to me' syndrome" on the part of many XP users," said Galindo. But now more than ever, he said, "it is highly likely that an unprotected system will be impacted by a virus, worm or malware."

In the short term, there are steps that can be taken to put up at least an adequate defense against the risks posed Internet Explorer.

David Harley, senior research fellow at IT security company ESET, suggested setting IE's Active Scripting and ActiveX to "prompt." It's "mildly irritating," he admitted, "but seems to reduce the attack surface if you actually disallow it on prompt unless you know you need it."

But "the simplest route is just to set IE security levels to 'high,' or use Enhanced Protected Mode in IE versions that support it," he added. "If you're using XP, you should probably be setting IE security level to 'high' already, as a way of generally decreasing the attack surface on an unsupported OS."

Longer term, however, the fact remains that Windows XP machines are at extremely high risk for hacking and data breaches; whatever the cost of upgrading to a newer operating system could be far eclipsed by the price of a HIPAA settlement.

For those practices still running on XP, Galindo suggests incremental steps. First, "make sure that information is archived properly," he said.

Next? Even though Microsoft's current OS is Windows 8.1, Galindo suggests a smaller leap to Windows 7.

"The problem is that Microsoft has moved on to Windows 8, which involves a different interface," he said. "Where possible, 7 is solid and is most like XP. Moving to 8 involves more training and adapting to a new interface -- this will involve some time for users to get used to it. I'm not sure that time is well spent at this point."

Technical Dr. Inc.'s insight:

inquiry@technicaldr.com or 877-910-0004

- The Technical Doctor Team

No comment yet.

Transcriptionist Breach Affects 15,000

Transcriptionist Breach Affects 15,000 | HIPAA Compliance for Medical Practices | Scoop.it

A breach involving the posting of information about 15,000 Boston Medical Center patients on a transcription firm's unsecured website serves as a reminder of the importance of monitoring the security practices of all business associates.

Boston Medical Center was notified on March 4 by another healthcare provider that MDF Transcription Services and its subcontractors "had incorrectly posted BMC physician office visit notes to the MDF website without password protection," a Boston Medical Center spokeswoman tells Information Security Media Group. "We immediately informed MDF and its subcontractors of this error and the website was removed from the Internet on the same day. We take our responsibility to maintain our patients' privacy very seriously and have notified all individuals who were affected by this vendor error."

As a result of the incident, physician notes "could have potentially been accessed by non-authorized individuals," she says. The information potentially exposed on the site includes names, addresses, medical information and medications. "We have no reason to believe that this led to the misuse or inappropriate accessing of any patient information," she says. "At this time, we have no evidence that any patient information was accessed by anyone other than medical personnel and administrative staff."

A number of Boston Medical Center physicians had used the transcription services company for several years, the spokeswoman says. Physicians routinely record audio notes about patient visits and then have these audio notes transcribed so they can be added to electronic medical records, she explains.

"Several physicians at BMC utilized MDF to transcribe their notes. Once transcribed, these notes were made accessible to physicians by MDF through an online site administered by subcontractors of MDF," she says. "Unfortunately in this instance the information was not password protected by MDF and its subcontractors."

The hospital is working with MDF and its subcontractors to determine the duration of the information exposure," the spokeswoman says.

As a result of the incident, Boston Medical Center has terminated its relationship with MDF. "BMC has rigorous contracting standards in place to protect patient privacy and any organization that works with BMC must be in full compliance with those standards," the spokeswoman says. "Failure to meet those standards in any way will result in immediate termination of the contract. "

MDF could not be reached for comment. Boston Medical Center declined to identify the subcontractors involved in the incident.

Business Associate Challenges

Security expert Brian Evans, principal consultant at Tom Walsh Consulting, says that many transcription services firms are aware of HIPAA's requirements but not always effective in carrying them out.

"In working with business associates that include transcription services, I'm finding that they are fully aware of their compliance obligations but lack the funding, staffing and security experience to adequately address them," he says.

"Unfortunately, business associates have not had as much time as covered entities to prepare for and meet their new compliance obligations. As a result, business associates, especially the smaller ones, are woefully behind in meeting their compliance requirements of the HIPAA security and privacy rules which include breach prevention tasks and technologies," he says.

Evans recalls a similar breach involving another transcription service. "I was involved with a data breach incident several years ago where the local transcription services company outsourced work to another company in Tennessee who then outsourced to an individual in India who posted actual patient data on his website," he says.

When covered entities work with transcription services firms, Evans says, they should ask the companies "specifically how they are protecting the confidentiality, integrity and availability of your patient data. I would also ask them to demonstrate their compliance with the HIPAA Security Rule."

Juggling BAs

Many large healthcare organizations, such as Boston Medical Center - a 496-bed academic medical center - might have hundreds of business associates, so managing these vendors can be difficult, Evans acknowledges.

"Despite greater investments in compliance efforts overall, the Boston Medical Center incident suggests that healthcare organizations have made limited progress in identifying or reducing business associate risk," Evans says. "The primary reason behind this is the sheer volume and diversity of business associates for any one organization."

Every business associate poses some form or level of risk, he says. "As a result, business associate risk is higher than most realize because a majority of this risk is not identified or reported. Consequently, potentially serious and costly compliance issues fly under the radar of senior management."

Under the HIPAA Omnibus Rule which went into effect last year, business associates are directly liable for HIPAA compliance. Like covered entities, business associates are subject to OCR enforcement actions, including penalties ranging up to $1.5 million per HIPAA violation.

Tips for Managing BAs

While managing dozens, if not hundreds, of business associates - including transcription services firms - can be a challenge, Evans says covered entities should take several steps to ensure compliance of their vendors.

"Consider taking a tiered approach to assessing and managing business associate risk to allocate your limited resources to the highest exposure areas," he says. "By employing a tiered risk management model, you can direct the most intensive compliance resources to areas of greatest exposure, allowing for broader coverage without increasing the overall resource investment in risk management.

"When business associates handle sensitive or regulated data, it is imperative that some form of written agreement specifies what is expected. But contracts and agreements alone are weak controls unless compliance can be verified."

The most effective way to reduce the rate of compliance failures at business associates is the combined use of risk assessments; contracts/agreements; due diligence; audit tools and other technologies; and careful oversight monitoring, he says. "Direct compliance with all of the safeguards and documentation requirements of the HIPAA Security Rule is your mandate, and your customers, patients and auditors are going to begin asking you to show them, not just tell them, that you are in good standing," he says.

Additionally, Evans suggests covered entities designate a specific individual or team to coordinate the oversight activities for significant business associate relationships, and, as necessary, involve other operational areas, such as audit and information technology, in the monitoring process. "The extent of oversight of a particular business associate will depend on the potential risks and the scope and magnitude of the relationship," he says. Results of oversight activities should be periodically reported to senior management or a designated committee, he advises. "Identified non-compliance issues or weaknesses should be documented and promptly addressed," he adds.

The revelation of the breach at Boston Medical Center comes on the heels of a distributed-denial-of-service attacks against (see DDoS Assault on Boston Children's Hospital)

No comment yet.

iOS changes will address HIPAA risk | Healthcare IT News

iOS changes will address HIPAA risk | Healthcare IT News | HIPAA Compliance for Medical Practices | Scoop.it

Imagine if almost everyone walking into your hospital – patients, doctors, visitors, salespeople – was carrying an active homing beacon, which broadcast, unencrypted, their presence and repeatedly updated exact location to anyone who chose to listen.

[See also: Where will HIT security be in 3 years?]

That's where things stand today, courtesy of the mobile MAC address signal (it stands for media access control), a unique ID coming from every smartphone, tablet and wearable device.

But not for long, given upcoming changes to how Apple products will handle MAC address broadcasts –  a move almost certain to be copied by Google's Android.

[See also: 'Troubling disconnect' between mobile security threats and protections in place]

Apple's iOS 8 change, focusing initially on how MAC addressing interacts with Wi-Fi scans, will shift to using "randomly, locally administered" MAC addresses. The result, according to Apple: "The MAC address used for Wi-Fi scans may not always be the device's real – universal – address." (That description is on page 18 of an Apple PDF, available here.)

As a practical matter, using this kind of a randomized bogus address approach will make tracking people via mobile devices impossible or, at best, impractical, depending on the level of randomization used and how often – if ever – the true MAC address is broadcast.

It will still be months before Apple releases this new version of its mobile OS publicly (it's now solely with developers), weeks and maybe months before most consumers will upgrade and longer still before others – especially Google's Android – mimic the move.

That means that, for now, this security privacy risk is still a very real threat.

The risk is twofold. First, there is the potential for a renegade member of the hospital's staff to track people. Second, there exists the possibility that hospital visitors could wirelessly track other hospital visitors.

With the first scenario, this is not as much of a concern for tracking doctors and other hospital staff, as they could just as easily be tracked the instant they log into the hospital's local area network, so the MAC address broadcast is not necessary. With visiting cyberthieves or stalkers, anyone with a mobile device is a potentially tracked victim.

The security risk is that a specific MAC address would be tracked over time, showing all travel activity within the hospital. Retail offers a great example of the risk: Retailers work with vendors who have contracts with lots of other retailers. This allows those companies to create – and to then sell – detailed reports of every store and mall and parking lot that a MAC address visits. By overlaying it with purchase records, that address can be associated with specific purchases. If those purchases used a payment card or loyalty card, that MAC address can then be associated with a specific person.

No comment yet.

Big Data, My Data - iHealthBeat

Big Data, My Data - iHealthBeat | HIPAA Compliance for Medical Practices | Scoop.it

"The routine operation of modern health care systems produces an abundance of electronically stored data on an ongoing basis," Sebastian Schneeweis writes in a recent New England Journal of Medicine Perspective.

Is this abundance of data a treasure trove for improving patient care and growing knowledge about effective treatments? Is that data trove a Pandora's black box that can be mined by obscure third parties to benefit for-profit companies without rewarding those whose data are said to be the new currency of the economy? That is, patients themselves?

In this emerging world of data analytics in health care, there's Big Data and there's My Data ("small data"). Who most benefits from the use of My Data may not actually be the consumer.

Big focus on Big Data. Several reports published in the first half of 2014 talk about the promise and perils of Big Data in health care. The Federal Trade Commission's study, titled "Data Brokers: A Call for Transparency and Accountability," analyzed the business practices of nine "data brokers," companies that buy and sell consumers' personal information from a broad array of sources. Data brokers sell consumers' information to buyers looking to use those data for marketing, managing financial risk or identifying people. There are health implications in all of these activities, and the use of such data generally is not covered by HIPAA. The report discusses the example of a data segment called "Smoker in Household," which a company selling a new air filter for the home could use to target-market to an individual who might seek such a product. On the downside, without the consumers' knowledge, the information could be used by a financial services company to identify the consumer as a bad health insurance risk.

"Big Data and Privacy: A Technological Perspective," a report from the President's Office of Science and Technology Policy, considers the growth of Big Data's role in helping inform new ways to treat diseases and presents two scenarios of the "near future" of health care. The first, on personalized medicine, recognizes that not all patients are alike or respond identically to treatments. Data collected from a large number of similar patients (such as digital images, genomic information and granular responses to clinical trials) can be mined to develop a treatment with an optimal outcome for the patients. In this case, patients may have provided their data based on the promise of anonymity but would like to be informed if a useful treatment has been found. In the second scenario, detecting symptoms via mobile devices, people wishing to detect early signs of Alzheimer's Disease in themselves use a mobile device connecting to a personal couch in the Internet cloud that supports and records activities of daily living: say, gait when walking, notes on conversations and physical navigation instructions. For both of these scenarios, the authors ask, "Can the information about individuals' health be sold, without additional consent, to third parties? What if this is a stated condition of use of the app? Should information go to the individual's personal physicians with their initial consent but not a subsequent confirmation?"

The World Privacy Foundation's report, titled "The Scoring of America: How Secret Consumer Scores Threaten Your Privacy and Your Future," describes the growing market for developing indices on consumer behavior, identifying over a dozen health-related scores. Health scores include the Affordable Care Act Individual Health Risk Score, the FICO Medication Adherence Score, various frailty scores, personal health scores (from WebMD and OneHealth, whose default sharing setting is based on the user's sharing setting with the RunKeeper mobile health app), Medicaid Resource Utilization Group Scores, the SF-36 survey on physical and mental health and complexity scores (such as the Aristotle score for congenital heart surgery). WPF presents a history of consumer scoring beginning with the FICO score for personal creditworthiness and recommends regulatory scrutiny on the new consumer scores for fairness, transparency and accessibility to consumers.

At the same time these three reports went to press, scores of news stories emerged discussing the Big Opportunities Big Data present. The June issue of CFO Magazine published a piece called "Big Data: Where the Money Is." InformationWeek published "Health Care Dives Into Big Data," Motley Fool wrote about "Big Data's Big Future in Health Care" and WIRED called "Cloud Computing, Big Data and Health Care" the "trifecta."

Well-timed on June 5, the Office of the National Coordinator for Health IT's Roadmap for Interoperability was detailed in a white paper, titled "Connecting Health and Care for the Nation: A 10-Year Vision to Achieve an Interoperable Health IT Infrastructure." The document envisions the long view for the U.S. health IT ecosystem enabling people to share and access health information, ensuring quality and safety in care delivery, managing population health, and leveraging Big Data and analytics. Notably, "Building Block #3" in this vision is ensuring privacy and security protections for health information. ONC will "support developers creating health tools for consumers to encourage responsible privacy and security practices and greater transparency about how they use personal health information." Looking forward, ONC notes the need for "scaling trust across communities."

Consumer trust: going, going, gone? In the stakeholder community of U.S. consumers, there is declining trust between people and the companies and government agencies with whom people deal. Only 47% of U.S. adults trust companies with whom they regularly do business to keep their personal information secure, according to a June 6 Gallup poll. Furthermore, 37% of people say this trust has decreased in the past year. Who's most trusted to keep information secure? Banks and credit card companies come in first place, trusted by 39% of people, and health insurance companies come in second, trusted by 26% of people. 

Trust is a basic requirement for health engagement. Health researchers need patients to share personal data to drive insights, knowledge and treatments back to the people who need them. PatientsLikeMe, the online social network, launched the Data for Good project to inspire people to share personal health information imploring people to "Donate your data for You. For Others. For Good." For 10 years, patients have been sharing personal health information on the PatientsLikeMe site, which has developed trusted relationships with more than 250,000 community members.

On the bright side, there is tremendous potential for My Data to join other peoples' data to drive better health for "Me" and for public health. On the darker side, there is also tremendous financial gain to be made by third-party data brokers to sell people's information in an opaque marketplace of which consumers have no knowledge. Individuals have the most to gain from the successful use of Big Data in health. But people also have a great deal to lose if that personal information is used against them unwittingly.

Deven McGraw, a law partner in the health care practice of Manatt, Phelps & Phillips, recently told a bipartisan policy forum on Big Data in health care, "If institutions don't have a way to connect and trust one another with respect to the data that they each have stewardship over, we won't have the environment that we need to improve health and health care." This is also true for individual consumers when it comes to privacy rights over personal health data.

No comment yet.

Will Healthcare Ever Take IT Security Seriously?

Will Healthcare Ever Take IT Security Seriously? | HIPAA Compliance for Medical Practices | Scoop.it

CIO - In the years since the HITECH Act, the number of reported healthcare data breaches has been on the rise - partly because organizations have been required to disclose breaches that, in the past, would have gone unreported and partly because healthcare IT security remains a challenge.

Recent research from Experian suggests that 2014 may be the worst year yet for healthcare data breaches, due in part to the vulnerability of the poorly assembled Healthcare.gov.

"The volume of IPs detected in this sample can be extrapolated to assume that there are millions of compromised healthcare organizations, applications, devices and systems sending malicious packets from around the globe."

Hacks and other acts of thievery get the attention, but the root cause of most healthcare data breaches is carelessness: Lost or stolen hardware that no one bothered to encrypt, protected health information emailed or otherwise exposed on the Internet, paper records left on the subway and so on.

What will it take for healthcare to take data security seriously?

Healthcare IT So Insecure It's 'Alarming'

Part of the problem is that healthcare information security gets no respect; at most healthcare organizations, security programs are immature at best, thanks to scarce funding and expertise. As a result, the majority of reported data breaches are, in fact, avoidable events.

[Related: Healthcare IT Security Is Difficult, But Not Impossible]

The recent SANS Health Care Cyber Threat Report underscores this point all too well. Threat intelligence vendor Norse, through its global network or honeypots and sensors, discovered almost 50,000 unique malicious events between September 2012 and October 2013, according to the SANS Institute, which analyzed Norse's data and released its report on Feb. 19. The vast majority of affected institutions were healthcare providers (72 percent), followed by healthcare business associates (10 percent) and payers (6 percent).

SANS uses the words "alarming" and "troubling" often in its analysis. "The sheer volume of IPs detected in this targeted sample can be extrapolated to assume that there are, in fact, millions of compromised health care organizations, applications, devices and systems sending malicious packets from around the globe," writes Senior SANS Analyst and Healthcare Specialist Barbara Filkins.

[ Tips: How to Prevent (and Respond to) a Healthcare Data Breach ]

More than half of that malevolent traffic came from network-edge devices such as VPNs (a whopping 33 percent), firewalls (16 percent) and routers (7 percent), suggesting "that the security devices and applications themselves were either compromised ... or that these 'protection' systems are not detecting malicious traffic coming from the network endpoints inside the protected perimeter," Filkins writes, noting that many vulnerabilities went unnoticed for months. Connected endpoints such as radiology imaging software and digital video systems also accounted for 17 percent of malicious traffic.

Norse executives say this stems from a disconnect between compliance and regulation. Simply put, says Norse CEO Sam Glines, "There is no best practice applied." Many firewall devices with a public-facing interface, for example, still use the factory username and password. The same is true of many surveillance cameras and network-attached devices such as printers - the default passwords for which can be obtained not through hacking but through a simple Internet search. "It's just not good enough in today's market."

The United States would do well to heed the European Union's data breach laws, Glines says, as they take a "categorically different" approach and include specific language about what is and isn't compliant. This could include, for example, specific policies for managing anything connected to an IP address or basic password and access control management measures, he says.

Mobile Health Security Especially Suspect

In the absence of such regulation, though, patient privacy is a myth. Data is shared freely in a hospital setting, for starters, and clinical systems favor functionality over privacy, so much so that privacy and security are often an afterthought in the development lifecycle. This is especially true in the growing mobile health marketplace, which largely places innovation before security.

[Related: Healthcare.gov Still has Major Security Problems, Experts Say]

Harold Smith discovered this all too quickly in December. After Happtique, a mobile health application certification group, released its first list of approved applications, Smith, the CEO of software firm Monkton Health, decided to check out a couple apps.

It wasn't pretty. He installed one on a jailbroken iPhone and, in less than a minute, pulled electronic protected health information (ePHI) from a plain text, unencrypted HTML5 file. He also found that this data - specifically, blood glucose levels - was being sent over HTTP, not HTTPS. "That was the first hint that something was wrong," he says. "That's a pretty big 'Security 101' thing to miss."

A second app, which Smith tested a few days later, also stored ePHI in unencrypted, plain text files. Though this app uses HTTPS, Smith notes in his blog that it sends usernames and passwords in plain text. "That was another big problem," he says.

[Slideshow: 12 Tips to Prevent a Healthcare Data Breach]

Happtique suspended its application certification program in light of Smith's findings, but the application developers themselves (as well as healthcare IT news sites and blogs) glossed over the issue. This irked Smith: "As someone who develops mobile health software, if someone tells me they've found a vulnerability, I get to them right away."

At the very least, Smith says, mobile health applications need a pin screen and data encryption. The bigger issue, though, is the tendency for developers treat mobile apps like desktop apps. That doesn't work in the "whole new Wild West" of mobile development, Smith says, where databases aren't encrypted and passwords need to be hashed. Five years after the release of the Apple SDK, he adds, people are still trying to figure it all out.

Is Red Tape to Blame for Poor Healthcare IT Security?

The same is true of healthcare's privacy and security regulations - which, to put it mildly are conflicting. Sharon Klein, head of the privacy, security and data protection practice at the law firm Pepper Hamilton, notes that, in the United States, there are 47 different sets of (inconsistent) data breach regulations and multiple regulatory frameworks.

[ Study: Healthcare Industry CIOs, CSOs Must Improve Security ]

If there are overarching standards, they come from the National Institute of Standards and Technology, Klein says, noting the Office for Civil Rights and Department of Health and Human Services have "consistently" used NIST standards. At the same time, other agencies are getting involved:

  • The Federal Trade Commission emphasizes privacy by design in the collection, transmission, use, retention and destruction of data;
  • The Food and Drug Administration's guidance on cybersecurity in medical devices and hospital networks pinpoints data confidentiality, integrity and availability, and
  • The Federal Communications Commission, in the wake of weak 802.11 wireless security, has issued disclaimers regarding text messaging and geolocation with implications for clinical communications.

[ Related: Solving Healthcare's Big Data Analytics Security Conundrum ]

Given the regulatory inconsistencies, Klein says it's best to document everything you're doing and conduct vigorous training and awareness programs for all staff. "Minimum necessary" policies, which limit who get to see which data and, critically, which change as an individual employee's role evolves, can eliminate unnecessary security holes, as does the appropriate de-identification of data.

Software developers have additional priorities. If anything is regulated, isolate it, Klein says, and make sure you disclose to consumers what data you are obtaining, what you intend to do with it, what third-parties will have access to it and whom to contact if there is an issue. Startups want to be first to market, she admits, but in the process - as Smith found - they can put security on the back burner, only to scramble to fill the gaps once vulnerabilities are discovered.

Balancing Healthcare IT Security and Accessibility

Experts largely agree that a cogent approach to health data security must balance security and accessibility, whether it's patients, physicians or third parties who want the data. This is especially important as the healthcare industry emphasizes more widespread health information exchange as part of a larger goal to provide more coordinated care.

"Security has for a long time been an afterthought. Now it has to be part of the build," says Glines - adding that, if it isn't, an app simply shouldn't be released.

Smith suggests that developers and security professionals hack iOS apps, as he did, and see for themselves how easy it is. Then, he says, they should ask, "If it's not that difficult, and [I'm] storing all that data on the phone, what can I do beyond what the OS offers?"

As it turns out, there are a "whole litany of things" that application developers can do, even in an ever-changing field. Specifically, Smith points to viaForensics' secure mobile development best practices, which apply to iOS and Android.

Given the findings of the Norse and SANS Institute study, Glines says it's worth having two conversations. One is with network administrators about the "basic blocking and tackling" work, such as actually changing default device passwords, which can bring about "simple, powerful change." The other is with executive staff about the implications of lax security - a conversation unfortunately made easier in the wake of the Target breach, which it turns out stemmed from systemic failures and not a single point of attack.

Regulators won't cut you any slack if a breach occurs, Klein says, especially if you knew vulnerabilities existed and didn't fix them. Under the new HIPAA Omnibus Rule, which went into effect in September 2013, firms face fines of up to $1.5 million in the event of the "willful neglect" of security issues."

Glines says boardrooms are beginning to shift their security mentality, but this will take time to trickle down. "In the next eight to 12 months, we will continue to see more front-page news" about data breaches, he says.

No comment yet.

Time for hard HITECH reboot | Healthcare IT News

Time for hard HITECH reboot | Healthcare IT News | HIPAA Compliance for Medical Practices | Scoop.it

So, you dropped a huge chunk of change on a new IT system. Now you are frustrated and have buyer’s regret. The “installation requirements” are very complex. And although you just bought it, the system already needs to be updated. It only seems to run one application – “EHR version 1.0,” but you want to do many other things too. EHR v1 is not very user friendly and sure makes you do your work differently. And there is more. This expensive new system doesn’t seem to connect to anything. Sure, there is a basic email application available, but you also want to look for information and get what you need when and where you need it. Isn’t that what the Internet enabled years ago?

If this is the metaphorical world of HITECH, here’s to giving Karen DeSalvo, the new national coordinator for Health IT, all the support she needs to do a full and hard HITECH reboot. More than 30 billion dollars have been spent. And while it is reasonable that many HIT outcomes are still unfulfilled, the path forward seems murky. EHR adoption has surged, but much of what has been broken about health IT in the United States still remains. And the leverage of the HITECH funds is dwindling fast.

Now there is yet another independent report, this time from the JASON group which, like the report from the President’s Council of Advisors on Science and Technology before it, suggests the need for a major architecting effort for health IT nationally. The Government Accountability Office reports that there is a lack of strategy, prioritized actions, and milestones in HITECH. HIT interoperability is recognized as being limited at multiple levels. And resultantly, the benefits of HIT that depend on a combination of adoption, interoperability, and health information exchange as table stakes are elusive.

Meanwhile political resistance has reached a frenzied pitch. Entrenched interests are back on top and aggressively lobbying to advance their respective positions. Very few hospitals and providers have achieved Stage 2 of meaningful use, Stage 3 concepts have received substantial push-back, EHR certification is under fire, and ICD-10 has been pushed back yet another time.  Dr. DeSalvo has initiated a reorganization of the HIT Policy and the HIT Standards Committees, but has much more work cut out for her. So, here we offer our unsolicited top 10 list of HITECH reboot priorities:

10) Divide and conquer - Separate healthcare provider adoption angst and frustration from EHR vendor complaints about standards implementation and certification. Although these two things are frequently conflated during political push-back, they should be addressed very differently…

9) Meaningful relief - Providers need to be left alone for a while. They were already under incredible strain from many non-HIT related pressures. HITECH added to these pressures (necessarily) by fostering EHR adoption. But meaningful use added much more strain through sometimes aspirational criteria that demand workflow and process changes well beyond simply adopting an EHR. Give providers meaningful relief from many of these new business requirements. It is not clear that there are incentives to sustain them after HITECH and the infrastructure needs attention before many are viable.

8) Double down on interoperability - Don’t give in, on the other hand, to push back against standards and certification. In fact, the entire standards, implementation guidance, and certification process needs a boost to achieve just a strategic sample of the transactions needed in health. There needs to be a broader, more inclusive, standards process. The ONC Standards & Interoperability Framework has good ideas, but there are many more needs than ONC alone can promote. There are also needs for broader standardization and specification of technologies beyond just data and messages. Constructively re-engage the industry to help make this happen.

7) Certify more, not less – Certify more systems not less. Hang more government incentives on certified HIT. Use CLIA, the DoD EHR investment, TRICARE, all federal contracts and grants – whatever it takes.  Standards, interoperability specifications, and certification that includes conformance testing is the recognized path to interoperability. Certification needs to be specific enough for robust conformance testing and interoperability certification needs to be applied to all HIT participants not just EHRs. Identifying business outcomes, even when incentives are aligned, will not suffice for interoperability. But by all means make certification as minimally burdensome as possible by focusing only on interoperability and security.

Technical Dr. Inc.'s insight:

Contact Details :
inquiry@technicaldr.com or 877-910-0004
- The Technical Doctor Team

No comment yet.

Medical kiosks raise security flags | Healthcare IT News

Medical kiosks raise security flags | Healthcare IT News | HIPAA Compliance for Medical Practices | Scoop.it

As the Cleveland Clinic adds its prestigious name to the hospital groups that have embraced next-generation medical kiosks – groups that include Metro Health, Miami Children's Hospital, Kaiser Permanente, Central Ohio Primary Care and Nationwide Children's Hospital – healthcare IT executives are wrestling with the powerful pros and cons of such a move.

[See also: Ohio hospital launches kiosk pilot.]

These kiosks have compartments that software can open to release various medical instruments (blood pressure cuff, thermometer, stethoscope, scale, pulse oximeter, otoscope, dermascope, etc.), which a nurse can help the patient use. The kiosk transmits the data to a cloud controlled by a kiosk vendor (in the case of the Cleveland Clinic, it's a vendor called HealthSpot) and then establishes a live video stream with a physician for a consultation. The kiosks can also accept payment card transactions.

The advantages of such systems are extensive, allowing far more patients to be seen. The kiosks can be placed in malls, houses of worship, schools, community centers and other locations that can be far more convenient for patients, especially in rural areas. The video streaming also offers the potential for efficiency improvements, where a doctor can hop onto the network whenever he/she has a free 10-15 minutes and see a patient.

"A doctor can jump on the laptop and get on the network if needed," said Christopher Soska, the Cleveland Clinic's chief of operations for community hospitals and family health centers. "The convenience for the patients is the biggest factor. Remember that the true value of these units is not within a medical office building. It's being out in the market in retail, in community centers, in churches. These patients may not have the means to get to our care providers."

[See also: Telehealth gives Miami docs global reach.]

That convenience, though, comes with inevitable IT security and privacy risks. How well are those highly-sensitive video streams protected? What about that payment card information? Is that highly profitable data going to attract cyber thieves? All of that data being gathered is being sent to the vendor's cloud. How secure is that cloud? Will cloud neighbors present a threat?

What are the privacy and marketing issues?

HealthSpot CEO Steve Cashman, for example, said his company has yet to decide how they will use – and potentially make money from – that information. That means that the hundreds of patients who have already given information to these kiosks did so with no guarantees how the data would ultimately be used. Patients are presented with a terms and conditions page, but HealthSpot refused to share its wording, saying the phrasing was proprietary. Yet the terms and conditions page is open to anyone who goes to a mall and walks into one of the kiosks.

How could that ultra-sensitive medical data be used?

"We understand the long-term value of that data," Cashman said. "If we're successful in building this out, we in theory could have the largest amount of data in the entire healthcare (community). We ultimately have to decide who that data is valuable to."

Among the privacy issues is how patients regard the kiosks.  When patients see the Cleveland Clinic name, for example, they might apply the trust they have for Cleveland Clinic brand  to the kiosk. And if anything were to happen to the data – regardless of whether it's the fault of Cleveland Clinic – the patient is likely to blame the trusted brand that made them comfortable using the kiosk initially.

Cleveland's Soska avoided discussing data use limitations and security issues, saying some of those details might have been covered in the group's contract with the vendor. It raises the question of how much patients themselves know the limits – and the exposures – before they use the kiosks.

Kathy Jobes is the chief information security officer at the 125-year-old 12-hospital $5.6 billion Sentara Healthcare enterprise. Jobes said that her organization is not, at the moment, exploring such advanced healthcare kiosks, but she'd have plenty of security concerns if it were.

These kiosks "are very exciting and they look very sexy, but it all comes with a price," Jobes said. "If it's not carefully thought through, there could be a negative ending for the organization. I can definitely see where it's very exciting, but I would have a lot of questions."

For example, she said, the streaming video part could be problematic, from a security perspective. "Streaming video is (security) difficult. It presents a lot of challenges," Jobes said. "It's not just the encryption issues, but the authentication is challenging at best for streaming video."

The authentication risk, Jobes said, is to prevent an identity thief from impersonating a prior user of the kiosk. With different medical personnel looking at the video, there is likely to be no one who would recognize that patient by sight. Not only could such a thief potentially learn intimate medical details about the patient – as the physician reveals details accessed in the patient's file – but it could also be used to steal prescriptions.

Another concern that Jobes had about such data-hungry kiosks is how the data would be destroyed when the relationship ended. Would an agreement allow for some data-retention in case patients access the kiosk through a different hospital group? Is the vendor allowed to use data in aggregate and to be able to sell it? Are there restrictions on whether the data can be placed on mobile devices or thumb drives, if an executive wanted to work on material at home or while on the road? Also, with relentless backups or backups in multiple locations, is it even possible to guarantee that every last bit of data will be destroyed, even if a contract requires that? If it's in the cloud, for example, who knows how many automated backups of that server are being executed?

"I absolutely would require upfront the foundational agreement that very explicitly set forth the scope of how the data could be used and how the data would be destroyed," Jobes said.

The cost of the HealthSpot kiosks to the hospital is about a $15,000 implementation fee, a multiyear support/maintenance agreement for about $1,000/month and a portion of the appointment fee (often around $12 goes to the vendor), Cashman said.  "If somebody is not insured and they just want to pay, we do offer a $59 cash pay option," he said, adding that they are preparing to accept PayPal soon.

Cleveland's Soska said these kind of next-generation kiosks have strong potential to get more patients seen more quickly and more efficiently. But it's not for all ailments. "This is not for emergent care. Chest pains still need to go to the emergency room," Soska said.

No comment yet.

FTC Urges Congress To Better Protect Consumer Health Data

FTC Urges Congress To Better Protect Consumer Health Data | HIPAA Compliance for Medical Practices | Scoop.it

n Tuesday, the Federal Trade Commission released a reportrecommending that Congress pass legislation to make data broker practices more transparent and give consumers more control over their personal health information, the Los Angeles Times' "Technology Now" reports (Faturechi, "Technology Now," Los Angeles Times, 5/27).

Report Details

The report is based off of an FTC investigation launched in 2012 involving nine major data-broker companies that aggregate consumer information from various online and off-line sources with little oversight (Ryan, National Journal, 5/27). Those companies were:

  • Acxiom;
  • CoreLogic;
  • Datalogix;
  • eBureau;
  • ID Analytics;
  • Intelius;
  • PeekYou;
  • Rapleaf; and
  • Recorded Future (Kerr, AP/ABC News, 5/28).
Report Findings

The report found that data brokers are collecting large amounts of personal information, including medical data, about "nearly every U.S. consumer" and are operating with a "fundamental lack of transparency" (National Journal, 5/27).

Specifically, FTC noted that brokers commonly:

  • Fail to verify data;
  • Offer risk-mitigation services that let companies check consumers' information before providing services;
  • Store data indefinitely, which could increase the risk of data theft; and
  • Sort data into potentially sensitive categories such as "cholesterol focus" and "diabetes interest."

In addition, the report found that brokers commonly analyze data to "predict which consumers are likely to use brand-name medicine, order prescriptions by mail, research medications online or respond to pharmaceutical advertisements" (Tahir, "Vital Signs," Modern Healthcare, 5/27).

FTC noted that such practices could increase the risk of data breaches and raise privacy concerns (National Journal, 5/27).


In the report, FTC called on Congress to enact legislation that would:

  • Allow consumers to access and correct their data or opt out of the data collection ("Vital Signs,"Modern Healthcare, 5/27);
  • Build a public, central database with details on the data brokers and the information they collect; and
  • Require brokers to disclose the names in and categories of their data (AP/ABC News, 5/27).

The report concluded, "Allowing consumers the ability to exercise control over the use of sensitive information is particularly important," adding, "For categories that some consumers might find sensitive and others may not (e.g., visually impaired, balding, overweight), having access to this data, along with the ability to suppress the use of it for marketing, will improve the transparency of data broker practices and allow consumers to control uses of the data about which they care the most" (Slabodkin, Health Data Management, 5/28).


Sen. Jay Rockefeller (D-W.Va.), who has previously introduced legislation that would increase transparency among data brokers, in a statement said, "With the release of today's report, which is supported by Democratic and Republican FTC commissioners, our conclusion is stronger than ever -- big data practices pose risks of consumer harm, including discrimination based on financial, health and other personal information." He added, "Congress can no longer put off action on this important issue" (National Journal, 5/27).

No comment yet.

Omnibus HIPAA in action: The good, bad and scary | Government Health IT

Omnibus HIPAA in action: The good, bad and scary | Government Health IT | HIPAA Compliance for Medical Practices | Scoop.it

2014 is the first financial year after the HIPAA Final Rule, and healthcare privacy has transformed in ways that are good, bad, and downright scary.

The good: Higher awareness
On the positive side, the total number of data breaches in healthcare has declined slightly, according to the Fourth Annual Benchmark Study on Patient Privacy and Data Security by Ponemon Institute. Clearly, healthcare organizations are aware of the requirements, and their data security budgets and systems are catching up.

On the forensics side, for instance, we've seen a strengthening of networks and locally stored data by hospitals and other healthcare organizations, with a greater use of applications that monitor networks. Lots of data breaches have been avoided.

The bad: Bigger, badder breaches
The news isn’t all rosy, however. Ponemon also found that 90 percent of respondents had at least one data breach over the past two years, while 38 percent have had more than five data breaches in the same period. Clearly, threats to patient data remain high.

More complex information systems and business relationships are leading to larger, more complex breaches. Ironically, the data on the internal systems of HIPAA covered entities is now much better protected, but with so much data in the cloud or shared with business associates, large amounts of information have become less well protected.

There are two big issues with cloud storage. First, organizations and users fail to realize that when they look at their email or share folder, they don't know where that data is actually located. People assume it is well protected, but it may not be. Second is the amount of data kept in the cloud. The low cost of cloud storage means many people use email as their storage system instead of using folders on a local file system. While their computer is in a highly protected work environment, their email is in the cloud, protected by only an email address and password. When the dam breaks, there's a huge amount of data.

Outsourcing to business associates also creates vulnerabilities. Healthcare organizations outsource work, such as claims processing, to cut costs and become more competitive. Unfortunately, those vendors are also competing with each other on costs, leading some to overstate the data security they provide.

The scary: Threats you can’t predict
The unpredictable threats come from the newest developments in the healthcare ecosystem, and from the computing infrastructure itself. In the healthcare market, health information exchanges are one of the big unknowns. The Ponemon study found that one-third of organizations surveyed have no plans to become a member of an HIE, in fact, with 72 percent not confident or only somewhat confident in the security and privacy of patient data shared on HIEs.

Security problems in the exchanges will arise because multiple levels of outsourcing, contracting, and subcontracting make them so opaque. When security incidents happen, organizations may not know for sure who is responsible for detecting or addressing the breach. And unlike platforms that have been around longer, we have not yet seen all the bugs and vulnerabilities in exchanges.

3 tips to protect health data
Despite unpredictability and greater complexity, organizations can still protect their patients’ privacy and secure their data with some common-sense strategies:

1. Don’t cut security costs. Reputable cloud storage companies have tools available for logging, monitoring, and controlling or restricting data access. But because organizations move to the cloud to save money, they’re not inclined to spend on security add-ons and plugins. People tend to see add-ons as optional, and companies are not buying the bells and whistles when they move to the cloud. Paying for security will pay off in the long run.

2. Don’t assume your security investment will protect you 100 percent. This may seem counter-intuitive to the first step, but it’s impossible to anticipate every threat. Organizations should assume that their security will fail, and go back to basic questions:
• Can we divest ourselves of data?
• Can we decrease our vulnerability?
• Can we take data off line?
• Can we limit the number of people with access?

3. Communicate, communicate, communicate. Organizations need to figure out the lines of communication before an incident happens. The more communication and transparency a covered entity has, both internally and to the regulatory agencies and the public, the better off it will be. Roles and responsibilities need to be clearly defined ahead of time, key decision-makers need to be in the meetings, and all stakeholders need to be in the loop.

And when the worst does happen, organizations need to go the extra mile in making things right for the individuals affected. If they don’t, consumers and patients may look elsewhere for other healthcare providers.

No comment yet.

New York-Presbyterian And Columbia Hospitals To Pay Record HIPAA Settlement - Food, Drugs, Healthcare, Life Sciences - United States

New York-Presbyterian And Columbia Hospitals To Pay Record HIPAA Settlement - Food, Drugs, Healthcare, Life Sciences - United States | HIPAA Compliance for Medical Practices | Scoop.it

On May 7, 2014, the US Department of Health and Human Services Office of Civil Rights (OCR) announced settlements with two New York-based hospitals totaling $4.8 million for violations of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy and Security Rules. The settlements related to the hospitals' failure to secure the electronic protected health information (ePHI) of thousands of patients held on their networks and are the latest example of OCR's increased enforcement action.

The two hospitals, New York-Presbyterian Hospital (Presbyterian) and Columbia University (Columbia), which participate in a joint arrangement allowing Columbia faculty members to serve as attending physicians at Presbyterian, were the subject of investigation following their submission of a joint breach report to OCR in September, 2010. As part of their joint arrangement, the hospitals operate a shared data network, administered by employees of both entities, which links to Presbyterian patient information systems containing ePHI.  The breach occurred when a physician employed by Columbia attempted to deactivate a personal computer server that was on the shared network and contained Presbyterian patient ePHI. The improper deactivation of the server resulted in ePHI being accessible through Internet search engines. Presbyterian and Columbia reported the disclosure of the ePHI of 6,800 individuals, including patient status, vital signs, medications, and laboratory results.

As part of their investigation, OCR also determined that neither hospital had conducted a thorough risk analysis to determine all systems accessing the shared data network and that neither hospital had an adequate risk management plan to address the potential threats to ePHI. Christina Heide, Acting Deputy Director of Health Information Privacy for OCR, noted that entities participating in joint compliance arrangements "share the burden of addressing the risks to protected health information," and that the cases against the hospitals should "remind health care organizations of the need to make data security central to how they manage their information systems."

Presbyterian has paid OCR a settlement of $3.3 million, while Columbia has paid $1.5 million. In addition to the monetary penalties, both hospitals agreed to substantive corrective action plans, which include requirements for the hospitals to undertake a risk analysis, develop a risk management plan, revise policies and procedures, and complete staff training.

OCR's settlements with Presbyterian and Columbia come one week after the agency announced settlements with two health care entities totaling close to $2 million for violations of the Privacy and Security Rules. The two companies, Concentra Health Services and QCA Health Plan, Inc., were the subject of separate OCR investigations initiated following reports of breaches of ePHI by the entities to OCR. Both breaches were the result of the thefts of unencrypted laptops containing ePHI. Concentra agreed to pay OCR $1.725 million and to adopt a corrective action plan to ensure that sufficient protections are put into place to safeguard ePHI. QCA agreed to a fine of $250,000 and to provide OCR with a risk management plan including additional risk-limiting security measures to secure QCA's ePHI.

OCR has substantially increased its HIPAA enforcement efforts in recent years. The Health Information Technology for Economic and Clinical Health Act (HITECH), as implemented by the Omnibus HIPAA Rule issued on January 25, 2013 (available at 78 Fed. Reg. 5566), increased the potential civil monetary penalties that OCR could impose on Covered Entities — health care providers, health plans, and health care clearinghouses — and their Business Associates — entities that create, receive, maintain or transmit Protected Health Information for or on behalf of Covered Entities — for violating HIPAA. The Director of the OCR, Leon Rodriguez, has been quoted as saying the Omnibus Rule strengthened OCR's ability to "vigorously enforce the HIPAA privacy and security protections, regardless of whether the information is being held by a health plan, a health care provider or one of their business associates."

In order to mitigate the risk of a potential breach, it is critical that Covered Entities and their Business Associates conduct a thorough risk analysis and develop risk management plans to address the potential threats and hazards to the security of ePHI. The risk analysis should frequently be reviewed and updated to account for changes in technology and/or new risks and risk management plans should be modified accordingly. Covered Entities and their Business Associates should also implement policies and procedures addressing workforce member access to databases and network security and should ensure that all employees and workforce members with access to ePHI are properly trained on the policies and procedures. As OCR's latest settlement indicates, failure to take these steps can result in severe financial penalties.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

No comment yet.

Is your client’s wellness plan fully HIPAA compliant?

Is your client’s wellness plan fully HIPAA compliant? | HIPAA Compliance for Medical Practices | Scoop.it
By Melissa A. Winn
May 9, 2014

Employers that offer an outcome-based wellness program are required by federal law to also offer a reasonable alternative standard (RAS), such as an educational class or health program, but advisers and employers need to know the RAS must continue to be offered annually even to employees who continuously fail to meet the desired health outcome.

Like what you see? Click here to sign up for Employee Benefit Adviser's weekly newsletter to get exclusive articles and polls that are designed to help build business.

Under the Health Insurance Portability and Accountability Act of 1996 all outcome-based wellness programs, those that offer a reward under a group health plan for individuals who attain or maintain a specific health outcome such as not smoking, must also offer an RAS to obtain the reward. This can include allowing employees to complete a smoking cessation program to earn the reward or avoid a surcharge to their premium. But, HIPAA rules also require employers to offer the RAS annually and allow employees to qualify for the reward through the RAS regardless of whether they fail to meet the health outcome, such as quitting smoking.

“Even if a participant continues to fail to meet the desired health outcome … [like] smoking cessation, healthy cholesterol level, healthy BMI … year after year, the participant must be able to continue obtaining the reward, avoiding any surcharge, by completing an appropriate RAS,” say attorneys Amy Ciepluch and Sarah Fowles of the Milwaukee, Wisc.-based Quarles and Brady law firm.

Completion of the program results in receiving the reward or avoiding the premium surcharge, regardless of whether the employee has stopped smoking or achieved a healthier BMI or cholesterol level. And the next year, the employer must offer the employee the same opportunity to complete the program (and possibly fail) to avoid the surcharge, the lawyers say in a blog posted this week on the subject.

Compliance for the RAS does not end there, however, Ciepulch and Fowles add, the RAS must also meet the following HIPAA requirements:

  • If the RAS is completion of an educational program, the employer must make the educational program available to the individual or assist the individual in finding such a program. The program must be free for the individual.
  • The time commitment must be reasonable.
  • If the RAS is a diet program, the employer must pay any membership or participation fee but is not required to pay for the cost of food.
  • If an individual’s personal physician states that a plan standard is not medically appropriate for the individual, the plan must provide a RAS that accommodates the physician’s recommendations.
  • If the RAS is another outcome-based wellness program, it must comply with the outcome-based wellness program rules.
  • The RAS cannot be a requirement to meet a different level of the same standard without providing an individual with additional time to comply with the RAS.

Notice of the availability of an RAS must be provided in all plan materials describing the terms of an outcome-based wellness program and any disclosure that an individual did not satisfy an initial outcome-based standard.

Increasingly, Ciepulch and Fowles say they are seeing wellness program designs that offer participants a “menu” of options to obtain a specific health plan reward or avoid a surcharge. While some of the methods in these designs are outcome-based and some methods are participatory and/or activity-based, offering employees more choice and flexibility, they caution employers to have the plans reviewed for HIPAA compliance.

Additionally, they say, some wellness programs provide rewards in cash, gift cards, or other tangible goods, and do not connect the rewards with a group health plan, thus avoiding regulation as an outcome-based wellness program.

But, these programs should also be reviewed for compliance with other applicable laws, they add.

No comment yet.

Columbia University, NY hospital to pay 4.8 million HIPAA fine

Columbia University, NY hospital to pay 4.8 million HIPAA fine | HIPAA Compliance for Medical Practices | Scoop.it

Columbia University and an affiliated health care entity, New York-Presbyterian Hospital (NYP), have reached the largest HIPAA settlement to date, bringing resolution to a breach investigation.

The organizations will pay the Department of Health and Human Services' Office for Civil Rights $4.8 million to avoid being found in violation of Health Insurance Portability and Accountability Act of 1996 (HIPAA) privacy and security rules.

According to an HHS announcement released last week, the organizations have also agreed to implement a corrective action plan, which will include risk analysis, development of a risk management plan, staff training, and updating organizational policies and procedures.

HHS began investigating Columbia and NYP after the entities notified the agency of a breach in September 2010 by filing a joint report. The investigation centered around the electronic protected health information (ePHI) of 6,800 people being exposed, which included data about patient status, vital signs, medications and laboratory results, the HHS release said.

The organizations are affiliated in that Columbia University faculty members serve as attending physicians at one of NYP's facilities, New York-Presbyterian Hospital/Columbia University Medical Center.

The breach occurred when a Columbia University physician tried to deactive a computer server, which left the data of NYP patients accessible through a simple online search, HHS revealed.

“The investigation revealed that the breach was caused when a physician employed by CU, who developed applications for both NYP and CU, attempted to deactivate a personally-owned computer server on the network containing NYP patient ePHI,” the release said. “Because of a lack of technical safeguards, deactivation of the server resulted in ePHI being accessible on internet search engines.  The entities learned of the breach after receiving a complaint by an individual who found the ePHI of the individual's deceased partner, a former patient of NYP, on the internet.”

Under the settlement terms, New York-Presbyterian will pay the bulk of the HIPAA fine, $3.3 million (PDF), while Columbia agreed to shell out the remaining $1.5 million (PDF). The deal comes just after Humana subsidiary Concentra agreed to pay $1.7 million to settle with the HHS over potential HIPAA violations.

No comment yet.

Your EHR Vendor Isn’t Certified – How Should You Approach MU Stage 2? | EMR and HIPAA

Your EHR Vendor Isn’t Certified – How Should You Approach MU Stage 2? | EMR and HIPAA | HIPAA Compliance for Medical Practices | Scoop.it

A recent study conducted by Wells Fargo Securities stated “Over 700 EHR vendors had solutions certified for Stage 1, but at this point about 40 have been certified for Stage 2. While there is still time, we believe 300-500 vendors will ultimately disappear from the government program.”

We talked about the possibility of many EHR vendors not being 2014 certified in our interview with John Squire. This is a real possibility for many EHR vendors. It will be interesting to see which ones choose not to tell their customers that they won’t be ready until it’s too late to switch EHR. I think that will say something about the company.

Allscripts has put out a whitepaper that looks at some of the meaningful use stage 2 challenges and what you should do to make sure you’re ready.

  • Where to begin with Meaningful Use Stage 2
  • The new requirements for Stage 2 attestation
  • Technology upgrade and replacement considerations
  • Meaningful Use reporting
  • Transitioning to population health management

I find the idea of using MU stage 2 as a way to get ready for population health pretty interesting. I know this is a challenge when an organization is overwhelmed by the day to day life of someone in healthcare.

Considering the abysmal meaningful use stage 2 numbers that were released, it seems that many organizations could benefit from some meaningful use stage 2 help this whitepaper provides. I’d be interested to hear if people think that MU stage 2 does help their organization move towards population health management. Is that a reasonable goal you can work on as you work on MU stage 2? Reminds me of those who are doing CDI (clinical documentation improvement) projects alongside their ICD-10 work.

No comment yet.

Columbia Medical Center, Hospital To Pay $4.8M Fine for Data Breach - iHealthBeat

Columbia Medical Center, Hospital To Pay $4.8M Fine for Data Breach - iHealthBeat | HIPAA Compliance for Medical Practices | Scoop.it

New York-Presbyterian Hospital and Columbia University Medical Center have agreed to pay the HHS Office for Civil Rights a $4.8 million joint settlement over a 2010 data breach, Healthcare IT News reports (McCann, Healthcare IT News, 5/8).

Background on Data Breach

Employees at both organizations manage a shared data network and network firewall, according to an OCR statement. CUMC faculty members serve as attending physicians at New York-Presbyterian (Goedert, Health Data Management, 5/8).

On Sept. 27, 2010, the two entities submitted a joint data breach report after they received a complaint from an individual who found a deceased partner's patient records on the Internet (Conn, Modern Healthcare, 5/7).

Following an investigation, HHS determined that the medical records of about 6,800 of New York-Presbyterian's patients were accessible through online search engines. HHS noted that the hospital was not aware of the breach prior to the complaint (AP/Sacramento Bee, 5/7).

The breach occurred after a physician from CUMC deactivated a server on Presbyterian Hospital's internal data network.

The compromised patient records included:

  • Lab reports;
  • Medications;
  • Patient status; and
  • Vital signs (Health Data Management, 5/8).
Details of Settlement

New York-Presbyterian Hospital has agreed to pay $3.3 million and CUMC has agreed to pay $1.5 million. The joint settlement is the largest HIPAA monetary fine to date, Healthcare IT News reports (Healthcare IT News, 5/7).

According to an HHS statement, each entity also has agreed to develop a "substantive corrective action plan" that includes:

  • Creating a risk management plan;
  • Providing progress reports;
  • Revising policies and procedures;
  • Implementing staff training; and
  • Undertaking a risk analysis (Modern Healthcare, 5/7).

However, the entities did not admit liability in the breach and are not liable for related civil money fines under the settlement, Health Data Management reports. In addition, OCR said the settlements were not a concession by the agency that the entities were found to be in violation of HIPAA (Health Data Management, 5/8).


Rachel Seeger, OCR's senior health information privacy outreach specialist, said, "The message here is to get your house in order" (Healthcare IT News, 5/8).

Meanwhile, Presbyterian Hospital spokesperson Doug Levy on Wednesday said that there was no proof at the time of the data breach or in the time following that any of the medical records were accessed or used inappropriately.

Levy noted that the hospital is committed to handling patient privacy and medical records with the "greatest respect and integrity" and is taking additional corrective measures as required under its agreement (AP/Sacramento Bee, 5/7).

Technical Dr. Inc.'s insight:

inquiry@technicaldr.com or 877-910-0004

- The Technical Doctor Team

No comment yet.