Showing posts with label health data. Show all posts
Showing posts with label health data. Show all posts

Thursday, November 21, 2019

Consumer DNA Testing May Be the Biggest Health Scam of the Decade; Gizmodo, November 20, 2019

Ed Cara, Gizmodo; Consumer DNA Testing May Be the Biggest Health Scam of the Decade

"This test, as well as many of those offered by the hundreds of big and small DNA testing companies on the market, illustrates the uncertainty of personalized consumer genetics.

The bet that companies like 23andMe are making is that they can untangle this mess and translate their results back to people in a way that won’t cross the line into deceptive marketing while still convincing their customers they truly matter. Other companies have teamed up with outside labs and doctors to look over customers’ genes and have hired genetic counselors to go over their results, which might place them on safer legal and medical ground. But it still raises the question of whether people will benefit from the information they get. And because our knowledge of the relationship between genes and health is constantly changing, it’s very much possible the DNA test you take in 2020 will tell you a totally different story by 2030."

Tuesday, September 17, 2019

Artificial intelligence in medicine raises legal and ethical concerns; The Conversation, September 4, 2019

, The Conversation; Artificial intelligence in medicine raises legal and ethical concerns

"The use of artificial intelligence in medicine is generating great excitement and hope for treatment advances.

AI generally refers to computers’ ability to mimic human intelligence and to learn. For example, by using machine learning, scientists are working to develop algorithms that will help them make decisions about cancer treatment. They hope that computers will be able to analyze radiological images and discern which cancerous tumors will respond well to chemotherapy and which will not.

But AI in medicine also raises significant legal and ethical challenges. Several of these are concerns about privacy, discrimination, psychological harm and the physician-patient relationship. In a forthcoming article, I argue that policymakers should establish a number of safeguards around AI, much as they did when genetic testing became commonplace."

Sunday, February 17, 2019

With fitness trackers in the workplace, bosses can monitor your every step — and possibly more; The Washington Post, February 16, 2019

Christopher Rowland, The Washington Post; With fitness trackers in the workplace, bosses can monitor your every step — and possibly more



[Kip Currier: This article--and case study about the upshots and downsides of employers' use of personal health data harvested from their employees' wearable devices--is a veritable "ripped from the headlines" gift from the Gods for an Information Ethics professor's discussion question for students this week!... 
What are the ethics issues? 
Who are the stakeholders? 
What ethical theory/theories would you apply/not apply in your analysis and decision-making?
What are the risks and benefits presented by the issues and the technology? 
What are the potential positive and negative consequences?  
What are the relevant laws and gaps in law?
Would you decide to participate in a health data program, like the one examined in the article? Why or why not?

And for all of us...spread the word that HIPAA does NOT cover personal health information that employees VOLUNTARILY give to employers. It's ultimately your decision to decide what to do, but we all need to be aware of the pertinent facts, so we can make the most informed decisions.
See the full article and the excerpt below...]   


"Many consumers are under the mistaken belief that all health data they share is required by law to be kept private under a federal law called HIPAA, the Health Insurance Portability and Accountability Act. The law prohibits doctors, hospitals and insurance companies from disclosing personal health information.


But if an employee voluntarily gives health data to an employer or a company such as Fitbit or Apple — entities that are not covered by HIPPA’s [sic] rules — those restrictions on disclosure don’t apply, said Joe Jerome, a policy lawyer at the Center for Democracy & Technology, a nonprofit in Washington. The center is urging federal policymakers to tighten up the rules.

“There’s gaps everywhere,’’ Jerome said.

Real-time information from wearable devices is crunched together with information about past doctors visits and hospitalizations to get a health snapshot of employees...

Some companies also add information from outside the health system — social predictors of health such as credit scores and whether someone lives alone — to come up with individual risk forecasts."

Thursday, December 20, 2018

HHS Seeks Feedback Regarding HIPAA Rules; Lexology, December 18, 2018

Thursday, October 25, 2018

Hackers Are Breaking into Medical Databases to Protect Patient Data; The Scientist, October 1, 2018

Catherine Offord, The Scientist; Hackers Are Breaking into Medical Databases to Protect Patient Data

"The first few times Ben Sadeghipour hacked into a computer, it was to access the video games on his older brother’s desktop. “He would usually have a password on his computer, and I would try and guess his password,” Sadeghipour tells The Scientist. Sometimes he’d guess right. Other times, he wouldn’t. “So I got into learning about how to get into computers that were password protected,” he says. “At the time, I had no clue that what I was doing was considered hacking.”

The skills he picked up back then would become unexpectedly useful later in life. Sadeghipour now breaks into other people’s computer systems as a profession. He is one of thousands of so-called ethical hackers working for HackerOne, a company that provides services to institutions and businesses looking to test the security of their systems and identify vulnerabilities before criminals do."

Monday, April 2, 2018

Machine learning as a service: Can privacy be taught?; ZDnet, April 2, 2018

Robin Harris, ZDNet; Machine learning as a service: Can privacy be taught?

"Machine learning is one of the hottest disciplines in computer science today. So hot, in fact, that cloud providers are doing a good and rapidly growing business in machine-learning-as-a-service (MLaaS).

But these services come with a caveat: all the training data must be revealed to the service operator. Even if the service operator does not intentionally access the data, someone with nefarious motives may. Or their may be legal reasons to preserve privacy, such as with health data.

In a recent paper, Chiron: Privacy-preserving Machine Learning as a Service Tyler Hunt, of the University of Texas, and others, presents a system that preserves privacy while enabling the use of cloud MLaaS."

Thursday, October 27, 2016

The FCC just passed sweeping new rules to protect your online privacy; Washington Post, 10/27/16

Brian Fung, Washington Post; The FCC just passed sweeping new rules to protect your online privacy:
"Federal regulators have approved unprecedented new rules to ensure broadband providers do not abuse their customers' app usage and browsing history, mobile location data and other sensitive personal information generated while using the Internet.
The rules, passed Thursday in a 3-to-2 vote by the Federal Communications Commission, require Internet providers, such as Comcast and Verizon, to obtain their customers' explicit consent before using or sharing that behavioral data with third parties, such as marketing firms.
Also covered by that requirement are health data, financial information, Social Security numbers and the content of emails and other digital messages. The measure allows the FCC to impose the opt-in rule on other types of information in the future, but certain types of data, such as a customer's IP address and device identifier, are not subject to the opt-in requirement. The rules also force service providers to tell consumers clearly what data they collect and why, as well as to take steps to notify customers of data breaches."