Published April 11, 2025
As health care systems increasingly rely on artificial intelligence and vast digital records, the potential for unintended harm grows alongside the promise of improved care. A new concept, “iatrogenic data trauma,” seeks to define and address that harm. It describes the distress and damage patients may experience when the collection, storage or use of their personal data leads to depersonalization, privacy loss or discrimination.
In a concept analysis published in CIN: Computers, Informatics, Nursing, nursing PhD student Erica Smith, MBA, RN, CHDA, and Clinical Professor Darryl Somayaji, PhD, RN, CNS, CCRC, used Walker and Avant’s framework to explore this emerging phenomenon through a review of academic and digital literature from multiple disciplines. They synthesized current thinking from fields like data justice, digital rights and health informatics to define iatrogenic data trauma as a sense of powerlessness and depersonalization when individuals become aware that their health data – often collected with good intentions – has caused or could cause harm.
“This is especially relevant in today's data-driven care landscape because health care systems are experiencing exponential growth in utilizing artificial intelligence to improve health outcomes,” Somayaji explains. “While this is a promising approach to health care, significant risks of unintended harm may occur to patients if data is mishandled or misinterpreted.”
This trauma can result from several factors. Some stem from how health care organizations gather social determinants of health data – such as housing, income, or personal history – that, while essential for identifying disparities, can also be stigmatizing or sensitive. When patients don’t know how their data will be used, or when they discover it was shared without their consent, the result may be broken trust, reluctance to seek care or feelings of exclusion, according to the researchers.
“In our rush to digitize health care, we’ve overlooked how collecting sensitive patient data without trauma-informed approaches can cause real harm,” Smith says. “Health care professionals and researchers must recognize that every data point represents a human story, and our methods of data collection and use can either honor that humanity or inadvertently become a source of distress and distrust.”
Model cases described in the analysis paint a clearer picture: a grieving widow is startled by questions about her marital and financial status; an older man receives mental health outreach he never consented to, based on a single survey answer; a pregnant woman is denied benefits with no explanation. These scenarios illustrate the unintended consequences that can arise when well-meaning data collection efforts are not implemented with care, transparency or sensitivity.
Smith and Somayaji describe five defining themes of iatrogenic data trauma:
While the concept is still emerging, the researchers say, the need for action is clear. They advocate for trauma-informed approaches to data practices, or “universal precautions,” that prioritize transparency, consent and person-centered care. They also call for the development of tools to measure the prevalence and impact of data trauma and for deeper exploration of patients’ lived experiences through qualitative research.
As data becomes even more deeply embedded in health care delivery, health care professionals must weigh both its potential to improve patient care and its capacity to cause harm. This work challenges health care leaders, educators and informatics professionals to expand their ethical lens and ensure that technology enhances care without compromising dignity, trust or equity.
By SARAH GOLDTHRITE
Sarah Goldthrite
Director of Marketing, Communications & Alumni Engagement
School of Nursing
105 Beck Hall (South Campus)
Email: sgoldthr@buffalo.edu
Tel: 716-829-3209