The Social Health Authority (SHA) holds data on 27 million patients across 9,500 health facilities-and counting. But from pregnancy history to HIV status, your most private info is only as safe as the weakest hospital server.
When Saumu Juma recently fell ill, she went to the nearest hospital. As a registered member of the Social Health Authority (SHA), she was asked for her national ID and a verification code sent to her phone. Within minutes, her profile was pulled up, and the attendant cleared her to see the doctor.
What Saumu could not see was the digital trail behind that moment. She has been visiting that hospital for years, and her file includes sensitive personal health data collected routinely during care. When told about the risks, she was unbothered. “So, what if they know I have whatever diseases?” she said and shrugged.
But Saumu’s quiet acceptance does not eliminate the risks. Kenya’s digital health systems now hold data on nearly 27 million people. As artificial intelligence becomes more embedded in healthcare and as health data grows in commercial and strategic value, the question of who protects that data, and who is harmed when protections fail, is becoming urgent. For women, the stakes are especially high.
Women interact with the health system more than men, particularly through antenatal care, childbirth, family planning and child health visits. A gendered analysis of Kenya’s health sector by Financial Sector Deepening (FSD) confirms that women have higher rates of outpatient visits and hospital admissions than men.
Every visit leaves a digital record containing sensitive details such as pregnancy history, HIV status, contraception use and fertility decisions. If systems fail, data is misused or access controls are weak, women face real risk. In many communities, exposure of such information can affect relationships, employment and personal safety.
Before, the patient knew only the doctor had that file. Today doctor keys in info and patient doesn’t know who else will see it- Dr David Kariuki, CEO, KMPDC
A healthcare system that collects, stores and analyses vast amounts of patient data needs strong safeguards and oversight. Those protections, experts say, still lag behind the pace of digitisation.
Patient information once lived in paper files at a single health facility. Digitisation changed that. Now it moves across departments, facilities and systems.
“Before, the patient knew only the doctor had that file,” said Dr David Kariuki, CEO of the Kenya Medical Practitioners and Dentists Council (KMPDC). “Today the doctor keys in information, and the patient doesn’t know who else will see it.”
Dr Kariuki maintains that the ethical obligation around patient confidentiality remains clear. Long before the Data Protection Act, health laws governed consent and confidentiality. “The information you receive from someone,” he said, “you will never use it in any way that will harm them.”
KMPDC emphasises data minimisation and role-based access. Registration staff should only see basic identifiers. Laboratory staff should only view test requests. Only clinicians directly involved in care should access full records. Systems are also expected to maintain audit trails showing who accessed what information, when, and why.
But Dr Kariuki acknowledged a critical gap: patients are often unaware of these safeguards. “The biggest challenge is getting all these initiatives known to the patient,” he said.
SHA handles data for nearly 27 million users with more than 9,500 health facilities onboarded
Poor data quality adds another layer of risk. Collette Akwana, a High Court advocate and data privacy and AI professional, says missing medication records or incorrect medical history can compound over time. Yet many patients are never shown what has been captured about them.
“I am yet to find a hospital that turns the screen to the patient and asks, ‘Have I captured this information correctly?'” she said.
At a recent data privacy conference in Mombasa under the theme ‘Trust the Data, Drive the Future’, Geoffrey Kones, Deputy Director of ICT at SHA, said their systems now handle data for nearly 27 million users across formal and informal sectors, with more than 9,500 health facilities onboarded. “This means we have quite a lot [of data] in terms of responsibility,” Kones said.
Compliance, Kones stressed, is non-negotiable. SHA grounds its approach in Article 31 of the Constitution on privacy, the Data Protection Act, access to information laws, and cybersecurity frameworks, he said. SHA has embedded data protection into systems covering registration, claims, payments and contracting, and conducts frequent data protection impact assessments, audits, testing and system upgrades.
“We balance innovation with regulation,” Kones said. “We cannot have technology that doesn’t allow us to comply with the laws of this country.”
Key gaps included absence of clear protocols for reporting data breaches, weak guidance on consent and patient rights
Still, it is difficult for patients to know whether health facilities are actually complying. An analysis by Strathmore University found that national electronic medical record standards, developed since 2010, did not fully incorporate data protection principles. Key gaps included the absence of clear protocols for reporting data breaches, weak guidance on consent and patient rights, limited provisions for privacy by design, and no requirements for data protection impact assessments or the appointment of data protection officers.
Under Kenya’s Data Protection Act and Regulations, organisations that process personal data are required to register as data controllers or processors. While the Office of the Data Protection Commissioner maintains a public register of data controllers and processors, the absence of sector-specific reporting makes it difficult to assess how many of Kenya’s thousands of health facilities are formally registered and compliant. Efforts to get comment from the Office of the Data Protection Commissioner on enforcement and complaints were unsuccessful.
The commercial value of health data sharpens these concerns. Irungu Houghton, Executive Director of Amnesty International Kenya, said that health data gives access to the most private information: “our vulnerabilities, the medicine we use,” while holding significant commercial and strategic worth.
Globally, health datasets are increasingly used to train AI systems, inform pharmaceutical research, shape insurance pricing and drive targeted advertising, often without patients’ knowledge, consent or control. Studies show insurers rely on predictive models built on health data to assess risk and costs, while commercial actors use reproductive and behavioural data to drive targeted advertising. Women’s health data, particularly around fertility, maternal care, chronic illness and lifestyle health services, is especially valuable within a multi-billion-dollar global health market.
Kenya and USA signed a Health Cooperation Framework, whose data-sharing elements triggered public concern
Irungu pointed to international data-sharing arrangements involving Kenya, raising concerns about who ultimately benefits. “Not just governments,” he argued, “but pharmaceutical companies who know how to use the data.”
Last year, Kenya and the US signed a Health Cooperation Framework whose data-sharing elements triggered public concern. The provisions allowing the US access to Kenyan health data have been challenged in court and sparked renewed scrutiny of Kenya’s health data governance. What troubled Irungu most was what was missing: no visible data protection impact assessment, no public consultation, and no clear explanation of what data would be shared, how it would be anonymised, or when it would be deleted.
The Ministry of Health said any data shared would fully comply with the Data Protection Act, the Digital Health Act 2023, and all relevant laws. But Irungu warned that without enforcement, these protections remain theoretical.
Kenya’s digital health infrastructure expanding faster than the public’s understanding of its implications
“Data protection is an ecosystem,” he said. “Clinics, processors, and institutions all have duties of care, but ultimately, the government is the main duty bearer through the Office of the Data Protection Commissioner.”
As AI grows more embedded in healthcare, accountability questions become harder to ignore. When a data breach leads to stigma, when an AI system flags a woman as high-risk, or when sensitive reproductive information is accessed without consent, who answers for the harm? Technology providers, hospitals, regulators and the state all sit within this chain, yet responsibility is often spread too thin.
Kenya’s digital health infrastructure is expanding faster than the public’s understanding of its implications. The information collected today will shape not only medical treatment, but also social outcomes, economic opportunities and personal autonomy for years to come.
For women, whose bodies, reproductive choices and caregiving roles generate some of the most sensitive data in the system, that is not an abstract concern. It is the digital trail behind every hospital visit, every verification code, every record pulled up in minutes. Saumu may not be worried. But the system that holds her data should be.
This article is part of the Gender+AI Reporting Fellowship, with support from the Africa Women’s Journalism Project (AWJP) in partnership with DW Akademie. The journalist used AI tools as research aids to review and summarise relevant policy and research documents and extract key statistics. All analysis, editorial decisions and final wording were done by the reporter, in line with Willow Health’s editorial standards.
This article was first published by Willow Health Media on April 29, 2026.










