Recent research from Professor Maryam Mehrnezhad at the Information Security Department, Royal Holloway University of London and a team of researchers reveals widespread privacy, security and regulatory failings in female-oriented health technologies (also known as FemTech). The researchers’ comprehensive analysis demonstrates how current practices leave sensitive health information vulnerable, while highlighting an urgent need for reform across technical, legal and social dimensions of digital healthcare. More
The rapidly expanding field of female-oriented health technologies encompasses a wide range of digital solutions designed to address women’s health needs. Known collectively as “FemTech,” these technologies include everything from fertility tracking apps and period monitoring tools to connected devices for reproductive health monitoring and intimate wellness products. The sector is experiencing explosive growth, with projections suggesting it will reach a market value exceeding $103 billion by 2030.
Three recent studies from Professor Mehrnezhad at the Royal Holloway University of London and her team of collaborators examine different but interconnected aspects of privacy and security concerns in FemTech, revealing troubling patterns in how these technologies handle sensitive personal health information. Together, the research provides a comprehensive picture of the challenges facing this emerging healthcare sector.
The first study, “Caring for Intimate Data in Fertility Technologies,” conducted by Professor Mehrnezhad and Professor Teresa Almeida, focused specifically on privacy practices in fertility tracking apps, or fertility tech, the most popular category of FemTech. The researchers introduced the concept of “differential vulnerabilities” in FemTech recognizing that different populations face varying degrees and types of privacy and security risk in terms of their exposure and susceptibility to hazard and their capacity to respond. In the context of FemTech, these vulnerabilities can manifest in several ways.
The researchers argued that fertility tracking data could, in certain regions, be used to identify and prosecute individuals seeking reproductive healthcare services such as abortion. It could also reveal infertility or pregnancy status to employers or other third parties. This could cause discrimination, social stigma and distress to the user.
To explore this further, these researchers analyzed privacy practices in fertility apps in two main ways. First, they looked at how apps ask users for permission to collect their data. They downloaded 30 of the most popular fertility apps from the Google Play Store and examined what happened when opening each app for the first time.
The results were concerning. Of the 30 apps studied, 12 (40%) showed no privacy-related information at all when first opened, and 13 apps (43%) buried privacy information within sign-up pages or terms and conditions, making it hard for users to understand what they were agreeing to. Only five apps had a dedicated privacy notice screen, but even these were problematic – four of them only gave users an ‘Accept’ option with no way to decline data collection.
Mehrnezhad and Almeida also studied the tracking behaviour of these apps via static and dynamic analysis tools. These findings showed that most fertility apps were not following proper privacy practices. Many started collecting data before getting proper permission, used multiple tracking tools without clear disclosure, and failed to give users real choices about their data privacy. The researchers also discovered that many apps contained multiple trackers – pieces of code that collect information about users.
The second study, conducted by Professor Mehrnezhad, Stephen Cook, and Dr Ehsan Toreini “Bluetooth security analysis of general and intimate health IoT devices and apps: the case of FemTech” examined technical vulnerabilities in FemTech devices. The researchers tested 21 Internet of Things devices, analyzing their security practices when communicating with companion mobile apps. This is the largest study of its own in the field.
The researchers set up two different ways to test the security of FemTech devices. The first setup used three small computers (called BBC Microbits) to monitor all the information being sent between the FemTech devices and their apps. Think of these computers like security cameras, each watching a different channel that Bluetooth devices use to communicate.
Using this step, the researchers could tell if the information being sent was protected or not. They saved the data and used software tools to look for security weaknesses.
For their second test setup, the researchers created what’s called a “Man-in-the-Middle” environment, where they set up a ‘middleman’ between the FemTech device and its app. This allowed the researchers to see all information passing between them.
Using this setup, the research team tried three main types of attacks. First, connection hijacking, where they took over the connection between a device and its app. Second, denial of service, where they blocked devices from working properly by flooding them with too many connection requests. Third, battery draining, where they prevented devices from going into power-saving mode, causing their batteries to drain quickly.
These tests revealed serious security problems. For example, during the battery draining attack, one device’s battery level dropped from 96% to 35% in just one hour. Furthermore, the majority of devices used outdated versions of bluetooth. More concerning was the finding that 20 out of 21 devices used the “Just Works” pairing method, the least secure option available for Bluetooth connections. This method provides no protection against man-in-the-middle attacks, where malicious actors can intercept communications between the device and app. The researchers successfully demonstrated several attacks exploiting these vulnerabilities. These vulnerabilities are all reported to the vendors and where possible, the researchers have contributed to fix them.
The third study, “Mind the FemTech gap: regulation failings and exploitative systems,” conducted by Professor Mehrnezhad, Dr Thyla Van Der Merwe, and Professor Michael Catt examine how current data protection regulations fail to adequately protect FemTech users. The researchers analyzed the General Data Protection Regulation (or GDPR for short), Swiss Federal Act on Data Protection, and UK and EU Medical Device regulations, finding significant gaps in how these regulations address the unique nature of intimate health information.
The study identified several distinct categories of data collected by FemTech applications, including information about users’ names and demographics, contact details, lifestyle factors, menstrual cycles, pregnancy status, nursing activities, reproductive organs, sexual activities, medical information, physical symptoms, and emotional states. Beyond user data, these applications often collect device/phone data, such as storage, contacts, camera/microphone access, location, and various sensors. Furthermore, they collect information about others, including data about babies and children, social media profiles, and partner information. The researchers found that this complex mix of personal and health data creates unique challenges for privacy protection.
The researchers’ analysis of privacy practices in 10 FemTech products revealed widespread GDPR violations. These devices included a smart breast pump, IoT fertility trackers, pelvic floor exercisers and sex toys, period and menopause management tech, general female health devices, and even a connected water bottle. Their examination found that only one app presented valid consent mechanisms meeting GDPR standards. The majority of apps either bundled privacy notices with terms of service, presented users with “tracking walls” that require them to accept tracking before accessing the service, or provided no privacy-related content at all.
Similar to the other studies, this study also found that these apps had multiple trackers, with their websites often tracking users before any engagement with cookie notices. In addition, their privacy-content analysis showed that many apps did not include references to or descriptions of FemTech-related data in their privacy policies. The researchers also identified a critical disconnect between GDPR and medical device regulations. While a higher level of safeguarding is expected in products where personal health data is recorded, even if the app doesn’t fall within current medical device definitions, this protection is not properly enforced in practice.
The research highlights how the intersection of health and medical solutions and user data creates particularly complex privacy challenges in FemTech. The researchers advocate for the development of domain-specific and sectorial regulations that specifically address FemTech data, with particular attention to the needs of marginalized user groups. They suggest that protecting citizens’ security, privacy, and safety in this space requires both better enforcement of existing regulations and the creation of new frameworks that explicitly acknowledge and accommodate the unique risks of these technologies.
A significant issue identified across all three studies was the miscategorization of FemTech products in app stores. Many apps containing medical records or other highly sensitive health information were categorized as “Health & Fitness” rather than “medical devices”, potentially allowing them to avoid stricter oversight. This categorization issue reflects broader challenges in regulating technologies that blur the lines between wellness tools and medical devices.
The researchers provided specific recommendations for addressing these issues. Professor Mehrnezhad and her colleagues suggest that the developers be more upfront about the data they collect. They should be careful not to miscategorise the technology to bypass privacy guidelines, be clear on its usage and purposes, and state how it’s shared with third parties. The researchers also noted the importance of ensuring that the most recent privacy considerations such as privacy-by-design, and minimal data collection are taken into account. Professor Mehrnezhad highlights that the security, privacy, and safety issues of FemTech do not only put women and girls at risk, but the entire population; as these products deal with health and medical aspects of our lives such as family planning.
The researchers conclude that addressing these challenges requires a coordinated approach. Without significant reform, users of FemTech products remain vulnerable to privacy breaches and potential misuse of their intimate health information.