By Ruoxin Su, 3 November 2025
If you use a period-tracking app, a digital contraceptive, or a pregnancy planner, you are part of the “FemTech” revolution. These female-oriented technologies—a burgeoning market estimated to be worth over $97 billion by 2030—promise to do what health systems have often failed to do: take women’s health seriously. By gathering user-entered data, FemTech promises to correct historical, systemic biases in a medical landscape often criticized as “male-by-default”, empowering users with enhanced control and decision-making over their own bodies.
But what is the cost of this empowerment?
This promise is shadowed by significant legal and ethical concerns. The data-intensive nature of these apps—which often collect intimate details about our menstrual cycles, sexual health, and pregnancies—creates profound risks. These include privacy intrusions, data exploitation, digital surveillance, and the reinforcement of societal inequalities.
Much of the legal discussion around FemTech has focused on data protection, namely the EU’s General Data Protection Regulation (GDPR). However, a GDPR-focused analysis is not enough. The risks posed by FemTech are not just about data mismanagement; they are potential interferences with rights that are fundamental to human dignity and bodily autonomy. This requires a fundamental rights analysis.
A recent research of HALL’s researcher Ruoxin, “FemTech Concerns: A Fundamental Rights Perspective through the ECHR”, which was published in Human Rights Review by Springer Nature, uses the European Convention on Human Rights (ECHR, or “Convention”) and the case law of the Strasbourg Court (ECtHR, or “the Court”) as a lens to critically examine this industry. The central question is: Can the ECHR framework adequately protect women from the novel risks posed by various rising FemTech products? This research finds that while the Convention itself provides an indispensable normative framework, its structural and interpretive limitations render it insufficient on its own.
1. The Core Risks of FemTech
Before applying the ECHR, this paper first systematically identifies the primary concerns discussed in existing literature. These include, while go far beyond simple data privacy concerns.
- Pervasive Privacy Intrusion: The FemTech business model is often built on sharing data. Investigations show many popular apps embed third-party trackers to collect and transmit intimate user data to external entities like Facebook and Google, often for marketing. The US Federal Trade Commission’s 2021 action against the Flo app for breaching its privacy commitments is a prime example.
- Serious Health Risks: What happens when the app is wrong? In 2018, the Natural Cycles app, certified as a contraceptive in the EU, was reported to Swedish authorities after 37 users reported unintended pregnancies. Inaccurate algorithmic predictions can mislead users and have severe consequences for their reproductive lives.
- Threats to Reproductive Freedom: The data collected by FemTech apps can be “weaponized”. We see the rise of “menstrual surveillance”, where employers incentivize staff to use tracking apps, creating risks of pregnancy-related discrimination. In jurisdictions with restrictive laws, this data could even be co-opted as legal evidence in prosecutions of those seeking abortions.
- Reinforcing Discrimination: Far from being empowering, many apps reinforce patriarchal norms, centering motherhood as a woman's primary role. Insurers could misuse this data to inflate premiums for those perceived as infertile. Furthermore, the industry’s feminine-centric design often excludes transgender and non-binary individuals, deepening existing health inequities.
2. Mapping FemTech Harms to the ECHR
This research maps these risks onto the ECHR’s binding guarantees, primarily Articles 8 (right to private life) and 14 (prohibition of discrimination).
Article 8: The Right to Private Life, Data, and Bodily Integrity
When addressing the challenges posed by FemTech, Article 8 is the core provision. The case law of the ECtHR provides a powerful, but incomplete, set of legal tools.
- Data Protection: The Court has long held that information about one's health is a key element of private life and is highly sensitive. In cases like Z. v. Finland, the Court emphasized that respecting the confidentiality of health data is a “vital principle”. In Y.G. v. Russia, the Court affirmed the state’s positive obligation to protect individuals from privacy breaches, finding a failure to investigate the sale of a database containing health information was a violation of Article 8. This principle clearly applies when FemTech apps leak or sell sensitive health data
- Physical and Reproductive Integrity: The notion of “private life” under Article 8 protects the right to self-determination in reproductive matters, including decisions “to become and not to become a parent”. Inaccurate FemTech apps that lead to unwanted pregnancies (like the Natural Cycles cases) directly interfere with this right. The Court’s case law on cyber-surveillance in domestic violence (e.g., Buturugă v. Romania) also provides a parallel for “menstrual surveillance” by partners or employers, recognizing digital abuse as a form of harm the state must investigate.
Article 14: Prohibition of Discrimination
The discriminatory risks of FemTech directly engage Article 14 (read in conjunction with Article 8).
The Court has adopted a critical stance against gender stereotyping. In Carvalho Pinto de Sousa Morais v. Portugal, the Court found that a domestic court’s assumption that sexuality was not important for a 50-year-old mother reflected a traditional (and discriminatory) idea linking female sexuality only to childbearing. This reasoning can be directly applied to FemTech apps that perpetuate patriarchal digitality by framing motherhood as women's primary role.
Furthermore, the Court held in Jurčić v. Croatia that discrimination based on pregnancy constitutes sex discrimination. This provides a strong basis to challenge the misuse of FemTech data by employers or insurers that results in adverse treatment.
More detailed analysis on other ECtHR cases can be found in the original text of this paper.
3. The Core Argument: Why the ECHR Is Not Enough
While the ECHR provides the essential language to articulate these harms, the central argument of this paper is that this rights-based approach is insufficient on its own. Its effectiveness is constrained by three core limitations.
The “Margin of Appreciation”
In ethically sensitive areas with no clear European consensus—such as reproductive rights and abortion—the Court frequently grants states a wide “margin of appreciation”. It defers to national authorities, leading to weaker and less harmonized protection. This judicial deference means the ECHR does not currently guarantee a substantive, harmonized right to reproductive healthcare, creating a significant gap that impacts the entire context in which FemTech operates.
The State-Centric Model
The ECHR is designed to regulate the relationship between the State and the individual. An individual harmed by a FemTech app cannot bring a case directly against the company at the ECtHR.
Instead, they must sue their State for failing to provide an adequate regulatory framework or effective remedy (the “positive obligations” doctrine). This is an indirect and painfully slow mechanism. The development of case law inherently lags behind technological innovation. By the time a case is decided, the technology and business models have already evolved, rendering the judgment less effective.
The Gaps in Parallel Regulations
The ECHR's insufficiency is highlighted by the fact that even more specific EU regulations, like the Medical Devices Regulation (MDR) and the GDPR, are failing to govern the industry.
Many FemTech developers evade the MDR’s strict requirements by classifying their products as “lifestyle” or “wellness” apps, not medical devices, even when they are used for functions like contraception. And as the UK’s Information Commissioner’s Office (ICO) found in a 2023 report, the GDPR’s principles of consent and transparency are systematically undermined in practice, leaving users vulnerable.
4. A Path Forward: The ECHR as a “Normative Anchor”
This paper concludes that the ECHR framework, while insufficient as a standalone solution, is nonetheless the normative anchor for the entire regulatory ecosystem. We cannot and should not abandon the fundamental rights approach. Instead, we must use it to build a multidimensional, whole-system governance model. The principles enshrined in the ECHR—privacy, autonomy, non-discrimination—must not be siloed at the ECtHR. They must be actively used to guide the interpretation and enforcement of all other specific regulations, from the GDPR and MDR to new initiatives like the European Health Data Space (EHDS). Only by ensuring that the enduring rights to privacy, health, and equality infuse every layer of governance—from product design to regulatory oversight—can we hope to close the accountability gaps and ensure the FemTech revolution genuinely serves the health and well-being of all women.
(Image source: Mewburn Elis)