Big DataData ProtectionEmerging TechnologiesPrivacy

The Indian Legal Landscape: A Safe Haven For FemTech Applications To Exploit User Data and Breach Privacy


Female healthcare-related Technology (colloquially referred to as “FemTech”) is a lucrative industry in the West, and much like other issues related to women, these apps which claim to “empower” women and give them greater autonomy over their bodies and overall well-being are increasingly doing just the opposite[i]. Users of these applications often inadvertently trade off their privacy for convenience and what they perceive as “professional” medical guidance.

In India, where issues of female healthcare and reproduction continue to remain contentious and applications of this nature are almost unheard of – besides the urban population. Nonetheless, this has not stopped a few domestic players[ii] from making a mark in this relatively obscure market. Popular FemTech applications are rapidly gaining traction among urban women in India and this should worry the regulatory authority, but most importantly, the women’s using these applications[iii]. Moreover, the algorithmically derived insights of a women’s body have largely shown a lack of accurate results, rarely based on medical consultation or, were even built by healthcare professionals[iv]. This article attempts to explore the legal lacunae in the regulations administering this domain, moreover, the following article is a genuine attempt at helping women make better choices with these applications wherein they are not coaxed into compromising their privacy, or misled about how their personal data is processed by these platforms.

FemTech: Origin and Boom

The word itself was coined in 2013[v] by the founder and CEO of the globally successful menstrual health app- Clue[vi]. Since its inception, tech companies around the world have come up with their version of similar applications.

This genre of application was surveyed[vii] to be found as the fourth most popular among adults and second most popular among adolescent females in the “health apps” category. Flo[viii], arguably the most popular women’s health app in the world, ranks at number 46 under the “health & fitness” category and has achieved a 4.8 rating out of 16,33,319 reviews on the Play Store in India. The up-and-coming MedTech sector is coveted to make great strides by the year 2025[ix], whereas, in India, where the size of this industry is unknown and growth erratic, yet the market is estimated to be worth $310 million[x] and attracting investments worth more than $9 billion by 2024[xi].

The Information Technology Act, its Rules, and India’s ambitious National Digital Health Mission

The Information Technology Act, 2000 (the “IT Act”) prescribes a set of rules under the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules 2011 (the “Rules”) for the overall management of sensitive personal data or information (“SPDI”)[xii]. The Rules did indeed kick off the conversation for the urgency of a robust data privacy law to govern SPDI in India. Rule 4 sets out conditions for a body corporate to disclose the privacy policy and practices of the collection, reception, possession, storage, and overall management of the data acquired from the data principal. However, these rules simply lack the contours and are too astray in practical application to deal with mobile applications of this nature, as well as, the regulatory actions necessitated by their whimsical policy terms. In the current legal system, it becomes extremely easy for FemTech applications to appropriate their overall policy terms in a manner that puts them in a legal grey area and, easier to practice and propagate their privacy policies unconditionally with the absence of a regulatory authority specifically designated for the protection of SPDI.

India’s National Digital Health Mission (NDHM)[xiii] – a new initiative by the Government to digitalize the health system and offer medical efficacy and standardize the health infrastructure offers a promising future for MedTech[xiv]. However, the NDHM is unlikely to have any influence on applications purely based on reproductive health, menstrual tracking, prenatal and postpartum welfare, and sexual wellness. Albeit, the error-ridden insights often found in FemTech apps[xv] puts it at paradox with the medical purity prospectively intended to be offered at NDHM. Not much can be said about NDHM’s impact on healthcare apps, but taking into account the general consensus over data misuse and inefficiency on the part of government agencies, people are more likely to trust their data in the hands of private actors. In that case, FemTech apps are here to stay.

Where does the Personal Data Protection Bill figure in the debate of FemTech’s data exploitation?

In India, where effective personal data protection laws are all but non-existent and statutory regulatory authority for the protection of SPDI also non-existent, applications of this nature virtually get off scot-free vis-à-vis data collection and usage.

However, this may change with the introduction of the Personal Data Protection Bill, 2019 (the “Bill”), whose need was first recognized by the Apex Court while adjudicating the landmark judgment on privacy[xvi]. The Supreme Court in this case, besides recognizing the right to privacy as a fundamental right enforceable against the state and its instrumentalities, also doubled down on the importance of “informational privacy” and the urgency for legislative intervention for enforcing the right to privacy against private entities as well[xvii].

Not long after the judgment, the Government of India proposed the aforementioned draft statute, which, if enacted, would abrogate the provision of compensation for failure to protect data under the IT Act[xviii]. The Bill also proposes to introduce India’s very first Data Protection Authority with the object of preserving “the interests of data principals, prevent any misuse of personal data, ensure compliance with the provisions of this Act, and promote awareness about data protection,” among other responsibilities[xix]. The proposed bill, besides a few aspects similar to the Rules’[xx] definition of “personal information”, will also incorporate any inference drawn from the data collected for profiling[xxi].

Compared to the Rules[xxii], the Bill also proposes eight new additions under SPDI[xxiii], notably, the introduction of “health data” of the principal’s, past, present, and future physical and mental state, offers a clearer and wider coverage than “medical records and history[xxiv]” – which, as a result, could put these applications in a tight spot, inevitably making them reconsider their management of user’s health data and the apparent brokerage of the aggregated data to unauthorized parties.

Additionally, the Bill also sets the principles based on which personal data can be processed by the data fiduciary, namely[xxv]:

  1. Processing can only happen for a specific, clear, and lawful purpose;
  2. Personal data must only be processed fairly and reasonably and must ensure the privacy of the data principal;
  3. Personal data of the data principal must only be processed in a way the principal would have reasonably expected to be used for, having regard to the purpose and, in the context and circumstances in which such data is collected;
  4. Processing only to the extent that is necessary for processing such data;
  5. A notice must be provided to the data principal at the time of collection of such data;
  6. The fiduciary must ensure the data processed is complete, accurate, not misleading, and updated, vis-à-vis to the purpose for which the data is collected in the first place; and
  7. A fiduciary can only store personal data for so long as reasonably required to satisfy the object for which it is collected in the first place.

The draft Bill also provides for a comprehensive set of what has now come to be known as “privacy by design” policy[xxvi] which many players in this industry claim and boast to have, but very few actually enforce. Adoption of both privacy by design and default will help these applications ensure a greater and thorough privacy safety net, creating a win-win scenario for both the user and creator of such applications. The Digital Information Security in Healthcare Act (DISHA)[xxvii], if combined with the Bill, will bolster health data privacy, anonymity, confidentiality, and privacy accountability. The Bill, if successfully enacted, would indeed offer a promising future for a robust policy system, as well as, an authority to uphold accountability in an era where user privacy has receded and the scope for data exploitation has become boundless.

FemTech applications’ resounding affirmation on women’s well-being is utterly inconsistent considering how these applications are ultimately commodifying deeply vulnerable and sensitive data, especially at a time where users approach them to seek closure and medical guidance. In a way, big tech[xxviii], employers, health insurers, and data brokers[xxix] are all cashing in and preying on women’s menstruation, sex life, conception to childbirth, struggling or avoiding to conceive, menopause; you name it, they are likely after it – without you even knowing it[xxx]. Companies that trade-off their users’ privacy under the guise of promoting corporate wellness and interoperability by inviting sponsors (third-party) who could offer better deals or alternatives based on the aggregated data – are grossly undermining data privacy to an extent which users just cannot comprehend and, the law enforcement cannot determine. With the growing popularity of FemTech and allied applications, the need for a robust personal data protection law is direr than ever. Nonetheless, these applications must not be regulated out of business, it is crucial to level this playing field without having to make ladies trade off their privacy for convenience and medical guidance.


Removing the technical and regulatory aspects associated with the privacy terms of FemTech applications, present policy machinery of these applications enable and reinforce the dated sexist cliches of a woman being dramatic, dysfunctional, and difficult during a certain period of the month, which is discouraging for both female tech entrepreneurs, as well as, users concerned about the accessibility of SPDI by their employers and insurers.

As India rapidly approaches and adapts to an age wherein everything is digitized[xxxi] and women leading the way ahead in tech at an exceptional rate[xxxii], we need to rethink and rebuild the inherently sexist business models that some tech companies and their privacy terms embody in order for restoring female confidence and developing tech to emancipate women much further than the present system allows.

Article Citation:

Bedotroyi Gupta, The Indian Legal Landscape: A Safe Haven For FemTech Applications To Exploit User Data and Breach Privacy, Metacept-Communicating the Law, accessible at


[i] No Body’s Business But Mine: How Menstruation Apps Are Sharing Your Data (Sep. 9, 2019),

[ii] Women’s tracker app Maya Receives Funding From Google’s Rajan Anandan (Oct. 13, 2016),

[iii] Megha Rajagopalan, Period Tracker Apps Used By Millions Of Women Are Sharing Incredibly Sensitive Data With Facebook (Sep. 9, 2019),

[iv] Moglia, Michelle L. WHNP, MS; Nguyen, Henry V. FNP, MS; Chyjek, Kathy MD; Chen, Katherine T. MD, MPH; Castaño, Paula M. MD, MPH, Evaluation of Smartphone Menstrual Cycle Tracking Applications Using an Adapted APPLICATIONS Scoring System (2016), Vol. 127, Issue. 6.

[v] Femtech & IP,

[vi] Clue,

[vii] Supra note 4.

[viii] Flo,

[ix] Frost & Sullivan, FemTech – Time for a digital revolution in the women’s health market,

[x] Vikas Bagaria, Has the Femtech Industry Really Matured In The Past Decade? (Dec. 20, 2019), Entrepreneur INDIA,

[xi] Vikas Bagaria, Femtech in 2021: Trends and Opportunities in Women’s Health Technology (Jan. 14, 2021), Entrepreneur INDIA,,transform%20their%20quality%20of%20life.

[xii] The Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules 2011, u/s 87(2) and r/w § 43A of the Information Technology Act, 2000.


[xiv] How India’s National Digital Health Mission Is Set To Revolutionize Healthcare (Aug. 17, 2020),

[xv] Supra note 4.

[xvi] Justice K.S. Puttaswamy (Retd.) and Anr. Vs Union of India and Ors (2017), W.P (CIVIL) NO.494/2012.

[xvii] Id “..while legitimate aims of the state, such as the protection of the revenue may intervene to permit a disclosure to the state, the state must take care to ensure that the information is not accessed by a private entity. The decision in Canara Bank has thus important consequences for recognising informational privacy.”

[xviii] The Information Technology Act, 2000, § 43A.

[xix] The Personal Data Protection Bill, 2019, § 49 (1), (2) & (3) of Ch.IX.

[xx] Supra note 12, Rule. 2(i), “”Personal information” means any information that relates to a natural person, which, either directly or indirectly, in combination with other information available or likely to be available with a body corporate, is capable of identifying such person.”

[xxi] The Personal Data Protection Bill, 2019, § 3(28).

[xxii] Supra note 12, Rule. 3, “Sensitive Personal Data or Information”.

[xxiii] The Personal Data Protection Bill, 2019, § 3 (21), (26), (36) (iv) (vii) (viii) (ix) (x) (xi).

[xxiv] Supra note 12, Rule. 3 (v).

[xxv] The Personal Data Protection Bill, 2019, § 4, 5, 6, 7, 8 & 9.

[xxvi] The Personal Data Protection Bill, 2019, § 22 (1).


[xxviii] Facebook reportedly gets deeply personal info, such as ovulation times and heart rate, from some apps (Feb. 22, 2019),

[xxix] What Are ‘Data Brokers,’ and Why Are They Scooping Up Information About You? (Mar. 27, 2018),

[xxx] Is your pregnancy app sharing your intimate data with your boss? (Apr. 11, 2019),

[xxxi] Digital India: Technology to transform a connected nation (Mar. 27, 2019),

[xxxii] Katy Ring, Women in Tech: India Leads the way ahead,


Related Articles

Leave a Reply