Corporate Greed Case Study: GoodRx & Its Impact on Public Trust
TL;DR: Digital health platform GoodRx, which millions of Americans trust for prescription discounts, systematically violated its own privacy promises by sharing users’ most sensitive health information with advertising giants like Facebook and Google. GoodRx compiled data on users’ medications for conditions like erectile dysfunction and heart disease, then used that information to target them with ads, turning their health vulnerabilities into a commodity. This was done for years without user knowledge or consent, culminating in a federal lawsuit and a settlement that exposed a corporate culture prioritizing profit over privacy.
This article delves into the legal documents that lay bare the company’s actions. What follows is an illustration of how deregulation and profit-maximization incentives in a neoliberal capitalist system can lead to the systemic erosion of public trust and personal privacy. Read on to understand the full scope of the misconduct and the structural failures that enabled it.
Table of Contents
- Introduction: The Betrayal of Trust
- Inside the Allegations: A System of Deception
- Regulatory Loopholes: A System Designed for Failure
- Profit-Maximization at All Costs: The Neoliberal Mandate
- The Economic Fallout: A Slap on the Wrist
- Public Health Risks: The High Cost of Data Exploitation
- The PR Machine: An Apology After the Fact
- Corporate Accountability Fails the Public
- Pathways for Reform: A Model for a Broken System
- This Is the System Working as Intended
- Conclusion: Profit Over People
- Frivolous or Serious Lawsuit?
Introduction: The Betrayal of Trust
GoodRx built its brand on the promise of making healthcare more affordable and accessible. Millions of consumers, 55.4 million since January 2017, used its platform to search for discounts on prescription medications, trusting the company with intimate details about their health. GoodRx explicitly promised its users it would never share their personal health information with advertisers or other third parties, a pledge that formed the bedrock of its relationship with the public.
That promise was comprehensively broken.
For years, GoodRx engaged in a calculated betrayal of its users’ trust.
GoodRx systematically funneled sensitive data—including information on prescription drugs for specific health conditions, personal contact information, and unique identifiers—to massive advertising platforms like Facebook, Google, and Criteo. This was not an accident or a data leak in the conventional sense; it was a deliberate business practice designed to monetize the very information users provided in confidence.
The company actively exploited our private / personal data. GoodRx used the information it sent to Facebook to build targeted advertising campaigns. It compiled lists of users who had purchased medications for specific conditions and then targeted those same users with ads on Facebook and Instagram related to their ailments, turning their health struggles into a marketing opportunity.
Inside the Allegations: A System of Deception
The U.S. government’s complaint against GoodRx details a multi-year effort to mislead consumers and exploit their data. The company assured users that any sharing of personal medical data would be rare and only for services they requested, and that third parties would be bound by federal privacy standards. In reality, the company operated a system that made a mockery of these assurances.
GoodRx integrated tracking tools, known as pixels and Software Development Kits (SDKs), from Facebook, Google, and others into its websites and mobile apps.
These tools were configured to collect and transmit vast amounts of user data. GoodRx created custom data events with revealing names like “Drug Name” and “Drug Category,” ensuring that when this information was sent to Facebook, it included the specific medication a user searched for and the health condition it treated.
The data sharing was extensive. It included the names of medications like Lipitor, the associated health condition like “high cholesterol,” the quantity and dosage, the user’s location, and even their full name, email address, and phone number.
GoodRx also shared this information from its telehealth service, HeyDoctor, disclosing details about treatments users sought for conditions ranging from urinary tract infections to high cholesterol.
Timeline of Misconduct
| Date/Period | Event |
| October 2017 | GoodRx privacy policy promises to never provide advertisers or third parties with information revealing a personal health condition. |
| 2017 – 2020 | GoodRx integrates tracking pixels from Facebook, Google, and Criteo, sharing user prescription data, health conditions, and personal contact information without consent. |
| April 2019 | GoodRx acquires the telehealth service HeyDoctor. |
| April – Sept. 2019 | HeyDoctor’s homepage displays a “HIPAA Secure. Patient Data Protected.” seal, despite not being a HIPAA-covered entity. |
| August 2019 | GoodRx runs a targeted ad campaign on Facebook. It uploads lists of users who purchased specific drugs (like Lisinopril and Atorvastatin) and targets them with ads related to those medications. |
| Feb. 25, 2020 | A consumer watchdog publicly reports on GoodRx’s data-sharing practices. Internally, the company’s CTO acknowledges the need to “strengthen our policies and procedures.” |
| Feb. 28, 2020 | GoodRx issues a public statement admitting it was “not living up to our own standards” and promises to stop sharing health information with Facebook. |
| April – Nov. 2020 | Despite its public promise, GoodRx’s Facebook pixel continues to transmit health information for thousands of users. |
| February 2023 | The U.S. government and GoodRx reach a settlement, resulting in a court order and a $1.5 million civil penalty. |
This timeline reveals a pattern of behavior that was not only deceptive but also continued even after the company was publicly exposed. It highlights a corporate structure where privacy was an afterthought, and the systems for data exploitation were deeply embedded in its marketing operations.
Regulatory Loopholes: A System Designed for Failure
The GoodRx case is a textbook example of how corporate innovation outpaces regulation in a neoliberal economy. The legal framework meant to protect health information, primarily the Health Insurance Portability and Accountability Act (HIPAA), was designed for traditional healthcare entities like doctors’ offices and insurance companies. It created a significant gray area for new-era digital health companies like GoodRx.
GoodRx exploited this loophole. GoodRx was not a “covered entity” under HIPAA, meaning it was not legally bound by its stringent privacy and security rules. Yet, it signaled to users that it was, by displaying a “HIPAA Secure” seal on its HeyDoctor homepage, an act the government deemed false and deceptive. This is a classic tactic of regulatory arbitrage, where a company projects an image of compliance with a respected standard while operating in a space where that standard does not legally apply.
The Federal Trade Commission (FTC) had to rely on its broader authority under the FTC Act, which prohibits unfair and deceptive practices, and the more obscure Health Breach Notification Rule (HBNR). This reactive approach demonstrates a systemic failure of proactive regulation. In a system that prioritizes deregulation and “permissionless innovation,” rules are written only after significant harm has occurred, leaving millions of consumers vulnerable in the interim.
The GoodRx case shows that without updated, explicit rules for the digital health industry, companies will interpret the absence of regulation as a license to operate as they please.
Profit-Maximization at All Costs: The Neoliberal Mandate
At its core, GoodRx’s misconduct was driven by the foundational logic of neoliberal capitalism: the maximization of profit above all other considerations. The company’s primary business model is to earn fees when consumers use its coupons to purchase medications. The alleged data sharing represents a secondary, parasitic revenue stream built on the exploitation of user trust.
The details from the government’s complaint show a corporate culture laser-focused on growth and marketing effectiveness, with privacy as a secondary concern.
GoodRx marketing employees created the custom data-sharing events without any formal review or approval process. The company had no sufficient formal policies or compliance programs to govern how sensitive health data could be shared until after its practices were exposed. This lack of internal controls is not merely an oversight; it is the predictable outcome of a system that incentivizes growth at any cost.
Furthermore, GoodRx permitted the third parties it shared data with, including Facebook, to use that sensitive health information for their own business purposes, such as research and ad optimization. This demonstrates a complete disregard for the sanctity of its users’ data.
The information was fed into the machinery of Big Tech to be profited from indefinitely. This business practice reflects a worldview where every piece of human interaction is a potential asset to be collected, analyzed, and monetized.
The Economic Fallout: A Slap on the Wrist
The settlement required GoodRx to pay a $1.5 million civil penalty. For a corporation that operates on a national scale and handles data from over 55 million consumers, this figure is widely seen as a minor cost of doing business. It is a financial penalty so small that it fails to serve as a meaningful deterrent to either GoodRx or other companies engaging in similar practices.
This outcome highlights a fundamental weakness in corporate accountability under late-stage capitalism. Fines are often treated as a predictable expense, factored into the cost-benefit analysis of engaging in unethical or illegal behavior. If the profit generated from exploiting user data far exceeds the potential penalty, the practice is seen as a rational business decision. The $1.5 million fine does little to disrupt this calculus.
The true economic harm is borne by consumers. The unauthorized disclosure of an individual’s health conditions can lead to stigma, discrimination, and tangible economic consequences, potentially affecting their ability to secure employment, housing, or insurance. While these harms are difficult to quantify, they are real and lasting. The legal system, in this case, prioritized a quick settlement with a manageable fine over a process that could have led to more substantial justice for the millions whose privacy was violated.
Public Health Risks: The High Cost of Data Exploitation
The actions of GoodRx pose a significant threat to public health that extends far beyond the individuals whose data was shared. The company’s betrayal of trust creates a chilling effect, discouraging people from using digital health tools that could otherwise provide valuable support and access to care. When patients fear that their searches for medications for mental health, substance addiction, or reproductive health will be shared with advertisers, they may choose to forgo seeking information or care altogether.
This erosion of trust pollutes the entire digital health ecosystem. The government’s complaint notes that the unauthorized disclosures revealed extremely intimate details about users, including information related to chronic physical or mental health conditions, life expectancy, disability status, and sexual health. The fear that such information could be exposed without consent undermines the very foundation of patient privacy.
In a healthcare system already fraught with barriers, digital tools offer a promise of convenience and accessibility. However, when companies in this space demonstrate a willingness to exploit patient data for profit, they sabotage that promise. The long-term public health consequence is a populace that is more hesitant, more suspicious, and less likely to engage with technologies that have the potential to improve health outcomes. GoodRx’s actions damaged the fragile trust between patients and the digital platforms they are increasingly encouraged to use.
The PR Machine: An Apology After the Fact
GoodRx’s response to being publicly exposed follows a familiar corporate crisis management playbook. Only after a consumer watchdog published an article in February 2020 detailing its data-sharing practices did the company take public action. Its leadership acknowledged internally that its policies needed to be strengthened, with the Chief Technology Officer admitting, “What we do not have is the data we are sharing by partner along with its business purpose.”
Publicly, GoodRx issued a carefully worded apology, stating it was “not living up to our own standards” and that “for this we are truly sorry.” Such statements, coming only after being caught, are a hallmark of corporate spin. They are designed to quell public anger and create the appearance of accountability without admitting to the full scope of the misconduct or its systemic nature.
The most damning fact is that even after this public mea culpa, the deception continued. The company’s Facebook pixel kept transmitting health information for several thousand users for months, between April and November 2020. This demonstrates that the public apology was more about reputation management than a genuine commitment to reform, a common tactic in a capitalist system where public perception is another asset to be managed for financial gain.
Corporate Accountability Fails the Public
The final settlement in the GoodRx case illustrates the profound limits of corporate accountability in the American legal system. A key provision of the stipulated order is that GoodRx “neither admits nor denies any of the allegations in the Complaint.” This legal maneuver allows the company to avoid ever having to formally take responsibility for the harm it caused, a standard feature of settlements that prioritizes expediency over justice.
No individual executives were held personally liable. The penalty was a civil fine paid by the corporation, allowing the decision-makers who designed and approved these systems to escape consequences. This reinforces a two-tiered justice system where corporate entities can absorb financial penalties as a business expense, while the individuals in charge remain insulated from accountability.
The settlement’s focus is forward-looking, requiring the implementation of a privacy program and future audits. While these measures are necessary, they do little to rectify the past harm. The public is left with a resolution where a company was caught engaging in years of deceptive practices, paid a relatively small fine, admitted no wrongdoing, and saw none of its leaders held responsible. This is not a failure of the system; it is the system functioning exactly as designed to protect corporate interests.
Pathways for Reform: A Model for a Broken System
The injunctive terms of the settlement, while born from corporate misconduct, offer a clear blueprint for meaningful regulatory reform. The court order permanently bans GoodRx from sharing health information with third parties for advertising purposes. This should not be a punishment for one company; it should be the baseline, legally mandated standard for the entire digital health industry.
The order also requires GoodRx to obtain “Affirmative Express Consent” from users before sharing health data for any other purpose. This consent must be clear, unambiguous, and separate from broad terms of service agreements. This reform directly counters the common corporate practice of burying consent in lengthy, unreadable legal documents that no ordinary consumer can be expected to understand.
Finally, the mandate for a comprehensive privacy program, overseen by a designated executive and subject to independent, third-party audits for 20 years, provides a model for genuine corporate governance. These are the types of structural safeguards that prevent privacy from becoming a casualty of profit-seeking. The tragedy is that these common-sense protections were only implemented after a federal lawsuit, rather than being required by law from the outset.
This Is the System Working as Intended
It is tempting to view the GoodRx case as an example of a good company gone bad or a system that failed. This perspective is a dangerous misreading of the situation. The actions of GoodRx are not an aberration from the norms of neoliberal capitalism; they are a direct and predictable product of it.
In an economic system that structurally prioritizes shareholder value and relentless growth, data is a resource to be extracted and exploited. Deregulation creates the space for this exploitation, and weak enforcement ensures the financial penalties for getting caught are merely a cost of doing business. GoodRx did what the system incentivized it to do: it found a legally ambiguous area, monetized an asset it had access to, and prioritized financial gain over its ethical obligations to its users.
The outcome—a settlement with no admission of guilt and a modest fine—is also the system working as intended. It resolves the legal conflict without fundamentally challenging the power of the corporation or the underlying economic logic that produced the harm. This case is not a story of failure, but a clear window into the routine operations of late-stage capitalism, where public trust is just one more externality to be managed on the balance sheet.
Conclusion: Profit Over People
The GoodRx scandal is more than a story about one company’s privacy violations. It is a distressing indictment of a political and economic system that has consistently chosen to protect corporate interests over the well-being and privacy of ordinary people. Millions of Americans, seeking to navigate a complex and costly healthcare system, placed their trust in a company that saw their health conditions not as a vulnerability to be protected, but as a data point to be sold.
The case reveals the urgent need for a fundamental rebalancing of power. It demands robust, proactive regulations for the digital age, not reactive lawsuits that come years after the damage is done. It calls for penalties that serve as a true deterrent, not just a business operating expense, and it requires genuine accountability for the executives who make these decisions.
Until these structural failures are addressed, we can expect to see this story repeat itself. Companies will continue to push the ethical and legal boundaries in their quest for profit, and the public will continue to pay the price. The GoodRx case serves as a powerful reminder that in the absence of strong protections, the logic of profit over people will always prevail.
Frivolous or Serious Lawsuit?
This was an unequivocally serious and legitimate lawsuit. The gravity of the harm done here, as detailed in the federal government’s comprehensive complaint, points to a deliberate, multi-year campaign to deceive consumers and misuse their most private information.
The federal government’s investment of resources and the detailed, 20-year-long compliance requirements imposed by the settlement underscore the severity of the corporate misconduct. This case represented a significant and necessary legal action to address a profound breach of public trust and to establish a crucial precedent for the rapidly growing digital health industry.
There is a press release on the FTC’s website about this from early 2023 if you’re interested in checking it out: https://www.ftc.gov/news-events/news/press-releases/2023/02/ftc-enforcement-action-bar-goodrx-sharing-consumers-sensitive-health-info-advertising
đź’ˇ Explore Corporate Misconduct by Category
Corporations harm people every day — from wage theft to pollution. Learn more by exploring key areas of injustice.
- 💀 Product Safety Violations — When companies risk lives for profit.
- 🌿 Environmental Violations — Pollution, ecological collapse, and unchecked greed.
- 💼 Labor Exploitation — Wage theft, worker abuse, and unsafe conditions.
- 🛡️ Data Breaches & Privacy Abuses — Misuse and mishandling of personal information.
- 💵 Financial Fraud & Corruption — Lies, scams, and executive impunity.
NOTE:
This website is facing massive amounts of headwind trying to procure the lawsuits relating to corporate misconduct. We are being pimp-slapped by a quadruple whammy:
- The Trump regime's reversal of the laws & regulations meant to protect us is making it so victims are no longer filing lawsuits for shit which was previously illegal.
- Donald Trump's defunding of regulatory agencies led to the frequency of enforcement actions severely decreasing. What's more, the quality of the enforcement actions has also plummeted.
- The GOP's insistence on cutting the healthcare funding for millions of Americans in order to give their billionaire donors additional tax cuts has recently shut the government down. This government shut down has also impacted the aforementioned defunded agencies capabilities to crack down on evil-doers. Donald Trump has since threatened to make these agency shutdowns permanent on account of them being "democrat agencies".
- My access to the LexisNexis legal research platform got revoked. This isn't related to Trump or anything, but it still hurt as I'm being forced to scrounge around public sources to find legal documents now. Sadge.
All four of these factors are severely limiting my ability to access stories of corporate misconduct.
Due to this, I have temporarily decreased the amount of articles published everyday from 5 down to 3, and I will also be publishing articles from previous years as I was fortunate enough to download a butt load of EPA documents back in 2022 and 2023 to make YouTube videos with.... This also means that you'll be seeing many more environmental violation stories going forward :3
Thank you for your attention to this matter,
Aleeia (owner and publisher of www.evilcorporations.com)
Also, can we talk about how ICE has a $170 billion annual budget, while the EPA-- which protects the air we breathe and water we drink-- barely clocks $4 billion? Just something to think about....