The Black Box That Decides Who Gets a Home
February 2026 | Oliver v. Navy Federal Credit Union, No. 24-1656 (4th Cir. 2026)
The Institution Behind the Denials
Navy Federal Credit Union is not a distant Wall Street bank. It was built specifically for the people who serve in the United States military and their families. It now counts 13 million members and offers mortgage products including conventional purchase loans, rate-and-term refinances, cash-out refinances, VA-backed mortgages, and home equity lines of credit. It is, by any measure, the largest credit union in the country.
Credit unions are supposed to be different from commercial banks. They are member-owned. Their profits stay inside the organization rather than flowing to shareholders. The entire premise is that a credit union serves its community rather than extracting from it. Navy Federal’s community is people who took an oath to serve the country. Many of those people are Black and Latino veterans.
The case now working its way through federal court asks a straightforward question: Did Navy Federal build a system that systematically directed those same members toward denial while directing White members toward approval? The answer, according to the complaint filed in the Eastern District of Virginia, is yes. The mechanism, according to the complaint, is a proprietary algorithm that the company keeps secret.
The Algorithm Nobody Outside Navy Federal Can See
Every mortgage applicant at Navy Federal starts by filling out a Uniform Residential Loan Application, the same standardized form used across most of the mortgage industry. That form collects name, Social Security number, citizenship status, property address, employment history, income, assets, liabilities, military service history, and more. This information, along with other undisclosed inputs, feeds into Navy Federal’s underwriting system.
The complaint describes the process as “semi-automated.” Human loan officers play a role, but a proprietary algorithm drives a critical portion of the decision. Navy Federal has not disclosed what that algorithm actually measures, how it weights different factors, or at what point human discretion enters the process. The complaint argues that some of the data fields collected by the application form “can be proxies for race”: geography, employment type, asset types, and relationship history with the institution can all correlate with race in ways that embed historical discrimination into a formula that appears neutral on its surface.
This is the core of the legal theory. The plaintiffs are not alleging that Navy Federal executives sat in a room and said, “Deny the Black applicants.” They are alleging something more structurally dangerous: that Navy Federal built and continues to run a system that produces racially discriminatory outcomes while hiding the mechanism inside a proprietary black box that nobody outside the company is allowed to inspect.
The NCUA, the federal regulator that oversees credit unions, analyzed mortgage lending data across the credit union industry in 2020 and 2021. Its economists found that “the estimated denial likelihood was higher for minority applicants by about 1.2 to 1.9 times the rate of White borrowers.” The NCUA issued its own caveat: because the public mortgage dataset excludes credit scores, work history, and bankruptcy history, the disparity alone cannot prove discrimination. But the disparity is real and documented at the federal level.
CNN’s 2023 investigation went further. Reporters specifically analyzed Navy Federal’s data rather than the industry overall. They found the same gap, concentrated in one institution serving 13 million military families. CNN acknowledged the same limitation: the public dataset available to outside researchers is stripped of key credit-risk variables that Navy Federal itself holds and will not share. The company possesses the data that could exonerate or condemn its algorithm. It has chosen to keep that data private.
The Non-Financial Ledger
The nine named plaintiffs in this case are not statistics. They are real people who went through the process of trying to buy a home, submitted the required paperwork, waited, and received a denial or an offer with worse terms than similarly situated White applicants received. Each of them came to Navy Federal as a member. The institution was built to serve them.
Consider what a mortgage denial costs somebody who is not wealthy. It does not just mean renting instead of owning. It means not building equity while property values increase. It means paying someone else’s mortgage through rent. It means not having a stable asset to borrow against in an emergency, to pass to children, or to sell at retirement. The racial wealth gap in the United States is not abstract. It is constructed, one denied application at a time, across decades of policies and practices that prevented Black and Latino families from accumulating the wealth that homeownership generates.
The named plaintiffs had varying financial profiles. One made several hundred thousand dollars per year and still faced adverse terms. Another put down $80,000 in income. Some had received mortgages from other lenders after Navy Federal denied them, demonstrating that the denial was specific to Navy Federal’s process, not to the applicants’ creditworthiness in the broader market. Six of the nine who were denied went elsewhere and got loans. Navy Federal’s algorithm said no. The market said yes.
One named plaintiff, Constantina Batchelor, was not denied outright. She was approved, but at a higher interest rate than similarly situated White applicants received. Over the life of a 30-year mortgage, a higher interest rate is not a minor inconvenience. A single percentage point on a $300,000 loan adds up to tens of thousands of dollars in additional interest paid over 30 years. That is money extracted from a minority borrower’s household that goes directly to the institution and ultimately functions as a penalty for race.
The proposed class extends well beyond these nine individuals. It encompasses all minority mortgage applicants to Navy Federal from 2018 to the present whose applications were denied, approved at worse terms, or processed more slowly than comparable White applications. The population is enormous. Navy Federal is the country’s largest credit union. The harm, if the algorithm operates as alleged, has been distributed across thousands or tens of thousands of households that were trying to build wealth and were instead turned away or charged extra for the privilege of borrowing.
None of this has been adjudicated. Navy Federal denies the allegations. But the court record makes one thing clear: the company has not offered any public explanation of why its approval rates diverge so dramatically by race. The algorithm remains secret. The weights remain undisclosed. The data that could resolve the question sits inside Navy Federal’s servers, unavailable to the families who may have been harmed by whatever the algorithm actually does.
Legal Receipts
The following passages are drawn directly from the court record in Oliver v. Navy Federal Credit Union, No. 24-1656 (4th Cir. Feb. 9, 2026). These are the court’s characterizations of the complaint’s allegations and the district court’s reasoning.
Navy Federal uses a “semi-automated underwriting process” for all loan applicants, which results in discrimination against “African Americans, Latinos, Native Americans, and other racial minorities.” That process involves collecting certain forms of data from every applicant, some of which “can be proxies for race.” Navy Federal then “runs the data . . . through its proprietary underwriting algorithm.” What “variables [are] used” by that algorithm, “and the weight those variables are given, is entirely up to Navy Federal,” and “Navy Federal maintains secrecy” over what those variables and weights are.
Fourth Circuit Majority Opinion, characterizing the complaint — Oliver v. Navy Federal, No. 24-1656 (4th Cir. 2026), at 4“More than 75% of the White borrowers who applied for a new conventional home purchase mortgage in 2022” were approved by Navy Federal, but “less than 50% of Black borrowers” were approved.
Fourth Circuit, citing CNN Report — Oliver v. Navy Federal, No. 24-1656 (4th Cir. 2026), at 22 (Richardson, J., concurring in part and dissenting in part)The complaint alleges that Navy Federal “requires every applicant to fill out” a single form that collects various categories of information that “can be proxies for race.” The complaint further alleges that Navy Federal “runs the data from the application . . . through its proprietary underwriting algorithm”—singular—”to determine a person’s creditworthiness” and decide “whether to lend to a particular borrower and on what terms.” Finally, the complaint alleges that this “at-least semi-automated underwriting process”—again, singular—generates “a uniquely discriminatory result.”
Fourth Circuit Majority Opinion — Oliver v. Navy Federal, No. 24-1656 (4th Cir. 2026), at 16–17Does Navy Federal use a single algorithm as part of its process for evaluating every loan applicant regardless of the underlying product? If so, does that algorithm—as opposed to some other variable(s)—produce the disparate impacts based on race that are alleged in the complaint? If so, is Navy Federal’s use of that algorithm justified by some interest that would defeat all the applicants’ claims?
Fourth Circuit Majority, identifying the common questions at stake — Oliver v. Navy Federal, No. 24-1656 (4th Cir. 2026), at 17–18“Of course, discovery in this case might show that the [complaint’s] allegations . . . are false” and that Navy Federal employs a process that is neither uniform nor results in disparate impacts based on a loan applicant’s race. But . . . that is not the question at this stage.
Fourth Circuit Majority Opinion — Oliver v. Navy Federal, No. 24-1656 (4th Cir. 2026), at 18–19The district court stated “there [were] too many moving parts” and cited “apples, oranges, grapefruits and bananas, I mean, you’ve got so many different categories of applicants.”
Fourth Circuit, summarizing the lower court’s oral ruling — Oliver v. Navy Federal, No. 24-1656 (4th Cir. 2026), at 14–15The majority of the Fourth Circuit disagreed with the district court’s dismissal of the injunctive class. The panel ruled that plaintiffs had stated enough, on the face of the complaint, to make a prima facie showing that a common question exists: does Navy Federal’s single algorithm, applied to all applicants, produce the racial disparity the data shows? That question cannot be answered without discovery. The district court killed the case before the plaintiffs could gather a single document from Navy Federal to find out.
Societal Impact Mapping
Environmental Degradation
The environmental dimension of algorithmic mortgage discrimination is indirect but real. When Black and Latino families are systematically denied homeownership in particular areas, the resulting concentration of renters rather than owners in those neighborhoods correlates with lower political power, less civic investment, and a reduced ability to fight industrial siting decisions. Research on environmental justice consistently finds that communities of color are more likely to be located near toxic facilities, in flood plains, and in areas with degraded air quality. Homeownership is a mechanism for community stability and political voice. Denial of homeownership across a class of military families compounds existing environmental inequalities by keeping targeted communities in lower-leverage positions when fighting polluters, highway expansions, or industrial development.
Public Health
Housing stability is one of the strongest predictors of health outcomes. Families who are denied mortgages and remain renters face greater housing instability, more frequent moves, and higher rates of overcrowding. Each of these factors correlates with worse physical and mental health outcomes: higher rates of stress-related illness, more limited access to primary care tied to stable addresses, worse outcomes for children in school, and greater susceptibility to respiratory illness tied to inadequate housing conditions. A Navy Federal mortgage denial is not just a financial loss. It is a health event whose consequences extend across the lifetime of the applicant and, through housing instability, across the childhoods of their children.
Economic Inequality
The racial wealth gap between Black and White households in the United States is structural, durable, and directly tied to homeownership patterns. Black households have been excluded from wealth-building through homeownership by explicit redlining, by discriminatory lending, by racially restrictive covenants, and by other government and private policies across the twentieth century. Those exclusions compound across generations because wealth builds on itself. A family that could not buy a home in 1960 could not build equity. A family that cannot build equity cannot help their children with down payments. A family whose children cannot receive intergenerational housing wealth has to start from zero in a market where everyone else has a head start.
If Navy Federal’s algorithm is producing what the complaint alleges, it is functioning as a private continuation of that historical pattern: a system that takes the inputs of racial geography, employment type, and institutional relationship history and produces denial at a rate nearly double that seen for White applicants, while keeping the mechanism secret from regulators, researchers, and the applicants themselves.
The 13 million members of Navy Federal include an enormous portion of the active-duty military and veteran population, a group that is disproportionately Black and Latino compared to the general population. Systematic denial of homeownership to those members is not an abstraction. It is a transfer of wealth: from military families who served the country to an institution that charges them more or denies them entirely while approving their White counterparts at a dramatically higher rate.
The Cost of a Life Metric
What Now?
The Fourth Circuit’s February 9, 2026 ruling returns the injunctive class action claims to the district court for further proceedings. The majority held that the district court acted prematurely in killing the Rule 23(b)(2) class before any discovery had occurred. The damages class under Rule 23(b)(3) remains dismissed, at least at this stage, because the individual differences among class members were too apparent on the face of the complaint to support a damages class without further factual development.
What happens next depends on discovery. The single most important question is whether Navy Federal will be compelled to open its algorithm to scrutiny. If the court allows discovery to proceed on the injunctive class, plaintiffs’ lawyers will be able to demand: the algorithm’s source code and documentation; the variables and their weights; internal communications about algorithm design and testing; demographic outcome data broken down by product, geography, and applicant characteristics; and any internal audits or equity reviews the company may have conducted.
- Consumer Financial Protection Bureau (CFPB): Primary federal enforcer of the Equal Credit Opportunity Act and the Fair Housing Act’s lending provisions. The CFPB has authority to examine Navy Federal’s lending data, subpoena internal documents, and pursue civil enforcement actions. Contact the CFPB’s Office of Fair Lending and Equal Opportunity.
- National Credit Union Administration (NCUA): The federal regulator of credit unions, including Navy Federal. The NCUA already published a 2022 report documenting racial disparities across the credit union industry. Advocates can petition the NCUA to conduct a targeted examination of Navy Federal’s underwriting system and to require algorithmic audits as a condition of operation.
- Department of Justice Civil Rights Division: Has authority to pursue pattern-or-practice investigations of discriminatory lending under the Fair Housing Act. A DOJ investigation would give federal investigators access to Navy Federal’s internal data without waiting for civil discovery.
- Department of Housing and Urban Development (HUD): Enforces the Fair Housing Act independently. HUD can accept complaints from individual applicants who believe they were discriminated against and can refer cases to DOJ for prosecution.
- State Attorneys General (California and Florida): The complaint also alleges violations of California and Florida state civil rights statutes. State attorneys general in both states have independent authority to investigate and prosecute financial discrimination under state law, with potentially broader consumer protection tools than federal agencies.
Corporate Leadership and Board Accountability
The court record does not name individual Navy Federal executives responsible for the algorithm’s design or the company’s decision to keep it secret. The company is led by its executive management and governed by a board of directors. Any accountability campaign should target those decision-makers directly: the CEO, the Chief Risk Officer, the Chief Technology Officer, and the board members who have authorized and overseen the underwriting system while maintaining secrecy about its function.
- File a complaint: If you or someone you know was denied a mortgage by Navy Federal or received worse terms than expected, file a complaint with the CFPB at consumerfinance.gov/complaint, with HUD at hud.gov/program_offices/fair_housing_equal_opp/online-complaint, and with your state attorney general’s office. Every documented complaint builds the evidentiary record for enforcement agencies and the plaintiffs’ legal team.
- Support the plaintiffs: The legal team includes attorneys at DiCello Levitt LLP, Tycko and Zavareei LLP, Ben Crump Law, and Milberg Coleman Bryson Phillips Grossman. Follow the case docket at PACER (Case No. 1:23-cv-01731, E.D. Va.) to track developments.
- Demand algorithmic audits: Contact your congressional representatives and senators and demand that they push the NCUA and CFPB to require algorithmic transparency and regular third-party bias audits for all mortgage underwriting systems used by federally regulated financial institutions. Several states have already passed algorithmic accountability laws. Federal legislation remains absent.
- Know your rights: If you are a Navy Federal member and have had a mortgage application denied, request the specific reasons for denial in writing. You are entitled to this under the Equal Credit Opportunity Act. Collect and preserve all communications with the institution. Contact a fair housing attorney in your state.
- Support fair housing organizations: Organizations including the National Fair Housing Alliance (nationalfairhousing.org) and local fair housing councils in your area conduct testing, advocacy, and direct support for people facing housing discrimination. They need funding and volunteers.
The source document for this investigation is attached below.
Explore by category
Product Safety Violations
When companies sell dangerous goods, consumers pay the price.
View Cases →Financial Fraud & Corruption
Lies, scams, and executive impunity that distort markets.
View Cases →


