Will a different kind of “Black Box” bring Supplier.io down?

Posted on March 4, 2025

0


QUESTION:

Regarding the lawsuit against American Airlines and Supplier.io, what impact will the AI black box play regarding supplier selection have on the case outcome? For example, while Supplier.io may not have overtly or intended to participate in discriminatory acts on the part of the Airline, what if their Black Box algorithms were a factor?

RESPONSE:

The impact of AI “black box” algorithms on the lawsuit filed by the American Alliance for Equal Rights (AAER) against American Airlines and Supplier.io—initiated on February 11, 2025, in the U.S. District Court for the Northern District of Texas—could significantly shape the case’s outcome, particularly regarding supplier selection and allegations of racial discrimination under 42 U.S.C. § 1981. As of March 3, 2025, the case remains unresolved, with no public ruling or detailed discovery available, so any analysis is predictive and based on legal principles, the complaint’s claims, and the role of opaque AI systems like Supplier.io’s. Here’s how the black box issue might play out, especially if Supplier.io’s algorithms unintentionally contributed to discriminatory outcomes despite no overt intent:

The Black Box Problem in Context

A “black box” in AI refers to algorithms whose decision-making processes are opaque—inputs go in, outputs come out, but the “why” and “how” remain unclear to users and even developers. In this case, Supplier.io’s platform connects American Airlines with suppliers, allegedly enforcing a diversity program that excludes non-diverse (e.g., white male-owned) businesses. The AAER claims Supplier.io rejects applicants who don’t fit American’s diversity criteria, but Supplier.io insists it’s a neutral tool, not setting the rules. If its algorithms are black-box systems, they could muddy the waters—did the AI independently amplify discrimination, or did it just follow American’s explicit instructions?

Potential Impact on the Case Outcome

  1. Liability if Algorithms Amplify Bias
    Even if Supplier.io didn’t intend to discriminate, a black box algorithm could still factor into liability if it independently skewed supplier selection beyond American’s stated policy. For example:
    • Unintended Bias: If Supplier.io’s AI was trained on historical data favoring diverse suppliers (e.g., past certifications or spend patterns), it might “learn” to prioritize those traits, rejecting non-diverse applicants more aggressively than American’s rules required. This mirrors cases like Amazon’s 2018 resume-screening AI, which unintentionally downgraded women due to biased training data.
    • Legal Hook: Under § 1981, which guarantees equal contracting rights regardless of race, courts could hold Supplier.io liable if its black box outputs directly caused discriminatory rejections of AAER’s Members A and B. Intent isn’t required—disparate impact might suffice, especially post-Students for Fair Admissions v. Harvard (2023), where race-based policies face heightened scrutiny.
  2. Defense: Following Orders vs. Independent Action
    Supplier.io’s claim—“we don’t set criteria or restrict registration”—could hold if its AI is a transparent pass-through for American’s rules. But a black box complicates this:
    • Opaque Decision-Making: If the algorithm’s logic isn’t auditable (e.g., no one can explain why Member A was rejected beyond “diversity category mismatch”), Supplier.io might struggle to prove it didn’t overstep. In FoxMeyer’s 1998 SAP suit, SAP dodged full liability by arguing misuse, not defects—but if Supplier.io’s black box hides defects, that defense weakens.
    • Discovery Pressure: AAER could demand insight into the algorithm (weights, training data, decision trees). Courts increasingly push for transparency in AI cases (e.g., COMPAS sentencing debates), and refusal to disclose could imply guilt or shift burden to Supplier.io.
  3. Causation and Responsibility
    The black box could blur who’s at fault—American’s policy or Supplier.io’s execution:
    • Direct Causation: If evidence shows Supplier.io’s AI rejected applicants solely per American’s explicit criteria (e.g., a hardcoded “minority-owned” filter), American takes the hit, and Supplier.io’s just a tool—like Andersen Consulting in FoxMeyer’s SAP flop. No black box magic needed.
    • Algorithmic Overreach: But if the AI’s opacity masks unintended exclusions (e.g., rejecting viable suppliers due to correlated non-diversity traits like location or size), Supplier.io might share liability. A 2024 EEOC guidance on AI in employment warns vendors about disparate impact from black-box tools, suggesting courts might extend this to procurement.
  4. Judicial Trends in 2025
    By 2025, courts are grappling with AI accountability amid a DEI backlash (e.g., American dropping DEI hiring in 2024). The black box could sway outcomes:
    • Skepticism of Tech: Conservative-leaning judges (common in Texas) might distrust opaque AI, especially if it’s seen as enforcing “woke” policies. A lack of explainability could hurt Supplier.io, even without intent.
    • Precedent: Cases like Fearless Fund (settled 2024) struck down race-based grants, but focused on policy setters, not tech providers. Supplier.io’s fate may hinge on proving its AI didn’t freelance beyond American’s intent.

Hypothetical Scenarios

  • Scenario 1: Clean Hands: Discovery reveals Supplier.io’s AI is a simple rule-based system (not a true black box), rejecting only what American explicitly flags. Outcome: American liable, Supplier.io off the hook—black box irrelevant.
  • Scenario 2: Hidden Bias: The AI’s a neural network with murky outputs, rejecting Members A and B for reasons beyond diversity (e.g., zip code proxies for race). Outcome: Supplier.io could face joint liability if AAER proves disparate impact, amplified by opacity.
  • Scenario 3: Stalemate: Supplier.io resists algorithm disclosure, citing trade secrets (a la COMPAS). Outcome: Court might infer fault or dismiss Supplier.io if American’s policy alone suffices as the violation.

Why It Matters

The black box could be a game-changer if it turns Supplier.io from a passive vendor into an active discriminator—echoing FoxMeyer’s claim that SAP’s R/3 wasn’t just misused but unfit. Without transparency (e.g., explainable AI methods), Supplier.io risks guilt by obscurity. Yet, American’s overt diversity rules might overshadow this—courts could see Supplier.io as a scapegoat unless AAER nails down algorithmic overreach.

Prediction

As of now, Supplier.io’s liability odds are low (say, 30%) unless AAER uncovers black-box smoking guns—e.g., training data or outputs showing bias beyond American’s brief. The case will likely hinge on discovery, due by late 2025 per typical federal timelines. If the AI’s a true black box and influenced rejections independently, Supplier.io’s in hotter water; if not, American carries the load. Want me to monitor for updates or simulate a ruling?

TODAY’S TAKEAWAY: Even if Supplier.io is not found liable, what impact will this have on a market already edgy about the AI mystery? Here is the link to a Procurement Insights post from last week on the Black Box fear in the C-Suite – https://bit.ly/4kqT64c

30

Posted in: Commentary