Arnobio Morelix / Silicon Valley-based leader working at the intersection of technology, economics, and policy. His work has been featured widely in national and global media, including the New York Times, the Economist, the Wall Street Journal, and the BBC, among others. He also serves as senior advisor at the Global Entrepreneurship Network and advisory board member at the Business Angel Minority Association.

*This column is adapted from the book Rebooted: An Uncommon Guide to Radical Success and Fairness in the New World of Life, Death, and Tech.


In software development, the term Technical Debt is used to describe hacky code created on the fly which does its job short term but is unwieldy and inefficient to scale in the long term. It is a familiar concept for developers.

It is time we also become familiar with its shadow twin: Fairness Debt.

Just like its technical counterpart, we incur Fairness Debt when we build systems that work for our current situation and user base today, but also have unintended consequences lurking underneath the surface as you continue to deploy the same solutions tomorrow.

One way to incur Fairness Debt is by optimizing our systems and algorithms for a particular performance metric without adding constraints (e.g., optimizing how much time users spend on an app without regards for other dimensions like their well-being). Data scientists and technologists make these types of optimization choices deliberately and often, even if naively.

But optimization often carries a Fairness Debt when taken to its natural progression. A 2015 Google Ventures post, for example, suggests optimizing for the amount of time users spend watching videos on your app. While at first this is a perfectly rational way to focus engineering efforts, it can get out of control when usage becomes excessive to the detriment of the user. As a friend managing AI products at Amazon said, “It is OK when a company is trying to get a user to go from spending seven to eight minutes a day on their app. It is a whole different game when some users are risking going to between seven and eight hours a day.”

At first, Fairness Debts are not paid by the company but by users or society. But once they get big enough, they bite all of us collectively. The backlash against the companies producing smartphones and apps optimized to capture attention—and the real headwinds these businesses are facing from both user and regulatory standpoint—are evidence debts can be postponed but not forgiven.

Going beyond “attention optimization,” imagine a more sinister situation where you have a Fintech company optimizing only for profitability of the loans getting approved. It is easy to imagine a situation where you end with something like algorithmic redlining—effectively adding a techie twist to the old-school practice of discriminating against people living in whole neighborhoods when they are applying for a loan. Researchers at UC Berkeley have found evidence of that in programs. Their research shows Fintech lenders overcharging Latino and African American mortgage refinance borrowers a combined $765 million yearly, partially due to impermissible discrimination.

But just like Technical Debt, Fairness Debt can be avoided and paid early. We have one such example with Upstart, a consumer-facing online lender.

Imagine, as it happened with Upstart, you are a Fintech founder or technologist waking up to a message from the Consumer Financial Protection Bureau (CFPB) mentioning a review of your fair lending practices only a couple of years after launching your product.

You might have been “moving fast and breaking things,” and focused only on improving your tech and growing your business. But you are in a highly regulated industry, and there is growing interest in the unintended consequences of what you do. For instance, Senators Elizabeth Warren and Doug Jones are sending letters asking the Federal Reserve, the Federal Deposit Insurance Corporation (FDIC), and the CFPB about their regulatory agency stance of algorithmic-based lending.

Although they did not call it that, the regulators were rightly seeing the possibility of Fairness Debt occurring. But Upstart took the issues seriously and did not let the debt occur.

Upstart’s AI-based models, careful to be fair from the start, have shown fantastic results, so good the CFPB took the unusual step of widely sharing how well they were performing. Upstart was able to approve 23 to 29 percent more applicants than the traditional lending model, with 15 to 17 percent lower interest rates across every tested race, ethnicity, and gender group. In addition, young adults were 32 percent more likely to get approved for loans and people with incomes under $50,000 got approved 13 percent more.

Just like Technical Debt, incurring Fairness Debt is a choice, not an inevitability. Pay it off early and often or, even better, do not incur it at all. It is the right choice. AI explainability approaches (which help you understand not only which decision an AI is making, but also why), like what Upstart used and companies like Fiddler Labs produce, can help with this process. In a world where our technologies and algorithms will be increasingly under scrutiny, companies that are careful about avoiding Fairness Debt will be rewarded later.