We use cookies to enhance your browsing experience and analyze our traffic. By clicking "Accept All", you consent to our use of cookies.. View our Privacy Policy for more information.
Your browser (Internet Explorer) is out of date. Please download one of these up-to-date, free and excellent browsers:
For more security speed and comfort.
The download is safe from the vendor's official website.

Blog:

AI gone wrong #1: losing $300M on buying houses

HUMAN Blog
AI & ML
HUMAN Protocol
Apr 6, 2023

AI gone wrong #1: losing $300M on buying houses

2 min read

The “AI gone wrong” series presents real-world examples of AI being misused. By examining mistakes, we can learn what went wrong, stop them from happening again, and potentially stop them from happening at all.

First up, we will look at how an algorithm lead to a company losing $300M, and a prediction of 25% of its workforce.

The concern with AI

While ChatGPT has made many excited about the future of AI, it has also caused many people to raise concerns. Elon Musk (founder, Tesla and SpaceX) and Steve Wozniack (co-founder, Apple), among 1,000 AI industry professionals, have signed an open letter to halt the creation of advanced AI until it can be better understood.

Those concerns are in relation to advanced AI, which we discussed last year, and which is not the topic of this article. Because this article is looking at what has happened, what has gone wrong in the realm of AI that is actually applied today.

The concerns of advanced AI are warranted. Before we get there, let’s think about specific AI, which is already here, and continually produces many unexpected outcomes.

Sometimes, they are funny. 

Sometimes, they are disastrous.

A $300M mistake

Zillow, the real estate company, implemented AI to estimate the price of homes. The “Zestimate” also represented a cash offer to buy the property. It was considered a breakthrough for the sector, and a way of expediting its business processes, while offering individuals the chance for less interaction during the pandemic.

They write in June 2019 of the algorithm’s improvements:

“Now, we’ve taught the Zestimate to discern quality by training convolutional neural networks with millions of photos of homes on Zillow, and asking them to learn the visual cues that signal a home feature’s quality. For instance, if a kitchen has granite countertops, the Zestimate now knows — based on the granite countertop’s pixels in the home photo — that the home is likely going to sell for a little more.”

Impressive, isn’t it? Well, in Q3 2021, Zillow had a $304 million inventory write-down, as reported by CNN at the time

The reason? The algorithm they were using had led to the company purchasing homes at higher prices than its estimates for future selling prices...

The AI product was abandoned; Zillow Offers, the branch associated with it, was shut down, and it planned to cut 25% of its staff.

This kind of AI usage – where it is not the foundation of the organization’s business, but a complement to it – will become increasingly popular over time. The key is to understand what you are building; and to have accurate models, built of relevant, voluminous data.

How could it be avoided?

The key is accurate data and models that understand how to interpret that data.

It is hard to know in the case of Zillow where the problem lies: 

  • it could be in the quality of the data they were using
  • the way it was modeled 

HUMAN Protocol could help on the former – the data – without which there is no model at all.

While it appears that they were using relatively sophisticated imaging techniques, by the outcome it is clear that there are problems somewhere, either in the training data, or in its application. More data points could have played a role in addressing the issue, perhaps.

HUMAN Global Queries has a specific functionality that could be applied here: data scraping. 

To feed the AI, Zillow could have a group of HUMAN respondents check and report back on many factors that could give an indication of the price, in real-time, and in a way that AI systems perhaps couldn’t:  

  • what is the square footage of this property?
  • what is the build date?
  • what is the ZIP code?
  • how many bathrooms does it have? 

Other functions, such as the multiple choice, could be employed to ask more qualitative questions such as: 

  • is this house modern? 
  • or does it have a lawn?

This would not necessarily resolve the problem Zillow had. But it would be an excellent place to start

The disaster of Zillow’s house-pricing algorithm led to a $300M loss. With the use of better modeling, and the better data HUMAN Protocol offers, this problem could have been avoided, saving hundreds millions of dollars, and the 25% of the staff that Zillow predicted it had to fire.

To stay up to date with the latest from HUMAN, follow us on Twitter or join our Discord.

Legal Disclaimer

The HUMAN Protocol Foundation makes no representation, warranty, or undertaking, express or implied, as to the accuracy, reliability, completeness, or reasonableness of the information contained here. Any assumptions, opinions, and estimations expressed constitute the HUMAN Protocol Foundation’s judgment as of the time of publishing and are subject to change without notice. Any projection contained within the information presented here is based on a number of assumptions, and there can be no guarantee that any projected outcomes will be achieved.

No items found.
Guest post