Zillow’s Sold Data Could Cost You

While franchises, brokerages and agents fume and plot how to regain control of the listing data, Zillow Group (ZG), realtor.com, CoreLogic and others are sucking down the MLS-style sold data — that’s what is going to kill you.

The brokerage industry is now running scared because the four main portals capture about 50 percent of all real estate-related Internet traffic, with ZG owning most of it.

ZG is better at marketing the industry’s own information than brokers are. It snags and publishes active listing data, and both Zillow Group and its investors love the profits they make through advertising fees paid by brokers for a shot at the business that those same brokers weren’t skilled enough to attract on their own.

But published listing data is not your real problem.

The fees earned through advertising paid on actively for-sale homes is simply the short-term cash flow to keep the ZG machine running in order to capture the full transaction processes of selling and buying real estate.

You see, portals have survived off dusty old county record data for a decade. But with each property displayed at these sites — and eventually sold — they have slowly built up a national database of rich sales information, and this is accelerating exponentially today.

The grungy old county records data barely offered them a style, size, bed and bath count, lot size and some government-conjured tax valuation for creating their Zestimate. But, oh, that rich MLS-style data with text descriptions, pictures and lists of inclusions, which comes as each active property becomes a sold property — that’s gold, or will be before long.

The Zestimate will soon integrate all of the luxurious text descriptions and inclusions from all of those sold properties into its valuation model — another huge chunk removed from the broker model’s foundation. Other companies with access to this kind of data will do the same.

From analyzing all of that sold data, the ZG sites could shortly integrate keyword-matching algorithms into their buyer search tools — another chunk out of the broker model’s foundation. Buyers will flock to ZG, thus more sellers will demand to be there, and more brokers will have to advertise there — and on and on.

Currently the Zestimate claims a median error rate of 8 percent, and brokers scream that’s not good enough. It will get much more accurate with a decade of rich sold data — and you have another problem.

Now that ZG is making money on listing data, and now that it can improve the buyer search tools and value estimates with a decade of accumulated MLS-style sale data — it can turn its efforts toward controlling the entire sale process.

Every time a potential buyer visits a ZG site, there is also an opportunity to track that buyer. It would be a smart business move for ZG to create learning algorithms and artificial intelligence to understand what actions buyers take just before becoming serious. ZG has all of the active properties, a decade of sold properties, and it gets to witness every action a potential buyer takes. ZG knows who’s about to buy — what’s that list of leads worth to you? Oh, it’s going to be more than what it costs to paste your picture next to somebody else’s listing — place your bid.

Do you agree that sold data from listing portals like ZG could end up costing you more for leads? Do you think this change in data consumption will change the face of the real estate industry? What are your thoughts?

You can read the full story here: