There’s been a lot of debate about the roll – and the perils – of online reviews in the multifamily space. Mostly, the concern, when boiled down, is about ceding control of the message from the marketing department to the consumer/renter. And that’s a daunting prospect, no doubt.
Those who know me know I’ve got my own reservations about user reviews, not so much because, if abused, they can be unfairly damaging to a business (that risk is overblown), but because I often find them more confusing than helpful, and don’t like the anonymity factor. (I have other issues, too. For example, have you noticed recently how pretty much every restaurant on Yelp! is now rated 3-4 stars? It’s uncanny how consistent our restaurants are today!)
Then there’s the problem like the one I discovered below. I was on Google researching hotels in lower Manhattan when I came across a promising boutique hotel that, the Google Search results, had an overall user rating of four stars. Not bad. So I clicked the Google Place page (which aggregates ratings from around the web) and started to read the first few reviews, all four stars (I thought), and presumably written by satisfied customers. But, wait, these don’t sound like reviews from happy people:
- “The worse (sic) experience we’d had in a long time…”
- “Even if the room was $99 I wouldn’t stay here again.”
- The next one starts out promisingly, “Very stylish, a convenient arrangement,” but then goes on to inform me, “Bad cleaning of room, in the first day in general have forgotten to be cleaned.” Good information, but not necessarily what I would expect from a four-star experience. (And what’s a “convenient arrangement” anyway?)
So something’s wrong here. Either the users clicked the wrong star rating or are from a parallel universe where one-star ratings are the best and five-star worst. Turns out, I was just reading the content wrong. The star rating featured on the Google Place page is an aggregate from that source. In other words, on Priceline.com, TripAdvisor.com and Booking.com (the three review sites most prominently displayed on the Place page), the hotel merited a total four-star rating on each. The user-written reviews featured on the Google Place page were picked, perhaps randomly, from those sites, and clearly not necessarily representative of the overall star rating. Confusing? Yes!
I could have made this an anti-Google Places rant, since it’s their poor implementation of the ratings section of the page that led to the confusion. And I certainly don’t want this post to be read as an anti-review/ratings rant. I believe user-generated content is here to stay, and online ratings and social recommendations (especially social recommendations!) will continue to grow as a critical influencing factor for consumers. My point is that the issues run beyond properties having to deal with a rogue former resident talking trash, or even the quality of user generated content itself, and into quality of the user experience and technology that aggregates and presents this content (I’m talking to you, Google!). Those issues will get ironed out, but until then, will users give reviews the scrutiny they really require?



TripAdvisor and Yelp! are apparently also miffed by Google’s Places implementation:
http://searchengineland.com/review-sites-rancor-rises-with-prominence-of-google-place-pages-62980
Let’s just say my position has “evolved.” :-)