Earlier this year, Yelp was taken to court because of its automated user review filter. Yelp uses an automated review filter to attempt to highlight the most legitimate, relevant reviews. The exact algorithm the site uses is unknown, but most likely Yelp is taking into account the length of the text as well as the content and composition of the review.
Some business owners, including the owner of three California restaurants and a Yelp advertiser, feel that too many legitimate positive reviews from happy customers are unfairly hidden on the site because of Yelp’s filtering process.
Frustrated that positive, legitimate reviews were getting suppressed, the restaurant owner challenged Yelp’s review filter in court. In the end, the court ruled in favor of Yelp, confirming that Yelp is not legally liable for filtering users’ reviews as it sees fit.
In a statement, Yelp said:
“Yelp has spent considerable time and effort to develop its review filter––a sophisticated tool intended to show the most reliable user reviews. The court rightly confirmed that Yelp’s discussion of the filter and our industry-leading efforts to combat unreliable reviews are protected speech about a matter of public concern, and noted that this action was spurred in part by negative reviews. There will always be businesses that think it may be easier to blame the messenger rather than respond directly to customer criticism, but this case reinforces our belief that the better option is constructive dialogue between consumers and businesses. We are happy to be a key part of that conversation.”
This case sets a precedent that websites face limited legal repercussions for automated content filtering decisions, and further demonstrates that Yelp can continue to manage its database of user reviews however it wishes.