Can A Computer Pick Out Fake Online Reviews When Humans Can't?
from the sounds-like-it dept
It's no surprise that there are a ton of "fake" reviews online of just about anything that can be reviewed. Businesses, hotels, authors, musicians, etc., all want to make sure that whatever it is they're selling, people see good reviews when they go searching. But, of course, that's a problem for consumers who rely on such fake reviews... and on the sites who host such reviews and want them to be as accurate as possible. So it's fascinating to see that some researchers at Cornell (yes, my alma mater) were able to come up with an algorithmic way to figure out what reviews are fake. You can read the full paper here (pdf). It's only 11 pages.The method was pretty clever. First, they used Mechanical Turk to create 400 faked 5-star reviews of Chicago hotels:
To solicit gold-standard deceptive opinion spam using AMT, we create a pool of 400 Human- Intelligence Tasks (HITs) and allocate them evenly across our 20 chosen hotels. To ensure that opinions are written by unique authors, we allow only a single submission per Turker. We also restrict our task to Turkers who are located in the United States, and who maintain an approval rating of at least 90%. Turkers are allowed a maximum of 30 minutes to work on the HIT, and are paid one US dollar for an accepted submission.Then, of course, they need "real" reviews. But since part of the issue is that many "real" reviews are faked, the team did their best to find a bunch of real reviews from TripAdvisor, by narrowing them down based on a few factors:
Each HIT presents the Turker with the name and website of a hotel. The HIT instructions ask the Turker to assume that they work for the hotel’s marketing department, and to pretend that their boss wants them to write a fake review (as if they were a customer) to be posted on a travel review website; additionally, the review needs to sound realistic and portray the hotel in a positive light. A disclaimer indicates that any submission found to be of insufficient quality (e.g., written for the wrong hotel, unintelligible, unreasonably short, plagiarized, etc.) will be rejected
For truthful opinions, we mine all 6,977 reviews from the 20 most popular Chicago hotels on TripAdvisor. From these we eliminate:They then test how humans see the two kinds of reviews, and discover that they can't tell them apart. In fact, their accuracy was only slightly above 50%. However, they then work out algorithmic ways of distinguishing the "real" reviews from the fake reviews, and come up with a system that is 90% accurate in picking out which reviews are which. Apparently, while humans can't pick out the differences, faked reviews have some common characteristics:Finally, we balance the number of truthful and deceptive opinions by selecting 400 of the remaining 2,124 truthful reviews, such that the document lengths of the selected truthful reviews are similarly distributed to those of the deceptive reviews. Work by Serrano et al. (2009) suggests that a log-normal distribution is appropriate for modeling document lengths. Thus, for each of the 20 chosen hotels, we select 20 truthful reviews from a log-normal (left-truncated at 150 characters) distribution fit to the lengths of the deceptive reviews.
- 3,130 non-5-star reviews;
- 41 non-English reviews;13
- 75 reviews with fewer than 150 characters since, by construction, deceptive opinions are at least 150 characters long...
- 1,607 reviews written by first-time authors— new users who have not previously posted an opinion on TripAdvisor—since these opinions are more likely to contain opinion spam, which would reduce the integrity of our truthful review data...
We observe that truthful opinions tend to include more sensorial and concrete language than deceptive opinions; in particular, truthful opinions are more specific about spatial configurations (e.g., small, bathroom, on, location). This finding is also supported by recent work by Vrij et al. (2009) suggesting that liars have considerable difficultly encoding spatial information into their lies. Accordingly, we observe an increased focus in deceptive opinions on aspects external to the hotel being reviewed (e.g., husband, business, vacation)...Obviously, it's just one bit of research, but apparently those involved in it have been contacted by... well, just about everyone doing online reviews. Hopefully this means that we're not too far off from better quality online reviews.
[....]
... we find increased first person singular to be among the largest indicators of deception, which we speculate is due to our deceivers attempting to enhance the credibility of their reviews by emphasizing their own presence in the review.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Reader Comments
Subscribe: RSS
View by: Time | Thread
There are other ways to do this...
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: There are other ways to do this...
[ link to this | view in thread ]
[ link to this | view in thread ]
Now All We Need ...
[ link to this | view in thread ]
"...we eliminate: * 3,130 non-5-star reviews;"
Anyone who even reads online "reviews" is a ninny and a sucker.
[ link to this | view in thread ]
Re:"...we eliminate: * 3,130 non-5-star reviews;"
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: Re: There are other ways to do this...
[ link to this | view in thread ]
From the gleaming exterior to the spacious inside this is the right blog for you.
My partner was impressed with the attentiveness of the staff and the expansiveness of the offerings.
[ link to this | view in thread ]
Re: "...we eliminate: * 3,130 non-5-star reviews;"
I know you want to shoehorn elitist conspiracies into every single comment you make about every single post, but it doesn't work.
[ link to this | view in thread ]
I kinda had that thought too. Who's going to govern what the rules are on the filtering? Probably whoever's paying for it, which makes perfect sense.
This may convert opinions and reviews into conceptual paid content from the company who hosts them. Immediately this brings the question of bias into the picture.
Consumers will see it as 'controlled' and as factual as any paid advertisement... might be. Then we get to company reputation - and many are lacking sorely there. It seems the concept now is if they all suck - then consumers have no choice.
[ link to this | view in thread ]
Re: Re: "...we eliminate: * 3,130 non-5-star reviews;"
The evidence strongly suggests that yes, old blueballs certainly is.
[ link to this | view in thread ]
Re: "...we eliminate: * 3,130 non-5-star reviews;"
You're correct. There's nothing elitist there at all. The whole project was around comparing 5-star reviews, so of course it makes sense to eliminate all other reviews.
Not sure how anyone could turn that into a statement on elitism.
Perhaps you misread?
[ link to this | view in thread ]
The Human Coefficient
[ link to this | view in thread ]
Slightly Off-Topic
I ask this because the only "Turk" in know (that's related to AI - I know that a country called Turkey exists) is from Terminator: The Sarah Connor Chronicles, so was wondering if the term was inspired from the series.
[ link to this | view in thread ]
Re: Slightly Off-Topic
http://en.wikipedia.org/wiki/The_Turk
(Also, that show was terrible. Terminator ended with T2. Cameron said so)
[ link to this | view in thread ]
Re: Re: There are other ways to do this...
[ link to this | view in thread ]
Re: "...we eliminate: * 3,130 non-5-star reviews;"
[ link to this | view in thread ]
If the same sort of standards were used to decide what was and was not copyright violations on a file site, you guys would be shitting yourselves and yelling about free speech and all that.
It's amazing to watch you guys go sometimes!
[ link to this | view in thread ]
Another Leap Forward
[ link to this | view in thread ]
Filter bubbles
It's vital, it's on point, and it addresses just how frightening this concept is.... especially if the algorithm automates and doesn't really provide any real logic.
Just read it.
[ link to this | view in thread ]