OkCupid co-founder Christian Rudder came clean about the site's experiments in an apparent attempt to defend Facebook, which recently sparked a wave of criticism by revealing that it has conducted psychological tests on users.
“We noticed recently that people didn’t like it when Facebook 'experimented' with their news feed,” Rudder wrote on the company's blog. “But guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.”
Rudder went on to describe three experiments, including one that involved misrepresenting a couple's “match percentage” -- OkCupid's assessment of whether two users would make a good pair.
Specifically, OkCupid told pairs of people whose match percentage was only 30% that the figure was actually 90%. The company also “tested things the other way, too,” Rudder says.
“We told people who were actually good for each other, that they were bad, and watched what happened,” he writes.
The results, according to Rudder, show people put at least some stock in the company's recommendations. Seventeen percent of users who were told their match percentage was 90 -- even though it was only 30 -- went on to exchange at least four messages. By contrast, only 10% of users who had a match percentage of 30 -- and were told the truth -- exchanged at least four messages.
And 20% of those whose match percentage was 90 -- and who were told the truth -- exchanged at least four messages. But only 16% of people who had a match percentage of 90, but were wrongly told it was 30, did the same.
Those results might be academically interesting. But that doesn't mean they justify deliberately deceiving users.
In that regard, OkCupid's experiment seems more problematic than Facebook's. Even though both companies manipulated users, Facebook at least didn't actively lie to them.
For its experiment, Facebook tinkered with 700,000 users' news feeds to filter out some positive or negative posts. Researchers then made note of users' reactions and concluded that mood was “contagious,” with users' responses matching the tone of the posts they saw. That is, people shown more negative posts began posting more negative material, while those shown more positive comments themselves responded positively.
News of the social networking service's experiment recently spurred University of Maryland law professors James Grimmelmann and Leslie Meltzer Henry to urge the Federal Trade Commission to examine the ethical issues raised by online research.
Grimmelmann and Henry raised several arguments in their letter, including at least one that seems to apply to OkCupid: The law professors contend that subjecting people to secret experiments can be a deceptive trade practice on the grounds that “the failure to disclose research is an omission that a reasonable consumer would consider material in deciding whether or not to use a service.”
Regardless of whether the FTC acts, deliberately misleading users doesn't seem like the kind of thing that's going to win a company many fans. As of Monday afternoon, some commenters were already telling OkCupid they weren't happy about its testing.
“Good to know at any one time here forward I can trust or distrust your [match percentages,] not knowing if your lab experiment has resurfaced,” one user wrote in.
“I hope you didn’t cause me to miss out on a relationship while playing around with the data,” another user added. “People trusting your match [percentage] is what you want, so don’t lie to them about it.”