STWR has launched a new website:
This older website is no longer being updated and is due to be closed down within the next few weeks.
All of STWR’s own content has been transferred to the new website, but most of the third-party content currently on the old site will soon be unavailable.
If you have any questions, contact email@example.com
|Can Trust Systems Build a New Economy From Ruin?|
As the consumer economy peaks in both economic and environmental terms, the sharing economy offers a more hopeful prospect. Yet a fundamental problem must be addressed for it to achieve scale: in an economy where we share houses, cars, money, and work with strangers, how do we decide who to trust? By Paul M. Davis.
7th September 2012 - Published by Shareables
In 1956, Bill Fair and Earl Isaac created the credit scoring system popularly known as FICO, for Fair Isaac Corporation. Fair and Isaac’s automated scoring system enabled credit card issuers to safely extend consumer credit to the masses. Consumer credit reached global scale with the consolidation of numerous international issuers into VISA in 1975. In 1995, Fannie Mae and Freddie Mac began to require that mortgage lenders include FICO scores, cementing the metric’s authority.
FICO, VISA, subprime mortgages, and other consumer credit innovations fueled unprecedented levels of household consumption and debt in the US and beyond. Never in human history have so many consumed so much in so little time.
Fast forward to today. We're left with a monumental hangover from a planet-threatening orgy of consumption. The US household debt to GDP ratio skyrocketed to 90% from 45% in 1982. That's a staggering $11 trillion in household debt. Credit card, student loan, and mortgage defaults are at or near record levels. Stagnant wages and high unemployment are exacerbating the crisis. Most developed countries have reached a similar economic cul-de-sac.
We've gone deeply into environmental debt too. Our natural capital is already dangerously overdrawn. Yet, humans currently use 50% more resources annually than the earth can replace.
The consumer economy has arguably peaked. In both economic and environmental terms, we can’t afford to boost consumption further. And as our trust in legacy institutions erodes, our reliance on each other increases. We obviously need something new.
Meanwhile, the sharing economy has emerged offering a hopeful prospect — a higher quality of life at dramatically lower economic and environmental cost. Yet a fundamental problem must be addressed for it to achieve scale: in an economy where we share houses, cars, money, and work with strangers, how do we decide who to trust? And what tools are needed to grow the sharing economy, as FICO scores did for the consumer economy?
Despite a handful of high-profile cases of abuse, sharing economy services have been fortunate to date. Bad apples have been statistical anomalies. But good fortune won’t build a new economy at scale. “Millions of people around the world are investing a surprising amount of trust in the people and businesses they rub shoulders with online and in p2p networks,” says Lisa Gansky, author of The Mesh: Why the Future of Business is Sharing and founder of Mesh Ventures. “But overall, there are few set rules or global standards in place.”
The sharing economy is as physical as it is digital: in transactions where individuals’ property or personal safety are potentially at risk, simple eBay-style seller reviews such as “A++ would buy again” won’t suffice. A January 2012 Campbell-Mithun study forecasting collaborative consumption’s growth stated that 67% of respondents cited concerns about trust as their primary barrier to adopting sharing services.
The case for reputation systems
During a talk at TEDGlobal2012 in June, Rachel Botsman posed the question: “How do we mimic the way trust is built face-to-face online?” Which is a fair place to start, but face-to-face relationships are far from foolproof. Mimicking the qualitative judgements we make about individuals in the physical world isn’t enough; what’s needed is mix of qualitative and quantitative metrics, so individuals engaging in p2p transactions can confidently judge whether a person is trustworthy. With a mix of face-to-face and digital technologies, there’s a chance for a significant upgrade in our species capacity to judge trustworthiness — and make that new capacity broadly available.
This is no small feat, posing significant implementation challenges and privacy concerns. The existing reputation systems that collaborative consumption services have developed in-house are piecemeal and offer little portability of user reputation data. House-sharing services such as Airbnb provide user feedback rankings and plug into users’ Facebook connections to provide an added layer of social vetting, while Couchsurfing verifies identity by through a $25 credit card verification fee, which is also their main revenue source. Taskrabbit performs background checks, while UK-based p2p lending site Zopa opts for identity and credit checks.
Reputation systems silo’d to individual services can serve as a disincentive to new users. Existing users of a service have little incentive to trust new users who have yet to build their reputation. And while some services consider user reputation data to be a competitive differentiator, the friction involved in proving your trustworthiness on a particular service may be depressing user adoption of sharing services as a whole. As Botsman memorably declared, sharing is contagious. Few have the time or inclination to build a profile of trustworthiness on a service like Airbnb, only to have to start from scratch when they choose to delve into car sharing.
In the past year, a plethora of reputation services have launched to serve as the connective tissue of reputation and trust across the web. Services like Trustcloud, Legit, Connect.me, Scaffold, and MiiCard take varying approaches to developing portable reputation systems, to address what Legit founder Jeremy Barton characterizes as a fundamental problem of context.
“Reputation as a whole is really about context of the transaction,” says Barton. “It’s hard to trust a stranger in an offline capacity, it’s even harder to trust a stranger in an online capacity. Reputation can give you a lot of context to help you manage expectations in a transaction.”
This point is echoed by Trustcloud co-founder Xin Chung. “The methods and resources for trust in peer marketplaces are fundamentally disorganized,” he says. “There is a high time barrier required of the community to quickly and easily look up trust indicators that will give people confidence in an online transaction.”
The various reputation systems differ not only in their approaches and implementation, but also guiding principles. Yet as a whole, they prompt a number of key questions: How do you rank trustworthiness? Is user data from social media useful in measuring trust? Will a single trust score work across multiple platforms? And what procedures are in place to ensure users’ privacy, the accuracy of the rankings, and the ability to address mistakes in rankings?
How do you rank trustworthiness?
On a fundamental level, the startups vying to address this problem share common traits. They all claim to provide users and services with a portable reputation system. In theory, your good ranking as a Airbnb host could help vet you as a safe bet as a car renter on Relayrides; a background check Taskrabbit ran on you could provide additional context to this transactional data. In the aggregate, a fuller picture of an individual's trustworthiness would presumably appear..
There are fundamental differences in implementation and philosophy between the various startups. Trustcloud’s goal is to build a portable, “real-time trust resume that offers a running feed of users’ trustworthy and virtuous actions online,” says Chung. This resume is built from three layers of data: user verification established through multiple authentication approaches including email, SMS, and physical address confirmation; a behavioral layer that aggregates a user’s social activity; and a transactional layer that tracks users’ behavior across sharing services. User trust resumes are displayed as Trustcards, a portable profile that integrates with those on partner services such as about.me, Tripping, and Sharetribe, and can be embedded into blogs, forum profiles, and the like.
In contrast to Trustcloud’s reputation profile system, Legit and Scaffold aim to operate more like the plumbing of online reputation, offering collaborative consumption services access to standardized reputation systems accessible via API. Unlike Trustcloud, Legit focuses solely on transaction data to aggregate reputation.
“Legit aims to be the credit system of the sharing economy,” says Barton. “We’re solely focused on transaction and reputation data that lives within a marketplace.” The service is developing the Legit Reputation Group (LRG), “a hub that exists between sharing economy companies.” Barton considers such a hub to be the “core piece of infrastructure to grow the sharing economy as a whole.”
Since sharing economy services have already developed in-house reputation systems, Barton isn’t suggesting that they scrap existing systems and the user loyalty accrued through them. Instead, Legit aims to be a way that these systems can share data to provide additional context to users deciding whether to engage in a transaction
“Companies store data in different ways, so our technology makes it very easy for a company to connect to our centralized API,” says Barton. “We normalize the reputation and transaction data across services, and also provide identity matching, so we can develop a unified reputation profile across services.”
Is social data relevant?
Providing context is a common goal among the reputation systems, but the companies differ in what they consider to be relevant trust metrics. Chung believes that users’ social activity is key to establishing their background and context, and that credit-based systems insufficiently address the economic and social shifts of a p2p economy.
“Obviously, there are comparisons to credit scores, but as our world changes with the advent of collaborative consumption, we have what I call the perfect storm,” Chung says, citing “the mounds of useful and serious data being thrown off by the social web. That information is not only easily accessible, but increasingly important.”
“Think about how unimportant a tweet was three years ago or a Facebook post was four years ago, and where we are today,” he says. “LinkedIn has taken the high ground of providing a trusted network of professional peers,” he says. Chung believes that Trustcloud can vouch for social reputation in “the wider world” of p2p transactions, from Craigslist to Relayrides.
Barton takes a different view on the value of social data. He explains that an early iteration of Legit bore some resemblance to Klout, but the startup’s approach shifted due to some negative experiences he had with p2p carsharing. “It was very formative,” he says. “I realized that while social data can add context, the core components of the predictive indicator we’re trying to build need to be built on solid transaction data. We didn’t feel that would provide the level of insight needed to determine whether a person would fulfill a transaction in the way they said they would.”
The Human Touch
Connect.me takes a different tack, aiming to bring more qualitative judgement into the equation. The service creates its reputation profile based upon aggregated social and transactional data, as well as the vouches of influential individuals within a user’s network — family, coworkers, acquaintances, and social influencers. It’s one piece of a proposed Respect Network, which founder Drummond Reed envisions as a “full p2p trust network.”
Any system based upon subjective judgement is primed for abuse. Reed notes that as Connect.me scales, it is selectively deciding which users will operate as “trust anchors” to vouch for users, operating as a social deterrent to gaming and abuse. “We created a seed population of trust anchors that is about 750 folks right now,” he says, “and every one of those folks has been individually vetted by multiple people. Ultimately we think there will be about 10,000 founding trust anchors, and building that core nexus. Eventually you’ll get a critical mass where everyone can get the three vouches they need.”
Can there be only one?
It’s debatable whether a single reputation system can account for the breadth of transactions that occur within the sharing economy. A good host may not necessarily be trustworthy with a loan; a generous microlender may be a hellion on the highway. And while the realities of the market suggest that the likes of Legit and Trustcloud are in competition, both Chung and Barton suggest that there may be space for more than one reputation system in the new economy.
“I think it’s going to be a ‘winner takes most’ game,” says Barton. “The web is interesting in that it’s primarily a game of winner takes all, for the most part. But when you think about the fact that there are three major credit bureaus, but the FICO score is the most well-known score, I think that the same will happen with reputation. One score will be the most well-known and respected, but additional context is always going to be appreciated, especially with a question like ‘can I trust this person with my car?’”
Chung also believes that a dominant standard will emerge, but maintains that he has “a pretty holistic and community-centric view of where trust is going. Companies like Trustcloud, Legit, MiiCard, and Scaffold will be able to innovate in our own ways, and I hope that we innovate toward a standard that is respected across the board and becomes a mainstream trust system that is viewed in the same way credit scores are viewed for credit..”
Who watches the watchers?
The promise of building a peer-to-peer economy built upon trusted transactions is compelling, but doesn’t mitigate the privacy concerns reputation systems raise. As Botsman acknowledged in her recent talk, online reputation systems pose “some enormous transparency and privacy issues.”
A January report on the privacy risks posed by online reputation systems by the European Network and Information Security Agency (ENISA) identified “five core areas of risk for users”. The report warned of security risks that could expose personal data to hackers, called for organizations using reputation systems to “become more open about the way their systems operate,” and noted that the “linkability” of social platforms enabled advanced analytics of user behavior, which can be shared with third parties for marketing purposes.
As is often the case with startups, the monetization model for many of the reputation systems remains vague. This should be a point of concern for users notes Lee Tien, a Senior Staff Attorney with the Electronic Frontier Foundation. “The issue ends up being how much personal information is being monetized and how,” says Tien. “These services generate the score, so who are they selling it to? Who is the market for their data? Do they sell it to ChoicePoint, or one of the other big generalized databases?”
The services interviewed for this piece emphasized that their goal is to provide users with more control of their online reputations, not less, and take user privacy seriously. Barton boasts of Legit’s “bank-level security for any data stored in our system,” and says that the company’s “goal is not to distribute the data in a way that would compromise a user’s privacy — for marketing purposes, for example.” The company plans to support itself by charging participating p2p marketplaces for access to its reputation API.
Chung emphasizes the transparency of Trustcloud’s rankings and methodology, which have been developed with the input of Trusted Advisor’s Charles H. Green and Stanford University’s Sociology Department. He also touts the service’s adoption of standards such as OpenID and ICAM, as well as its data portability and privacy policies, which state that Trustcloud “will never give any personally identifiable data to any third party without permission.”.
Rather than building a centralized data store of reputation data, Connect.me favors a federated network where each user’s data is stored in the cloud provider of their choice. “Connect.me is built as a layer on top of social networks,” says Reed. “We don’t rebuild your social graph. You connect through social web using OAuth,” an open token-based protocol that allows services to only the user data those services need to know. “What we’re doing is giving users a control center” for their aggregate online reputation, Reed says, “whoever your cloud provider is.”
No matter the approach, the notion of a digital dossier that trails us online and off is no doubt disquieting. But reputation system advocates note that digital dossiers are already being collected, and at this point in time, they’re out of our hands. Companies such as Experian and Intelius mine credit histories, criminal records, and legal filings, and increasingly scrape data from the social web to perform background check services to enterprise clients. Meanwhile, startups such as eBureau crunch online purchase data to rate our value, offering clients “a private, digital ranking of American society unlike anything that has come before,” according to The New York Times.
Such dossiers are opaque to individuals, and the procedures for obtaining or disputing them often prohibitively time-consuming. Botsman argues, “ultimately if we can collect our reputation history, we can control it more.” This is a point echoed by Gansky, who sees a mismatch between the implicit trust we put in large companies versus individuals. She advocates for controlled transparency rather than more privacy. “Like it or not,” she says, “privacy is not in our control, but disclosure is.”
“As we leave behind a searchable and potentially permanent trail of breadcrumbs showing where we’ve been and what we’ve been up to,” she says, “all of us can benefit from a more proactive approach to unveiling who we are on our own terms and time. We are living in a world where intentional and regular public disclosure is the most effective path to establishing, building, and maintaining trust,” she says. “Opting out is not a practical option.”
A bitter pill for privacy advocates, no doubt. Reputation systems designed for a p2p economy may offer a level of transparency and personal agency denied to us by the increasingly sophisticated ranking systems that prop up a faltering and corrosive consumer economy. But this will require that the dominant reputation systems behave in an equally transparent fashion. As the EFF’s Tien warns consumers, “if you don't know what they're doing with your data, you should think twice.” It’s a rule of a thumb that will only grow more important in the years ahead, as p2p transactions grow increasingly common, and our reputations online begin to reflect on nearly every aspect of our lives.
Paul M. Davis is the Science, Tech and Government Editor for Shareable Magazin. More at paulmdavis.com.
|Climate Change & Environment|
|Global Financial Crisis|
|Global Conflicts & Militarization|
|IMF, World Bank & Trade|
|Poverty & Inequality|
|Aid, Debt & Development|
|The UN, People & Politics|
|Food Security & Agriculture|
|Health, Education & Shelter|
|Land, Energy & Water|
|Economic Sharing & Alternatives|