how big data gets real - aspen institute · choice is by now pretty clearly hadoop. cloudera, a...

Post on 21-Jun-2020

0 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

How Big Data Gets Real http://bits.blogs.nytimes.com/2012/06/04/how-big-data-gets-real/

The business of Big Data, which involves collecting large amounts of data and then searching it for patterns and new revelations, is the result of cheap storage, abundant sensors and new software. It has become a multibillion-dollar industry in less than a decade. Growing at speed like that, it is easy to miss how much remains to do before the industry has proven standards. Until then, lots of customers are probably wasting much of their money. There is essential work to be done training a core of people in very hard problems, like advanced statistics and software that ensures data quality and operational efficiency. Broad-based literacy in the uses of data should probably happen too, along with new kinds of management, better tools for reading the information, and privacy safeguards for corporate and personal information. That such a huge number of tasks are taking place is a good indicator that, even with the hype, Big Data is a big deal. Last Friday, a number of technologists gathered at a forum hosted by the University of California, Berkeley, iSchool and talked about ways many of these jobs are being done. (Disclosure: I lecture at the iSchool, which is the school of information science, and moderated several panels there.) They talked about the progress so far, and identified a number of good ideas and businesses left to pursue. In some ways, Big Data is about managing all kinds of weird new data, like social media updates from a mobile phone. It is hard to categorize in the first place, and may be used in lots of different ways, from advertising to traffic management. The so-called unstructured database of choice is by now pretty clearly Hadoop. Cloudera, a leading software producer, is now training 1,500 people a month, mostly online, in how to use both the database and associated applications. According to Amr A. Awadallah, Cloudera's chief technical officer, over 10,000 people have been trained on its system. Data quality from new diverse sources is still a big problem, as is persuading companies and organizations to let others see data that might be more valuable in a commonly shared algorithm. "I've tried paying money for it, but it's easier for companies to decide not to share," said Gil Elbaz, the founder of Factual, a company that seeks to hold lots of online data. "The only way that works is to get them to take risks in exchange for data that is valuable to them." Much of the fear about exposing data, he said, has to do with competitors learning secrets. Mr. Elbaz thinks there is a good business in developing "de-identifiers" that can make data anonymous, and privacy insurers specializing in covering the costs of exposure. On a personal level, others think the government or a trusted private institution will hold the personal identifiers of things like medical data, releasing it to trusted parties. "It's a little scary that right now a cab driver using Uber knows more about you than a doctor, who has to take all of your information for the first time," said Peter Skomoroch, principal data scientist at LinkedIn. Another data-improving business consists of moving the world's older data online. A company called Captricity aims to couple image-capture from things like cellphone cameras with cheap workers in Amazon.com's Mechanical Turk service, in order to put older handwritten documents into digital databases. The company's early business is from government and charity sites in Africa and India, but there is no reason why it shouldn't be valuable for most medical records. If

someone took the trouble to write it down, the company figures, that is a good way to assume it is valuable data. There are other businesses trying to take the arcane side of Big Data into the mainstream, with easy-to-use statistical tools and new ways of visualizing data that make it easier to understand. Companies like ClearStory and Platfora "want to make it possible for businesses, for history majors, to use," said Ben Werther, chief executive of Platfora. "We're in the pre-industrial age of Big Data." Martin Wattenberg, creator of a well-known wind map and who is now at Google, talked about a necessary revolution in design of data outcomes that have yet to become widespread. Away from big online companies like Google, some of the earliest data-driven businesses include hedge funds and insurance companies, which have the money and tradition of using lots of math. Another, Mr. Skomoroch said, is the Mormon Church. "It is the focus on genealogy data," he says. "If you move, they'll always find you." What Big Data is seeing now looks like the classic industrial curve. There is the first discovery of something big, leading to establishing principles like scientific rules. Science moves toward engineering as a means to manufacturing, resulting in mass deployment. Then things really change.  

Chapter 1: The Personal Data Landscape

Introduction

The digital world is awash in personal data.1 Every day, people send 10 billion text messages, make 1 billion posts to a blog or social network and generate millions of entries into their growing electronic health records.

In addition, with approximately 6 billion mobile telephone subscriptions in the world, it is now increasingly possible to track the location of nearly every person on the planet, as well as their social connections and transactions.2 And mobile phones are not the only devices recording data – Web applications, cars, retail point-of-sale machines, medical devices and more are generating unprecedented volumes of data as they embed themselves into our daily lives. Estimates are that by 2015, 1 trillion devices will be connected to the Internet.3

Companies and governments are using this ocean of Big Data to unleash powerful analytic capabilities. They are connecting data from different sources, finding patterns and generating new insights – all of which adds to the ever deepening pool of data.

In short, the growing quantity and quality of personal data creates enormous value for the global economy. It can help transform the lives of individuals, fuel innovation and growth, and help solve many of society’s challenges. As the first-phase World Economic Forum report on personal data elaborated, personal data represents an emerging asset class, potentially every bit as valuable as other assets such as traded goods, gold or oil.

Historically, the strength of a major economy is tightly linked to its ability to move physical goods. The Silk Route, the Roman roads and the English fleet all served as the economic backbones connecting vast geographies. Even though it is a virtual good, data is no different. Data needs to move to create value. Data sitting alone on a server is like money hidden under a mattress. It is safe and secure, but largely stagnant and underutilized.

As an emerging asset class, personal data currently lacks the rules, norms and frameworks that exist for other assets. The lack of trading rules and policy frameworks for its movement have resulted in a deficit of trust among all stakeholders and could undermine the long-term potential value of personal data. Different jurisdictions are looking to tackle this deficit of trust through different approaches, ranging from fundamental rights-based approaches to harm-minimization approaches.

Several characteristics of personal data make establishing rules and frameworks uniquely challenging:

- The digital nature of personal data means it can be copied infinitely and distributed globally, thereby eliminating many of the trade barriers which exist for physical goods.

- Data, unlike most tangible assets, is not consumed when used; it can be reused to generate value.

- Data grows ever more connected and valuable with use. Connecting two pieces of data creates another piece of data and with it new potential opportunities (as well as new potential harms).

- The role of the individual is changing. Individuals are no longer primarily passive data subjects. They are also increasingly the creators of data. In addition, personal data is intimately linked with an individual’s background and identity, unlike interchangeable commodity goods.

1 There are many definitions of personal data across different jurisdictions and even different sectors within the same jurisdiction. For the purposes of this report, personal data is used to refer to data and metadata relating to a specific, identified or identifiable person. This is similar to definitions used by both the proposed US Privacy Bill of Rights and the EU’s Data Protection Regulation. 2 “Key Global Telecom Indicators for the World Telecommunication Service Sector”. International Telecommunications Union, http://www.itu.int/ITU-D/ict/statistics/at_glance/KeyTelecom.html. 3 King, Rachel. “IBM panel discusses tackling big data storage as problem escalates”. smartplanet, September 2011, http://www.smartplanet.com/blog/smart-takes/ibm-panel-discusses-tackling-big-data-storage-as-problem-escalates/19010.

7Rethinking Personal Data: Strengthening Trust

pkeefer
Rectangle

8 Rethinking Personal Data: Strengthening Trust

Chapter 1: The Personal Data Landscape

All stakeholders in the personal data ecosystem face a challenge of unprecedented size, speed and complexity. Rules and norms change faster in a hyperconnected world and outstrip the ability of traditional rule-setting approaches to keep pace. Solutions that focus on isolated examples or one-size-fits-all approaches quickly grow outdated as social, commercial and regulatory contexts change. Enterprises, governments and individuals need to creatively collaborate to develop new rules and frameworks that are both robust enough to be enforceable, yet flexible enough to accommodate the world’s accelerating and constant change.

Consider the many ways personal data can create economic and social value for governments, organizations and individuals:

- Responding to global challenges. Real-time personal data and social media can help to better understand and respond to global crises like disaster response, unemployment and food security. It represents an unprecedented opportunity to track the human impacts of crises as they unfold and to get real-time feedback on policy responses.5 Or consider the case of Google Flu Trends, which uses individuals’ ostensibly private flu-related search words and location data6 to detect potential flu outbreaks in real time, as opposed to the weeks-old government data that currently exists.7 Researchers have found that the data has a high correlation with upswings in emergency room activity, and it could provide the basis for early-warning systems to detect pandemics and save millions of lives.8

- Generating efficiencies. For centuries, increased access to information has created more efficient ways of doing business.9 These days, organizations in every industry are using vast amounts of digital data to streamline their operations and boost overall productivity. For example, US$ 700 billion in health cost

The Opportunity

Personal data plays a vital role in countless facets of our everyday lives. Medical practitioners use health data to better diagnose illnesses, develop new cures for diseases and address public health issues. Individuals are using data about themselves and others to find more relevant information and services, coordinate actions and connect with people who share similar interests. Governments are using personal data to protect public safety, improve law enforcement and strengthen national security. And businesses are using a wide range of personal data to innovate, create efficiencies and design new products that stimulate economic growth.

Estimates are that the Internet economy amounted to US$ 2.3 trillion in value in 2010, or 4.1% of total GDP, within the G20 group of nations. Larger than the economies of Brazil or Italy, the Internet’s economic value is expected to nearly double by 2016 to US$ 4.2 trillion.4 But this growth could be severely constrained if the

5 UN Global Pulse, http://www.unglobalpulse.org/about-new.6 It is important to note that Google Flu Trends data can never be used to identify individual users because it relies on anonymized, aggregated counts of how often certain search queries occur each week.7 Google Flu Trends, http://www.google.org/flutrends/about/how.html.8 Dugas, A. F., Y. H. Hsieh, S. R. Levin, et al. “Google Flu Trends: Correlation with Emergency Department Influenza Rates and Crowding Metrics.” Clinical Infectious Diseases, 2012. Described in Science Daily, http://www.sciencedaily.com/releases/2012/01/120109155511.htm.9 Gleick, James. The Information: A History, a Theory, a Flood. London: Fourth Estate, 2011.

flow of personal data on which e-commerce and other economic activity depends becomes overly restricted.

4 Dean, David, Sebastian DiGrande, Dominic Field and Paul Zwillenberg. “The Digital Manifesto: How Companies and Countries Can Win in the Digital Economy.” The Boston Consulting Group. January 2012.

pkeefer
Rectangle

9Rethinking Personal Data: Strengthening Trust

Chapter 1: The Personal Data Landscape

savings in the US, or about 30% of total healthcare spending today, could result in large part through the increased flow of personal data. Improved information flow could reduce duplicative lab testing and imaging, fraud and inefficiencies, as well as lead to better care coordination and treatment.10 In financial services, personal data is already being used to generate significant efficiencies through facilitating online commerce and payments as well as saving billions of dollars through fraud prevention.

- Making better predictions. Personal data is stimulating innovative new products tailored to and personalized for the specific needs of individuals. For example, Amazon’s “Customers Who Bought This Also Bought” collaborative filtering tool suggests related items to buy that customers might not have discovered otherwise.11 In financial services, tailored insurance products are being developed based on devices that track driving behaviour rather than just age, gender and neighbourhood. News and content websites can customize the articles each individual views based on their interests and preferences. In addition, organizations can use business intelligence derived from the aggregation of millions of individual consumer transactions to prepare for likely events. For example, before a hurricane strikes, Wal-Mart knows to stock its shelves with not only flashlights and batteries, but also with Pop-Tarts.12

- Democratizing access to information. Consumers benefit from “free” services like search engines, e-mail, news sites and social networks that previously either did not exist or have a significant monetary cost in other forms in the offline world. However, individuals are beginning to realize that targeted advertising, based on data about them and their online behaviour, 13 fuels most of these ostensibly free services either directly or indirectly. As is widely quoted online: “If you’re not paying for something, you’re not the customer; you’re the product being sold.”14

- Empowering individuals. Empowered consumers are taking greater control over the use of data created by and about them. Rather than individuals being passive agents, they are engaging with organizations in a collective dialogue.15 As a result, people are starting to volunteer information that only they know. Such volunteered personal information (VPI) includes their updated contact details as they happen, reasons why they took an action or made a purchase, and future plans and preferences. Some observers have estimated that VPI could reach approximately US$ 32 billion in value by 2020 in the UK alone.16 In addition, individuals are using the information they share about themselves, their beliefs and their preferences to connect like never before. In the past few years, social media tools built on the foundation of widely shared personal data have played a key role in bringing down governments.

10 “Where Can $700 Billion in Waste Be Cut Annually from the U.S. Healthcare System?” Thomson Reuters, http://www.factsforhealthcare.com/whitepaper/HealthcareWaste.pdf, 2009. 11 Tene, Omer and Jules Polonetsky. “Privacy in the Age of Big Data”. Stanford Law Review, 2 February 2012. http://www.stanfordlawreview.org/online/privacy-paradox/big-data.12 “A Different Game”. The Economist. 25 February 2010. http://www.economist.com/node/15557465.13 Most online advertising today relies on anonymous cookies and is linkable to a device rather than an individual.14 Lewis, Andrew. MetaFilter Weblog. 26 August 2010. http://www.metafilter.com/95152/Userdriven-discontent#3256046.15 Searles, Doc. Cluetrain Manifesto. http://www.cluetrain.com.16 Ctrl-Shift, http://www.ctrl-shift.co.uk/themes/volunteered_personal_information.

The above list is far from comprehensive. Some of the most important value-creation opportunities from personal data remain as yet unknown. Most personal data still resides in silos separated by different technology standards and legal contracts. And a lack of an effective system of permissions prevents data from moving in a trusted and secure way to create value. Creating such systems will allow ever greater “data leverage”. But such opportunities are not without potential harms and risks, which manifest themselves in a loss of trust among all stakeholders in how personal data is protected and used.

Given the variety of applications in which personal data can be leveraged, estimating the impact of the “economics of trust” is difficult to measure. Existing research by BCG on the Internet economy in the G20 has forecast that online retail will grow to US$ 2 trillion by 2016.17 However, this estimate is influenced by consumer perception of trust in how personal data is used. Online retail could grow even faster to US$ 2.5 trillion by 2016 with enhanced trust or to only US$ 1.5 trillion if trust were to be eroded. Given that this US$ 1 trillion range is from just one small part of the broader personal data ecosystem, it provides an indication of the magnitude of the potential economic impact when other sectors (health, financial services, etc.) are considered – potentially in the tens of trillions of dollars.

The Evidence of a Decline in Trust

Ample evidence suggests that there is a decline in trust in the personal data ecosystem. Many of the existing regulatory, commercial and technical mechanisms for strengthening trust are no longer fit to do the job. The widespread loss of trust is unmistakable: security breaches, identity theft and fraud; concern from individuals and organizations about the accuracy and use of personal data; confusion from companies about what they can and cannot do; and increasing attention and sanctions from regulators.

Some recent events serve as leading indicators of the potential for future instability on a much more massive scale. In 2011, Sony revealed breaches in its online video game network that led to the theft of names, addresses and possibly credit card data belonging to more than 100 million accounts, in what was one of the largest-ever Internet security break-ins.18 Experts said the breaches could cost Sony and credit card issuers US$ 1-2 billion.19

Data breaches are also growing more anarchic and motivated by state-sponsored political and anti-business sentiments. The loose-knit hacking movement known as Anonymous claimed to have stolen thousands of credit card numbers and other personal information belonging to clients of US-based security think tank Stratfor.20

17 For country-by-country forecasts see Dean, David, Sebastian DiGrande, Dominic Field, Andreas Lundmark, James O’Day, John Pineda and Paul Zwillenberg. “The Internet Economy in the G-20: The $4.2 Trillion Growth Opportunity.” The Boston Consulting Group. March 2012.18 Baker, Liana B. and Jim Fincle. “Sony PlayStation Suffers Massive Data Breach”. Reuters. 26 April 2011. http://www.reuters.com/article/2011/04/26/us-sony-stoldendata-idUSTRE73P6WB20110426; “Sony suffers second data breach with theft of 25m more user details”. The Guardian, Technology Blog. http://www.guardian.co.uk/technology/blog/2011/may/03/sony-data-breach-online-entertainment.19 Miller, Mary Helen. “Sony data breach could be most expensive ever”. Christian Science Monitor. 3 May 2011. http://www.csmonitor.com/Business/2011/0503/Sony-data-breach-could-be-most-expensive-ever.20 “Anonymous targets US security think tank Stratfor”. Associated Press. 25 December 2011. http://www.guardian.co.uk/technology/2011/dec/25/anonymous-security-thinktank-stratfor

pkeefer
Rectangle

10 Rethinking Personal Data: Strengthening Trust

Chapter 1: The Personal Data Landscape

In addition, some data breaches happen accidentally or unintentionally; think of the employee who leaves an unencrypted laptop on a train containing thousands of data records. And breaches are not limited to the companies that originally collected the personal data, which adds to the loss of trust. In 2011, hackers compromised the database of Epsilon, a marketing company responsible for sending 40 billion marketing e-mails on behalf of 2,500 customers, including such companies as Best Buy, Disney and Chase.21 Very few customers had ever heard of Epsilon, nor did they know it held data about them.

Stakeholders Have Different Perspectives and Concerns

Tension is rising. Individuals are growing concerned that companies and governments are not protecting data about them and that they are instead using it in ways not necessarily in their best interests. Many organizations are struggling to protect and secure the explosion of data they have access to, and they are unsure what they can and cannot do with it.

Governments are trying to strike a balance between protecting individuals and encouraging innovation and growth. Such uncertainty creates instability, which in turn manifests itself differently across three main groups of actors in the personal data ecosystem – individuals, government policy-makers and organizations.

Individuals Surveys show that individuals are losing trust in how data about them is being collected, used, shared and combined by both organizations and governments. For example, according to European Justice Commissioner Viviane Reding, 72% of European citizens are concerned that their personal data may be misused, and they are particularly worried that companies may be passing on their data to other companies without their permission.22 However, a disconnect exists between what people say and what they do; the world of personal data is no exception. While many people say they care about privacy, they also share information quite widely on social networks and elsewhere online.

A large part of concern stems from the fact that individuals often sign up to services not knowing how their data will be protected or if it will be shared. While legally they have given organizations their consent to the stated rules for usage, few individuals actually read these privacy policies or terms of services. Individuals therefore have little visibility into the practices of the organizations they are putting their trust in – until their data is breached or misused. As government’s use of personal data grows, concern likewise grows over government protections of individual privacy.

Another concern of individuals is how to manage their online identity and the different aspects of their digital lives. Health data about an individual has a different impact when shared in a healthcare, work, family or social context. The lack of contextual control and permissions represents another cause for concern among individuals. At the moment, one of the few ways for individuals to keep different parts of their digital lives separate is to use

different names and e-mail addresses for different contexts, to use pseudonyms, or to prevent their data being captured or linked to them in the first place.

Government Policy-makers Policy-makers and regulators around the world share similar objectives – to stimulate innovation and economic growth while at the same time protecting individuals from harmful uses of personal data. Striking this balance lies at the heart of the tension facing the regulatory community. However, regulators are growing concerned about the loss of individuals’ trust in how organizations are using and protecting data about them. Meanwhile, different regulators are taking different approaches to balancing these objectives, potentially adding to the instability as they work out how to protect individuals through new laws, policies and regulations. This ranges from privacy bills of rights and “do-not-track” options for consumers, to requirements that consumers be granted access to their personal data. A more comprehensive approach is needed that does not limit the solution to one sector or jurisdiction to allow for a global flow of data.

Following the release of the National Strategy for Trusted Identities in Cyberspace in 2011, which focused on identity assurance to support online interactions, the US government has outlined a comprehensive online Privacy Bill of Rights that aims to give individuals more control over how personal information about them is used on the Internet.23 The Privacy Bill of Rights includes the goals of individual control, transparency, respect for context, security, access and accuracy, focused collection and accountability. The government intends to work with stakeholders to translate the document into specific practices or codes of conduct, to develop legislation based on these rights and to provide the US Federal Trade Commission (FTC) with enforcement authority. The approach builds on existing US self-regulation practices in which the FTC steps in to enforce unfair or deceptive practices.

In Europe, on the other hand, the European Commission has approached the issue from the perspective of protecting fundamental rights, although it shares with the US the same common goal of providing individuals with greater control to restore trust. The proposed Data Protection Regulation issued in January 2012 includes a requirement that Internet companies obtain explicit consent from consumers about the use of their personal data and delete the data forever at the users’ request or face the prospect of fines for failing to comply.24 The rules would also extend the reach of European law to companies outside the EU that handle data relating to EU residents. This raises, however, jurisdictional questions given the global nature of data flows. While many observers have praised these efforts as an important step in giving individuals more control over data about them, some have raised questions about the possible unintended consequences on innovation and economic growth and the use of data for purposes such as health research.25

Many of these regulatory requirements have reasonable motives, but could also have unintended consequences that might undermine economic and social value. For example, the Indian government introduced rules in April 2011 it said would protect individuals,

23 “We Can’t Wait: Obama Administration Unveils Blueprint for a ‘Privacy Bill of Rights’ to Protect Consumers Online”. Office of the Press Secretary, US White House, 23 February 2012, http://www.whitehouse.gov/the-press-office/2012/02/23/we-can-t-wait-obama-administration-unveils-blueprint-privacy-bill-rights.24 Sengupta, Somini. “Europe Weighs Tough Law on Online Privacy”. The New York Times. 23 January 2012. http://www.nytimes.com/2012/01/24/technology/europe-weighs-a-tough-law-on-online-privacy-and-user-data.html?_r=1&pagewanted=al.l.25 For example, see the US government’s position on the draft EC regulation available at http://www.edri.org/files/US_lobbying16012012_0000.pdf; http://blogs.law.harvard.edu/infolaw/2012/01/25/more-crap-from-the-e-u; http://www.pcworld.com/businesscenter/article/251573/proposed_eu_data_laws_under_fire_from_both_sides.html; http://www.marketingweek.co.uk/news/facebook-data-laws-could-stifle-innovation/4000557.article.

21 Lennon, Mike. “Massive Breach at Epsilon Compromises Customer Lists of Major Brands”. Security Week, 2 April 2011. http://www.securityweek.com/massive-breach-epsilon-compromises-customer-lists-major-brands.22 Reding, Viviene. “The EU Data Protection Reform 2012: Making Europe the Standard Setter for Modern Data Protection Rules in the Digital Age.” Speech at the Innovation Conference Digital, Life, Design, Munich, Germany, 22 January 2012. http://ec.europa.eu/commission_2010-2014/reding/pdf/speeches/s1226_en.pdf.

pkeefer
Rectangle

11Rethinking Personal Data: Strengthening Trust

Chapter 1: The Personal Data Landscape

Exhibit 1: A lack of trust has the potential to pull the ecosystem apart

Enhance Control Increase Transparency

Obtain Fair Value Distribution

Drive Growth Stimulate Innovation Protect Individuals

Create Value Generate Efficiencies

Predict Behaviour

– Lack of Transparency – Significant Liabilities

– No Clear Rules – Rapid Pace of Change

including requiring companies to take consent in writing from individuals about the use of the sensitive personal information they collect.26 The requirement could have significantly constrained the US$ 75 billion Indian outsourcing industry, which employs 2.5 million people.27 In August 2011, the Indian Ministry of Communications and IT issued a clarification effectively exempting outsourcers from the new law and calling into question the law itself.28

The role of regulators is likely to be very different when it comes to ensuring accountability for organizational stewardship of data as opposed to the setting of rules for what can and cannot be done with personal data. In the latter areas, the personal data ecosystem is increasingly too complex, fast-moving and global for traditional regulatory mechanisms to be effective. Some observers say approaches that treat governance as an afterthought and economic externality (through regulatory oversight and mechanisms such as notification and consent, hearings, rulings, legal challenges, injunctions and warrants) create huge costs, uncertain liabilities and problematic social acceptance.29

Organizations Commercial enterprises, not-for-profits and governments that have access to personal data are responding in unique ways to the explosion of information created by and about individuals. Some are unsure of the rules of the game and are concerned about legal liabilities and the negative brand impact of being seen as unfairly exploiting personal data given the heightened media attention on privacy and data breaches. As a result, some organizations are currently restricting the use of the personal data they hold and are underinvesting in the systems that can help to generate value from this data.

A number of private organizations, particularly those in the telecom sector, face significant legal and long-standing regulatory constraints on how they can use personally identifiable information. This lies in contrast with many of their competitors, which are using personal data much more freely to generate value. Others are innovating and developing new business models that offer individuals the tools to receive some form of control or even payment for the use of data about them.

The relationship individuals hold with Internet giants such as Google and Facebook was tested in 2011 when the FTC announced that it planned to heavily monitor the companies during the next 20 years. The settlements stemmed from charges related to the unauthorized sharing with third parties of personal data, failure to delete personal data from deactivated accounts and engaging in deceptive tactics that violated established privacy policies.30

26 Wugmeister, Miriam and Cynthia Rich. “India’s New Privacy Regulations”. Morrison & Foerster Client Alert. http://www.mofo.com/files/Uploads/Images/110504-Indias-New-Privacy-Regulations.pdf.27 “India’s share in global outsourcing market rises to 55 pct in 2010”. International Business Times, 3 February 2011. http://www.ibtimes.com/articles/108475/20110203/nasscom-global-outsourcing-share-india-it-bpo-sector-revenue-growth-of-it-bpo-sector.htm.28 Ribeiro, John. “India Exempts Outsourcers From New Privacy Rules”. IDG News, 24 August 2011. http://www.pcworld.com/businesscenter/article/238706/india_exempts_outsourcers_from_new_privacy_rules.html.29 Clippinger, John Henry. “Design for a Trusted Data Platform and a Data-driven Bank: Overview and Next Steps”. ID Cubed, January 2012.

30 “Facebook and privacy: Walking the tightrope”. The Economist. 29 November 2011. http://www.economist.com/blogs/babbage/2011/11/facebook-and-privacy; “FTC Charges Deceptive Privacy Practices in Google’s Rollout of Its Buzz Social Network”. Federal Trade Commission, http://www.ftc.gov/opa/2011/03/google.shtm.

pkeefer
Rectangle

12 Rethinking Personal Data: Strengthening Trust

In the meantime, some industry sectors are coming together to help establish best practices and norms to guide behaviour. In the marketing world, the Digital Advertising Alliance, which represents more than 90% of online advertisers, has established self-regulatory principles for online behavioural advertising.31 The guidelines aim to give individuals a better understanding of and greater control over ads that are customized based on their online behaviour. They also establish specific limitations for the use of this data in cases when the potential harm is considered highest, such as employment, credit and healthcare eligibility information.32 In addition, the Network Advertising Initiative has established monitoring and accountability mechanisms to encourage compliance with these rules.33

Similar efforts can be found in the mobile space with the recently announced GSM Association privacy principles.34 In the online identity arena, the Open Identity Exchange was formed as a not-for-profit to support the creation of private sector-led legal rules and policy development and the creation of an open market for related identity and privacy products and services relating to online data and identity challenges.35

Governments are also increasingly using personal data for law enforcement and national security, including the monitoring of SMS messages, blogs, social network posts or geolocation data to fight criminal and terrorist activity, which is raising significant surveillance and privacy concerns. A 2011 report by the Brookings Institute noted that rapidly declining storage costs make it technologically and financially feasible for authoritarian governments “to record nearly everything that is said or done within their borders – every phone conversation, electronic message, social media interaction, the movements of nearly every person and vehicle, and video from every street corner”.36

Governments are trying to improve service delivery and achieve significant cost savings by leveraging personal data. For example, a recent World Economic Forum report estimated the cost savings to emerging market governments could range up to US$ 100 billion per year with increased use of mobile financial services and the ability to utilize personal data more efficiently.37

Chapter 1: The Personal Data Landscape

31 http://www.aboutads.info/obaprinciples32 http://www.aboutads.info/msdprinciples33 http://www.networkadvertising.org/pdfs/NAI_2011_Compliance_Release.pdf34 http://www.gsma.com/mobile-privacy-principles35 http://openidentityexchange.org36 Villasenor, John, “Recording Everything: Digital Storage as an Enabler of Authoritarian Governments”, Brookings Institute, 2011.37 Galvanizing Support: The Role of Government in Advancing Mobile Financial Services. World Economic Forum, March 2012.

Re-establishing Trust in a Complex Ecosystem

The existing dialogue about personal data is currently anchored in fear, uncertainty and doubt. This fear is potentially made worse by the increasingly shortened cycle between the discovery of an event or vulnerability and widespread media coverage. A researcher discovers that a website or mobile app has used data improperly, the mainstream press picks up the story, setting off a popular firestorm, which creates pressure on politicians to react.

While exposing company missteps is clearly important, the focus on fear and concern could result in a reactive and one-sided response. Unintended consequences could reduce the opportunities for value creation. It seems to be an intractable problem: how to create the rules and tools so that all stakeholders can capture the value from data in a trusted way.

pkeefer
Rectangle

A Social Network that Pays You Chime.in lets users create pages about their own interests—and plans to give them a cut of the resulting ad revenue.Tuesday, November 8, 2011 For all the differences among them, the juggernauts of social media rely on a common business model: create free services, then sell ads against users' information. In a dramatic departure, a new social network plans to give its users a 50 percent commission—or even let them sell their own ads and keep all the revenue. Chime.in is built around users' interests—think photography, politics, or travel—as opposed to friends, professional contacts, or news. The site's founders hope that by creating pages around those interests, the users will attract people with similar affinities, an attractive combination for targeted advertising. "Because social is going to be so powerful, I feel that the people who are creating the engaging social content should have some stake," says Bill Gross, the serial entrepreneur who is the CEO of both Idealab, a startup incubator, and Ubermedia, a social media developer that launched Chime.in. "Right now that's sort of a heresy—but I almost like it that people think it's heresy. It gives me more of a lead." Gross is no stranger to creating disruptive business models. The pay-per-click concept for advertising in search listings was born in 1998 at his startup Goto.com, a search engine that was later renamed Overture and sold to Yahoo in 2003 for $1.6 billion. "It took five years [to go from calling Overture] heresy to 'We want to own it,' " Gross recalls. Chime.in launched last month in beta; the site will officially launch at the end of this year, with the advertising model kicking into gear in 2012. In terms of technology, Chime.in is a highly derivative platform. It most resembles Facebook, with a string of posts and comments beneath them. Like Twitter, the content is public by default, and users can follow anyone (no friending required), but posts can be longer: 2,000 characters. Users can vote on content, much as they do on Digg. But the emphasis is on users' interests; after joining already existing groups or creating their own, users can sort content by those interests. Chime.in says the site is now home to 5,000 interest-based groups that have so far shared more than 25 million "chimes." In a sense, Chime.in is offering a social-networking version of Web-publishing platforms like Wordpress, with full social features like reposting and comment threads. And each post or "chime"—often with a photo or video—fits nicely on a smart-phone screen. The overall idea is that this technology—as well as a promise of at least 50 percent of all ad revenue—will prod people to add and develop state-of-the-art content that other people find trustworthy. With time spent on social networks rising and search engines falling, "more and more people will make decisions based on social cues from people they trust, than from something they found on a search engine," Gross says. The ads would appear on a person's personal profile page, or on a community page created by an individual, brand, or celebrity. Whoever created the page would get 50 percent of the revenue from any advertising Chime.in placed there. Other Chime.in users, most likely companies, could also place ads themselves on their own pages, and collect 100 percent. Gross estimates that some

successful pages eventually could bring in thousands of dollars in ad revenue. The site already has gotten several entertainment companies to set up their own Chime.in pages, including E! Entertainment, Universal Pictures, Walt Disney Studios, and Bravo. The site is not without glitches. I created two accounts: one through Facebook (which let Chime.in search my Facebook profile information) and another directly through Chime.in. Each time it offered me a rather strange, seemingly random collection of 11 interests from which to initially choose: Apple, autos, blogworld, blogging, celebrity chefs & restaurants, comic books & superheroes, Google, marketing & advertising, macro photography, and music discovery. I chose "music discovery" and "autos," but I landed in the "macro photo" group. Still, this meant I got to meet my first follower: Kayla Connelly, of Moosic, Pennsylvania, a prodigious Chimer and macro photographer. I followed her, too, and was soon enjoying her intimate portraits of vodka labels, grilled-cheese sandwiches, and snow-dusted angel statues. I also learned she likes Coldplay. Her post of a white Christmas-tree light was captioned with this purloined lyric: "Lights will guide you home and ignite your bones, and I will try to fix you." I wondered why Connelly was Chiming. Finding no way to e-mail her directly within Chime.in, I posted a comment under one of her photos (of paint pots) and disclosed my journalistic purpose. I asked her why she would bother with Chime.in; we already have Facebook, Twitter, Google+, Digg, and many others. "Immediately addicted!" she replied. "So much easier to connect with knowledgeable users with similar interests and get feedback." Poking around the site, I saw groups working on crowdsourced efforts. One such group is writing a work of fiction. It's called "The Great Story." Here's one of the latest passages: "Chapter 32: Suddenly, I feel a sharp pain in both the right side of my head and in my left triceps. Everything is spinning and my vision is blurry. The pain in my head greatly intensifies before ..." Within a few minutes of my chat with Connelly, I heard from Chime.in's PR team. It turns out that Chime.in has community managers who do "human curation" of the content, to bring out the higher-quality material. One such manager—who I later learned was Joy Hepp, an "expert on Mexican travel with five Frommer's titles under her belt"—had alerted the authorities to my inquiry. That curated setup has some advantages, but the long-term success of Chime.in will likely depend on users eventually being able to create and manage high-quality, spam-free content without such assistance.

pkeefer
Rectangle

Can the Middle Class Be Saved?

The Great Recession has accelerated the hollowing-out of the American middle class. And it has illuminated the widening divide between most of America and the super-rich. Both developments herald grave consequences. Here is how we can bridge the gap between us. http://www.theatlantic.com/magazine/archive/2011/09/can-the-middle-class-be-saved/8600/

In October 2005, three Citigroup analysts released a report describing the pattern of growth in the U.S. economy. To really understand the future of the economy and the stock market, they wrote, you first needed to recognize that there was “no such animal as the U.S. consumer,” and that concepts such as “average” consumer debt and “average” consumer spending were highly misleading.

In fact, they said, America was composed of two distinct groups: the rich and the rest. And for the purposes of investment decisions, the second group didn’t matter; tracking its spending habits or worrying over its savings rate was a waste of time. All the action in the American economy was at the top: the richest 1 percent of households earned as much each year as the bottom 60 percent put together; they possessed as much wealth as the bottom 90 percent; and with each passing year, a greater share of the nation’s treasure was flowing through their hands and into their pockets. It was this segment of the population, almost exclusively, that held the key to future growth and future returns. The analysts, Ajay Kapur, Niall Macleod, and Narendra Singh, had coined a term for this state of affairs: plutonomy.

In a plutonomy, Kapur and his co-authors wrote, “economic growth is powered by and largely consumed by the wealthy few.” America had been in this state twice before, they noted—during the Gilded Age and the Roaring Twenties. In each case, the concentration of wealth was the result of rapid technological change, global integration, laissez-faire government policy, and “creative financial innovation.” In 2005, the rich were nearing the heights they’d reached in those previous eras, and Citigroup saw no good reason to think that, this time around, they wouldn’t keep on climbing. “The earth is being held up by the muscular arms of its entrepreneur-plutocrats,” the report said. The “great complexity” of a global economy in rapid transformation would be “exploited best by the rich and educated” of our time.

Kapur and his co-authors were wrong in some of their specific predictions about the plutonomy’s ramifications—they argued, for instance, that since spending was dominated by the rich, and since the rich had very healthy balance sheets, the odds of a stock-market downturn were slight, despite the rising indebtedness of the “average” U.S. consumer. And their division of America into only two classes is ultimately too simple. Nonetheless, their overall characterization of the economy remains resonant. According to Gallup, from May 2009 to May 2011, daily consumer spending rose by 16 percent among Americans earning more than $90,000 a year; among all other Americans, spending was completely flat. The consumer recovery, such as it is, appears to be driven by the affluent, not by the masses. Three years after the crash of 2008, the rich and well educated are putting the recession behind them. The rest of America is stuck in neutral or reverse.

Income inequality usually shrinks during a recession, but in the Great Recession, it didn’t. From 2007 to 2009, the most-recent years for which data are available, it widened a little. The top 1 percent of earners did see their incomes drop more than those of other Americans in 2008. But that fall was due almost entirely to the stock-market crash, and with it a 50 percent reduction in realized capital gains. Excluding capital gains, top earners saw their share of national income rise even in 2008. And in any case, the stock market has since rallied. Corporate profits have marched smartly upward, quarter after quarter, since the beginning of 2009.

Even in the financial sector, high earners have come back strong. In 2009, the country’s top 25 hedge-fund managers earned $25 billion among them—more than they had made in 2007, before the crash. And while the crisis may have begun with mass layoffs on Wall Street, the financial industry has remained well shielded compared with other sectors; from the first quarter of 2007 to the first quarter of 2010, finance shed 8 percent of its jobs, compared with 27 percent in construction and 17 percent in

manufacturing. Throughout the recession, the unemployment rate in finance and insurance has been substantially below that of the nation overall.

It’s hard to miss just how unevenly the Great Recession has affected different classes of people in different places. From 2009 to 2010, wages were essentially flat nationwide—but they grew by 11.9 percent in Manhattan and 8.7 percent in Silicon Valley. In the Washington, D.C., and San Jose (Silicon Valley) metro areas—both primary habitats for America’s meritocratic winners—job postings in February of this year were almost as numerous as job candidates. In Miami and Detroit, by contrast, for every job posting, six people were unemployed. In March, the national unemployment rate was 12 percent for people with only a high-school diploma, 4.5 percent for college grads, and 2 percent for those with a professional degree. Housing crashed hardest in the exurbs and in more-affordable, once fast-growing areas like Phoenix, Las Vegas, and much of Florida—all meccas for aspiring middle-class families with limited savings and education. The professional class, clustered most densely in the closer suburbs of expensive but resilient cities like San Francisco, Seattle, Boston, and Chicago, has lost little in comparison. And indeed, because the stock market has rebounded while housing values have not, the middle class as a whole has seen more of its wealth erased than the rich, who hold more-diverse portfolios. A 2010 Pew study showed that the typical middle-class family had lost 23 percent of its wealth since the recession began, versus just 12 percent in the upper class.

The ease with which the rich and well educated have shrugged off the recession shouldn’t be surprising; strong winds have been at their backs for many years. The recession, meanwhile, has restrained wage growth and enabled faster restructuring and offshoring, leaving many corporations with lower production costs and higher profits—and their executives with higher pay.

Anthony Atkinson, an economist at Oxford University, has studied how several recent financial crises affected income distribution—and found that in their wake, the rich have usually strengthened their economic position. Atkinson examined the financial crises that swept Asia in the 1990s as well as those that afflicted several Nordic countries in the same decade. In most cases, he says, the middle class suffered depressed income for a long time after the crisis, while the top 1 percent were able to protect themselves—using their cash reserves to buy up assets very cheaply once the market crashed, and emerging from crisis with a significantly higher share of assets and income than they’d had before. “I think we’ve seen the same thing, to some extent, in the United States” since the 2008 crash, he told me. “Mr. Buffett has been investing.”

“The rich seem to be on the road to recovery,” says Emmanuel Saez, an economist at Berkeley, while those in the middle, especially those who’ve lost their jobs, “might be permanently hit.” Coming out of the deep recession of the early 1980s, Saez notes, “you saw an increase in inequality … as the rich bounced back, and unionized labor never again found jobs that paid as well as the ones they’d had. And now I fear we’re going to see the same phenomenon, but more dramatic.” Middle-paying jobs in the U.S., in which some workers have been overpaid relative to the cost of labor overseas or technological substitution, “are being wiped out. And what will be left is a hard and a pure market,” with the many paid less than before, and the few paid even better—a plutonomy strengthened in the crucible of the post-crash years.

The Culling of the Middle Class

One of the most salient features of severe downturns is that they tend to accelerate deep economic shifts that are already under way. Declining industries and companies fail, spurring workers and capital toward rising sectors; declining cities shrink faster, leaving blight; workers whose roles have been partly usurped by technology are pushed out en masse and never asked to return. Some economists have argued that in one sense, periods like these do nations a service by clearing the way for new innovation, more-efficient production, and faster growth. Whether or not that’s true, they typically allow us to see, with rare and brutal clarity, where society is heading—and what sorts of people and places it is leaving behind.

Arguably, the most important economic trend in the United States over the past couple of generations has been the ever more distinct sorting of Americans into winners and losers, and the slow hollowing-out of the middle class. Median incomes declined outright from 1999 to 2009. For most of the aughts, that trend was masked by the housing bubble, which allowed working-class and middle-class families to raise their standard of living despite income stagnation or downward job mobility. But that fig leaf has since blown away. And the recession has pressed hard on the broad center of American society.

“The Great Recession has quantitatively but not qualitatively changed the trend toward employment polarization” in the United States, wrote the MIT economist David Autor in a 2010 white paper. Job losses have been “far more severe in middle-skilled white- and blue-collar jobs than in either high-skill, white-collar jobs or in low-skill service occupations.” Indeed, from 2007 through 2009, total employment in professional, managerial, and highly skilled technical positions was essentially unchanged. Jobs in low-skill service occupations such as food preparation, personal care, and house cleaning were also fairly stable. Overwhelmingly, the recession has destroyed the jobs in between. Almost one of every 12 white-collar jobs in sales, administrative support, and nonmanagerial office work vanished in the first two years of the recession; one of every six blue-collar jobs in production, craft, repair, and machine operation did the same.

Autor isolates the winnowing of middle-skill, middle-class jobs as one of several labor-market developments that are profoundly reshaping U.S. society. The others are rising pay at the top, falling wages for the less educated, and “lagging labor market gains for males.” “All,” he writes, “predate the Great Recession. But the available data suggest that the Great Recession has reinforced these trends.”

For more than 30 years, the American economy has been in the midst of a sea change, shifting from industry to services and information, and integrating itself far more tightly into a single, global market for goods, labor, and capital. To some degree, this transformation has felt disruptive all along. But the pace of the change has quickened since the turn of the millennium, and even more so since the crash. Companies have figured out how to harness exponential increases in computing power better and faster. Global supply chains, meanwhile, have grown both tighter and more supple since the late 1990s—the result of improving information technology and of freer trade—making routine work easier to relocate. And of course China, India, and other developing countries have fully emerged as economic powerhouses, capable of producing large volumes of high-value goods and services.

Some parts of America’s transformation may now be nearing completion. For decades, manufacturing has become continually less important to the economy, as other business sectors have grown. But the popular narrative—rapid decline in the 1970s and ’80s, followed by slow erosion thereafter—isn’t quite right, at least as far as employment goes. In fact, the total number of people employed in industry remained quite stable from the late 1960s through about 2000, at roughly 17 million to 19 million. To be sure, manufacturing wasn’t providing many new jobs for a growing population, but for decades, rising output essentially offset the impact of labor-saving technology and offshoring.

But since 2000, U.S. manufacturing has shed about a third of its jobs. Some of that decline reflects losses to China. Still, industry isn’t about to vanish from America, any more than agriculture did as the number of farm workers plummeted during the 20th century. As of 2010, the United States was the second-largest manufacturer in the world, and the No. 3 agricultural nation. But agriculture is now so mechanized that only about 2 percent of American workers make a living as farmers. American manufacturing looks to be heading down the same path.

Meanwhile, another phase of the economy’s transformation—one more squarely involving the white-collar workforce—is really just beginning. “The thing about information technology,” Autor told me, “is that it’s extremely broadly applicable, it’s getting cheaper all the time, and we’re getting better and better at it.” Computer software can now do boilerplate legal work, for instance, and make a first pass at reading X-rays and other medical scans. Likewise, thanks to technology, we can now easily have those scans read and interpreted by professionals half a world away.

In 2007, the economist Alan Blinder, a former vice chairman of the Federal Reserve, estimated that between 22 and 29 percent of all jobs in the United States had the potential to be moved overseas within the next couple of decades. With the recession, the offshoring of jobs only seems to have gained steam. The financial crisis of 2008 was global, but job losses hit America especially hard. According to the International Monetary Fund, one of every four jobs lost worldwide was lost in the United States. And while unemployment remains high in America, it has come back down to (or below) pre-recession levels in countries like China and Brazil.

Anxiety Creeps Upward

Over time, both trade and technology have increased the number of low-cost substitutes for American workers with only moderate cognitive or manual skills—people who perform routine tasks such as product assembly, process monitoring, record keeping, basic information brokering, simple software coding, and so on. As machines and low-paid foreign workers have taken on these functions, the skills associated with them have become less valuable, and workers lacking higher education have suffered.

For the most part, these same forces have been a boon, so far, to Americans who have a good education and exceptional creative talents or analytic skills. Information technology has complemented the work of people who do complex research, sophisticated analysis, high-end deal-making, and many forms of design and artistic creation, rather than replacing that work. And global integration has meant wider markets for new American products and high-value services—and higher incomes for the people who create or provide them.

The return on education has risen in recent decades, producing more-severe income stratification. But even among the meritocratic elite, the economy’s evolution has produced a startling divergence. Since 1993, more than half of the nation’s income growth has been captured by the top 1 percent of earners, and the gains have grown larger over time: from 2002 to 2007, out of every three dollars of national income growth, the top 1 percent of earners captured two. Nearly 2 million people started college in 2002—1,630 of them at Harvard—but among them only Mark Zuckerberg is worth more than $10 billion today; the rise of the super-elite is not a product of educational differences. In part, it is a natural outcome of widening markets and technological revolution, which are creating much bigger winners much faster than ever before—a result that’s not even close to being fully played out, and one reinforced strongly by the political influence that great wealth brings.

Recently, as technology has improved and emerging-market countries have sent more people to college, economic pressures have been moving up the educational ladder in the United States. “It’s useful to make a distinction between college and post-college,” Autor told me. “Among people with professional and even doctoral [degrees], in general the job market has been very good for a very long time, including recently. The group of highly educated individuals who have not done so well recently would be people who have a four-year college degree but nothing beyond that. Opportunities have been less good, wage growth has been less good, the recession has been more damaging. They’ve been displaced from mid-managerial or organizational positions where they don’t have extremely specialized, hard-to-find skills.”

College graduates may be losing some of their luster for reasons beyond technology and trade. As more Americans have gone to college, Autor notes, the quality of college education has become arguably more inconsistent, and the signaling value of a degree from a nonselective school has perhaps diminished. Whatever the causes, “a college degree is not the kind of protection against job loss or wage loss that it used to be.”

Without doubt, it is vastly better to have a college degree than to lack one. Indeed, on a relative basis, the return on a four-year degree is near its historic high. But that’s largely because the prospects facing people without a college degree have been flat or falling. Throughout the aughts, incomes for college graduates barely budged. In a decade defined by setbacks, perhaps that should occasion a sort of wan celebration. “College graduates aren’t doing badly,” says Timothy Smeeding, an economist at the

University of Wisconsin and an expert on inequality. But “all the action in earnings is above the B.A. level.”

America’s classes are separating and changing. A tiny elite continues to float up and away from everyone else. Below it, suspended, sits what might be thought of as the professional middle class—unexceptional college graduates for whom the arrow of fortune points mostly sideways, and an upper tier of college graduates and postgraduates for whom it points progressively upward, but not spectacularly so. The professional middle class has grown anxious since the crash, and not without reason. Yet these anxieties should not distract us from a second, more important, cleavage in American society—the one between college graduates and everyone else.

If you live and work in the professional communities of Boston or Seattle or Washington, D.C., it is easy to forget that nationwide, even among people ages 25 to 34, college graduates make up only about 30 percent of the population. And it is easy to forget that a family income of $113,000 in 2009 would have put you in the 80th income percentile nationally. The true center of American society has always been its nonprofessionals—high-school graduates who didn’t go on to get a bachelor’s degree make up 58 percent of the adult population. And as manufacturing jobs and semiskilled office positions disappear, much of this vast, nonprofessional middle class is drifting downward.

The Bottom 70 Percent

The troubles of the nonprofessional middle class are inseparable from the economic troubles of men. Consistently, men without higher education have been the biggest losers in the economy’s long transformation (according to Michael Greenstone, an economist at MIT, real median wages of men have fallen by 32 percent since their peak in 1973, once you account for the men who have washed out of the workforce altogether). And the struggles of men have amplified the many problems—not just economic, but social and cultural—facing the country today.

Just as the housing bubble papered over the troubles of the middle class, it also hid, for a time, the declining prospects of many men. According to the Harvard economist Lawrence Katz, since the mid-1980s, the labor market has been placing a higher premium on creative, analytic, and interpersonal skills, and the wages of men without a college degree have been under particular pressure. “And I think this downturn exacerbates” the problem, Katz told me. During the aughts, construction provided an outlet for the young men who would have gone into manufacturing a generation ago. Men without higher education “didn’t do as badly as you might have expected, on long-run trends, because of the housing bubble.” But it’s hard to imagine another such construction boom coming to their rescue.

One of the great puzzles of the past 30 years has been the way that men, as a group, have responded to the declining market for blue-collar jobs. Opportunities have expanded for college graduates over that span, and for nongraduates, jobs have proliferated within the service sector (at wages ranging from rock-bottom to middling). Yet in the main, men have pursued neither higher education nor service jobs. The proportion of young men with a bachelor’s degree today is about the same as it was in 1980. And as the sociologists Maria Charles and David Grusky noted in their 2004 book, Occupational Ghettos, while men and women now mix more easily on different rungs of the career ladder, many industries and occupations have remained astonishingly segregated, with men continuing to seek work in a dwindling number of manual jobs, and women “crowding into nonmanual occupations that, on average, confer more pay and prestige.”

As recently as 2001, U.S. manufacturing still employed about as many people as did health and educational services combined (roughly 16 million). But since then, those latter, female-dominated sectors have added about 4 million jobs, while manufacturing has lost about the same number. Men made no inroads into health care or education during the aughts; in 2009, they held only about one in four jobs in those rising sectors, just as they had at the beginning of the decade. They did, however, consolidate their hold on manufacturing—those dwindling jobs, along with jobs in construction, transportation, and utilities, were more heavily dominated by men in 2009 than they’d been nine years earlier.

“I’m deeply concerned” about the prospects of less-skilled men, says Bruce Weinberg, an economist at Ohio State. In 1967, 97 percent of 30-to-50-year-old American men with only a high-school diploma were working; in 2010, just 76 percent were. Declining male employment is not unique to the United States. It’s been happening in almost all rich nations, as they’ve put the industrial age behind them. Weinberg’s research has shown that in occupations in which “people skills” are becoming more important, jobs are skewing toward women. And that category is large indeed. In his working paper “People People,” Weinberg and two co-authors found that interpersonal skills typically become more highly valued in occupations in which computer use is prevalent and growing, and in which teamwork is important. Both computer use and teamwork are becoming ever more central to the American workplace, of course; the restructuring that accompanied the Great Recession has only hastened that trend.

Needless to say, a great many men have excellent people skills, just as a great many men do well in school. As a group, men still make more money than women, in part due to lingering discrimination. And many of the differences we observe between the genders may be the result of culture rather than genetics. All of that notwithstanding, a meaningful number of men have struggled badly as the economy has evolved, and have shown few signs of successful adaptation. Men’s difficulties are hardly evident in Silicon Valley or on Wall Street. But they’re hard to miss in foundering blue-collar and low-end service communities across the country. It is in these less affluent places that gender roles, family dynamics, and community character are changing in the wake of the crash.

A Cultural Separation

In the March 2010 issue of this magazine, I discussed the wide-ranging social consequences of male economic problems, once they become chronic. Women tend not to marry (or stay married to) jobless or economically insecure men—though they do have children with them. And those children usually struggle when, as typically happens, their parents separate and their lives are unsettled. The Harvard sociologist William Julius Wilson has connected the loss of manufacturing jobs from inner cities in the 1970s—and the resulting economic struggles of inner-city men—to many of the social ills that cropped up afterward. Those social ills eventually became self-reinforcing, passing from one generation to the next. In less privileged parts of the country, a larger, predominantly male underclass may now be forming, and with it, more-widespread cultural problems.

What I didn’t emphasize in that story is the extent to which these sorts of social problems—the kind that can trap families and communities in a cycle of disarray and disappointment—have been seeping into the nonprofessional middle class. In a national study of the American family released late last year, the sociologist W. Bradford Wilcox wrote that among “Middle Americans”—people with a high-school diploma but not a college degree—an array of signals of family dysfunction have begun to blink red. “The family lives of today’s moderately educated Americans,” which in the 1970s closely resembled those of college graduates, now “increasingly resemble those of high-school dropouts, too often burdened by financial stress, partner conflict, single parenting, and troubled children.”

“The speed of change,” wrote Wilcox, “is astonishing.” By the late 1990s, 37 percent of moderately educated couples were divorcing or separating less than 10 years into their first marriage, roughly the same rate as among couples who didn’t finish high school and more than three times that of college graduates. By the 2000s, the percentage in “very happy” marriages—identical to that of college graduates in the 1970s—was also nearing that of high-school dropouts. Between 2006 and 2008, among moderately educated women, 44 percent of all births occurred outside marriage, not far off the rate (54 percent) among high-school dropouts; among college-educated women, that proportion was just 6 percent.

The same pattern—families of middle-class nonprofessionals now resembling those of high-school dropouts more than those of college graduates—emerges with norm after norm: the percentage of 14-year-old girls living with both their mother and father; the percentage of adolescents wanting to attend college “very much”; the percentage of adolescents who say they’d be embarrassed if they got (or got someone) pregnant; the percentage of never-married young adults using birth control all the time.

One stubborn stereotype in the United States is that religious roots are deepest in blue-collar communities and small towns, and, more generally, among Americans who do not have college degrees. That was true in the 1970s. Yet since then, attendance at religious services has plummeted among moderately educated Americans, and is now much more common among college grads. So, too, is participation in civic groups. High-school seniors from affluent households are more likely to volunteer, join groups, go to church, and have strong academic ambitions than seniors used to be, and are as trusting of other people as seniors a generation ago; their peers from less affluent households have become less engaged on each of those fronts. A cultural chasm—which did not exist 40 years ago and which was still relatively small 20 years ago—has developed between the traditional middle class and the top 30 percent of society.

The interplay of economic and cultural forces is complex, and changes in cultural norms cannot be ascribed exclusively to the economy. Wilcox has tried to statistically parse the causes of the changes he has documented, concluding that about a third of the class-based changes in marriage patterns, for instance, are directly attributable to wage stagnation, increased job insecurity, or bouts of unemployment; the rest he attributes to changes in civic and religious participation and broader changes in attitudes among the middle class.

In fact, all of these variables seem to reinforce each other. Nonetheless, some of the most significant cultural changes within the middle class have accelerated in the past decade, as the prospects of the nonprofessional middle class have dimmed. The number of couples who live together but are not married, for instance, has been rising briskly since the 1970s, but it really took off in the aughts—nearly doubling, from 3.8 million to 6.7 million, from 2000 to 2009. From 2009 to 2010, that number jumped by nearly a million more. In six out of 10 of the newly cohabitating couples, at least one person was not working, a much higher proportion than in the past.

Ultimately, the evolution of the meritocracy itself appears to be at least partly responsible for the growing cultural gulf between highly educated Americans and the rest of society. As the journalist Bill Bishop showed in his 2008 book, The Big Sort, American communities have become ever more finely sorted by affluence and educational attainment over the past 30 years, and this sorting has in turn reinforced the divergence in the personal habits and lifestyle of Americans who lack a college degree from those of Americans who have one. In highly educated communities, families are largely intact, educational ideals strong, and good role models abundant. None of those things is a given anymore in communities where college-degree attainment is low. The natural leaders of such communities—the meritocratic winners who do well in school, go off to selective colleges, and get their degrees—generally leave them for good in their early 20s.

In their 2009 book, Creating an Opportunity Society, Ron Haskins and Isabel Sawhill write that while most Americans believe that opportunity is widespread in the United States, and that success is primarily a matter of individual intelligence and skill, the reality is more complicated. In recent decades, people born into the middle class have indeed moved up and down the class ladder readily. Near the turn of the millennium, for instance, middle-aged people who’d been born to middle-class parents had widely varied incomes. But class was stickier among those born to parents who were either rich or poor. Thirty-nine percent of children born to parents in the top fifth of earners stayed in that same bracket as adults. Likewise, 42 percent of those whose parents were in the bottom fifth remained there themselves. Only 6 percent reached the top fifth: rags-to-riches stories were extremely rare.

A thinner middle class, in itself, means fewer stepping stones available to people born into low-income families. If the economic and cultural trends under way continue unabated, class mobility will likely decrease in the future, and class divides may eventually grow beyond our ability to bridge them.

What is most worrying is that all of the most powerful forces pushing on the nonprofessional middle class—economic and cultural—seem to be pushing in the same direction. We cannot know the future, and over time, some of these forces may dissipate of their own accord. Further advances in technology may be less punishing to middle-skill workers than recent advances have been; men may adapt better to a post-industrial economy, as the alternative to doing so becomes more stark; nonprofessional families may find

a new stability as they accommodate themselves to changing norms of work, income, and parental roles. Yet such changes are unlikely to occur overnight, if they happen at all. Momentum alone suggests years of trouble for the middle class.

Changing the Path of the American Economy

True recovery from the Great Recession is not simply a matter of jolting the economy back onto its former path; it’s about changing the path. No single action or policy prescription can fix the varied problems facing the middle class today, but through a combination of approaches—some aimed at increasing the growth rate of the economy itself, and some at ensuring that more people are able to benefit from that growth—we can ameliorate them. Many of the deepest economic trends that the recession has highlighted and temporarily sped up will take decades to fully play out. We can adapt, but we have to start now.

The rest of this article suggests how we might do so. The measures that I propose are not comprehensive, nor are they without drawbacks. But they are emblematic of the types of proposals we will need to weigh in the coming years, and of the nature of the national conversation we need to have. That conversation must begin with a reassessment of how globalization is affecting American society, and of what it will take for the U.S. to thrive in a rapidly changing world.

In 2010, the McKinsey Global Institute released a report detailing just how mighty America’s multinational companies are—and how essential they have become to the U.S. economy. Multinationals headquartered in the U.S. employed 19 percent of all private-sector workers in 2007, earned 25 percent of gross private-sector profits, and paid out 25 percent of all private-sector wages. They also accounted for nearly three-quarters of the nation’s private-sector R&D spending. Since 1990, they’ve been responsible for 31 percent of the growth in real GDP.

Yet for all their outsize presence, multinationals have been puny as engines of job creation. Over the past 20 years, they have accounted for 41 percent of all gains in U.S. labor productivity—but just 11 percent of private-sector job gains. And in the latter half of that period, the picture grew uglier: according to the economist Martin Sullivan, from 1999 through 2008, U.S. multinationals actually shrank their domestic workforce by about 1.9 million people, while increasing foreign employment by about 2.4 million.

The heavy footprint of multinational companies is merely one sign of how inseparable the U.S. economy has become from the larger global economy—and these figures neatly illustrate two larger points. First, we can’t wish away globalization or turn our backs on trade; to try to do so would be crippling and impoverishing. And second, although American prosperity is tied to globalization, something has nonetheless gone wrong with the way America’s economy has evolved in response to increasingly dense global connections.

Particularly since the 1970s, the United States has placed its bets on continuous innovation, accepting the rapid transfer of production to other countries as soon as goods mature and their manufacture becomes routine, all with the idea that the creation of even newer products and services at home will more than make up for that outflow. At times, this strategy has paid off big. Rapid innovation in the 1990s allowed the economy to grow quickly and create good, new jobs up and down the ladder to replace those that were becoming obsolete or moving overseas, and enabled strong income growth for most Americans. Yet in recent years, that process has broken down.

One reason, writes the economist Michael Mandel, is that America no longer enjoys the economic fruits of its innovations for as long as it used to. Knowledge, R&D, and business know-how depreciate more quickly now than they did even 15 years ago, because global communication is faster, connections are more seamless, and human capital is more broadly diffused than in the past.

As a result, domestic production booms have ended sooner than they used to. IT-hardware production, for instance, which in 1999 the Bureau of Labor Statistics projected would create about 155,000 new jobs in the U.S. over the following decade, actually shrank by nearly 500,000 jobs in that time. Jobs in data processing also fell, presumably as a result of both offshoring and technological advance. Because innovations now depreciate faster, we need more of them than we used to in order to sustain the same rate of economic growth.

Yet in the aughts, as an array of prominent economists and entrepreneurs have recently pointed out, the rate of big innovations actually slowed considerably; with the housing bubble fueling easy growth for much of that time, we just didn’t notice. This slowdown may have been merely the result of bad luck—big breakthroughs of the sort that create whole categories of products or services are difficult to predict, and long droughts are not unknown. Overregulation in certain areas may also have played a role. The economist Tyler Cowen, in his recent book, The Great Stagnation, argues that the scientific frontier itself—or at least that portion of it leading to commercial innovation—has been moving outward more slowly, and requiring ever more resources to do so, for many decades.

Process innovation has been quite rapid in recent years. U.S. multinationals and other companies are very good at continually improving their operational efficiency by investing in information technology, restructuring operations, and shifting work around the globe. Some of these activities benefit some U.S. workers, by making the jobs that stay in the country more productive. But absent big breakthroughs that lead to new products or services—and given the vast reserves of low-wage but increasingly educated labor in China, India, and elsewhere—rising operational efficiency hasn’t been a recipe for strong growth in either jobs or wages in the United States.

America has huge advantages as an innovator. Places like Silicon Valley, North Carolina’s Research Triangle, and the Massachusetts high-tech corridor are difficult to replicate, and the United States has many of them. Foreign students still flock here, and foreign engineers and scientists who get their doctorates here have been staying on for longer and longer over the past 15 years. When you compare apples to apples, the United States still leads the world, handily, in the number of skilled engineers, scientists, and business professionals in residence.

But we need to better harness those advantages to speed the pace of innovation, in part by putting a much higher national priority on investment—rather than consumption—in the coming years. That means, among other things, substantially raising and broadening both national and private investment in basic scientific progress and in later-stage R&D—through a combination of more federal investment in scientific research, perhaps bigger tax breaks for private R&D spending, and a much lower corporate tax rate (and a simpler corporate tax code) overall.

Edmund Phelps and Leo Tilman, professors at Columbia University, have proposed the creation of a National Innovation Bank that would invest in, or lend to, innovative start-ups—bringing more money to bear than venture-capital funds could, and at a lower cost of capital, which would promote more investment and enable the funding of somewhat riskier ventures. The broader idea behind such a bank is that because innovation carries so many ambient benefits—from job creation to the experience gained by even failed entrepreneurs and the people around them—we should be willing to fund it more liberally as a society than private actors would individually.

Removing bureaucratic obstacles to innovation is as important as pushing more public funds toward it. As Wall Street has amply demonstrated, not every industry was overregulated in the aughts. Nonetheless, the decade did see the accretion of a number of regulatory measures that may have chilled the investment climate (the Sarbanes-Oxley accounting reforms and a proliferation of costly security regulations following the creation of the Department of Homeland Security are two prominent examples).

Regulatory balance is always difficult in practice, but Michael Mandel has suggested a useful rule of thumb: where new and emerging industries are concerned—industries that are at the forefront of the economy and could provide big bursts of growth—our bias should be toward light regulation, allowing

creative experimentation and encouraging fast growth. The rapid expansion of the Internet in the 1990s is a good example of the benefit that can come from a light regulatory hand early in an industry’s development; green technology, wireless platforms, and social-networking technologies are perhaps worthy of similar treatment today.

Any serious effort to accelerate innovation would mean taking many other actions as well—from redoubling our commitment to improving U.S. schools, to letting in a much larger number of creative, highly skilled immigrants each year. Few such measures will be without costs or drawbacks. Among other problems, a mandate of light regulation on high-potential industries requires the government to “pick winners.” Tilting government spending toward investment and innovation probably means tilting it away from defense and programs aimed at senior citizens. And because the benefits of innovation diffuse more quickly now, the return on national investment in scientific research and commercial innovation may be lower than it was in previous decades. Despite these drawbacks and trade-offs, the alternative to heavier investment and a higher priority on national innovation is dismal to contemplate.

As we strive toward faster innovation, we also need to keep the production of new, high-value goods within American borders for a longer period of time. Protectionist measures are generally self-defeating, and while vigilance against the theft of intellectual property and strong sanctions when such theft is discovered are sensible, they are unlikely to alter the basic trends of technological and knowledge diffusion. (Much of that diffusion is entirely legal, and the long history of industrialization and globalization suggests that attempts to halt it will fail.) What can really matter is a fair exchange rate. Throughout much of the aughts and continuing to the present day, China, in particular, has taken extraordinary measures to keep its currency undervalued relative to the dollar, and this has harmed U.S. industry. We must press China on currency realignment, putting sanctions on the table if necessary.

Given some of the workforce trends of the past decade, doubling down on technology, innovation, and globalization may seem wrongheaded. And indeed, this strategy is no cure-all. But without a vibrant, innovative economy, all other prospects dim. For the professional middle class in particular, an uptick in innovation and a return to faster economic growth would solve many problems, and likely reignite income growth. While technology is eating into the work that some college graduates do, their general skills show little sign of losing value. Recent analysis by the McKinsey Global Institute, for instance, indicates that demand for college grads by American businesses is likely to grow quickly over the next decade even if the economy grows very slowly; rapid economic growth would cause demand for college grads to far exceed supply.

Still, even in boom times, many more people than we would care to acknowledge won’t have the education, skills, or abilities to prosper in a pure and globalized market, shaped by enormous labor reserves in China, India, and other developing countries. Over the next decade or more, even if national economic growth is strong, what we do to help and support moderately educated Americans may well determine whether the United States remains a middle-class country.

Filling the Hole in the Middle Class

In The Race Between Education and Technology, the economists Claudia Goldin and Lawrence Katz write that throughout roughly the first three-quarters of the 20th century, most Americans prospered and inequality fell because, although technological advance was rapid—and mostly biased toward people with relatively high skills—educational advance was faster still; the pool of people who could take advantage of new technologies kept growing larger, while the pool of those who could not stayed relatively small.

There would be no better tonic for the country’s recent ills than a resumption of the rapid advance of skills and abilities throughout the population. Clearly there is room for improvement. About 30 percent of young adults finish college today, yet that figure is 50 percent among those with affluent parents. It follows that with improvements in the K–12 school system, more-stable home environments, and widespread financial access to college, we eventually could move to a 50 percent college graduation rate overall. And because IQ worldwide has been slowly increasing from generation to generation—a

somewhat mysterious development known as the “Flynn effect”—higher rates still may eventually come within reach.

Yet the past three decades of experience suggest that this upward migration, even to, say, 40 percent, will be slow and difficult. (From 1979 to 2009, the percentage of people ages 25 to 29 with a four-year college degree rose from 23.1 percent to 30.6 percent—or roughly 1 percentage point every four years.) And ultimately, of course, the college graduation rate is likely to hit a substantially lower ceiling than that for high school or elementary school. For a time, elementary school was the answer to the question of how to build a broad middle class in America. And for a time after that, the answer was high school. College may never provide as comprehensive an answer. At the very least, over the next decade or two, college education simply cannot be the whole answer to the woes of the middle class, since even under the rosiest of assumptions, most of the middle of society will not have a four-year college degree.

Among the more pernicious aspects of the meritocracy as we now understand it in the United States is the equation of merit with test-taking success, and the corresponding belief that those who struggle in the classroom should expect to achieve little outside it. Progress along the meritocratic path has become measurable from a very early age. This is a narrow way of looking at human potential, and it badly underserves a large portion of the population. We have beaten the drum so loudly and for so long about the centrality of a college education that we should not be surprised when people who don’t attend college—or those who start but do not finish—go adrift at age 18 or 20. Grants, loans, and tax credits to undergraduate and graduate students total roughly $160 billion each year; by contrast, in 2004, federal, state, and local spending on employment and training programs (which commonly assist people without a college education) totaled $7 billion—an inflation-adjusted decline of about 75 percent since 1978.

As we continue to push for better K–12 schooling and wider college access, we also need to build more paths into the middle class that do not depend on a four-year college degree. One promising approach, as noted by Haskins and Sawhill, is the development of “career academies”—schools of 100 to 150 students, within larger high schools, offering a curriculum that mixes academic coursework with hands-on technical courses designed to build work skills. Some 2,500 career academies are already in operation nationwide. Students attend classes together and have the same guidance counselors; local employers partner with the academies and provide work experience while the students are still in school.

“Vocational training” programs have a bad name in the United States, in part because many people assume they close off the possibility of higher education. But in fact, career-academy students go on to earn a postsecondary credential at the same rate as other high-school students. What’s more, they develop firmer roots in the job market, whether or not they go on to college or community college. One recent major study showed that on average, men who attended career academies were earning significantly more than those who attended regular high schools, both four and eight years after graduation. They were also 33 percent more likely to be married and 36 percent less likely to be absentee fathers.

Career-academy programs should be expanded, as should apprenticeship programs (often affiliated with community colleges) and other, similar programs that are designed to build an ethic of hard work; to allow young people to develop skills and achieve goals outside the traditional classroom as well as inside it; and ultimately to provide more, clearer pathways into real careers. By giving young people more information about career possibilities and a tangible sense of where they can go in life and what it takes to get there, these types of programs are likely to lead to more-motivated learning, better career starts, and a more highly skilled workforce. Their effect on boys in particular is highly encouraging. And to the extent that they can expose boys to opportunities within growing fields like health care (and also expose them to male role models within those fields), these programs might even help weaken the grip of the various stereotypes that seem to be keeping some boys locked into declining parts of the economy.

Even in the worst of scenarios, “middle skill” jobs are not about to vanish altogether. Many construction jobs and some manufacturing jobs will return. And there are many, many middle-income occupations—from EMTs, lower-level nurses, and X-ray technicians, to plumbers and home remodelers—that trade and

technology cannot readily replace, and these fields are likely to grow. A more highly skilled workforce will allow faster, more efficient growth; produce better-quality goods and services; and earn higher pay.

All of that said, the overall pattern of change in the U.S. labor market suggests that in the next decade or more, a larger proportion of Americans may need to take work in occupations that have historically required little skill and paid low wages. Analysis by David Autor indicates that from 1999 to 2007, low-skill jobs grew substantially as a share of all jobs in the United States. And while the lion’s share of jobs lost during the recession were middle-skill jobs, job growth since then has been tilted steeply toward the bottom of the economy; according to a survey by the National Employment Law Project, three-quarters of American job growth in 2010 came within industries paying, on average, less than $15 an hour. One of the largest challenges that Americans will face in the coming years will be doing what we can to make the jobs that have traditionally been near the bottom of the economy better, more secure, and more fulfilling—in other words, more like middle-class jobs.

As the urban theorist Richard Florida writes in The Great Reset, part of that process may be under way already. A growing number of companies have been rethinking retail-workforce development, to improve productivity and enhance the customer experience, leading to more-enjoyable jobs and, in some cases, higher pay. Whole Foods Markets, for instance, one of Fortune magazine’s “Best Companies to Work For,” organizes its workers into teams and gives them substantial freedom as to how they go about their work; after a new worker has been on the job for 30 days, the team members vote on whether the new employee has embraced the job and the culture, and hence whether he or she should be kept on. Best Buy actively encourages all its employees to suggest improvements to the company’s work processes, much as Toyota does, and favors promotion from within. Trader Joe’s sets wages so that full-time employees earn at least a median income within their community; store captains, most of them promoted from within, can earn six figures.

The natural evolution of the economy will surely make some service jobs more productive, independent, and enjoyable over time. Yet productivity improvements at the bottom of the economy seem unlikely to be a sufficient answer to the problems of the lower and middle classes, at least for the foreseeable future. Indeed, the relative decline of middle-skill jobs, combined with slow increases in college completion, suggests a larger pool of workers chasing jobs in retail, food preparation, personal care, and the like—and hence downward pressure on wages.

Whatever the unemployment rate over the next several years, the long-term problem facing American society is not that employers will literally run out of work for people to do—it’s that the market value of much low-skill and some middle-skill work, and hence the wages employers can offer, may be so low that few American workers will strongly commit to that work. Bad jobs at rock-bottom wages are a primary reason why so many people at the lower end of the economy drift in and out of work, and this job instability in turn creates highly toxic social and family problems.

American economists on both the right and the left have long advocated subsidizing low-wage work as a means of social inclusion—offering an economic compact with everyone who embraces work, no matter their level of skill. The Earned Income Tax Credit, begun in 1975 and expanded several times since then, does just that, and has been the country’s best anti-poverty program. Yet by and large, the EITC helps only families with children. In 2008, it provided a maximum credit of nearly $5,000 to families with two children, with the credit slowly phasing out for incomes above $15,740 and disappearing altogether at $38,646. The maximum credit for workers without children (or without custody of children) was only $438. We should at least moderately increase both the level of support offered to families by the EITC and the maximum income to which it applies. Perhaps more important, we should offer much fuller support for workers without custody of children. That’s a matter of basic fairness. But it’s also a measure that would directly target some of the biggest budding social problems in the United States today. A stronger reward for work would encourage young, less-skilled workers—men in particular—to develop solid, early connections to the workforce, improving their prospects. And better financial footing for young, less-skilled workers would increase their marriageability.

A continued push for better schooling, the creation of clearer paths into careers for people who don’t immediately go to college, and stronger support for low-wage workers—together, these measures can help mitigate the economic cleavage of U.S. society, strengthening the middle. They would hardly solve all of society’s problems, but they would create the conditions for more-predictable and more-comfortable lives—all harnessed to continuing rewards for work and education. These, ultimately, are the most-critical preconditions for middle-class life and a healthy society.

The Limits of Meritocracy

As a society, we should be far more concerned about whether most Americans are getting ahead than about the size of the gains at the top. Yet extreme income inequality causes a cultural separation that is unhealthy on its face and corrosive over time. And the most-powerful economic forces of our times will likely continue to concentrate wealth at the top of society and to put more pressure on the middle. It is hard to imagine an adequate answer to the problems we face that doesn’t involve greater redistribution of wealth.

Soaking the rich would hardly solve all of America’s problems. Holding all else equal, we would need to raise the top two tax rates to roughly 90 percent, then unrealistically assume no change in the work habits of the people in those brackets, merely to bring the deficit in a typical year down to 2 percent of GDP. But even with strong budget discipline and a reduction in the growth of Medicare costs, somewhat higher taxes for most Americans—in one form or another—seem inevitable. If we aim to increase our national investment in innovation, and to provide more assistance to people who are falling out of the middle class (or who can’t step up into it), that’s even more true. The professional middle class in particular should not expect exemption from tax increases.

Over time, the United States has expected less and less of its elite, even as society has oriented itself in a way that is most likely to maximize their income. The top income-tax rate was 91 percent in 1960, 70 percent in 1980, 50 percent in 1986, and 39.6 percent in 2000, and is now 35 percent. Income from investments is taxed at a rate of 15 percent. The estate tax has been gutted.

High earners should pay considerably more in taxes than they do now. Top tax rates of even 50 percent for incomes in the seven-figure range would still be considerably lower than their level throughout the boom years of the post-war era, and should not be out of the question—nor should an estate-tax rate of similar size, for large estates.

The rich have not become that way while living in a vacuum. Technological advance, freer trade, and wider markets—along with the policies that promote them—always benefit some people and harm others. Economic theory is quite clear that the winners gain more than the losers lose, and therefore the people who suffer as a result of these forces can be fully compensated for their losses—society as a whole still gains. This precept has guided U.S. government policy for 30 years. Yet in practice, the losers are seldom compensated, not fully and not for long. And while many of the gains from trade and technological progress are widely spread among consumers, the pressures on wages that result from these same forces have been felt very differently by different classes of Americans.

What’s more, some of the policies that have most benefited the rich have little to do with greater competition or economic efficiency. Fortunes on Wall Street have grown so large in part because of implicit government protection against catastrophic losses, combined with the steady elimination of government measures to limit excessive risk-taking, from the 1980s right on through the crash of 2008.

As America’s winners have been separated more starkly from its losers, the idea of compensating the latter out of the pockets of the former has met stiff resistance: that would run afoul of another economic theory, dulling the winners’ incentives and squashing their entrepreneurial spirit; some, we are reminded, might even leave the country. And so, in a neat and perhaps unconscious two-step, many elites have pushed policies that benefit them, by touting theoretical gains to society—then ruled out measures that would distribute those gains widely.

Even as we continue to strive to perfect the meritocracy, signs that things may be moving in the other direction are proliferating. The increasing segregation of Americans by education and income, and the widening cultural divide between families with college-educated parents and those without them, suggests that built-in advantages and disadvantages may be growing. And the concentration of wealth in relatively few hands opens the possibility that much of the next generation’s elite might achieve their status through inheritance, not hard or innovative work.

America remains a magnet for talent, for reasons that go beyond the tax code; and by international standards, none of the tax changes recommended here would create an excessive tax burden on high earners. If a few financiers choose to decamp for some small island-state in search of the smallest possible tax bill, we should wish them good luck.

In political speeches and in the media, the future of the middle class is often used as a stand-in for the future of America. Yet of course the two are not identical. The size of the middle class has waxed and waned throughout U.S. history, as has income inequality. The post-war decades of the 20th century were unusually hospitable to the American middle class—the result of strong growth, rapid gains in education, progressive tax policy, limited free agency at work, a limited pool of competing workers overseas, and other supportive factors. Such serendipity is anomalous in American history, and unlikely to be repeated.

Yet if that period was unusually kind to the middle class, the one we are now in the midst of appears unusually cruel. The strongest forces of our time are naturally divisive; absent a wide-ranging effort to constrain them, economic and cultural polarization will almost surely continue. Perhaps the nonprofessional middle class is rich enough today to absorb its blows with equanimity. Perhaps plutonomy, in the 21st century, will prove stable over the long run.

But few Americans, no matter their class, will be eager for that outcome.

How Superstars’ Pay Stifles Everyone Else This article was adapted from “The Price of Everything: Solving the Mystery of Why We Pay What We Do,” by Eduardo Porter, an editorial writer for The New York Times. The book, to be published on Jan. 4 by Portfolio, examines how pricing affects all of our choices.

IN 1990, the Kansas City Royals had the heftiest payroll in Major League Baseball: almost $24 million. A typical player for the New York Yankees, which had some of the most expensive players in the game at the time, earned less than $450,000.

Last season, the Yankees spent $206 million on players, more than five times the payroll of the Royals 20 years ago, even after accounting for inflation. The Yankees’ median salary was $5.5 million, seven times the 1990 figure, inflation-adjusted.

What is most striking is how the Yankees have outstripped the rest of the league. Two decades ago. the Royals’ payroll was about three times as big as that of the Chicago White Sox, the cheapest major-league team at the time. Last season, the Yankees spent about six times as much as the Pittsburgh Pirates, who had the most inexpensive roster.

Baseball aficionados might conclude that all of this points to some pernicious new trend in the market for top players. But this is not specific to baseball, or even to sport. Consider the market for pop music. In 1982, the top 1 percent of pop stars, in terms of pay, raked in 26 percent of concert ticket revenue. In 2003, that top percentage of stars — names like Justin Timberlake, Christina Aguilera or 50 Cent — was taking 56 percent of the concert pie.

The phenomenon is not even specific to the United States. Pelé, from Brazil, the greatest soccer player of all time, made his World Cup debut in Sweden in 1958, when he was only 17. He became an instant star, coveted by every team on the planet. By 1960, his team, Santos, reportedly paid him $150,000 a year — about $1.1 million in today’s money. But these days, that would amount to middling pay. The top-paid player of the 2009-10 season, the Portuguese forward Cristiano Ronaldo, made $17 million playing for the Spanish team Real Madrid.

Of course, the inflated rewards of performers at the very top have to do with specific changes in the underlying economics of entertainment. People have more disposable income to spend on entertainment. Corporate sponsorships, virtually non-existent in the age of Pelé, account today for a large share of performers’ income. In 2009, the highest-earning soccer player was the English midfielder David Beckham, who made $33 million from endorsements on top of a $7 million salary from the Los Angeles Galaxy and AC Milan.

But broader forces are also at play. Nearly 30 years ago, Sherwin Rosen, an economist from the University of Chicago, proposed an elegant theory to explain the general pattern. In an article entitled “The Economics of Superstars,” he argued that technological changes would allow the best performers in a given field to serve a bigger market and thus reap a greater share of its revenue. But this would also reduce the spoils available to the less gifted in the business.

The reasoning fits smoothly into the income dynamics of the music industry, which has been shaken by many technological disruptions since the 1980s. First, MTV put music on television. Then Napster took it to the Internet. Apple allowed fans to buy single songs and take them with

them. Each of these breakthroughs allowed the very top acts to reach a larger fan base, and thus command a larger audience and a bigger share of concert revenue.

Superstar effects apply, too, to European soccer, which is beamed around the world on cable and satellite TV. In 2009, the top 20 soccer teams reaped revenue of 3.9 billion euros, more than 25 percent of the combined revenue of all the teams in European leagues.

Pelé was not held back by the quality of his game, but by his relatively small revenue base. He might be the greatest of all time, but few people could pay to experience his greatness. In 1958, there were about 350,000 television sets in Brazil. The first television satellite, Telstar I, wasn’t launched until July 1962, too late for his World Cup debut.

By contrast, the 2010 FIFA World Cup in South Africa, in which Ronaldo played for Portugal, was broadcast in more than 200 countries, to an aggregate audience of over 25 billion. Some 700 million people watched the final alone. Ronaldo is not better then Pelé. He makes more money because his talent is broadcast to more people.

IF one loosens slightly the role played by technological progress, Dr. Rosen’s framework also does a pretty good job explaining the evolution of executive pay. In 1977, an elite chief executive working at one of America’s top 100 companies earned about 50 times the wage of its average worker. Three decades later, the nation’s best-paid C.E.O.’s made about 1,100 times the pay of a worker on the production line.

This has separated the megarich from the merely very rich. A study of pay in the 1970s found that executives in the top 10 percent made about twice as much as those in the middle of the pack. By the early 2000s, the top suits made more than four times the pay of the executives in the middle.

Top C.E.O.’s are not pop stars. But the pay for the most sought-after executives has risen for similar reasons. As corporations have increased in size, management decisions at the top have become that much more important, measured in terms of profits or losses. Top American companies have much higher sales and profits than they did 20 years ago. Banks and funds have more assets.

With so much more at stake, it has become that much more important for companies to put at the helm the “best” executive or banker or fund manager they can find. This has set off furious competition for top managerial talent, pushing the prices of top-rated managers way above the pay of those in the tier just below them. Two economists at New York University, Xavier Gabaix and Augustin Landier, published a study in 2006 estimating that the sixfold rise in the pay of chief executives in the United States over the last quarter century or so was attributable entirely to the sixfold rise in the market size of large American companies.

And therein lies a big problem for American capitalism.

CAPITALISM relies on inequality. Like differences in other prices, pay disparities steer resources — in this case, people — to where they would be most productively employed.

Despite the great danger and cost of crossing the border illegally into the United States, hundreds of thousands of the hardest-working Mexicans are drawn by the relative prosperity they can

achieve north of the border — where the average income of a Mexican-American household is more than $33,000, almost five times that of a family in Mexico.

In poor economies, fast economic growth increases inequality as some workers profit from new opportunities and others do not. The share of national income accruing to the top 1 percent of the Chinese population more than doubled from 1986 to 2003. Inequality spurs economic growth by providing incentives for people to accumulate human capital and become more productive. It pulls the best and brightest into the most lucrative lines of work, where the most profitable companies hire them.

Yet the increasingly outsize rewards accruing to the nation’s elite clutch of superstars threaten to gum up this incentive mechanism. If only a very lucky few can aspire to a big reward, most workers are likely to conclude that it is not worth the effort to try. The odds aren’t on their side.

Inequality has been found to turn people off. A recent experiment conducted with workers at the University of California found that those who earned less than the typical wage for their pay unit and occupation became measurably less satisfied with their jobs, and more likely to look for another one if they found out the pay of their peers. Other experiments have found that winner-take-all games tend to elicit much less player effort — and more cheating — than those in which rewards are distributed more smoothly according to performance.

Ultimately, the question is this: How much inequality is necessary? It is true that the nation grew quite fast as inequality soared over the last three decades. Since 1980, the country’s gross domestic product per person has increased about 69 percent, even as the share of income accruing to the richest 1 percent of the population jumped to 36 percent from 22 percent. But the economy grew even faster — 83 percent per capita — from 1951 to 1980, when inequality declined when measured as the share of national income going to the very top of the population.

One study concluded that each percentage-point increase in the share of national income channeled to the top 10 percent of Americans since 1960 led to an increase of 0.12 percentage points in the annual rate of economic growth — hardly an enormous boost. The cost for this tonic seems to be a drastic decline in Americans’ economic mobility. Since 1980, the weekly wage of the average worker on the factory floor has increased little more than 3 percent, after inflation.

The United States is the rich country with the most skewed income distribution. According to the Organization for Economic Cooperation and Development, the average earnings of the richest 10 percent of Americans are 16 times those for the 10 percent at the bottom of the pile. That compares with a multiple of 8 in Britain and 5 in Sweden.

Not coincidentally, Americans are less economically mobile than people in other developed countries. There is a 42 percent chance that the son of an American man in the bottom fifth of the income distribution will be stuck in the same economic slot. The equivalent odds for a British man are 30 percent, and 25 percent for a Swede.

NONE of this even begins to account for the damage caused by the superstar dynamics that shape the pay of American bankers.

Remember the ’80s? Gordon Gekko first sashayed across the silver screen. Ivan Boesky was jailed for insider trading. Michael Milken peddled junk bonds. In 1987, financial firms amassed a little less than a fifth of the profits of all American corporations. Wall Street bonuses totaled $2.6 billion — about $15,600 for each man and woman working there.

Yet by current standards, this era of legendary greed appears like a moment of uncommon restraint. In 2007, as the financial bubble built upon the American housing market reached its peak, financial companies accounted for a full third of the profits of the nation’s private sector. Wall Street bonuses hit a record $32.9 billion, or $177,000 a worker.

Just as technology gave pop stars a bigger fan base that could buy their CDs, download their singles and snap up their concert tickets, the combination of information technology and deregulation gave bankers an unprecedented opportunity to reap huge rewards. Investors piled into the top-rated funds that generated the highest returns. Rewards flowed in abundance to the most “productive” financiers, those that took the bigger risks and generated the biggest profits.

Finance wasn’t always so richly paid. Financiers had a great time in the early decades of the 20th century: from 1909 to the mid-1930s, they typically made about 50 percent to 60 percent more than workers in other industries. But the stock market collapse of 1929 and the Great Depression changed all that. In 1934, corporate profits in the financial sector shrank to $236 million, one-eighth what they were five years earlier. Wages followed. From 1950 through about 1980, bankers and insurers made only 10 percent more than workers outside of finance, on average.

This ebb and flow of compensation mimics the waxing and waning of restrictions governing finance. A century ago, there were virtually no regulations to restrain banks’ creativity and speculative urges. They could invest where they wanted, deploy depositors’ money as they saw fit. But after the Great Depression, President Franklin D. Roosevelt set up a plethora of restrictions to avoid a repeat of the financial bubble that burst in 1929.

Interstate banking had been limited since 1927. In 1933, the Glass-Steagall Act forbade commercial banks and investment banks from getting into each other’s business — separating deposit taking and lending from playing the markets. Interest-rate ceilings were also imposed that year. The move to regulate bankers continued in 1959 under President Dwight D. Eisenhower, who forbade mixing banks with insurance companies.

Barred from applying the full extent of their wits toward maximizing their incomes, many of the nation’s best and brightest who had flocked to make money in banking left for other industries.

Then, in the 1980s, the Reagan administration unleashed a surge of deregulation. By 1999, the Glass-Steagall Act lay repealed. Banks could commingle with insurance companies at will. Ceilings on interest rates vanished. Banks could open branches anywhere. Unsurprisingly, the most highly educated returned to banking and finance. By 2005, the share of workers in the finance industry with a college education exceeded that of other industries by nearly 20 percentage points. By 2006, pay in the financial sector was again 70 percent higher than wages elsewhere in the private sector. A third of the 2009 Princeton graduates who got jobs after graduation went into finance; 6.3 percent took jobs in government.

Then the financial industry blew up, taking out a good chunk of the world economy.

Finance will not be tamed by tweaking the way bankers are paid. But bankers’ pay could be structured to discourage wanton risk taking. Similarly, superstar effects are not the sole cause of the stagnant incomes of regular Joes. But the piling of rewards on our superstars is encouraging a race to the top that, if left unabated, could leave very little to strive for in its wake.

 

top related