Recent Updates Toggle Comment Threads | Keyboard Shortcuts

  • feedwordpress 17:07:06 on 2018/08/01 Permalink
    Tags: , , geopolitics, , , , , , , ,   

    Google and China: Flip, Flop, Flap 

    Google’s Beijing offices in 2010, when the company decided to stop censoring its results and exit the market.

    I’ve been covering Google’s rather tortured relationship with China for more than 15 years now. The company’s off again, on again approach to the Internet’s largest “untapped” market has proven vexing, but as today’s Intercept scoop informs us, it looks like Google has yielded to its own growth imperative, and will once again stand up its search services for the Chinese market. To wit:

    GOOGLE IS PLANNING to launch a censored version of its search engine in China that will blacklist websites and search terms about human rights, democracy, religion, and peaceful protest, The Intercept can reveal.

    The project – code-named Dragonfly – has been underway since spring of last year, and accelerated following a December 2017 meeting between Google’s CEO Sundar Pichai and a top Chinese government official, according to internal Google documents and people familiar with the plans.

    If I’m reading story correctly, it looks like Google’s China plans, which were kept secret from nearly all of the company’s employees, were leaked to The Intercept by concerned members of Google’s internal “Dragonfly” team, one of whom was quoted:

    “I’m against large companies and governments collaborating in the oppression of their people, and feel like transparency around what’s being done is in the public interest,” the source said, adding that they feared “what is done in China will become a template for many other nations.”

    This news raises any number of issues – for Google, certainly, but given the US/China trade war, for anyone concerned with the future of free trade and open markets. And it revives an age old question about where the line is between “respecting the rule of law in markets where we operate,” a standard tech company response to doing business on foreign soil, and “enabling authoritarian rule,” which is pretty much what Google will be doing should it actually launch the Dragonfly app.

    A bit of history. Google originally refused to play by China’s rules, and in my 2004 book, I reviewed the history, and gave the company props for taking a principled stand, and forsaking what could have been massive profits in the name of human rights. Then, in 2006, Google decided to enter the Chinese market, on government terms. Google took pains to explain its logic:

    We ultimately reached our decision by asking ourselves which course would most effectively further Google’s mission to organize the world’s information and make it universally useful and accessible. Or, put simply: how can we provide the greatest access to information to the greatest number of people?

    I didn’t buy that explanation then, and I don’t buy it now. Google is going into China for one reason, and one reason alone: Profits. As Google rolled out its service in 2006, I penned something of a rant, titled “Never Poke A Dragon While It’s Eating.” In it I wrote:

    The Chinese own a shitload of our debt, and are consuming a shitload of the world’s export base of oil. As they consolidate their power, do you really believe they’re also planning parades for us? I’m pretty sure they’ll be celebrating decades of US policy that looked the other way while the oligarchy used our technology (and that includes our routers, databases, and consulting services) to meticulously undermine the very values which allowed us to create companies like Google in the first place. But those are not the kind of celebrations I’m guessing we’d be invited to.

    So as I puzzle through this issue, understanding how in practical terms it’s really not sensible to expect that some GYMA pact is going to change the world (as much as I might wish it would), it really, honestly, comes down to one thing: The man in the White House.

    Until the person leading this country values human rights over appeasement, and decides to lead on this issue, we’re never going to make any progress. 

    Google pulled out of China in 2010, using a China-backed hacking incident as its main rationale (remember that?!).  The man in the White House was – well let’s just say he wasn’t Bush, nor Clinton, and he wasn’t Trump. In any case, the hacking incident inconveniently reminded Google that the Chinese government has no qualms about using data derived from Google services to target its own citizens.

    Has the company forgotten that fact? One wonders. Back in 2010, I praised the company for standing up to China:

    In this case, Google is again taking a leadership role, and the company is forcing China’s hand. While it’s a stretch to say the two things are directly connected, the seeming fact that China’s government was behind the intrusions has led Google to decide to stop censoring its results in China. This is politics at its finest, and it’s a very clear statement to China: We’re done playing the game your way.

    Seems Google’s not done after all. Which is both sad, and utterly predictable. Sad, because in today’s political environment, we need our companies to lead on moral and human rights issues. And predictable, because Android has a massive hold on China’s internet market, and Google’s lack of a strong search play there threatens not only the company’s future growth in its core market, but its ability to leverage Android across all its services, just as it has in Europe and the United States.

    Google so far has not made a statement on The Intercept’s story, though I imagine smoke is billowing out of some communications war room inside the company’s Mountain View headquarters.  Will the company attempt some modified version of its 2006 justifications? I certainly hope not. This time, I’d counsel, the company should just tell the truth: Google is a public company that feels compelled to grow, regardless of whether that growth comes at a price to its founding values. Period, end of story.

    I’ll end with another quote from that 2006 “Don’t Poke a Dragon” piece:

    …companies like Yahoo and Google don’t traffic in sneakers, they traffic in the most powerful forces in human culture – expression. Knowledge. Ideas. The freedom of which we take as fundamental in this country, yet somehow, we seem to have forgotten its importance in the digital age – in China, one protesting email can land you in jail for 8 years, folks.

    …Congress can call hearings, and beat up Yahoo, Google and the others for doing what everyone else is doing, but in the end, it’s not (Google’s) fault, nor, as much as I wish they’d take it on, is it even their problem. It’s our government’s problem….Since when is China policy somehow the job of private industry?

    Until that government gives (the tech industry) a China policy it can align behind, well, they’ll never align, and the very foundation of our culture – free expression and privacy, will be imperiled.

    After all, the Chinese leaders must be thinking, as they snack on our intellectual property, we’re only protecting our citizens in the name of national security.

    Just like they do in the US, right?

     
  • feedwordpress 14:13:11 on 2018/07/30 Permalink
    Tags: bay area, , , , marin, moving, , , transition   

    On Leaving the Bay Area 


    I first moved to the Bay area in 1983. I graduated from high school, spent my summer as an exchange student/day laborer in England (long story), then began studies at Berkeley, where I had a Navy scholarship (another long story).

    1983. 35 years ago.

    1983 was one year before the introduction of the Macintosh (my first job was covering Apple and the Mac). Ten years before the debut of Wired magazine. Twenty years before I began writing The Search, launching Web 2.0, and imagining what became Federated Media. And thirty years before we launched NewCo and the Shift Forum. It’s a … long fucking time ago.

    According to my laptop’s backup program, which daily and plaintively reminds me of my nomadic existence, it’s been 35 days since I left my home in Marin for good. For the past five weeks  (and the next three) my wife, my youngest daughter and I have lived out of suitcases; in hotels and Airbnbs, across ten or so cities: Boulder, Cincinnati, Florence, New Orleans, Middletown (RI), Tisbury, and of course a few visits to New York and the Bay (mainly to see our two older kids, who live in Berkeley now). It’s actually been rather thrilling, to be without an address or a home. But even as we embarked, we knew where we’d eventually end up: We’re moving to New York City.

    In the past few weeks we’ve found a home (in West Chelsea, near the High Line), and on August 15th we’ll become eager, anxious, and excited residents of Manhattan.

    Taking stock of 35 years is exhausting. Moving from a home that’s borne the weight of your collective memories for so long… well, it forces reckoning, it shakes you by the shoulders, it demands repair. If you’ve been wondering why I’ve not been writing much, why I’ve been relatively quiet after months of nearly daily posts… here you have it.

    I can’t explain in a headline, or even a few sentences, why we decided to leave the Bay. But if you’ll bear with me, I’ll do what I do, which is write till I’m done, and hope to explain myself to the extent you might care to know.

    First things first: My wife is from New York, and when I courted her from out in California (and I really did court her), I promised that once this Wired thing played out (I foolishly thought it’d be a few years, if that), we’d move back to her home state. Her mother and brother live in New York, and I always have wanted to live there as well. If you’re at heart a writer, a thinker, and a creator of stuff, you have to live at least once in the most vibrant city in the world.

    But as things turn out, three years in California stretched to five, then our first child was born, and we moved to a place we loved: Marin.  Replete with a truly majestic mountain, a community of extraordinary humans, and a lifestyle built for sending down roots, Marin lulled us into near senescence. Five more companies and two more children came, and with them a commitment to schools, to people we came to love, to the companies we struggled to build.

    But even with all that, over the past five or so years, I’ve felt that the industry which once challenged, thrilled, and engaged me was … missing something. A few things actually. NewCo was, in a small way, my attempt at identifying those things and responding to them. Identifying and celebrating companies that valued mission and purpose over profit and growth, in cities around the world, not just in the Bay area…that seemed the right thing to do five years ago. And while NewCo was not a barn burning success as a business, it thrived as an idea, and along the way my founders and I met incredible leaders, thinkers, and fellow travelers.

    But after more than three decades and six companies started in San Francisco, I’m ready to take a break from the West, from the Bay, and from the Valley. Truth be told, the place is starting to annoy me a bit more than I’m comfortable with.

    I can rationalize San Francisco’s adolescent fits – it’s trying to grow up, and it’s terrible at it – and it seems our industry is trying to press past its bro culture and blinkered focus on tech for tech’s sake. But to be honest, it’s the lack of networked, lateral thinking that’s left me wanting. It feels like nearly everyone in the Bay area is so busy making companies (guilty), they don’t have time to have conversations about much more than … making companies.

    I’ve spent my career chasing essentially one story: the impact of technology on society. Whenever I travel to New York, I find a different approach to that narrative. Sure, folks want to talk shop, but they also want to find connection points to culture, to social issues, to politics, to ideas and to the rest of the world. I feel like a lot of the Valley is habitually talking to itself about things that aren’t that interesting anymore. There’s a much bigger story to chase, and the density of connection and dialog about that story feels way more present in NYC. So I’m headed there, to see what might come of it.

    That said, there are thousands of amazing minds in the Valley who are also fascinated by the story I’m chasing. It’s just hard to connect the dots, given how spread out the damn place is – Marin to San Jose can be a two hour slog, both ways. I’ll be back, frequently, but now as a reporter of sorts, with a mission of understanding tech from an outsider’s point of view. I’ve been in NYC at least once a month for the past few decades. Now I’ll be just flipping the bit, as it were.

    How does this effect my current work with NewCo and Shift? Not much, truth be told. NewCo’s festivals around the world are now all run by wonderful partners who have them well in hand. The Shift site is moving to a open web domain, and keeping the Medium site as well, so our readers there can stay in touch with us. And Shift Forum will continue, but probably be a bit later than usual this coming year, given the disruption this move has driven through my life. I’m in remarkable conversations with a number of folks about what else I might do in New York, and as those conversations yield news, I’ll keep you guys informed about them here.

    So for now, goodbye, Bay area, and thank you for making me who I am. And hello, New York – I’m a bit nervous about what you have in store, but I’m jumping in without reservation. If you live there, let me know. I look forward to the conversation.

     
  • feedwordpress 21:41:10 on 2018/07/28 Permalink
    Tags: , , , , ,   

    When Senators Ask Followup Questions, You Answer Them. 

    Following my Senate testimony last month, several Senators reached out with additional questions and clarification requests. As I understand it this is pretty standard. Given I published my testimony here earlier, I asked if I could do the same for my written followup. The committee agreed, the questions and my answers are below.

    Questions for the Record from Sen. Cortez Masto (D. Nevada)

    Facebook Audits

    On April 4, 2018, following the public controversy over Cambridge Analytica’s use of user data, Facebook announced several additional changes to its privacy policies. The changes include increased restrictions on apps’ ability to gather personal data from users and also a policy of restricting an app’s access to user data if that user has not used the app in the past three months. In addition, Facebook has committed to conducting a comprehensive review of all apps gathering data on Facebook, focusing particularly on apps that were permitted to collect data under previous privacy policies. Facebook will also notify any users affected by the Cambridge Analytica data leak.

    Question 1: What steps can the government take to ensure that there is proper oversight of these reviews and audits?

    John Battelle’s response:

    I think this is a simple answer: Make sure Facebook does what it says it will do, and make sure its response is a matter not only of public record, but also public comment. This should include a full and complete accounting of how the audit was done and the findings.

    Question 2: From a technical standpoint, how effective are forensic methods at ascertaining information related to what data was transferred in these cases?

    John Battelle’s response:

    I’m not a technologist, I’m an entrepreneur, author, analyst and commentator. I’d defer to someone who has more knowledge than myself on issues of forensic data analysis.  

    Technology for Consumer Protection

    Question 1: Are there any technological solutions being developed that can help address some of the issues of consumers’ privacy being violated online?

    John Battelle’s response:

    Yes, there are many, likely too many to mention. Instead, what I’d like to highlight is the importance of the architecture of how data flows in our society. We should be creating a framework that allows data to flow ethically, securely, and with key controls around permissioning, editing, validation, revocation, and value exchange. Blockchains hold great promise here, but are still underdeveloped (but they’re evolving rapidly).

    Data Retention

    Question 1: What should we, as legislators, be thinking about to verify that – when Americans are told that their data has been destroyed – that deletion can actually be confirmed?

    John Battelle’s response:

    Independent third party auditing services that services such as Facebook must employ seems the most straightforward response. “Trust us” is not enough, we must trust and verify.

    Law Enforcement

    During the hearing we had a brief discussion on the balance between privacy and sharing data with law enforcement.

    Question 1: What should companies keep in mind to ensure that they can appropriately assist in law enforcement investigations?

    John Battelle’s response:

    This is a delicate balance, as evinced in the varied responses to these kind of cases from companies like Apple, Twitter, Yahoo, and others. Valid search warrants, not fishing expeditions, should be the rule. We’ve got the framework for this already. The issue of how governments and law enforcement deal with encryption is unresolved. However, I fall on the side of enabling strong encryption, as I believe all citizens have the right to privacy. Lose that, and we lose democracy.  

    Questions 2: As lawmakers, what should we be aware of as we try to strike the right balance between privacy and safety in this area?

    John Battelle’s response:

    Democracy is open, messy, transparent, and has many failures. But it’s the best system yet devised (in my humble opinion) and privacy lies at its core. That means criminals will be able to abuse its benefits. That is a tradeoff we have to accept and work around. Sure, it’d be great if law enforcement had access to all the data created by its citizens. Until it’s abused, and cases of this kind of abuse by government are easy to find.

    Senator Richard Blumenthal (D. Conn) Questions for the Record 

    Privacy Legislation

    Across hearings and questions for the record, members of Congress have raised concerns about the data collection tactics used by Facebook that are not made clear to its users. As I stated during the hearing, I am interested in putting into place rules of the road for online privacy, taking into consideration the European General Data Protection Regulation. During the hearing Mr. Battelle and others offered support for the intent of GDPR, but expressed reservations about the implementation and unintended consequences. I look forward to any further thoughts from the panelists regarding how to implement data privacy rules in the United States.

     Question for All Panelists:

    Question 1. In addition to any recommendations or comments on what types of legislation or other measures could help protect consumer privacy, what lessons and principles of the California Consumer Privacy Act and the GDPR should Congress consider in privacy legislation?

     John Battelle’s response:

    Implementation of sweeping legislation like those mentioned above is extremely onerous for small business. Instead of using that as an excuse to avoid legislation, the policy should incorporate remedies for smaller business (IE, enabling federation of resources and response/compliance, enabling trusted intermediaries).

    The principle of empowering the consumer is embodied in both GDPR and CCPA. While well intentioned, neither envision how that empowerment will truly be effective in a modern digital marketplace. Take the principle of data portability. It’s one thing to allow consumers to download a copy of their data from a platform or service. But for that data to drive innovation, it must be easily uploaded, in a defined, well-governed, machine-readable format, so that new kinds of services can flourish. Watch how large tech platforms chip away at CCPA and attempt to subvert that ecosystem from taking root. Consider how best to ensure that ecosystem will in fact exist. I’m not a legislative analyst, but there must be an enlightened way to encourage a class of data brokers (and yes, they’re not all bad) who enable re-aggregation of consumer data, replete with permissions, revocation, validation, editing, and value exchange. Happy to talk more about this.

    Questions for Mr. Battelle:

    Question 2. You have written at length about the influence of Facebook and Google on the advertising and third party data market. In your experience, has Facebook driven the ad market as a sector to more invasively collect data about people? What other changes in the ad market can be attributed to the dominance of Google and Facebook?

    John Battelle’s response:

    Yes, without question, Facebook has driven what you describe in your initial question. But not for entirely negative reasons. Because Facebook has so much information on its users, larger advertisers feel at a disadvantage. This is also true of publishers who use Facebook for distribution (another important aspect of the platform, especially as it relates to speech and democratic discourse). Both advertisers and publishers wish to have a direct, one to one dialog with their customers, and should be able to do so on any platform. Facebook, however, has forced their business model into the middle of this dialog – you must purchase access to your followers and your readers. A natural response is for advertisers and publishers to build their own sophisticated databases of their customers and potential customers. This is to be expected, and if the data is managed ethically and transparently, should not be considered an evil.

    As for other changes in the ad market that might be attributed to FB and GOOG, let’s start with the venture funding of media startups, or advertising-dependent startups of any kind. Given the duopoly’s dominance of the market, it’s become extremely hard for any entrepreneur to find financing for ideas driven by an advertising revenue stream. Venture capitalists will say “Well, that’s a great (idea, service, product), but no way am I going to fund a company that has to compete with Google or Facebook.” This naturally encourages a downward spiral in innovation.

    Another major problem in ad markets is the lack of portable data and insights between Facebook and Google. If I’m an advertiser or publisher on Facebook, I’d like a safe, ethical, and practical way to know who has responded to my messaging on that platform, and to take that information across platforms, say to Google’s YouTube or Adwords. This is currently far too hard to do, if not impossible in many cases. This also challenges innovation across the business ecosystem.

    Questions for the Record

    Senator Margaret Wood Hassan (D. New Hampshire)

    Question 1. The internet has the potential to connect people with ideas that challenge their worldview, and early on many people were hopeful that the internet would have just that effect. But too often we have seen that social media sites like Facebook serve instead as an echo chamber that polarizes people instead of bringing them together, showing them content that they are more likely to agree with rather than exposing them to new perspectives. Do you agree this is a problem? And should we be taking steps to address this echo chamber effect?

    John Battelle’s response:

    Yes, this filter bubble problem is well defined and I agree it’s one of the major design challenges we face not only for Facebook, but for our public discourse as well. The public square, as it were, has become the domain of private companies, and private companies do not have to follow the same rules as, say, UC Berkeley must follow in its public spaces (Chancellor Carol Christ has been quite eloquent on this topic, see her interview at the NewCo Shift Forum earlier this year).

    As to steps that might be taken, this is a serious question that balances a private corporation’s right to conduct its business as it sees fit, and the rights and responsibilities of a public space/commons. I’d love to see those corporations adopt clear and consistent rules about speech, but they are floundering (see Mr. Zuckerberg’s recent comments on Holocaust deniers, for example). I’d support a multi-stakeholder commission on this issue, including policymakers, company representatives, legal scholars, and civic leaders to address the issue.

    Question 2. In your testimony you discuss the value of data. You stated that you think in some ways, QUOTE, “data is equal to – or possibly even more valuable than – monetary currency.” We in Congress are seeking to figure out the value of data as well to help us understand the costs and benefits of protecting this data. Can you expand on what value you think data has, and how we should be thinking about measuring that value – both as citizens and as legislators?

    John Battelle’s response:

    Just as we had no idea the value of oil when it first came into the marketplace (it was used for lamps and for paving streets, and no one could have imagined the automobile industry), we still have not conceived of the markets, products, and services that could be enabled by free flowing and ethically sourced and permissioned data in our society. It’s literally too early to know, and therefore, too early to legislate in sweeping fashions that might limit or retard innovation. However, one thing I am certain of is that data – which is really a proxy for human understanding and innovation – is the most fundamentally valuable resource in the world. All money is simply data, when you think about it, and therefore a subset of data.

    So how to measure its value? I think at this point it’s impossible – we must instead treat it as an infinitely valuable resource, and carefully govern its use. I’d like to add my response to another Senator’s question here, about new laws (GDPR and the California Ballot initiative) as added reference:

    Implementation of sweeping legislation like those mentioned above is extremely onerous for small business. Instead of using that as an excuse to avoid legislation, the policy should incorporate remedies for smaller business (IE, enabling federation of resources and response/compliance, enabling trusted intermediaries).

    The principle of empowering the consumer is embodied in both GDPR and CCPA. While well intentioned, neither envision how that empowerment will truly be effective in a modern digital marketplace. Take the principle of data portability. It’s one thing to allow consumers to download a copy of their data from a platform or service. But for that data to drive innovation, it must be easily uploaded, in a defined, well-governed, machine-readable format, so that new kinds of services can flourish. Watch how large tech platforms chip away at CCPA and attempt to subvert that ecosystem from taking root. Consider how best to ensure that ecosystem will in fact exist. I’m not a legislative analyst, but there must be an enlightened way to encourage a class of data brokers (and yes, they’re not all bad) who enable re-aggregation of consumer data, replete with permissions, revocation, validation, editing, and value exchange. Happy to talk more about this.

    Question 3. Mark Zuckerberg has said that he sees Facebook more as a government than a traditional company.  Among other things, governments need to be transparent and open about the decisions they make. Many large institutions have set up independent systems — such as offices of inspectors general or ombudsmen and ethics boards — to ensure transparency and internally check bad decisions.  Facebook has none of those controls. What kinds of independent systems should companies like Facebook have to publicly examine and explain their decision-making?

    John Battelle’s response:

    OK, this one is simple. Facebook is NOT a government. If it is, I don’t want to be a “citizen.” I think Mr. Zuckerberg is failing to truly understand what a government truly is. If indeed Facebook wishes to become a nation state, then first it must decide what kind of nation state it wishes to be. It needs a constitution, a clear statement of rights, roles, responsibilities, and processes. None of these things exist at the moment. A terms of service does not a government make.

    However, all of the ideas you mention make a ton of sense for Facebook at this juncture. I’d be supportive of them all.

     
  • feedwordpress 17:53:30 on 2018/07/22 Permalink
    Tags: , , , , , , , ,   

    The Tragedy of the Data Commons 

    Before, and after?

    A theme of my writing over the past ten or so years has been the role of data in society. I tend to frame that role anthropologically: How have we adapted to this new element in our society? What tools and social structures have we created in response to its emergence as a currency in our world? How have power structures shifted as a result?

    Increasingly, I’ve been worrying a hypothesis: Like a city built over generations without central planning or consideration for much more than fundamental capitalistic values, we’ve architected an ecosystem around data that is not only dysfunctional, it’s possibly antithetical to the core values of democratic society. Houston, it seems, we really do have a problem.

    I know, it’s been a while since I’ve written here, and most of my recent stuff has focused on Facebook. I’ve been on the road the entire summer, and preparing to move from the Bay area to NYC ( that’s another post). But before you roll your eyes in anticipation of yet another Facebook rant, no, this post is not about Facebook, despite that company’s continued inability to govern itself.

    No, this post is about the business of health insurance.

    Last week ProPublica published a story titled Health Insurers Are Vacuuming Up Details About You — And It Could Raise Your Rates.  It’s the second in an ongoing series the investigative unit is doing on the role of data in healthcare. I’ve been watching this story develop for years, and ProPublica’s piece does a nice job of framing the issue. It envisions  “a future in which everything you do — the things you buy, the food you eat, the time you spend watching TV — may help determine how much you pay for health insurance.”  Unsurprisingly, the health industry has  developed an insatiable appetite for personal data about the individuals it covers. Over the past decade or so, all of our quotidian activities (and far more) have been turned into data, and that data can and is being sold to the insurance industry:

    “The companies are tracking your race, education level, TV habits, marital status, net worth. They’re collecting what you post on social media, whether you’re behind on your bills, what you order online. Then they feed this information into complicated computer algorithms that spit out predictions about how much your health care could cost them.”

    HIPPA, the regulatory framework governing health information in the United States, only covers and protects medical data – not search histories, streaming usage, or grocery loyalty data. But if you think your search, video, and food choices aren’t related to health, well, let’s just say your insurance company begs to differ.

    Lest we dive into a rabbit hole about the corrosive combination of healthcare profit margins with personal data (ProPublica’s story does a fine job of that anyway), I want to pull back and think about what’s really going on here.

    The Tragedy of the Commons

    One of the most fundamental tensions in an open society is the potential misuse of resources held “in common” – resources to which all individuals have access. Garrett Hardin’s 1968 essay on the subject, “The Tragedy of the Commons,” explores this tension, concluding that the problem of human overpopulation has no technical solution. (A technical solution is one that does not require a shift in human values or morality (IE, a political solution), but rather can be fixed by application of science and/or engineering.) Hardin’s essay has become one of the most cited works in social science – the tragedy of the commons is a facile concept that applies to countless problems across society.

    In the essay, Hardin employs a simple example of a common grazing pasture, open to all who own livestock. The pasture, of course, can only support a finite number of cattle. But as Hardin argues, cattle owners are financially motivated to graze as many cattle as they possibly can, driving the number of grass munchers beyond the land’s capacity, ultimately destroying the commons. “Freedom in a commons brings ruin to all,” he concludes, delivering an intellectual middle finger to Smith’s “invisible hand” in the process.

    So what does this have to do with healthcare, data, and the insurance industry? Well, consider how the insurance industry prices its policies. Insurance has always been a data-driven business – it’s driven by actuarial risk assessment, a statistical method that predicts the probability of a certain event happening. Creating and refining these risk assessments lies at the heart of the insurance industry, and until recently, the amount of data informing actuarial models has been staggeringly slight. Age, location, and tobacco use are pretty much how policies are priced under Obamacare, for example. Given this paucity, one might argue that it’s utterly a *good* thing that the insurance industry is beefing up its databases. Right?

    Perhaps not. When a population is aggregated on high-level data points like age and location, we’re essentially being judged on a simple shared commons – all 18 year olds who live in Los Angeles are being treated essentially the same, regardless if one person has a lurking gene for cancer and another will live without health complications for decades. In essence, we’re sharing the load of public health in common – evening out the societal costs in the process.

    But once the system can discriminate on a multitude of data points, the commons collapses,  devolving into a system rewarding whoever has the most profitable profile. That 18-year old with flawless genes, the right zip code, an enviable inheritance, and all the right social media habits will pay next to nothing for health insurance. But the 18 year old with a mutated BRCA1 gene, a poor zip code, and a proclivity to sit around eating Pringles while playing Fortnite? That teenager is not going to be able to afford health insurance.

    Put another way, adding personalized data to the insurance commons destroys the fabric of that commons. Healthcare has been resistant to this force until recently, but we’re already seeing the same forces at work in other aspects of our previously shared public goods.

    A public good, to review, is defined as “a commodity or service that is provided without profit to all members of a society, either by the government or a private individual or organization.” A good example is public transportation. The rise of data-driven services like Uber and Lyft have been a boon for anyone who can afford these services, but the unforeseen externalities are disastrous for the public good. Ridership, and therefore revenue, falls for public transportation systems, which fall into a spiral of neglect and decay. Our public streets become clogged with circling rideshare drivers, roadway maintenance costs skyrocket, and – perhaps most perniciously – we become a society of individuals who forget how to interact with each other in public spaces like buses, subways, and trolley cars.

    Once you start to think about public goods in this way, you start to see the data-driven erosion of the public good everywhere. Our public square, where we debate political and social issues, has become 2.2 billion data-driven Truman Shows, to paraphrase social media critic Roger McNamee. Retail outlets, where we once interacted with our fellow citizens, are now inhabited by armies of Taskrabbits and Instacarters. Public education is hollowed out by data-driven personalized learning startups like Alt School, Khan Academy, or, let’s face it, YouTube how to videos.

    We’re facing a crisis of the commons – of the public spaces we once held as fundamental to the functioning of our democratic society. And we have data-driven capitalism to blame for it.

    Now, before you conclude that Battelle has become a neo-luddite, know that I remain a massive fan of data-driven business. However, if we fail to re-architect the core framework of how data flows through society – if we continue to favor the rights of corporations to determine how value flows to individuals absent the balancing weight of the public commons – we’re heading down a path of social ruin. ProPublica’s warning on health insurance is proof that the problem is not limited to Facebook alone. It is a problem across our entire society. It’s time we woke up to it.

    So what do we do about it? That’ll be the focus of a lot of my writing going forward.  As Hardin writes presciently in his original article, “It is when the hidden decisions are made explicit that the arguments begin. The problem for the years ahead is to work out an acceptable theory of weighting.” In the case of data-driven decisioning, we can no longer outsource that work to private corporations with lofty sounding mission statements, whether they be in healthcare, insurance, social media, ride sharing, or e-commerce.

     

     

     
  • feedwordpress 23:29:34 on 2018/06/19 Permalink
    Tags: , , , , , , , , ,   

    My Senate Testimony 

    (image) Today I had a chance to testify to the US Senate on the subject of Facebook, Cambridge Analytica, and data privacy. It was an honor, and a bit scary, but overall an experience I’ll never forget. Below is the written testimony I delivered to the Commerce committee on Sunday, released on its site today. If you’d like to watch, head right here, I think it’ll be up soon.  Forgive the way the links work, I had to consider that this would be printed and bound in the Congressional Record. I might post a shorter version that I read in as my verbal remarks next…we’ll see.


     

    Honorable Committee Members –

     

    My name is John Battelle, for more than thirty years, I’ve made my career reporting, writing, and starting companies at the intersection of technology, society, and business. I appreciate the opportunity to submit this written and verbal testimony to your committee.

    Over the years I’ve written extensively about the business models, strategies, and societal impact of technology companies, with a particular emphasis on the role of data, and the role of large, well-known firms. In the 1980s and 90s I focused on Apple and Microsoft, among others. In the late 90s I focused on the nascent Internet industry, the early 2000s brought my attention to Google, Amazon, and later, Twitter and Facebook. My writings tend to be observational, predictive, analytical, and opinionated.

    Concurrently I’ve been an entrepreneur, founding or co-founding and leading half a dozen companies in the media and technology industries. All of these companies, which span magazines, digital publishing tools, events, and advertising technology platforms, have been active participants in what is broadly understood to be the “technology industry” in the United States and, on several occasions, abroad as well. Over the years these companies have employed thousands of staff members, including hundreds of journalists, and helped to support tens of thousands of independent creators across the Internet. I also serve on the boards of several companies, all of which are deeply involved in the technology and data industries.

    In the past few years my work has focused on the role of the corporation in society, with a particular emphasis on the role technology plays in transforming that role. Given this focus, a natural subject of my work has been on companies that are the most visible exemplars of technology’s impact on business and society. Of these, Facebook has been perhaps my most frequent subject in the past year or two.

    Given the focus of this hearing, the remainder of my written testimony will focus on a number of observations related generally to Facebook, and specifically to the impact of the Cambridge Analytica story. For purposes of brevity, I will summarize many of my points here, and provide links to longer form writings that can be found on the open Internet.

    Facebook broke through the traditional Valley startup company noise in the mid 2000s, a typical founder-driven success story backed by all the right venture capital, replete with a narrative of early intrigue between partners, an ambitious mission (“to make the world more open and connected”), a sky-high private valuation, and any number of controversial decisions around its relationship to its initial customers, the users of its service (later in its life, Facebook’s core customers bifurcated to include advertisers). I was initially skeptical about the service, but when Sheryl Sandberg, a respected Google executive, moved to Facebook to run its advertising business, I became certain it would grow to be one of the most important companies in technology. I was convinced Facebook would challenge Google for supremacy in the hyper-growth world of personalized advertising. In those early days, I often made the point that while Google’s early corporate culture sprang from the open, interconnected world wide web, Facebook was built on the precept of an insular walled garden, where a user’s experience was entirely controlled by the Facebook service itself. This approach to creating a digital service not only threatened the core business model of Google (which was based on indexing and creating value from open web pages), it also raised a significant question of what kind of public commons we wanted to inhabit as we migrated our attention and our social relationships to the web. (Examples: https://battellemedia.com/archives/2012/02/its-not-whether-googles-threatened-its-asking-ourselves-what-commons-do-we-wish-for ; https://battellemedia.com/archives/2012/03/why-hath-google-forsaken-us-a-meditation)

    In the past five or so years, of course, Facebook has come to dominate what is colloquially known as the public square – the metaphorical space where our society comes together to communicate with itself, to debate matters of public interest, and to privately and publicly converse on any number of topics. Since the dawn of the American republic, independent publishers (often referred to as the Fourth Estate – from pamphleteers to journalists to bloggers) have always been important actors in the center of this space. As a publisher myself, I became increasingly concerned that Facebook’s appropriation of public discourse would imperil the viability of independent publishers. This of course has come to pass.

    As is well understood by members of this committee, Facebook employed two crucial strategies to grow its service in its early days. The first was what is universally known as the News Feed, which mixed personal news from “friends” with public stories from independent publishers. The second strategy was the Facebook “Platform,” which encouraged developers to create useful (and sometimes not so useful) products and services inside Facebook’s walled garden service. During the rise of both News Feed and Platform, I repeatedly warned independent publishers to avoid committing themselves and their future viability to either News Feed or the Platform, as Facebook would likely change its policies in the future, leaving publishers without recourse. (Examples: https://battellemedia.com/archives/2012/01/put-your-taproot-into-the-independent-web ; https://battellemedia.com/archives/2012/11/facebook-is-now-making-its-own-weather ; https://shift.newco.co/we-can-fix-this-f-cking-mess-bf6595ac6ccd ; https://shift.newco.co/ads-blocking-and-tackling-18129db3c352)

    Of course, the potent mix of News Feed and a subset of independent publishers combined to deliver us the Cambridge Analytica scandal, and we are still grappling with the implications of this incident on our democracy. But it is important to remember that while the Cambridge Analytica breach seems unusual, it is in fact not – it represents business as usual for Facebook. Facebook’s business model is driven by its role as a data broker. Early in its history, Facebook realized it could grow faster if it allowed third parties, often referred to as developers, to access its burgeoning trove of user data, then manipulate that data to create services on Facebook’s platform that increased a Facebook user’s engagement on the platform. Indeed, in his early years as CEO of Facebook, Mark Zuckerberg was enamored with the “platform business model,” and hoped to emulate such icons as Bill Gates (who built the Windows platform) or Steve Jobs (who later built the iOS/app store platform).

    However, Facebook’s core business model of advertising, driven as it is by the brokerage of its users’ personal information, stood in conflict with Zuckerberg’s stated goal of creating a world-beating platform. By their nature, platforms are places where third parties can create value. They do so by leveraging the structure, assets, and distribution inherent to the platform. In the case of Windows, for example, developers capitalized on Microsoft’s well-understood user interface, its core code base, and its massive adoption by hundreds of millions of computer users. Bill Gates famously defined a successful platform as one that creates more value for the ecosystem that gathers around it than for the platform itself. By this test – known as the Gates Line – Facebook’s early platform fell far short. Developers who leveraged access to Facebook’s core asset – its user data – failed to make enough advertising revenue to be viable, because Facebook (and its advertisers) would always preference Facebook’s own advertising inventory over that of its developer partners. In retrospect, it’s now commonly understood in the Valley that Facebook’s platform efforts were a failure in terms of creating a true ecosystem of value, but a success in terms of driving ever more engagement through Facebook’s service.

    For an advertising-based business model, engagement trumps all other possible metrics. As it grew into one of the most successful public companies in the history of business, Facebook nimbly identified the most engaging portions of its developer ecosystem, incorporated those ideas into its core services, and became a ruthlessly efficient acquirer and manipulator of its users’ engagement. It then processed that engagement into advertising opportunities, leveraging its extraordinary data assets in the process. Those advertising opportunities drew millions of advertisers large and small, and built the business whose impact we now struggle to understand.

    To truly understand the impact of Facebook on our culture, we must first understand the business model it employs. Interested observers of Facebook will draw ill-informed conclusions about the company absent a deep comprehension of its core driver – the business of personalized advertising. I have written extensively on this subject, but a core takeaway is this: The technology infrastructure that allows companies like Facebook to identify exactly the right message to put in front of exactly the right person at exactly the right time are, in all aspects of the word, marvelous. But the externalities of manufacturing attention and selling it to the highest bidder have not been fully examined by our society. (Examples: https://shift.newco.co/its-the-advertising-model-stupid-b843cd7edbe9 ; https://shift.newco.co/its-the-advertising-model-stupid-b843cd7edbe9 ; https://shift.newco.co/lost-context-how-did-we-end-up-here-fd680c0cb6da ; https://battellemedia.com/archives/2013/11/why-the-banner-ad-is-heroic-and-adtech-is-our-greatest-technology-artifact ; https://shift.newco.co/do-big-advertisers-even-matter-to-the-platforms-9c8ccfe6d3dc )

    The Cambridge Analytica scandal has finally focused our attention on these externalities, and we should use this opportunity to go beyond the specifics of that incident, and consider the broader implications. The “failure” of Facebook’s Platform initiative is not a failure of the concept of an open platform. It is instead a failure by an immature, blinkered company (Facebook) to properly govern its own platform, as well as a failure of our own regulatory oversight to govern the environment in which Facebook operates. Truly open platforms are regulated by the platform creator in a way that allows for explosive innovation (see the Gates Line) and shared value creation. (Examples: https://shift.newco.co/its-not-the-platforms-that-need-regulation-2f55177a2297 ; https://shift.newco.co/memo-to-techs-titans-please-remember-what-it-was-like-to-be-small-d6668a8fa630)

    The absolutely wrong conclusion to draw from the Cambridge Analytica scandal is that entities like Facebook must build ever-higher walls around their services and their data. In fact, the conclusion should be the opposite. A truly open society should allow individuals and properly governed third parties to share their data so as to create a society of what Nobel laureate Edmond Phelps calls “mass flourishing.” My own work now centers on how our society might shift what I call the “social architecture of data” from one where the control, processing and value exchange around data is managed entirely by massive, closed entities like Facebook, to one where individuals and their contracted agents manage that process themselves. (Examples: https://shift.newco.co/are-we-dumb-terminals-86f1e1315a63 ; https://shift.newco.co/facebook-tear-down-this-wall-400385b7475d ; https://shift.newco.co/how-facebook-google-amazon-and-their-peers-could-change-techs-awful-narrative-9a758516210a ; https://shift.newco.co/on-facebook-a156710f2679 ; https://battellemedia.com/archives/2014/03/branded-data-preferences )

    Another mistaken belief to emerge from the Cambridge Analytica scandal is that any company, no matter how powerful, well intentioned, or intelligent, can by itself “fix” the problems the scandal has revealed. Facebook has grown to a size, scope, and impact on our society that outstrips its ability to manage the externalities it has created. To presume otherwise is to succumb to arrogance, ignorance, or worse. The bald truth is this: Not even Mark Zuckerberg understands how Facebook works, nor does he comprehend its impact on our society. (Examples: https://shift.newco.co/we-allowed-this-to-happen-were-sorry-we-need-your-help-e26ed0bc87ac ; https://shift.newco.co/i-apologize-d5c831ce0690 ; https://shift.newco.co/facebooks-data-trove-may-well-determine-trump-s-fate-71047fd86921 ; https://shift.newco.co/its-time-to-ask-ourselves-how-tech-is-changing-our-kids-and-our-future-2ce1d0e59c3c )

    Another misconception: Facebook does not “sell” its data to any third parties. While Facebook may not sell copies of its data to these third parties, it certainly sells leases to that data, and this distinction bears significant scrutiny. The company may not wish to be understood as such, but it is most certainly the largest data broker in the history of the data industry.

    Lastly, the Cambridge Analytica scandal may seem to be entirely about a violation of privacy, but to truly understand its impact, we must consider the implications relating to future economic innovation. Facebook has used the scandal as an excuse to limit third party data sharing across and outside its platform. While this seems logical on first glance, it is in fact destructive to long term economic value creation.

    So what might be done about all of this? While I understand the lure of sweeping legislation that attempts to “cure” the ills of technological progress, such approaches often have their own unexpected consequences. For example, the EU’s adoption of GDPR, drafted to limit the power of companies like Facebook, may in fact only strengthen that company’s grip on its market, while severely limiting entrepreneurial innovation in the process (Example: https://shift.newco.co/how-gdpr-kills-the-innovation-economy-844570b70a7a )

    As policy makers and informed citizens, we should strive to create a flexible, secure, and innovation friendly approach to data governance that allows for maximum innovation while also insuring maximum control over the data by all effected parties, including individuals, and importantly, the beneficiaries of future innovation yet conceived and created. To play forward the current architecture of data in our society – where most of the valuable information is controlled by an increasingly small oligarchy of massive corporations – is to imagine a sterile landscape hostile to new ideas and mass flourishing.

    Instead, we must explore a world governed by an enlightened regulatory framework that encourages data sharing, high standards of governance, and maximum value creation, with the individual at the center of that value exchange. As I recently wrote: “Imagine … you can download your own Facebook or Amazon “token,” a magic data coin containing not only all the useful data and insights about you, but a control panel that allows you to set and revoke permissions around that data for any context. You might pass your Amazon token to Walmart, set its permissions to “view purchase history” and ask Walmart to determine how much money it might have saved you had you purchased those items on Walmart’s service instead of Amazon. You might pass your Facebook token to Google, set the permissions to compare your social graph with others across Google’s network, and then ask Google to show you search results based on your social relationships. You might pass your Google token to a startup that already has your genome and your health history, and ask it to munge the two in case your 20-year history of searching might infer some insights into your health outcomes. This might seem like a parlor game, but this is the kind of parlor game that could unleash an explosion of new use cases for data, new startups, new jobs, and new economic value.”

    It is our responsibility to examine our current body of legislation as it relates to how corporations such as Facebook impact the lives of consumers and the norms of our society overall. Much of the argument around this issue turns on the definition of “consumer harm” under current policy. Given that data is non-rivalrous and services such as Facebook are free of charge, it is often presumed there is no harm to consumers (or by extension, to society) in its use. This also applies to arguments about antitrust enforcement. I think our society will look back on this line of reasoning as deeply flawed once we evolve to an understanding of data as equal to – or possibly even more valuable than – monetary currency.

    Most observers of technology agree that data is a new class of currency in society, yet we continue to struggle to understand its impact, and how best to govern it. The manufacturing of data into currency is the main business of Facebook and countless other information age businesses. Currently the only participatory right in this value creation for a user of these services is to A/engage with the services offered and B/purchase the stock of the company offering the services. Neither of these options affords the user – or society – compensation commensurate with the value created for the firm. We can and must do better as a society, and we can and must expect more of our business leaders.

    (More: https://shift.newco.co/its-time-for-platforms-to-come-clean-on-political-advertising-69311f582955 ; https://shift.newco.co/come-on-what-did-you-think-they-do-with-your-data-396fd855e7e1 ; https://shift.newco.co/tech-is-public-enemy-1-so-now-what-dee0c0cc40fe ; https://shift.newco.co/why-is-amazons-go-not-bodega-2-0-6f148075afd5 ; https://shift.newco.co/predictions-2017-cfe0806bed84 ; https://shift.newco.co/the-automatic-weapons-of-social-media-3ccce92553ad )

    Respectfully submitted,

    John Battelle

    Ross, California

    June 17, 2018

     
c
compose new post
j
next post/next comment
k
previous post/previous comment
r
reply
e
edit
o
show/hide comments
t
go to top
l
go to login
h
show/hide help
esc
cancel