Tagged: politics Toggle Comment Threads | Keyboard Shortcuts

  • feedwordpress 23:02:06 on 2018/11/12 Permalink
    Tags: , , , , , ideas, , politics, ,   

    When Tech Loves Its Fiercest Critics, Buyer Beware 

    Detail from the cover of Harari’s lastest work, 21 Lessons for the 21st Century.

    A year and a half ago I reviewed Yuval Noah Harari’s Homo Deus, recommending it to the entire industry with this subhead: “No one in tech is talking about Homo Deus. We most certainly should be.”

    Eighteen months later, Harari is finally having his technology industry moment. The author of a trio of increasingly disturbing books – Sapiens, for which made his name as a popular historian philosopher, the aforementioned Homo Deus, which introduced a dark strain of tech futurism to his work, and the recent 21 Lessons for the 21st Century – Harari has cemented his place in the Valley as tech’s favorite self-flagellant. So it’s only fitting that this weekend Harari was the subject of New York Times profile featuring this provocative title: Tech C.E.O.s Are in Love With Their Principal Doomsayer. The subhead continues: “The futurist philosopher Yuval Noah Harari thinks Silicon Valley is an engine of dystopian ruin. So why do the digital elite adore him so?”

    Well, I’m not sure if I qualify as one of those elites, but I have a theory, one that wasn’t quite raised in the Times’ otherwise compelling profile. I’ve been a student of Harari’s work, and if there’s one clear message, it’s this: We’re running headlong into a world controlled by a tiny elite of superhumans, masters of new technologies that the “useless class” will never understand. “Homo sapiens is an obsolete algorithm,” Harari writes in Homo Deus. A new religion of Dataism will transcend our current obsession with ourselves, and we will “dissolve within the data torrent like a clump of earth within a gushing river.” In other words, we humans are f*cked, save for a few of the lucky ones who manage to transcend their fate and become masters of the machines. “Silicon Valley is creating a tiny ruling class,” the Times writes, paraphrasing Harari’s work, “and a teeming, furious “useless class.””

    So here’s why I think the Valley loves Harari: We all believe we’ll be members of that tiny ruling class. It’s an indefensible, mathematically impossible belief, but as Harari reminds us in 21 Lessons, “never underestimate human stupidity.” Put another way, we are  fooling ourselves, content to imagine we’ll somehow all earn a ticket into (or onto) whatever apocalypse-dodging exit plan Musk, Page or Bezos might dream up (they’re all obsessed with leaving the planet, after all). Believing that impossible fiction is certainly a lot easier than doing the quotidian work of actually fixing the problems which lay before us. Better to be one of the winners than to risk losing along with the rest of the useless class, no?

    But we can’t all be winners in the future Harari lays out, and he seems to understand this fact. “If you make people start thinking far more deeply and seriously about these issues,” he said to the Times, “some of the things they will think about might not be what you want them to think about.”

    Exactly, Professor. Now that I’ve departed the Valley, where I spent nearly three decades of my life, I’m starting to gain a bit of perspective on my own complicated relationship with the power structure of the place. I grew up with the (mostly) men who lead companies like Amazon, Google, Facebook and Apple, and early in the industry’s rise, it was heady to share the same stage with legends like Bezos, Jobs, or Page. But as the technology industry becomes the driving force of social rupture, I’m far more skeptical of its leaders’ abilities to, well, lead.

    Witness this nearly idea-free interview with Google CEO Sundar Pichai, also in the Times, where the meticulously media-prepped executive opines on whether his industry has a role to play in society’s ills: “Every generation is worried about the new technology, and feels like this time it’s different. Our parents worried about Elvis Presley’s influence on kids. So, I’m always asking the question, “Why would it be any different this time?” Having said that, I do realize the change that’s happening now is much faster than ever before. My son still doesn’t have a phone.”

    Pichai’s son not have a phone, but he is earning money mining Ethereum (really, you can’t make this shit up). I’m not sure the son of a centi-millionaire needs to earn money – but it certainly is useful to master the algorithms that will soon control nearly every aspect of human life. So – no, son, no addictive phone for you (even though my company makes them, and makes their operating systems, and makes the apps which ensure their addictive qualities).

    But mining crypto currency? Absolutely!

    Should Harari be proven right and humanity becomes irrelevant, I’m pretty sure Pichai’s son will have a first class ticket out of whatever mess is left behind. But the rest of us? We should probably focus on making sure that kid never needs to use it.

    Cross posted from NewCo Shift. 


    By the way, the other current obsession of Valley folks is author Anand Giridharadas’ Winners Take All – The Elite Charade of Changing the World. Read them together for a one-two punch, if you dare…

     
  • feedwordpress 11:39:41 on 2018/10/24 Permalink
    Tags: , commerce, , , , , , , politics, , retail, walmart   

    This Is How Walmart Beats Amazon 

    A scenario from the future

    (cross posted from NewCo Shift)

    In my last post I imagined a world in which large data-driven platforms like Amazon, Google, Spotify, and Uber are compelled to share machine-readable copies of data to their users. There are literally scores, if not hundreds of wrinkles to iron out around how such a system would work, and in a future post I hope to dig into some of those questions. But for now, come with me on a journey into the future, where the wrinkles have been ironed out, and a new marketplace of personally-driven information is flourishing. We’ll return to one of the primary examples I sketched out in the aforementioned post: A battle for the allegiance – and pocketbook – of one online shopper, in this case, my wife Michelle.

    ***

    It’s a crisp winter mid morning in Manhattan when the doorbell rings. Michelle looks up from her laptop, wondering who it might be. She’s not expecting any deliveries from Amazon, usually the source of such interruptions. She glances at her phone, and the Ring app (an Amazon service, naturally) shows a well dressed, smiling young woman at the door. She’s holding what looks like an elegantly wrapped gift in her hands. Now that’s unusual! Michelle checks the date – no anniversaries, no birthdays, no special occasions – so what gives?

    Michelle opens the door and is greeted by a woman who introduces herself as Sheila. She tells Michelle she’s been sent over by Walmart. Walmart? Michelle’s never set foot in a Walmart store, and has a less than charitable view of the company overall. Why on earth would Walmart be sending her a special delivery gift box?

    Sheila is used to exactly this kind of response – she’s been trained to expect it, and to manage the conversation that ensues. Sheila is a college-educated Walmart management associate, and delivering these gift boxes is a mandatory part of her company training. In fact, Sheila’s future career trajectory is based, in part, on her success at converting Michelle into becoming a Walmart customer, and she’s learned from her colleagues back at corporate that the best way to succeed is to be direct and open while engaging with a top-level prospect.

    “Michelle, I know this seems a bit strange, but Walmart has identified you as a premier ecommerce customer – I’m guessing you probably have at least three or four packages a week delivered here?”

    “More like three or four a day,” Michelle answers, warming to Sheila’s implied status as a premium customer.

    “Yes, it’s amazing how it’s become a daily habit,” Sheila answers. “And as you probably know, Walmart has an online service, but truth be told, we never seem to get the business of folks like you. I’m here to see if we might change that.”

    Michelle becomes suspicious. It doesn’t make sense to her – sending over a manager bearing gifts? Such tactics don’t scale – and feel like an intrusion to boot.

    Sensing this, Sheila continues. “Look, I’m not here to sell you anything. I’ve got this special gift for you from Doug McMillon, the CEO of Walmart. You’ve been selected to be part of a new program we’re testing – we call it Walton’s Circle. It’s named after Sam Walton, our founder, who was pretty fond of the personal touch. In any case, the gift is yours to keep. There’s some pretty cool stuff in there, I have to say, including La Mer skin cream and some Neuhaus chocolate that’s to die for.”

    Michelle smiles. Strange how the world’s biggest retailer, a place she’s never shopped, seems to know her brand preferences for skin care and chocolate. Despite herself, she relaxes a bit.

    “Also inside,” Sheila continues, “is an invitation. It’s entirely up to you if you want to accept it, but let me explain?”

    “Sure,” Michelle answers.

    “Great. Have you heard of the Token Act?”

    Michelle frowns. She read about this new piece of legislation, something to do with personal data and the right to exchange it for value across the internet. In the run up to its passage, her husband wouldn’t shut up about how revolutionary it was going to be, but so far nothing important in her life had changed.

    “Yes, I’ve heard of it,” Michelle answers, “but it all seems pretty abstract.”

    “Yeah, I hear that all the time,” Sheila responds. “But that’s where our invitation comes in. Inside the box is an envelope with a code and a website. I imagine you use Amazon…” Sheila glances toward an empty brown box in the hallway with Amazon’s universal smiling logo. Michelle laughs. “Of course you do! I was a huge Amazon customer for years. And that’s what our invitation is about – it’s an invitation to see what might happen if you became a Walmart customer instead. If you go to our site and enter your code, a program will automatically download your Amazon purchase history and run it through Walmart’s historical inventory. Within seconds, you’ll be given a report detailing what you would have saved had you purchased exactly the same products, at the same time, from us instead of Jeff Bezos.”

    “Huh,” Michelle responds. “Sounds cool but…that’s my information on Amazon, no? I don’t want you to have that, do I?”

    “Of course not,” Sheila says knowingly. “All of your information is protected by LiveRamp Identity, and is never stored or even processed on our servers. You maintain complete control over the process, and can revoke it at any time.”

    Michelle had heard of LiveRamp Identity, it was a third-party guarantor of information safety she’d used for a recent mortgage application.  She also came across it when co-signing for a car loan for her college-aged daughter.

    “When you put that code into our site, a token is generated that gives us permission to compare our data to yours, and a report is generated,” Sheila explained. “The report is yours to keep and do with what you want. In fact, the report becomes a token in and of itself, and you can submit that token to third party services like TokenTrust, which will audit our work and tell you if our results can be trusted.”

    TokenTrust was another service Michelle had heard of, her husband had raved about it as one of the fastest growing new entrants in the tech industry. The company had recently been featured on 60 Minutes – it played a significant role in a story about Google’s search results, if she recalled correctly. Docusign had purchased the company for several billion just last year. In any case, Michelle’s suspicions were defused – may as well check this out. I mean, why would Walmart risk its reputation stealing her Amazon data? It was worth at least seeing that report.

    Sheila sensed the opening. “The reports are pretty amazing,” she says. “I’ve had clients who’ve discovered they could have saved thousands of dollars a year. And here’s the best part: If, after reviewing and validating the report, you switch to Walmart, we’ll credit your account with those savings – in essence, we’ll retroactively deliver you the savings you would have had all along.”

    “Wow. That almost sounds too good to be true!” Michelle says. “But… OK, thanks. I’ll check it out. Thanks for coming by.”

    “Absolutely,” Sheila responds. “And here’s my card – that’s my cell, and my email. Let me know if you have any questions.”

    ***

    Michelle heads back inside and places the gift box on the table next to her laptop. Before opening the box, she wants to be sure this thing is for real. She Googles “Walmart Walton Circle Savings Token”  – and the first link is to a Business Insider article: “These Lucky Few Amazon Customers Are Paid Thousands to Switch – By Walmart.” So Sheila wasn’t lying – this program is for real!

    Michelle tugs on the satin ribbon surrounding her gift box and raises its sturdy lid. Nestled on straw inside are two jars of La Mer, several samples of Neuhaus chocolates, two of her favorite bath salts, and various high end household items. The inside lid of the box proclaims “Welcome to Walton’s Circle!” in elegant script. At the center of the box is an creamy envelope engraved with her name. Michelle opens it, and just as Sheila mentioned, a URL and code is included, along with simple instructions.

    What the hell, may as well see what comes of it. Turning to her laptop, Michelle heads to Walmart.com – for the first time in her life – and enters her code. Almost instantaneously a dialog pops up, informing her that her report is ready. Would she like to review it?

    Why not?! Michelle clicks “Yes” and up comes a side-by-side comparison of her entire Amazon purchase history. She notices that during the early years – roughly until 2006 –  there’s not much on the Walmart side of the report. But after that the match rates start to climb, and for the past five or so years, the report shows that 98 percent of the stuff she’s bought at Amazon was also available on Walmart.com. Each purchase has a link, and she tries out one – a chaise lounge she purchased in 2014 (gotta love Prime shipping!). Turns out Walmart didn’t have that exact match, but the report shows several similar alternatives, any of which would have worked. Cool.

    Michelle’s eye is drawn to the bottom of the report, to a large sum in red that shows the difference in price between her Amazon purchases and their Walmart doppelgangers.

    $2,700.

    Holy….cow. Michelle can’t believe it. Is this for real? Anticipating the question, Walmart’s report software pops up a dialog. “Would you like to validate your token’s report using TokenTrust? We’ll pay all fees.” Michelle clicks yes, and a TokenTrust site appears. The site shows a “working” icon for several seconds, then returns a simple message: “TokenTrust has reviewed Walmarts claims and your Amazon token, and validates the accuracy of this report.”

    Michelle is sold. Next to the $2700 figure at the bottom of her report is one line of text, and a “Go” link. “Would you like to become a founding member of the Walton Circle? We’ll take care of all your transition needs, and Sheila, who’ve you already met, will be named as your personal shopping concierge.”

    Michelle hovers momentarily over “Go.” What the hell, she thinks. I can always switch back. And with one click, Michelle does something she never thought she would: She becomes a Walmart customer.

    Satisfied, she turns her eyes back to her work. Several new emails have collected in her inbox. One is from Doug McMillon, welcoming her to Walton’s Circle. As she hovers over it, mail refreshes, and a new message piles on top of McMillon’s.

    Holy shit. Did Jeff Bezos really just email me?! 

    ***

    Is such a scenario even possible? Well, that question remains unexplored, at least for now. As I wrote in my last post, I’m not certain Amazon’s terms of service would allow for such an information exchange, though it’s currently possible to download exactly the information Walmart would need to stand up such a service. (I’ve done it, it takes a bit of poking around, but it’s very cool to see.) The real question is this: Would Walmart spend the thousands of dollars required to make this kind of customer acquisition possible?

    I don’t see why not. A high end e-commerce customer spends more than ten thousand dollars a year online. Over a lifetime, this customer is worth thousands of dollars in profit for a well-run commerce site like Walmart. The most difficult and expensive problem for any brand is switching costs – it’s at the core of the most sophisticated marketing efforts in the world – Ford spends hundreds of millions each year trying to  convince customers to switch from GM, Verizon spends equal amounts in an effort to pull customers from AT&T. Over the past five years, Walmart has watched Amazon run away with its customers online, even as it has spent billions building a competitive commerce offering. What Walmart needs are “point to” customers – the kind of people who not only become profitable lifelong buyers, but who will tell hundreds of friends, family members and colleagues about their gift box experience.

    But to get there, Walmart needs that Amazon token. Wouldn’t it be cool if such a thing actually existed?

     
  • feedwordpress 14:20:27 on 2018/10/17 Permalink
    Tags: , , , politics, ,   

    Facebook Can’t Fix This. 

    The last 24 hours have not been kind to Facebook’s already bruised image. Above are four headlines, all of which clogged my inbox as I cleared email after a day full of meetings.

    Let’s review: Any number of Facebook’s core customers – advertisers – are feeling duped and cheated (and have felt this way for years). A respected reporter who was told by Facebook executives that the company would not use data collected by its new Portal product, is now accusing the company of misrepresenting the truth  (others would call that lying, but the word lost its meaning this year). The executive formerly in charge of Facebook’s security is…on an apology tour, convinced the place he worked for has damaged our society (and he’s got a lot ofcompany).

    In other news, Facebook has now taken responsibility for protecting the sanctity of our elections, by, among other things, banning “false information about voting requirements and fact-check[ing] fake reports of violence or long lines at polling stations.”

    Yep, a company that, in its core business, is currently charged with evasion, misstatements, and putting growth above civic duty is somehow still solely responsible for fixing the problems it’s created in our civil discourse and attendant democracy.

    Does this feel off to anyone else?

    We’ve had nearly two years of congressional hearings, nearly two years of testimony and apologies and “we must do better-isms.” While the company must be commended for actually making several things better (the ad transparency platform, for example), the fact that we continue to believe that the appropriate remedy for what ails us is to let the fox fix the holes in our chicken coop is downright….baffling.

    I guess this is what you get when the folks in power are happy with the results of our elections.

    But here’s my prediction, and it won’t take long for me to be proven right or wrong: Should the Democrats take control of the House, things are going to change. Quickly. Sure, with only the House, the Democrats can’t actually force any new regulation, nor can they command any cabinet level policy shifts.

    But as Trump well knows (and fears), a subpoena is a powerful thing.

    Now, if the Democrats don’t win the House, well, that’s another column.

    (cross posted from NewCo Shift)

     
  • feedwordpress 16:16:33 on 2018/09/24 Permalink
    Tags: , , , , , politics, ,   

    Governance, Technology, and Capitalism. 

    Or, Will Nature Just Shrug Its Shoulders?

    If you pull far enough back from the day to day debate over technology’s impact on society – far enough that Facebook’s destabilization of democracy, Amazon’s conquering of capitalism, and Google’s domination of our data flows start to blend into one broader, more cohesive picture – what does that picture communicate about the state of humanity today?

    Technology forces us to recalculate what it means to be human – what is essentially us, and whether technology represents us, or some emerging otherness which alienates or even terrifies us.  We have clothed ourselves in newly discovered data, we have yoked ourselves to new algorithmic harnesses, and we are waking to the human costs of this new practice. Who are we becoming?

    Nearly two years ago I predicted that the bloom would fade from the technology industry’s rose, and so far, so true. But as we begin to lose faith in the icons of our former narratives, a nagging and increasingly urgent question arises:  In a world where we imaging merging with technology, what makes us uniquely human?

    Our lives are now driven in large part by data, code, and processing, and by the governance of algorithms. These determine how data flows, and what insights and decisions are taken as a result.

    So yes, software has, in a way, eaten the world. But software is not something being done to us. We have turned the physical world into data, we have translated our thoughts, actions, needs and desires into data, and we have submitted that data for algorithmic inspection and processing. What we now struggle with is the result of these new habits – the force of technology looping back upon the world, bending it to a new will.  What agency – and responsibility – do we have? Whose will? To what end?

    • ••

    Synonymous with progress, asking not for permission, fearless of breaking things – in particular stupid, worthy-of-being-broken things like government, sclerotic corporations, and fetid social norms – the technology industry reveled for decades as a kind of benighted warrior for societal good. As one Senator told me during the Facebook hearings this past summer, “we purposefully didn’t regulate technology, and that was the right thing to do.” But now? He shrugged. Now, maybe it’s time.

    Because technology is already regulating us. I’ve always marveled at libertarians who think the best regulatory framework for government is none at all. Do they think that means there’s no governance?

    In our capitalized healthcare system, data, code and algorithms now drive diagnosis, costs, coverage and outcomes. What changes on the ground? People are being denied healthcare, and this equates to life or death in the real world. 

    In our public square, data, code and algorithms drive civil discourse. We no longer share one physical, common square, but instead struggle to comprehend a world comprised of a billion Truman Shows. What changes on the ground? The election results of the world’s most powerful country.

    Can you get credit to start a business? A loan to better yourself through education? Financial decisions are now determined by data, code, and algorithms. Job applications are turned to data, and run through cohorts of similarities, determining who gets hired, and who ultimately ends up leaving the workforce.

    And in perhaps the most human pursuit of all – connecting to other humans – we’ve turned our desires and our hopes to data, swapping centuries of cultural norms for faith in the governance of code and algorithms built – in necessary secrecy – by private corporations.

    • ••

    How does a human being make a decision? Individual decision making has always been opaque – who can query what happens inside someone’s head? We gather input, we weigh options and impacts, we test assumptions through conversations with others. And then we make a call – and we hope for the best.

    But when others are making decisions that impact us, well, those kinds of decisions require governance. Over thousands of years we’ve designed systems to insure that our most important societal decisions can be queried and audited for fairness, that they are defensible against some shared logic, that they will  benefit society at large.

    We call these systems government. It is imperfect but… it’s better than anarchy.

    For centuries, government regulations have constrained social decisions that impact health, job applications, credit – even our public square. Dating we’ve left to the governance of cultural norms, which share the power of government over much of the world.

    But in just the past decade, we’ve ceded much of this governance to private companies – companies motivated by market imperatives which demand their decision making processes be hidden. Our public government – and our culture – have not kept up.

    What happens when decisions are taken by algorithms of governance that no one understands? And what happens when those algorithms are themselves governed by a philosophy called capitalism?

    • ••

    We’ve begun a radical experiment combining technology and capitalism, one that most of us have scarcely considered. Our public commons – that which we held as owned by all, to the benefit of all – is increasingly becoming privatized.

    Thousands of companies are now dedicated to revenue extraction in the course of delivering what were once held as public goods. Public transportation is being hollowed out by Uber, Lyft, and their competitors (leveraging public goods like roadways, traffic infrastructure, and GPS).  Public education is losing funding to private schools, MOOCs, and for-profit universities. Public health, most disastrously in the United States, is driven by a capitalist philosophy tinged with technocratic regulatory capture. And in perhaps the greatest example of all, we’ve ceded our financial future to the almighty 401K – individuals can no longer count on pensions or social safety nets – they must instead secure their future by investing in “the markets” – markets which have become inhospitable to anyone lacking the technological acumen of the world’s most cutting-edge hedge funds.

    What’s remarkable and terrifying about all of this is the fact that the combinatorial nature of technology and capitalism outputs fantastic wealth for a very few, and increasing poverty for the very many. It’s all well and good to claim that everyone should have a 401K. It’s irresponsible to continue that claim when faced with the reality that 84 percent of the stock market is owned by the wealthiest ten percent of the population.

    This outcome is not sustainable. When a system of governance fails us, we must examine its fundamental inputs and processes, and seek to change them.

    • ••

    So what truly is governing us in the age of data, code, algorithms and processing? For nearly five decades, the singular true north of capitalism has been to enrich corporate shareholders. Other stakeholders – employees, impacted communities, partners, customers – do not directly determine the governance of most corporations.

    Corporations are motivated by incentives and available resources. When the incentive is extraction of capital to be placed in the pockets of shareholders, and a new resource becomes available which will aide that extraction, companies will invent fantastic new ways to leverage that resource so as to achieve their goal. If that resource allows corporations to skirt current regulatory frameworks, or bypass them altogether, so much the better.

    The new resource, of course, is the combination of data, code, algorithms and processing. Unbridled, replete with the human right of speech and its attendant purchasing of political power, corporations are quite literally becoming our governance model.

    Now the caveat: Allow me to state for the record that I am not a socialist. If you’ve never read my work, know I’ve started six companies, invested in scores more, and consider myself an advocate of transparently governed free markets. But we’ve leaned far too over our skis – the facts no longer support our current governance model.

    • ••

    We turn our worlds to data, leveraging that data, technocapitalism then terraforms our world. Nowhere is this more evident that with automation – the largest cost of nearly every corporation is human labor, and digital technologies are getting extraordinarily good at replacing that cost.

    Nearly everyone agrees this shift is not new – yes yes, a century or two ago, most of us were farmers. But this shift is coming far faster, and with far less considered governance. The last great transition came over generations. Technocapitalism has risen to its current heights in ten short years. Ten years. 

    If we are going to get this shift right, we urgently need to engage in a dialog about our core values. Can we perhaps rethink the purpose of work, given work no longer means labor? Can we reinvent our corporations and our regulatory frameworks to honor, celebrate and support our highest ideals? Can we prioritize what it means to be human even as we create and deploy tools that make redundant the way of life we’ve come to know these past few centuries?

    These questions beg a simpler one: What makes us human?

    I dusted off my old cultural anthropology texts, and consulted the scholars. The study of humankind teaches us that we are unique in that we are transcendent toolmakers – and digital technology is our most powerful  tool. We have nuanced language, which allows us both recollection of the past, and foresight into the future. We are wired – literally at the molecular level – to be social, to depend on one another, to share information and experience. Thanks to all of this, we have the capability to wonder, to understand our place in the world, to philosophize. The love of beauty,  philosophers will tell you, is the most human thing of all.

    Oh, but then again, we are uniquely capable of intentional destroying ourselves. Plenty of species can do that by mistake. We’re unique in our ability to do it on purpose.

    But perhaps the thing that makes us most human is our love of story telling, for narrative weaves nearly everything human into one grand experience. Our greatest philosophers even tell stories about telling stories! The best stories employ sublime language, advanced tools, deep community, profound wonder, and inescapable narrative tension.  That ability to destroy ourselves? That’s the greatest narrative driver in this history of mankind.

    How will it turn out?

    • ••

    We are storytelling engines uniquely capable of understanding our place in the world. And it’s time to change our story, before we fail a grand test of our own making: Can we transition to a world inhabited by both ourselves, and the otherness of the technology we’ve created? Should we fail, nature will indifferently shrug its shoulders. It has billions of years to let the whole experiment play over again.

    We are the architects of this grand narrative. Let’s not miss our opportunity to get it right.

    Adapted from a speech presented at the Thrival Humans X Tech conference in Pittsburgh earlier this week. 

    Cross posted from NewCo Shift. 

     

     
  • feedwordpress 13:43:08 on 2018/09/06 Permalink
    Tags: , , , , , , politics, , ,   

    Facebook, Twitter, and the Senate Hearings: It’s The Business Model, Period. 

    “We weren’t expecting any of this when we created Twitter over 12 years ago, and we acknowledge the real world negative consequences of what happened and we take the full responsibility to fix it.”

    That’s the most important line from Twitter CEO Jack Dorsey’s testimony yesterday – and in many ways it’s also the most frustrating. But I agree with Ben Thompson, who this morning points out (sub required) that Dorsey’s philosophy on how to “fix it” was strikingly different from that of Facebook COO Sheryl Sandberg (or Google, which failed to send a C-level executive to the hearings). To quote Dorsey (emphasis mine): “Today we’re committing to the people and this committee to do that work and do it openly. We’re here to contribute to a healthy public square, not compete to have the only one. We know that’s the only way our business thrives and helps us all defend against these new threats.”

    Ben points out that during yesterday’s hearings, Dorsey was willing to tie the problems of public discourse on Twitter directly to the company’s core business model, that of advertising. Sandberg? She ducked the issue and failed to make the link.

    You may recall my piece back in January, Facebook Can’t Be Fixed. In it I argue that the only way to address Facebook’s failings as a public square would be to totally rethink its core advertising model, a golden goose which has driven the company’s stock on an six-year march to the stratosphere. From the post:

    “[Facebook’s ad model is] the honeypot which drives the economics of spambots and fake news, it’s the at-scale algorithmic enabler which attracts information warriors from competing nation states, and it’s the reason the platform has become a dopamine-driven engagement trap where time is often not well spent.

    To put it in Clintonese: It’s the advertising model, stupid.

    We love to think our corporate heroes are somehow super human, capable of understanding what’s otherwise incomprehensible to mere mortals like the rest of us. But Facebook is simply too large an ecosystem for one person to fix.”

    That one person, of course, is Mark Zuckerberg, but what I really meant was one company – Facebook. It’s heartening to see Sandberg acknowledge, as she did in her written testimony, the scope and the import of the challenges Facebook presents to our democracy (and to civil society around the world). But regardless of sops to “working closely with law enforcement and industry peers” and “everyone working together to stay ahead,” it’s clear Facebook’s approach to “fixing” itself remains one of going it alone. A robust, multi-stakeholder approach would quickly identify Facebook’s core business model as a major contributor to the problem, and that’s an existential threat.

    Sandberg’s most chilling statement came at the end of of her prepared remarks, in which she defined Facebook as engaged in an “arms race” against actors who co-opt the company’s platforms. Facebook is ready, Sandberg implied, to accept the challenge of lead arms producer in this race: “We are determined to meet this challenge,” she concludes.

    Well I’m sorry, I don’t want one private company in charge of protecting civil society. I prefer a more accountable social structure, thanks very much.

    I’ve heard this language of “arms races” before, in far less consequential framework: Advertising fraud, in particular on Google’s search platforms. To combat this fraud, Google locked arms with a robust network of independent companies, researchers, and industry associations, eventually developing a solution that tamed the issue (it’s never going to go away entirely).  That approach – an open and transparent process, subject to public checks and balances – is what is desperately needed now, and what Dorsey endorsed in his testimony. He’s right to do so. Unlike Google’s ad fraud issues of a decade ago, Facebook and Twitter’s problems extend to life or death, on-the-ground consequences – the rise of a dictator in the Philippines, genocide in Myanmar, hate crimes in Sri Lanka, and the loss of public trust (and possibly an entire presidential election) here in the United States. The list is terrifying, and it’s growing every week.

    These are not problems one company, or even a heterogenous blue ribbon committee, can or should “fix.” Facebook does not bear full responsibility for these problems – anymore than Trump is fully responsible for the economic, social, and cultural shifts which swept him into office last year.  But just as Trump has become the face of what’s broken in American discourse today, Facebook – and tech companies more broadly – have  become the face of what’s broken in capitalism. Despite its optimistic, purpose driven, and ultimately naive founding principles, the technology industry has unleashed a mutated version of steroidal capitalism upon the world, failing along the way to first consider the potential damage its business models might wreak.

    In an OpEd introducing the ideas in his new book “Farsighted”, author Steven Johnson details how good decisions are made, paying particular attention to how important it is to have diverse voices at the table capable of imagining many different potential scenarios for how a decision might play out. “Homogeneous groups — whether they are united by ethnic background, gender or some other commonality like politics — tend to come to decisions too quickly,” Johnson writes.  “They settle early on a most-likely scenario and don’t question their assumptions, since everyone at the table seems to agree with the broad outline of the interpretation.”

    Sounds like the entire tech industry over the past decade, no?

    Johnson goes on to quote the economist and Nobel laureate Thomas Schelling: “One thing a person cannot do, no matter how rigorous his analysis or heroic his imagination, is to draw up a list of things that would never occur to him.”

    It’s clear that the consequences of Facebook’s platforms never occurred to Zuckerberg, Sandberg, Dorsey, or other leaders in the tech industry. But now that the damage is clear, they must be brave enough to consider new approaches.

    To my mind, that will require objective study of tech’s business models, and an open mind toward changing them. It seems Jack Dorsey has realized that. Sheryl Sandberg and her colleagues at Facebook? Not so much.

     

     

     

     
c
compose new post
j
next post/next comment
k
previous post/previous comment
r
reply
e
edit
o
show/hide comments
t
go to top
l
go to login
h
show/hide help
esc
cancel