Marketers Have Given Up on Context, And Our National Discourse Is Suffering

It’s getting complicated out there.

(First in a series. Post two, on Twitter’s solution, is here).

Marketers – especially brand marketers: Too many of you have lost the script regarding the critical role you play in society. And while well-intentioned TV spots about “getting through this together” are nice, they aren’t a structural solution. It’s time to rethink the relationship between marketers, media companies (not “content creators,” ick), and the audience.

Read More
1 Comment on Marketers Have Given Up on Context, And Our National Discourse Is Suffering

SIGN UP FOR THE NEWSLETTER

Stay up to date on the latest from BattelleMedia.com

Tik Tok, Tick Tock…Boom.

Something’s been bugging me about Tik Tok. I’ve almost downloaded it about a dozen times over the past few months. But I always stop short. I don’t have a ton of time (here’s why) so forgive me as I resort to some short form tricks here. To wit:

  1. China employs a breathtaking model of state-driven surveillance.
  2. The US employs a breathtaking model of capitalist surveillance.

We on the same page so far? OK, great.

Read More
3 Comments on Tik Tok, Tick Tock…Boom.

Facebook Can’t Fix This.

The last 24 hours have not been kind to Facebook’s already bruised image. Above are four headlines, all of which clogged my inbox as I cleared email after a day full of meetings.

Let’s review: Any number of Facebook’s core customers – advertisers – are feeling duped and cheated (and have felt this way for years). A respected reporter who was told by Facebook executives that the company would not use data collected by its new Portal product, is now accusing the company of misrepresenting the truth  (others would call that lying, but the word lost its meaning this year). The executive formerly in charge of Facebook’s security is…on an apology tour, convinced the place he worked for has damaged our society (and he’s got a lot ofcompany).

In other news, Facebook has now taken responsibility for protecting the sanctity of our elections, by, among other things, banning “false information about voting requirements and fact-check[ing] fake reports of violence or long lines at polling stations.”

Yep, a company that, in its core business, is currently charged with evasion, misstatements, and putting growth above civic duty is somehow still solely responsible for fixing the problems it’s created in our civil discourse and attendant democracy.

Does this feel off to anyone else?

We’ve had nearly two years of congressional hearings, nearly two years of testimony and apologies and “we must do better-isms.” While the company must be commended for actually making several things better (the ad transparency platform, for example), the fact that we continue to believe that the appropriate remedy for what ails us is to let the fox fix the holes in our chicken coop is downright….baffling.

I guess this is what you get when the folks in power are happy with the results of our elections.

But here’s my prediction, and it won’t take long for me to be proven right or wrong: Should the Democrats take control of the House, things are going to change. Quickly. Sure, with only the House, the Democrats can’t actually force any new regulation, nor can they command any cabinet level policy shifts.

But as Trump well knows (and fears), a subpoena is a powerful thing.

Now, if the Democrats don’t win the House, well, that’s another column.

(cross posted from NewCo Shift)

Leave a comment on Facebook Can’t Fix This.

Governance, Technology, and Capitalism.

Or, Will Nature Just Shrug Its Shoulders?

If you pull far enough back from the day to day debate over technology’s impact on society – far enough that Facebook’s destabilization of democracy, Amazon’s conquering of capitalism, and Google’s domination of our data flows start to blend into one broader, more cohesive picture – what does that picture communicate about the state of humanity today?

Technology forces us to recalculate what it means to be human – what is essentially us, and whether technology represents us, or some emerging otherness which alienates or even terrifies us.  We have clothed ourselves in newly discovered data, we have yoked ourselves to new algorithmic harnesses, and we are waking to the human costs of this new practice. Who are we becoming?

Nearly two years ago I predicted that the bloom would fade from the technology industry’s rose, and so far, so true. But as we begin to lose faith in the icons of our former narratives, a nagging and increasingly urgent question arises:  In a world where we imaging merging with technology, what makes us uniquely human?

Our lives are now driven in large part by data, code, and processing, and by the governance of algorithms. These determine how data flows, and what insights and decisions are taken as a result.

So yes, software has, in a way, eaten the world. But software is not something being done to us. We have turned the physical world into data, we have translated our thoughts, actions, needs and desires into data, and we have submitted that data for algorithmic inspection and processing. What we now struggle with is the result of these new habits – the force of technology looping back upon the world, bending it to a new will.  What agency – and responsibility – do we have? Whose will? To what end?

  • ••

Synonymous with progress, asking not for permission, fearless of breaking things – in particular stupid, worthy-of-being-broken things like government, sclerotic corporations, and fetid social norms – the technology industry reveled for decades as a kind of benighted warrior for societal good. As one Senator told me during the Facebook hearings this past summer, “we purposefully didn’t regulate technology, and that was the right thing to do.” But now? He shrugged. Now, maybe it’s time.

Because technology is already regulating us. I’ve always marveled at libertarians who think the best regulatory framework for government is none at all. Do they think that means there’s no governance?

In our capitalized healthcare system, data, code and algorithms now drive diagnosis, costs, coverage and outcomes. What changes on the ground? People are being denied healthcare, and this equates to life or death in the real world. 

In our public square, data, code and algorithms drive civil discourse. We no longer share one physical, common square, but instead struggle to comprehend a world comprised of a billion Truman Shows. What changes on the ground? The election results of the world’s most powerful country.

Can you get credit to start a business? A loan to better yourself through education? Financial decisions are now determined by data, code, and algorithms. Job applications are turned to data, and run through cohorts of similarities, determining who gets hired, and who ultimately ends up leaving the workforce.

And in perhaps the most human pursuit of all – connecting to other humans – we’ve turned our desires and our hopes to data, swapping centuries of cultural norms for faith in the governance of code and algorithms built – in necessary secrecy – by private corporations.

  • ••

How does a human being make a decision? Individual decision making has always been opaque – who can query what happens inside someone’s head? We gather input, we weigh options and impacts, we test assumptions through conversations with others. And then we make a call – and we hope for the best.

But when others are making decisions that impact us, well, those kinds of decisions require governance. Over thousands of years we’ve designed systems to insure that our most important societal decisions can be queried and audited for fairness, that they are defensible against some shared logic, that they will  benefit society at large.

We call these systems government. It is imperfect but… it’s better than anarchy.

For centuries, government regulations have constrained social decisions that impact health, job applications, credit – even our public square. Dating we’ve left to the governance of cultural norms, which share the power of government over much of the world.

But in just the past decade, we’ve ceded much of this governance to private companies – companies motivated by market imperatives which demand their decision making processes be hidden. Our public government – and our culture – have not kept up.

What happens when decisions are taken by algorithms of governance that no one understands? And what happens when those algorithms are themselves governed by a philosophy called capitalism?

  • ••

We’ve begun a radical experiment combining technology and capitalism, one that most of us have scarcely considered. Our public commons – that which we held as owned by all, to the benefit of all – is increasingly becoming privatized.

Thousands of companies are now dedicated to revenue extraction in the course of delivering what were once held as public goods. Public transportation is being hollowed out by Uber, Lyft, and their competitors (leveraging public goods like roadways, traffic infrastructure, and GPS).  Public education is losing funding to private schools, MOOCs, and for-profit universities. Public health, most disastrously in the United States, is driven by a capitalist philosophy tinged with technocratic regulatory capture. And in perhaps the greatest example of all, we’ve ceded our financial future to the almighty 401K – individuals can no longer count on pensions or social safety nets – they must instead secure their future by investing in “the markets” – markets which have become inhospitable to anyone lacking the technological acumen of the world’s most cutting-edge hedge funds.

What’s remarkable and terrifying about all of this is the fact that the combinatorial nature of technology and capitalism outputs fantastic wealth for a very few, and increasing poverty for the very many. It’s all well and good to claim that everyone should have a 401K. It’s irresponsible to continue that claim when faced with the reality that 84 percent of the stock market is owned by the wealthiest ten percent of the population.

This outcome is not sustainable. When a system of governance fails us, we must examine its fundamental inputs and processes, and seek to change them.

  • ••

So what truly is governing us in the age of data, code, algorithms and processing? For nearly five decades, the singular true north of capitalism has been to enrich corporate shareholders. Other stakeholders – employees, impacted communities, partners, customers – do not directly determine the governance of most corporations.

Corporations are motivated by incentives and available resources. When the incentive is extraction of capital to be placed in the pockets of shareholders, and a new resource becomes available which will aide that extraction, companies will invent fantastic new ways to leverage that resource so as to achieve their goal. If that resource allows corporations to skirt current regulatory frameworks, or bypass them altogether, so much the better.

The new resource, of course, is the combination of data, code, algorithms and processing. Unbridled, replete with the human right of speech and its attendant purchasing of political power, corporations are quite literally becoming our governance model.

Now the caveat: Allow me to state for the record that I am not a socialist. If you’ve never read my work, know I’ve started six companies, invested in scores more, and consider myself an advocate of transparently governed free markets. But we’ve leaned far too over our skis – the facts no longer support our current governance model.

  • ••

We turn our worlds to data, leveraging that data, technocapitalism then terraforms our world. Nowhere is this more evident that with automation – the largest cost of nearly every corporation is human labor, and digital technologies are getting extraordinarily good at replacing that cost.

Nearly everyone agrees this shift is not new – yes yes, a century or two ago, most of us were farmers. But this shift is coming far faster, and with far less considered governance. The last great transition came over generations. Technocapitalism has risen to its current heights in ten short years. Ten years. 

If we are going to get this shift right, we urgently need to engage in a dialog about our core values. Can we perhaps rethink the purpose of work, given work no longer means labor? Can we reinvent our corporations and our regulatory frameworks to honor, celebrate and support our highest ideals? Can we prioritize what it means to be human even as we create and deploy tools that make redundant the way of life we’ve come to know these past few centuries?

These questions beg a simpler one: What makes us human?

I dusted off my old cultural anthropology texts, and consulted the scholars. The study of humankind teaches us that we are unique in that we are transcendent toolmakers – and digital technology is our most powerful  tool. We have nuanced language, which allows us both recollection of the past, and foresight into the future. We are wired – literally at the molecular level – to be social, to depend on one another, to share information and experience. Thanks to all of this, we have the capability to wonder, to understand our place in the world, to philosophize. The love of beauty,  philosophers will tell you, is the most human thing of all.

Oh, but then again, we are uniquely capable of intentional destroying ourselves. Plenty of species can do that by mistake. We’re unique in our ability to do it on purpose.

But perhaps the thing that makes us most human is our love of story telling, for narrative weaves nearly everything human into one grand experience. Our greatest philosophers even tell stories about telling stories! The best stories employ sublime language, advanced tools, deep community, profound wonder, and inescapable narrative tension.  That ability to destroy ourselves? That’s the greatest narrative driver in this history of mankind.

How will it turn out?

  • ••

We are storytelling engines uniquely capable of understanding our place in the world. And it’s time to change our story, before we fail a grand test of our own making: Can we transition to a world inhabited by both ourselves, and the otherness of the technology we’ve created? Should we fail, nature will indifferently shrug its shoulders. It has billions of years to let the whole experiment play over again.

We are the architects of this grand narrative. Let’s not miss our opportunity to get it right.

Adapted from a speech presented at the Thrival Humans X Tech conference in Pittsburgh earlier this week. 

Cross posted from NewCo Shift. 

 

Leave a comment on Governance, Technology, and Capitalism.

Social Media Too Shall Pass

At dinner last night with my wife and our 14 year-old daughter, I noticed a circular table of four teenage girls eating alone. They were about the same age as my daughter, who wasn’t exactly thrilled to be stuck with her parents as company on her first weekend of the school year. As we ate, I paid attention to the group’s dynamics, imagining them to be a possible reflection of what my daughter would be doing once she started going out alone with friends in New York City.

The most striking characteristic of the group was how they used their phones. The default position for each of them – their resting state, if you will – was to hold  their device at chin level while gazing into the blue grip of its screen. They looked away only to point out something happening on that screen – at no time during an hour or so of observation did any of them put their phones down to simply talk to one another.

I pointed this out to my daughter – I’m used to seeing kids on their phones, but this was a bit over the top. “Is that normal?” I asked her. “For sure,” she replied, looking over her shoulder at the clutch of zombified girls. “But,” I protested, “at some point they’ll put them down and just be human beings enjoying each other’s company, right?”

“Not really,” my daughter replied casually. “They’re Snapping,” she stated matter of factly, deducing the fact from the social and physical interactions particular to that app. “They’re adding their dinner to their stories.”

I ventured into old-person-yelling-from-the-porch territory. “But…they’re not going to do that the entire dinner, are they?”

“No,” she replied, “soon they’ll be taking photos of each other for Instagram.”

Within five minutes, that’s exactly what the girls were doing.

“Surely this can’t be a lasting behavior,” I rejoined. “Twenty years from now, we’re all going to look back at this era and realize what a bunch of idiots we were, right?”

My daughter looked at me, considered my statement, and without any apparent irony, agreed.

6 Comments on Social Media Too Shall Pass

When Senators Ask Followup Questions, You Answer Them.

Following my Senate testimony last month, several Senators reached out with additional questions and clarification requests. As I understand it this is pretty standard. Given I published my testimony here earlier, I asked if I could do the same for my written followup. The committee agreed, the questions and my answers are below.

Questions for the Record from Sen. Cortez Masto (D. Nevada)

Facebook Audits

On April 4, 2018, following the public controversy over Cambridge Analytica’s use of user data, Facebook announced several additional changes to its privacy policies. The changes include increased restrictions on apps’ ability to gather personal data from users and also a policy of restricting an app’s access to user data if that user has not used the app in the past three months. In addition, Facebook has committed to conducting a comprehensive review of all apps gathering data on Facebook, focusing particularly on apps that were permitted to collect data under previous privacy policies. Facebook will also notify any users affected by the Cambridge Analytica data leak.

Question 1: What steps can the government take to ensure that there is proper oversight of these reviews and audits?

John Battelle’s response:

I think this is a simple answer: Make sure Facebook does what it says it will do, and make sure its response is a matter not only of public record, but also public comment. This should include a full and complete accounting of how the audit was done and the findings.

Question 2: From a technical standpoint, how effective are forensic methods at ascertaining information related to what data was transferred in these cases?

John Battelle’s response:

I’m not a technologist, I’m an entrepreneur, author, analyst and commentator. I’d defer to someone who has more knowledge than myself on issues of forensic data analysis.  

Technology for Consumer Protection

Question 1: Are there any technological solutions being developed that can help address some of the issues of consumers’ privacy being violated online?

John Battelle’s response:

Yes, there are many, likely too many to mention. Instead, what I’d like to highlight is the importance of the architecture of how data flows in our society. We should be creating a framework that allows data to flow ethically, securely, and with key controls around permissioning, editing, validation, revocation, and value exchange. Blockchains hold great promise here, but are still underdeveloped (but they’re evolving rapidly).

Data Retention

Question 1: What should we, as legislators, be thinking about to verify that – when Americans are told that their data has been destroyed – that deletion can actually be confirmed?

John Battelle’s response:

Independent third party auditing services that services such as Facebook must employ seems the most straightforward response. “Trust us” is not enough, we must trust and verify.

Law Enforcement

During the hearing we had a brief discussion on the balance between privacy and sharing data with law enforcement.

Question 1: What should companies keep in mind to ensure that they can appropriately assist in law enforcement investigations?

John Battelle’s response:

This is a delicate balance, as evinced in the varied responses to these kind of cases from companies like Apple, Twitter, Yahoo, and others. Valid search warrants, not fishing expeditions, should be the rule. We’ve got the framework for this already. The issue of how governments and law enforcement deal with encryption is unresolved. However, I fall on the side of enabling strong encryption, as I believe all citizens have the right to privacy. Lose that, and we lose democracy.  

Questions 2: As lawmakers, what should we be aware of as we try to strike the right balance between privacy and safety in this area?

John Battelle’s response:

Democracy is open, messy, transparent, and has many failures. But it’s the best system yet devised (in my humble opinion) and privacy lies at its core. That means criminals will be able to abuse its benefits. That is a tradeoff we have to accept and work around. Sure, it’d be great if law enforcement had access to all the data created by its citizens. Until it’s abused, and cases of this kind of abuse by government are easy to find.

Senator Richard Blumenthal (D. Conn) Questions for the Record 

Privacy Legislation

Across hearings and questions for the record, members of Congress have raised concerns about the data collection tactics used by Facebook that are not made clear to its users. As I stated during the hearing, I am interested in putting into place rules of the road for online privacy, taking into consideration the European General Data Protection Regulation. During the hearing Mr. Battelle and others offered support for the intent of GDPR, but expressed reservations about the implementation and unintended consequences. I look forward to any further thoughts from the panelists regarding how to implement data privacy rules in the United States.

 Question for All Panelists:

Question 1. In addition to any recommendations or comments on what types of legislation or other measures could help protect consumer privacy, what lessons and principles of the California Consumer Privacy Act and the GDPR should Congress consider in privacy legislation?

 John Battelle’s response:

Implementation of sweeping legislation like those mentioned above is extremely onerous for small business. Instead of using that as an excuse to avoid legislation, the policy should incorporate remedies for smaller business (IE, enabling federation of resources and response/compliance, enabling trusted intermediaries).

The principle of empowering the consumer is embodied in both GDPR and CCPA. While well intentioned, neither envision how that empowerment will truly be effective in a modern digital marketplace. Take the principle of data portability. It’s one thing to allow consumers to download a copy of their data from a platform or service. But for that data to drive innovation, it must be easily uploaded, in a defined, well-governed, machine-readable format, so that new kinds of services can flourish. Watch how large tech platforms chip away at CCPA and attempt to subvert that ecosystem from taking root. Consider how best to ensure that ecosystem will in fact exist. I’m not a legislative analyst, but there must be an enlightened way to encourage a class of data brokers (and yes, they’re not all bad) who enable re-aggregation of consumer data, replete with permissions, revocation, validation, editing, and value exchange. Happy to talk more about this.

Questions for Mr. Battelle:

Question 2. You have written at length about the influence of Facebook and Google on the advertising and third party data market. In your experience, has Facebook driven the ad market as a sector to more invasively collect data about people? What other changes in the ad market can be attributed to the dominance of Google and Facebook?

John Battelle’s response:

Yes, without question, Facebook has driven what you describe in your initial question. But not for entirely negative reasons. Because Facebook has so much information on its users, larger advertisers feel at a disadvantage. This is also true of publishers who use Facebook for distribution (another important aspect of the platform, especially as it relates to speech and democratic discourse). Both advertisers and publishers wish to have a direct, one to one dialog with their customers, and should be able to do so on any platform. Facebook, however, has forced their business model into the middle of this dialog – you must purchase access to your followers and your readers. A natural response is for advertisers and publishers to build their own sophisticated databases of their customers and potential customers. This is to be expected, and if the data is managed ethically and transparently, should not be considered an evil.

As for other changes in the ad market that might be attributed to FB and GOOG, let’s start with the venture funding of media startups, or advertising-dependent startups of any kind. Given the duopoly’s dominance of the market, it’s become extremely hard for any entrepreneur to find financing for ideas driven by an advertising revenue stream. Venture capitalists will say “Well, that’s a great (idea, service, product), but no way am I going to fund a company that has to compete with Google or Facebook.” This naturally encourages a downward spiral in innovation.

Another major problem in ad markets is the lack of portable data and insights between Facebook and Google. If I’m an advertiser or publisher on Facebook, I’d like a safe, ethical, and practical way to know who has responded to my messaging on that platform, and to take that information across platforms, say to Google’s YouTube or Adwords. This is currently far too hard to do, if not impossible in many cases. This also challenges innovation across the business ecosystem.

Questions for the Record

Senator Margaret Wood Hassan (D. New Hampshire)

Question 1. The internet has the potential to connect people with ideas that challenge their worldview, and early on many people were hopeful that the internet would have just that effect. But too often we have seen that social media sites like Facebook serve instead as an echo chamber that polarizes people instead of bringing them together, showing them content that they are more likely to agree with rather than exposing them to new perspectives. Do you agree this is a problem? And should we be taking steps to address this echo chamber effect?

John Battelle’s response:

Yes, this filter bubble problem is well defined and I agree it’s one of the major design challenges we face not only for Facebook, but for our public discourse as well. The public square, as it were, has become the domain of private companies, and private companies do not have to follow the same rules as, say, UC Berkeley must follow in its public spaces (Chancellor Carol Christ has been quite eloquent on this topic, see her interview at the NewCo Shift Forum earlier this year).

As to steps that might be taken, this is a serious question that balances a private corporation’s right to conduct its business as it sees fit, and the rights and responsibilities of a public space/commons. I’d love to see those corporations adopt clear and consistent rules about speech, but they are floundering (see Mr. Zuckerberg’s recent comments on Holocaust deniers, for example). I’d support a multi-stakeholder commission on this issue, including policymakers, company representatives, legal scholars, and civic leaders to address the issue.

Question 2. In your testimony you discuss the value of data. You stated that you think in some ways, QUOTE, “data is equal to – or possibly even more valuable than – monetary currency.” We in Congress are seeking to figure out the value of data as well to help us understand the costs and benefits of protecting this data. Can you expand on what value you think data has, and how we should be thinking about measuring that value – both as citizens and as legislators?

John Battelle’s response:

Just as we had no idea the value of oil when it first came into the marketplace (it was used for lamps and for paving streets, and no one could have imagined the automobile industry), we still have not conceived of the markets, products, and services that could be enabled by free flowing and ethically sourced and permissioned data in our society. It’s literally too early to know, and therefore, too early to legislate in sweeping fashions that might limit or retard innovation. However, one thing I am certain of is that data – which is really a proxy for human understanding and innovation – is the most fundamentally valuable resource in the world. All money is simply data, when you think about it, and therefore a subset of data.

So how to measure its value? I think at this point it’s impossible – we must instead treat it as an infinitely valuable resource, and carefully govern its use. I’d like to add my response to another Senator’s question here, about new laws (GDPR and the California Ballot initiative) as added reference:

Implementation of sweeping legislation like those mentioned above is extremely onerous for small business. Instead of using that as an excuse to avoid legislation, the policy should incorporate remedies for smaller business (IE, enabling federation of resources and response/compliance, enabling trusted intermediaries).

The principle of empowering the consumer is embodied in both GDPR and CCPA. While well intentioned, neither envision how that empowerment will truly be effective in a modern digital marketplace. Take the principle of data portability. It’s one thing to allow consumers to download a copy of their data from a platform or service. But for that data to drive innovation, it must be easily uploaded, in a defined, well-governed, machine-readable format, so that new kinds of services can flourish. Watch how large tech platforms chip away at CCPA and attempt to subvert that ecosystem from taking root. Consider how best to ensure that ecosystem will in fact exist. I’m not a legislative analyst, but there must be an enlightened way to encourage a class of data brokers (and yes, they’re not all bad) who enable re-aggregation of consumer data, replete with permissions, revocation, validation, editing, and value exchange. Happy to talk more about this.

Question 3. Mark Zuckerberg has said that he sees Facebook more as a government than a traditional company.  Among other things, governments need to be transparent and open about the decisions they make. Many large institutions have set up independent systems — such as offices of inspectors general or ombudsmen and ethics boards — to ensure transparency and internally check bad decisions.  Facebook has none of those controls. What kinds of independent systems should companies like Facebook have to publicly examine and explain their decision-making?

John Battelle’s response:

OK, this one is simple. Facebook is NOT a government. If it is, I don’t want to be a “citizen.” I think Mr. Zuckerberg is failing to truly understand what a government truly is. If indeed Facebook wishes to become a nation state, then first it must decide what kind of nation state it wishes to be. It needs a constitution, a clear statement of rights, roles, responsibilities, and processes. None of these things exist at the moment. A terms of service does not a government make.

However, all of the ideas you mention make a ton of sense for Facebook at this juncture. I’d be supportive of them all.

Leave a comment on When Senators Ask Followup Questions, You Answer Them.

Dear Facebook…Please Give Me Agency Over The Feed

(cross posted from NewCo Shift)

Like you, I am on Facebook. In two ways, actually. There’s this public page, which Facebook gives to people who are “public figures.” My story of becoming a Facebook public figure is tortured (years ago, I went Facebook bankrupt after reaching my “friend” limit), but the end result is a place that feels a bit like Twitter, but with more opportunities for me to buy ads that promote my posts (I’ve tried doing that, and while it certainly increases my exposure, I’m not entirely sure why that matters).

Then there’s my “personal” page. Facebook was kind enough to help me fix this up after my “bankruptcy.” On this personal page I try to keep my friends to people I actually know, with mixed success. But the same problems I’ve always had with Facebook are apparent here — some people I’m actually friends with, others I know, but not well enough to call true “friends.” But I don’t want to be an ass…so I click “confirm” and move on.

Read More

1 Comment on Dear Facebook…Please Give Me Agency Over The Feed

On Medium, Facebook, and the Graph Conflict

I double took upon arriving at Medium just now, fingers flexed to write about semi-private data and hotel rooms (trust me, it’s gonna be great).

But upon my arrival, I was greeted thusly:

Screen Shot 2016-01-21 at 9.13.43 PMNow, I have no categorical beef with Facebook, I understand the value of its network as much as the next publisher. But it always struck me that Medium was forging a third way — it’s not a blogging platform, quite, at least as we used to understand them. And it’s not a social network, though it has a social feel. It’s something … of itself, and that’s a good thing.

Read More

1 Comment on On Medium, Facebook, and the Graph Conflict