My most popular posts

I recently started collecting email addresses using MailChimp for those readers who want to receive email updates when I post here. Given my relatively low frequency of posts these days, especially compared to my heyday when I posted almost daily, and given the death of RSS, such an email list may have more value than it once did. You can sign up for that list from my About page.

I've yet to send an email to the list successfully yet, but let's hope this post will be the first to go out that route. Given this would be the first post to that list, with perhaps some new readers, I thought it would be worth compiling some of my more popular posts in one place.

Determining what those are proved difficult, however. I never checked my analytics before, since this is just a hobby, and I realized when I went to the popular content panel on Squarespace that their data only goes back a month. I also don't have data from the Blogger or Movable Type eras of my blog stashed anywhere, and I never hooked up Google Analytics here.

A month's worth of data was better than nothing, as some of the more popular posts still get a noticeable flow of traffic each month, at least by my modest standards. I also ran a search on Twitter for my URL and used that as a proxy for social media popularity of my posts (and in the process, found some mentions I'd never seen before since they didn't include my Twitter handle; is there a way on Twitter to get a notification every time your domain is referenced?).

In compiling the list, I went back and reread these posts for the first time in ages added a few thoughts on each.

  • Compress to Impress — my most recent post is the one that probably attracted most of the recent subscribers to my mailing list. I regret not including one of the most famous cinematic examples of rhetorical compression, from The Social Network, when Justin Timberlake's Sean Parker tells Jesse Eisenberg, "Drop the "The." Just Facebook. It's cleaner." Like much of the movie, probably made up (and also, why wasn't the movie titled just Social Network?), but still a good example how movies almost always compress the information to be visually compact scenes. The reason people tend to like the book better than the movie adaptation in almost every case is that, like Jeff Bezos and his dislike of Powerpoint, people who see both original and compressed information flows feel condescended and lied to by the latter. On the other hand, I could only make it through one and a half of the Game of Thrones novels so I much prefer the TV show's compression of that story, even as I watch every episode with super fans who can spend hours explaining what I've missed, so it feels like I have read the books after all.
  • Amazon, Apple, and the beauty of low margins — one of the great things about Apple is it attracts many strong, independent critics online (one of my favorites being John Siracusa). The other of the FAMGA tech giants (Facebook, Amazon, Microsoft, Google) don't seem to have as many dedicated fans/analysts/critics online. Perhaps it was that void that helped this post on Amazon from 2012 to go broad (again, by my modest standards). Being able to operate with low margins is not, in and of itself, enough to be a moat. Anyone can lower their prices, and more generally, any company should be wary of imitating any company's high variance strategy, lest they forget all the others who did and went extinct (i.e., a unicorn is a unicorn because it's a unicorn, right?). Being able to operate with low margins with unparalleled operational efficiency, at massive scale globally, while delivering more SKUs in more shipments with more reliability and greater speed than any other retailer is a competitive moat. Not much has changed, by the way. Apple just entered the home voice-controlled speaker market with its announcement of the HomePod and is coming in from above, as expected, at $349, as the room under Amazon's price umbrella isn't attractive.
  • Amazon and the profitless business model fallacy — the second of my posts on Amazon to get a traffic spike. It's amusing to read some of the user comments on this piece and recall a time when every time I said anything positive about Amazon I'd be inundated with comments from Amazon shorts and haters. Which is the point of the post, that people outside of Amazon really misunderstood the business model. The skeptics have largely quieted down nowadays, and maybe the shorts lost so much money that they finally went in search of weaker prey, but in some ways I don't blame the naysayers. Much of their misreading of Amazon is the result of GAAP rules which really don't reveal enough to discern how much of a company's losses are due to investments in future businesses or just aggressive depreciation of assets. GAAP rules leave a lot of wiggle room to manipulate your numbers to mask underlying profitability, especially when you have a broad portfolio of businesses munged together into single line items on the income statement and balance sheet. This doesn't absolve professional analysts who should know better than to ignore unit economics, however. Deep economic analysis isn't a strength of your typical tech beat reporter, which may explain the rise of tech pundits who can fill that gap. I concluded the post by saying that Amazon's string of quarterly losses at the time should worry its competitors more than it should assure them. That seems to have come to fruition. Amazon went through a long transition period from having a few very large fulfillment centers to having many many more smaller ones distributed more broadly, but generally located near major metropolitan areas, to improve its ability to ship to customers more quickly and cheaply. Now that the shift has been completed for much of the U.S., you're seeing the power of the fully operational Death Star, or many tiny ones, so to speak.
  • Facebook hosting doesn't change things, the world already changed — the title feels clunky, but the analysis still holds up. I got beat up by some journalists over this piece for offering a banal recommendation for their malady (focus on offering differentiated content), but if the problem were so tractable it wouldn't be a problem.
  • The network's the thing — this is from 2015, and two things come to mind since I wrote it.
    • As back then, Instagram has continued to evolve and grow, and Twitter largely has not and has not. Twitter did stop counting user handles against character limits and tried to alter its conversation UI to be more comprehensible, but the UI's still inscrutable to most. The biggest change, to an algorithmic rather than reverse chronological timeline, was an improvement, but of course Instagram had beat them to that move as well. The broader point is still that the strength of any network lies most in the composition of its network, and in that, Twitter and other networks that have seened flattening growth, like Snapchat or Pinterest, can take solace. Twitter is the social network for infovores like journalists, technorati, academics, and intellectual introverts, and that's a unique and influential group. Snapchat has great market share among U.S. millennials and teens, Pinterest among women. It may be hard for them to break out of those audiences, but those are wonderfully differentiated audiences, and it's also not easy for a giant like Facebook to cater to particular audiences when its network is so massive. Network scaling requires that a network reduce the surface area of its network to each individual user using strategies like algorithmic timelines, graph subdivision (e.g., subreddits), and personalization, otherwise networks run into reverse economies of scale in their user experience.
    • The other point that this post recalls is the danger of relying on any feature as a network moat. People give Instagram, Messenger, FB, and WhatsApp grief for copying Stories from Snapchat, but if any social network has to pin its future on any single feature, all of which are trivial to replicate in this software age, that company has a dim future. The differentiator for a network is how its network uses a features to strengthen the bonds of that network, not the feature itself. Be wary of hanging your hat on an overnight success of a feature the same way predators should be wary of mutations that offer temporary advantages over their prey. The Red Queen effect is real and relentless.
  • Tower of Babel — From earlier this year, and written at a time when I was quite depressed about a reversal in the quality of discourse online, and how the promise of connecting everyone via the internet had quickly seemed to lead us all into a local maximum (minimum?) of public interaction. I'm still bullish on the future, but when the utopian dreams of global connection run into the reality of human's coalitional instincts and the resentment from global inequality, we've seen which is the more immovable object. Perhaps nothing expresses the state of modern discourse like waking up to see so many of my followers posting snarky responses to one of Trump's tweets. Feels good, accomplishes nothing, let's all settle for the catharsis of value signaling. I've been guilty of this, and we can do better.
  • Thermodynamic theory of evolution — actually, this isn't one of my most popular posts, but I'm obsessed with the second law of thermodynamics and exceptions to it in the universe. Modeling the world as information feels like something from the Matrix but it has reinvigorated my interest in the physical universe.
  • Cuisine and empire — on the elevation of food as scarce cultural signal over music. I'll always remember this post because Tyler Cowen linked to it from Marginal Revolution. Signalling theory is perhaps one of the three most influential ideas to have changed my thinking in the past decade. I would not underestimate its explanatory power in the rise of Tesla. Elon Musk and team made the first car that allowed wealthy people to signal their environmental values without having to also send a conflicting signal about their taste in cars. It's one example where actually driving one of the uglier, less expensive EV's probably would send the stronger signal, whereas generally the more expensive and useless a signal the more effective it is.
  • Your site has a self-describing cadence — I'm fond of this one, though Hunter Walk has done so much more to point to this post than anyone that I feel like I should grant him a perpetual license to call it his own. It still holds true, almost every service and product I use online trains me how often to return. The only unpleasant part of rereading this is realizing how my low posting frequency has likely trained my readers to never visit my blog anymore.
  • Learning curves sloping up and down — probably ranks highly only because I have such a short window of data from Squarespace to examine, but I do think that companies built for the long run have to come to maintain a sense of the slope of their organization's learning curve all the time, especially in technology where the pace of evolution and thus the frequency of existential decisions is heightened.
  • The paradox of loss aversion — more tech markets than ever are winner-take-all because the internet is the most powerful and scalable multiplier of network effects in the history of the world. Optimal strategy in winner-take-all contests differs quite a bit from much conventional business strategy, so best recognize when you're playing in one.
  • Federer and the Paradox of Skill — the paradox of skill is a term I first learned from Michael Mauboussin's great book The Success Equation. This post applied it to Roger Federer, and if he seems more at peace recently, now that he's older and more evenly matched in skill to other top players, it may be that he no longer feels subject to the outsized influence of luck as he did when he was a better player. In Silicon Valley, with all its high achieving, brilliant people, understanding the paradox of skill may be essential to feeling jealous of every random person around you who fell into a pool of money. The Paradox of Skill is a cousin to The Red Queen effect, which I referenced above and which tech workers of the Bay Area should familiarize themselves with. It explains so much of the tech sector but also just living in the Bay Area. Every week I get a Curbed newsletter, and it always has a post titled "What $X will get you in San Francisco" with a walkthrough of a recent listing that you could afford on that amount of monthly rent. Over time they've had to elevate the dollar amount just to keep things interesting, or perhaps because what $2900 can rent in you in SF was depressing its readers.

Having had this blog going off and on since 2001, I only skimmed through through a fraction of the archives, but perhaps at some point I'll cringe and crawl back further to find other pieces that still seem relevant.

The network's the thing

Last week Instagram announced it was supporting more than just square aspect ratios for photos and videos. This led of course to a Slate article decrying the move, because Slate is that friend that has to be contrarian on every topic all the time, just to be annoying.

The square confines Instagram users to a small area of maneuver. It forces us to consider what details are essential, and which can be cropped out. It spares us from indulgence of the landscape and the false promise of the panorama.
 
But Instagram, which is owned by Facebook, is in the business of accommodating its users, not challenging them. One of the problems with the square, the company explained in its announcement, is that “you can’t capture the Golden Gate Bridge from end to end.” This example speaks to the needs of a certain kind of Instagram user who enjoys planting his flag on settled territory. Like an iPhone videographer at a Taylor Swift concert, the guy Instagramming the Golden Gate Bridge is not creating a rare or essential document, only proof that he saw it with his own eyes.
 
And why did he bother doing that, anyway? Clearly, because photographs cannot really capture the scope of the Golden Gate Bridge, or St. Peter’s Basilica, or the view from your car window as you drive up the Pacific Coast Highway. The impulse to capture these moments on camera is shaded by the knowledge that the moment, in all its immediacy, is too large to fit in a frame of any size.
 

I don't think my friend who snapped a pic of her daughter this morning or the friend who memorialized the little leaf the barista made in the foam on his latte was contemplating how wonderful it was that they were sparing me from the “indulgence of the landscape and the false promise of the panorama” but what do I know. I'm fairly certain the guy Instagramming the Golden Gate Bridge (I've done that a few times on Instagram) realizes he's not “creating a rare or essential document” but it never hurts to remind him, I'm sure he appreciates being set in his artistic place.

I'm glad Instagram is accommodating the additional aspect ratios, and it's a sign of how powerfully their network has matured. People confuse arbitrary limits on social networks—Twitter's 140 character limit, Instagram's square aspect ratio and limited filters, to take two prominent examples—with their core asset, which is the network itself. Sure, the limits can affect the nature of the content shared, but Instagram is above else a pure and easy way to share visual content with other people and get their feedback. That they started allowing videos and now differing aspect ratios doesn't change the core value of the network, which is the graph.

In fact, this move by Instagram validates the power of their network. If they were failing they either wouldn't have survived long enough to make such a move or it would be positioned as some desperate pivot. Instagram is dealing from a position of strength here, expanding the flexibility of its tools to meet the needs of a still growing user base.

In the same way, Twitter should have lifted the 140 character limit on DMs much earlier than they did. The power of Twitter, deep down, is that it's a public messaging protocol. The 140 character limit is not its secret power. The network is.

I'd actually remove the 140 character limit on Tweets as well, though such a move would undoubtedly spawn even more of a public outcry than Instagram's move since so many power users of Twitter are journalists. Yes, a 140 character limit enforces some concision in writing, rewarding the witty among us, but it also alienates a lot of people who hate having to edit a thought multiple times just to fit in the arbitrary limit. Lots of those people abandoned Twitter and publish on Facebook instead. Twitter could always choose to limit how much of a Tweet to display in the Timeline so as to allow for a higher vertical density of Tweets in the timeline, when people are scanning.

Look at how many users of Twitter have to break long thoughts across multiple Tweets, in Tweetstorms or just long linked series of Tweets. Many of those are power users, yet I still see power users do it incorrectly every day, making it really difficult to follow the entire sequence. Users who want to link tweets in a Tweetstorm or just to link their own Tweets together in a series should reply to each of their Tweets, removing their own username in the process. This allows readers who click one tweet to easily see the rest of the Tweets in the series, and removing one's own username adds back some characters for the content and prevents it from seeming as if you're talking to yourself like a crazy person. That many have no idea how to do it is just one of Twitter's usability issues. It's a wonderfully elegant public messaging protocol, but its insistence on staying so low level is crazy. Don't even get me started on putting a period before a username in a Tweet, try explaining that to your mother with a straight face.

Here's another example. Look at how many experienced Twitters users now turn to apps like OneShot to attach screenshots of text to their Tweets as photos, to circumvent the 140 character limit. I happen to really enjoy those screenshorts, as they're sometimes called now, and they demonstrate how Twitter could expand their 140 character limit without overwhelming the Timeline: just truncate at some point and add a click to expand function. This is yet another example of users generating useful innovation on top of Twitter when it should be coming from within the company.

Rather than force users to jump through all these hoops to publish longer content, Twitter could just allow users to write more than 140 characters in one Tweet, truncating the whole of it after some limit and posting a Read More button to allow readers to see the rest of the thought. Blasphemy! many will shout. I can already see the pitchforks in the distance. Some good old blasphemy is just what Twitter needs.

Longer character limits would likely increase the ability to follow conversations and dialogues on the service, too. One of the wonderful things about Twitter is that conversations between specific users can be read by other users. That's one of the greatest things about Twitter as a public messaging protocol. But because replies have to fit within 140 characters, often they need to be broken up into multiple Tweets. Many who reply don't realize that unless they hit the reply button on the previous Tweet in the conversation, the dialogue link is broken. Many mistakenly compose a new Tweet to continue the dialogue, not realizing that any reader clicking on that Tweet will not automatically see other Tweets in that conversation. Instead, it will just display by itself, as an orphan.

I run into this multiple times every day on the service, clicking on a Tweet without any easy way to figure out what it was in response to. If a lot of time has passed, it's often impossible to piece the conversation back together. It drives me crazy. I tried explaining how to piece broken conversation threads like this back together to a few people who abandoned Twitter and then realized I sounded like a madman. Why, in this day and age, should they have to learn such low level nonsense? Threaded conversations are, for the most part, a solved UI issue in this day and age.

I'm not done with the character limits, so hold your disgust. You may wish to bring more than just your pitchforks after I'm done. Every Twitter conversation that involves more than two people devolves into a short series of retorts that eventually dies because each additional username consumes more of the 140 character limit, until there is no more room for actual dialogue.

It's absurd, but it's always been that way. Why usernames count towards the 140 character limit has always befuddled me. Meaningful conversation always has to migrate off of Twitter to some other platform, for no reason other than a stubborn allegiance to an arbitrary limit which made sense in the SMS age but now is a defect. If you're going to keep a character limit (could we at least double it?), let's not have usernames count against the limit. In fact, if I hit reply to someone's Tweet, do we even need to insert that person's username at the front of the Tweet? You can still send the notification to that user that I replied to their Tweet, and suddenly my reply won't seem so oddly formatted to the average reader. There are plenty of ways to indicate who the message is addressed to through contextual formatting, and if I wanted to mention them explicitly I could always write @username in the Tweet. But it's unnecessary to insert it by default.

Vine is perhaps the only network whose chief content-creation limit seems intrinsically part of the network, and that's because video is one type of content which can't be scanned, in which each additional second of content imposes a linear attention cost of one second on the viewer. A six minute video costs the reader 60X the attention cost that a 6 second video does, and to even create a 6 second video of any interest requires some clever editing to produce a coherent narrative. A Vine video joke has its own distinct pace, it's like a two line riddle, often a 4.5 second setup with a 1.5 second punchline (at least that's the pacing in most Vines in my home feed).

This 6-second limit still constrains the size of Vine's userbase, and they may be okay with that. I think that's fine. I enjoy Vine, it's its own art form. Still, the 6 second limit means a lot of people don't turn to it for a lot of their video sharing. It's not easy to come up with a succinct 6 second video clip.

Look at how Snapchat has evolved to see another company realizing that its power is not the initial constraint but the network. Snapchat still imposes a 10 second limit on video length. But now you can string many videos together into My Story. This was brilliant on their part; it allows viewers to skip any boring clip with one tap, but it allows the creator to tell longer stories simply by shooting multiple snaps in sequence. They lowered the content generation cost on creators without meaningfully increasing it for viewers.

Furthermore, Snapchat now allows you to download your Stories to your camera roll. Those who claim ephemerality is the key to Snapchat's success might panic at such a change, but all it demonstrates is that they realize they now have users for whom ephemerality isn't the main draw of the service. They haven't confused an arbitrary early limit for being the root of their success, and they understand the underlying power of their platform.

Perhaps more than any other social network, Facebook has long recognized that their chief asset is their graph. They've made all sorts of major changes to their interface, changes that always leads to huge initial outcries from their users, followed by a fade to silence as users continue to access the service in increasing numbers.

That they recognized this and had the courage of their convictions from such an early stage is not to be discounted. Plenty of companies live in fear of their early adopters, who often react negatively at any change. This leaves these companies paralyzed, unable to grow when they hit  saturation of their early adopter segment. Because the global market of users has been expanded by the unprecedented reach of connected smart phones, early adopter segments can now number in the tens of millions, confusing companies into thinking that their early adopter segment is actually the mass market.

Twitter, more than any other company, needs to stop listening to its earliest users and recognize that deep down, its core strength is not the 140 character limit per Tweet, nor is it the strict reverse chronological timeline, or many other things its earliest users treat as gospel.

It's not even the ability to follow people, though for its current power users that has proved a useful way to mine some of the most relevant content from the billions of Tweets on the service. If Twitter realizes this, they'll understand that their chief goal should not necessarily be to teach the next several hundred million users how to follow hundreds of people, the way that the early adopters did. To do so is to mistake the next wave of users as being identical in their information consumption preferences and habits as the first 300 million, or whatever the true active count is among that number (I'm going to guess it's in the range of 40 to 80 million truly active daily users, though it's hard to tell without seeing the data).

Twitter's chief strength is that it's an elegant public messaging protocol that allows anyone to write something quickly and easily, and for anyone in the world to see that writing. It's a public marketplace of information. That's an amazing network, and the reason people struggle to describe Twitter is that a platform like that can be used for so many things.

If Twitter realizes that, then they'll realize that making that information marketplace much more efficient is the most critical way to realize the full potential of what is a world-changing concept. How do you match content from people who publish on Twitter with the readers who'd enjoy that content?

A people follow model is one way, but a topic-based matching algorithm is another. Event-based channels are just a specific version of that. Search is one option, but why isn't there browse? I can think of a dozen other ways to turbocharge that marketplace off the top of my head, and the third party developer community, kicked out of the yard by Twitter so many times like stray dogs, could likely come up with dozens of others if they were allowed back in.

Twitter can leave the reverse chronological timeline in place for grumpy early adopters. It can be Twitter Classic. Most of those early adopters are largely happy with things the way they are, and if Twitter is scared to lose them, leave the current experience in place for them. I honestly don't think they'd abandon the service if Twitter raised the 140 character limit, or allowed for following of topics, or any number of other changes suggested here, because I think the power of the network is the network itself, but if the company has any such trepidations, it's not a big deal to leave Twitter Classic in place. The company has a huge engineering and product team, it's easy to park that experience in maintenance mode.

When social networks come into their own, when they realize their power is not in any one feature but in the network itself, they make changes like this that seem heretical. They aren't. Instead, these are fantastic developmental milestones, indicative of a network achieving self-awareness. A feature is trivial to copy. A network, on the other hand, is like a series of atoms that have bonded into a molecule. Not so easy to split.

It's a post for another day, but one of the defining features of our age is the rise of the network. Software may be eating the world, but I posit that networks are going to eat an outsized share because they capitalize disproportionately on the internet. Journalism, advertising, video, music, publishing, transportation, finance, retail, and more—networks are going to enter those spaces faster than those industries can turn themselves into networks. That some of our first generation online social networks have begun self-actualizing is just the beginning of that movement.

“People need dramatic examples to shake them out of apathy and I can't do that as Bruce Wayne. As a man, I'm flesh and blood, I can be ignored, I can be destroyed; but as a symbol... as a symbol I can be incorruptible, I can be everlasting.” Bruce Wayne, Batman Begins

Call me Nostradamus

When Meerkat and then Periscope had the tech world buzzing about live video streaming through mobile phones, I wrote a piece on how the live video streaming space would play out. One bullet in that timeline:

27. Facebook adds a live video streaming button to its app, then shortly after that spins it out into a separate app altogether. They name it Live, and some other company that launched an app called Live that did the same thing a year earlier complains that Facebook stole their name, but no one really pays any attention.
 

From TechCrunch:

Before Periscope and Meerkat jumpstarted the mobile live-streaming craze, Facebook was already quietly working on its own way to let public figures broadcast live videos to their fans. Today, Facebook is launching “Live” as a feature in its Mentions app that’s only available to celebrities with a verified Page.
 
VIPs can start a Live broadcast that’s posted to the News Feed, watch comments overlaid in real-time on their stream, and then make the recording permanently available for viewing. Stars like The Rock and Serena Williams will stream today.
 

Okay, so maybe no one is giving Facebook guff about the name, but I'm still going to give myself a partial high five.

In a post about Venmo and payments as a social network in April, I wrote:

Speaking of the pile of poop emoji, it seems only a matter of time until someone releases an app that allows you to broadcast when you are taking a poop. It should be a mobile app just called Poop. I leave it to the design geniuses at Apple to figure out what type of haptic feedback a poop notification should emit on the Apple Watch.
 

From Mashable this past Friday:

A new chat app called Pooductive aims to create a miniature social network specifically for anyone who gets bored while they are doing a number two, and want to talk to people in the same position.
 
Created by two student developers, the free iPhone app, which began life as a failed Kickstarter, facilitates one-on-one or group chats based on your location. You can choose to message people nearby or be connected with users in other cities or countries.
 
"The fact that there is only little to do whilst tending to ‘number two’ is common knowledge, and truly a first world problem," the developers write on Pooductive's website.
 

Poop is clearly a superior name to Pooductive, so the only reason I didn't nail the name yet again was poor branding instincts on the part of the developers. The sample screenshots of the app in the iTunes App Store are something for the archives, someone actually dreamt up this imaginary chat between two people sitting on the toilet.

I honestly don't know which prediction I'm prouder of.

Facebook hosting doesn't change things, the world already changed

Like any industry, the media loves a bit of navel-gazing (what is the origin of this phrase, because I don't enjoy staring at my own navel; maybe mirror-preening instead?). When Facebook announced they were offering to host content from media sites like The New York Times, the media went into a frenzy of apocalyptic prediction, with Mark Zuckerberg in the role of Mephistopheles.

All this sound and fury, signifying nothing. Whether media sites allow Facebook to host their content or not won't meaningfully change things one way or the other, and much of the FUD being spread would be energy better spent focused on other far larger problems.

Let's just list all the conditions that exist and won't change one bit whether or not you let Facebook host your content:

  • News is getting commodified. The days of being special just for covering a story are over. Beyond millions of citizen journalists that the internet has unleashed, you're competing with software that can do basic reporting. Tech press inadvertently furnished evidence of the commodification of news when, in the past few years, they all did a giant game of musical chairs, seemingly everyone picking up and moving from one site to the next. Are these sites anything more than a collection of their reporters? If so, did the brands meaningfully change when everyone switched seats? I love and respect many tech reporters, but a lot of others seem interchangeable (though I like some of them, too). Instead of just reporting news, what matters is how you report it: your analysis, the quality of your writing and infographics, the uniqueness of your perspective. The bar is higher to stand out, as it tends to be when...
  • ...distribution is effectively free. Instead of pulp, our words take the form of bits that are distributed across...oh, you know. As the Unfrozen Caveman might say, “Your packets of data frighten and confuse me!” The Internet: seventh wonder of the world. This must be what it feels like to have grown up when electricity first became widespread. Or sewer systems. Okay, maybe not as great as sewer systems, I don't know how people lived before that.
  • Marketing is cheaper. You can use Twitter or Facebook or other social media to make a name for yourself. Big media companies can take advantage of that, too, but the incremental advantage is greater for the indies. Ben Thompson is one of my favorite examples, an independent tech journalist/writer living in Taiwan who built up his brand online to the point that I pay him $10 a month to have him send me email every day, and it's worth every penny. He is smarter about the tech industry than just about every “professional” journalist covering tech, and he's covered a lot of what I'm covering here already. He's just one example of how...
  • ...competition for attention is at an all-time high and getting worse. Facebook already competes with you, whether you let them host your content or not. So does Snapchat, Instagram, Twitter, IM, Yik Yak, television, cable, Netflix, video games, Meerkat/Periscope, movies, concerts, Spotify, podcasts, and soon VR. When it comes to user attention, the one finite resource left in media, most distractions are close substitutes.
  • Facebook will continue to gain audience. Even if Facebook pauses for a rest after having gained over 1 billion users, they also own Instagram, which is growing, and WhatsApp, which will likely hit 1 billion users in the near future, and Oculus, which is one part of the VR market which is one portion of the inception of the Matrix that we will all be living in as flesh batteries for Colonel Sanders in the medium-range future. If you think withholding your content from Facebook will change their audience meaningfully one way or the other, you really may be an unfrozen caveman from journalism's gilded age. The truth is...
  • Facebook and Twitter and other social media drive a huge % of the discovery of content. Media companies can already see this through their referral logs. This isn't unique to the text version of media. Facebook drives a huge share of YouTube video streams, which is why they're building their own video service, because why send all that free ad revenue to a competitor when you can move down the stack and own it yourself. And also, YouTube's ad model is not that great: those poorly targeted banner ads that pop up and cover the video in a blatant show of disrespect for the content, those pre-rolls you have to wait 5 seconds to skip...wait a minute, this sounds a lot like how...
  • ...media ad experiences are awful. I wonder sometimes if folks at media companies ever try clicking their own links from within social media like Twitter or Facebook, just to experience what a damn travesty of a user experience it is. Pop-ups that hide the content and that can't be scrolled in an in-app browser so you effectively can't ever close them to read the article. Hideous banner ads all over the page. Another pop-up trying to get you to sign up for a newsletter for the site when you haven't even read the article to see if you'd even want to get that newsletter (the answer is no, by the way). Forced account creation or login screens, also before you read a word of content. An interstitial ad that tries to load video for a few seconds while you wait patiently for a countdown timer or X button to close it out as quickly as possible. Short articles spread across 3 pages for no reason other than to inflate page views. Articles that take so long to load that you just click away because in-app browsers are already disadvantaged from a speed perspective, and media sites compound the problem by loading a ton of cruft like ad tracking and other crap all over the place, reducing the content to just a fraction of the total payload. It's the reading equivalent of being a beautiful girl at a New York bar, getting hit on by dozens of obnoxious first year investment banking analysts in pinstripe suits and Hermès ties. This is what happens when you treat your reader like a commodified eyeball to monetize and not a living, breathing human whose patronage you appreciate and wish to nurture. And this is why I'm happy when services like Flipboard or Facebook transform content into a more friendly reading experience. Chris Cox of Facebook said that reading on mobile is still a crummy experience, and amen to that. The poor media ad experience is a symptom of the fact that...
  • ...media business models are not great. Monopolies don't have to have great business models, because as Peter Thiel will tell you, being a monopoly is itself a great business model. For the longest time at media sites, and this probably still happens, the reporters sat on a different floor for the ad sales folks. This meant that the way the company made money was divorced from the product people (to use a more techie term). This works great when there isn't a lot of user choice (“No one ever got fired for buying IBM”) and the ad sales people can throw their weight around (before), but not so great when ad buyers suddenly have a whole lot more choice in where to spend their money (now). It turns out that having your best product people separate from your ad team is a dangerous game and leads to a terrible ad experience, which should come as a surprise to no one. Many still defend this practice as a way to preserve journalistic integrity, a separation of church and state that keeps the money from corrupting the writing, but the Internet has other ways to defend against that now. It's great that the New York Times has a public editor in Margaret Sullivan, but today the eyeballs of the world on your content serve as one giant collective public editor, like some human blockchain of integrity. I sympathize with media companies, though, because even if they wanted to improve on this front...
  • ...tech companies have better ad platforms than media companies. Facebook's native ad unit may not be perfect, but it's leaps and bounds better than the god awful ad experience on almost any media site. It's better not just for readers, but likely for advertisers, too. At Flipboard, we went with full-page ads a la glossy fashion magazines because our philosophy was that when content is on the screen, it deserves your full attention, and the same with ads, never the two shall meet. This is exacerbated by the smaller screen sizes of mobile phones and tablets. Trying to split user attention with banner ads is a bad idea for both readers and advertisers, and most every study on ad recall and effectiveness that I've seen bear this out. Because of tech companies' scale and technology advantage, as noted in the previous bullet, their ad platforms will continue to get better and scale, while those at media companies will not. When I was at Hulu, we shopped around for an ad platform that could meet all our needs and couldn't find one so we just rolled our own. That's possible if you can hire great developers, but if you're a media company, it's not easy, and that's because...
  • ...tech companies have a tech hiring advantage on non-tech companies. This sounds like it's self-evident, but it's critical and worth emphasizing. It's not just media but other businesses that suffer from this (which is particularly awful for consumers when it comes to information security). At this hour of the third industrial revolution, software is eating the world, but we still have a scarcity of software developers, let alone great ones. The ones that are blessed to live in this age want to work with other great developers at cool technology companies where the lunches are free, the dress codes are flexible, the hours vampiric, and ping pong tables abound. It's like being a free range chicken, but with stock options and before the death and refrigeration. Companies like that include Facebook, Google, Apple, Amazon, and so on, but they don't include most media companies, even though most of those also allow you to dress how you want, I think. Maybe someday the market will overcorrect itself and everyone will know how to program, but by that point we will probably all be living lives of leisure while AI software and robots take care of everything while we just lounge around experiencing a never-ending stream of personalized VR pleasure. If David Foster Wallace were alive to rewrite Infinite Jest, VR would be the infinite jest.
  • Design skill is not equally distributed. In an age when software comes to dominate more of the world, the returns to being great at user interface design are still high and will continue to be for some time. It's no wonder that Apple is the world's largest company now given their skill at integrated software and hardware design. That's become the most valuable user experience in the world to dominate. It's not going to let up, either. Every day I still experience a ton of terrible user experiences, from government to healthcare to education to household appliances to retail to you name it. The number of great product and design people in the world is still much too finite, and it happens that a lot of them work for tech companies. Not for companies in all the other industries I named above. Even in tech, the skills are too finite, which is why enterprise software is being disrupted by companies like Dropbox and Slack and others that simply bring a better user experience than the monstrosities that pass for most enterprise software. And yes, these people tend not to work for media companies.
  • Tech companies are rich. Take all the factors above, add it up, and it comes down to the fact that we're living through another gold rush, and this time most of the wealth is flowing into Silicon Valley. Take a bunch of companies that are extremely wealthy and employ great software developers and designers at a time when software is eating the world, add in a healthy dose of world-changing ambition, and you get companies that keep expanding their footprints, to the point where they are all competing in almost every business. People wonder why Apple might build a car, but I say why not? Above all, they are great at building computers, and what is a Tesla other than another portable computer (“The first one is an oversized iPad. The second is a revolutionary transport vehicle. And the third is a portable air conditioner. So, three things: an oversized iPad, a revolutionary transport vehicle, and a portable air conditioner. An iPad, a vehicle, and an air conditioner. An iPad, a vehicle…are you getting it? These are not three separate devices, this is one device, and we are calling it Apple Car.”)? Facebook, Apple, Google, Amazon, et al all continue to compete directly in more and more spaces because at their heart they are all software companies. I suppose they could have all decided not to compete with each other, but companies looking to maximize return in free markets usually don't behave that way, and so we'll see all of them trying to do more and more of the same things, like stream music and video, build smart phones, deliver stuff, etc. That's how a nuclear arms race happens. Your neighbor has the bomb, it's pointed at some part of your business, you get one too, if for no other reason than defensive purposes. Meanwhile, you also try to do some virgin land grabs, because networked businesses tend to reward first movers, and that's how you end up with tech companies trying to colonize space, build self-driving cars, float balloons around the world to bring the Internet to everyone, and, to bring it full circle, be the new front page for every user.

It's worth repeating: all the things above have been happening, are happening, and will continue to happen whether or not Facebook hosts your content.

By the way, you can still host your own content yourself, even if you let Facebook host yours. Getting yourself set up to host content on Facebook is largely a one-time fixed cost of some time to provide them with some feed. It was the same at Flipboard, though some companies took longer than expected because they couldn't output an RSS feed of their content out of legacy CMS systems. It was shocking to learn that a random blogger on Squarespace or Wordpress or Tumblr could syndicate their content more easily than a famous media company, but that was often the case and speaks to the tech deficit in play here.

This may all sound grim for media companies, but here's the kicker: it really is that grim. Wait, are kickers supposed to be positive? Maybe I meant kick in the butt.

Okay, I can offer some positives. A media company may not be able to be world class at every layer of the full stack, from distribution and marketing to ad sales and producing great content, but it doesn't have to be. Far better to be really good at one part of that, the one that tech companies are least likely to be good at, and that's producing great, differentiated content.

The fact is, great content is not yet commodified. That may sound like Peter Thiel's advice to be a monopoly. Self-evident, non-trivial, not useful. But many of the best advice is just that, as banal as a fortune cookie prescription but no less true.

Let's take The New Yorker as an example. They don't try to compete on breaking news, though they have beefed up on that front with their online blogs. They hire great writers who go long on topics, and thus they can charge something like $50 a year for a subscription because their content is peerless. I'm subscribed through something like 2020 (so please stop mailing me renewal solicitations, New Yorker, please!?).

Look at Techmeme. They provide value by curating all the tech news out there, using a mix of human and algorithm to prioritize the tech news stream to produce Silicon Valley's front page at any given moment in time. Curation is a key part of discovery, you don't have to focus on producing content yourself. A daily visit for me.

Look at HBO. A media company with great content that you can't easily find a substitute for, with a smart content portfolio strategy that minimizes subscriber churn. They surprised me recently by announcing they were going to launch HBO Now, ahead of when I anticipated, at the same price it costs to add it on to cable package. Kudos to them for not letting innovator's dilemma handcuff them for too long.

Look at Buzzfeed. Ignore the jealous potshots from their elders and marvel at their ability to create content you can't easily find elsewhere. That's right, I said it. Despite being the company that everyone says just rips off other people's content, Buzzfeed actually has more content I can't find substitutes for than most tech news sites. It's not just their original content and reporting, which is good and getting better. Like Vox trying to make the top news stories of the day digestible for more people, Buzzfeed takes fun content and packages it in a really consumable way. It turns out in a world of abundance, most people would prefer just a portion of their media diet from the heavy news food group. More of their daily diet is from the more fun food groups, and Buzzfeed owns a ton of shelf space in that aisle. It's something other sites can do, but many avoid because they're too proud or because it isn't part of their brand. I saw white and gold, BTW.

Look at Grantland. They also hit the fun part of the daily diet by targeting pop culture and sports with great writers and new content daily. People jab at Bill Simmons' a lot now that he is in the media penthouse, but he started as a blogger for AOL, and he was the first writer to really channel the fan's voice and point of view. It could've been you, perhaps, but it wasn't.

Look at Gruber, or Ben Thompson, or Marc Maron, or Serious Eats, or The Wirecutter, or Adam Carolla. Hell, even look at Fox News (just don't look too long). It turns out that differentiated content is differentiated. When the world's an all-you-can-eat buffet of information, you want to be the king crab legs, not the lettuce bowl.

The value of being a generalist as a reporter, someone who just shows up and asks questions and transcribes them into a summary article, is not that valuable. If you cover an industry, do you understand that industry? Take tech reporters as an example, many of them don't understand the underlying technology they write about. That may have sufficed in a bygone age, but it no longer does, which is good for Gruber's claim chowder business but not good business. Taking the time to become an expert in a domain still has value because it takes hard work, and that is also not a resource that is equally distributed in the world.

Some companies try to tackle more than one part of the stack, with some success. Look at MLBAM. They have managed to hire some strong technologists and build such a powerful platform that other media companies are syndicating it for their own use. Yeah, it's great to have content from a legally sanctioned monopoly to bootstrap your business, but credit to them for embracing the future and leveraging that content goldmine to build a differentiated technology platform.

Is it easy to replicate any of those? No, but your mother should have taught you that lesson long ago. At least what they're doing is clear and understandable to any outside observer.

If you've stuck with me this long, you may still think that hosting your content on Facebook is a Faustian bargain. Maybe Facebook changes their News Feed algorithm and your traffic vanishes overnight, like Zynga. Or maybe Facebook holds you hostage and asks for payment to promote your content more in the News Feed.

It's possible, but that risk exists whether your content is hosted there or not. Maybe hosting minimizes that risk a bit, but Facebook's first priority will always be to keep their user's attention and engagement because that's how they keep their lights on (and pay for the free lunches). If your content is engaging, it will keep a News Feed roof over its head, and if it doesn't, it won't.

Does that mean you have to write clickbait headlines and package stories up into listicles with animated GIFs? I don't think so, and if that's not your brand then by all means steer clear. That doesn't mean you shouldn't write a compelling headline. I despise clickbait headlines that just try to coax a click when the content has barely anything of substance, just to gain a cheap page view, but I appreciate a well-written headline over a dull one, too. Jeff Bezos used to caution us against the “tyranny of the or,” or false tradeoffs. This is one example. I also believe Zuckerberg and other Facebook execs when they say they'd like to weed out the more egregious clickbait from the News Feed. I understand if others don't, but my general belief about most tech companies is that they're just semi-evil.

Let's go deeper into the FUD. What if Facebook decides to go into the media business themselves? What if, instead of hosting your content, they produce their own and prefer it in the News Feed?

First of all, if that ever happens, it won't happen anytime soon. When you're in the phase of convincing folks to hop aboard your platform, you have to remove that possibility or no one will join.

Secondly, content production isn't generally a business that tech companies love. The margins aren't great, it's a headache to manage creative types, content production is messy and labor intensive, and tech companies prefer playgrounds where software economics play better.

It's far more likely that tech companies use their ample cash to license content. Remember how I said tech companies are rich? It turns out they are richer than movie studios and TV networks and newspapers and book publishers and music labels, and it turns out that writing a check for exclusive content hurts in the short-term but is great in the long run paired with the right business model, regardless of whether that's subscription or subsidized by ads. If you have the best ad units and platform, the marginal return on user attention is higher for you than the next competitor, and that means licensing can make sense. You also get to meet some celebrities, too, who are beautiful and charming.

Lastly, if Facebook wanted to go into the media business, they could do it now, or they could do it in the future, and your Facebook hosting abstinence wouldn't matter one bit. They already have all the eyeballs they need, it's not a situation like Netflix in its early days where they had to build a subscriber base first before they could consider producing their own original content (thank you First Sale Doctrine!). Long before Facebook even had a News Feed where your articles were shared, hundreds of millions of people already tuned in to see what that cute guy or girl was up to, or to see their friends' latest selfie, and other forms of ambient intimacy. I could perhaps even craft an argument where if all the sites out there stood on the sidelines it might accelerate Facebook's move into the space.

And if Facebook did, if they decided to compete with The New York Times and Grantland and all the other media companies, or to buy one or more of them, is that so bad? Maybe you could work for them, if you're unique and differentiated. If you are, you'll do just fine, in this world and the next.

Did I mention they have free lunches?

Information previews in modern UI's

[I don't know if Facebook invented this (and if they didn't, I'm sure one of my readers will alert me to who did), but it's certainly the service which has used it to greatest effect which I suppose is the case for anything they put to use given their scale.]

One problem with embedded videos as opposed to text online has always been the high cost of sampling the video. Especially for interviews, I'd almost always rather just have the transcript than be forced to wade through an entire video. Scanning text is more efficient than scanning online video.

Facebook has, for some time now, autoplayed videos in the News Feed with the audio on mute. Not only does it catch your eye, it automatically gives you a motion preview of the video itself (without annoying you with the audio), thus lowering the sampling cost. To play the video, you click on it and it activates the audio. I'm sure the rollout of this UI change increased video clicks in the News Feed quite a bit. Very clever. I've already seen this in many mobile apps and expect it to become a standard for video online.

[It's trickier when videos include pre-roll ads; it's not a great user experience to be enticed to watch a video by an autoplayed clip, then to be dropped into an ad as soon as you act on your interest.]

Someday, the autoplayed samples could be even smarter; perhaps the video uploader could define in and out points for a specific sample, or perhaps the algorithm which selects the sample could be smarter about the best moment to select.

It's not just video where sampling costs should be minimized. Twitter shows a title, image, and excerpts for some links in its Timelines, helping you to preview what you might get for clicking on the link. They show these for some but not all links. I suspect they'd increase clickthroughs on those links quite a bit if they were more consistent in displaying those preview Twitter cards.

Business Insider and Buzzfeed linkbait-style headlines are a text analogue, albeit one with a poor reputation among some. Given the high and increasing competition for user attention at every waking moment, it's not clear that services can leave any such tactical stones unturned.