My most popular posts

I recently started collecting email addresses using MailChimp for those readers who want to receive email updates when I post here. Given my relatively low frequency of posts these days, especially compared to my heyday when I posted almost daily, and given the death of RSS, such an email list may have more value than it once did. You can sign up for that list from my About page.

I've yet to send an email to the list successfully yet, but let's hope this post will be the first to go out that route. Given this would be the first post to that list, with perhaps some new readers, I thought it would be worth compiling some of my more popular posts in one place.

Determining what those are proved difficult, however. I never checked my analytics before, since this is just a hobby, and I realized when I went to the popular content panel on Squarespace that their data only goes back a month. I also don't have data from the Blogger or Movable Type eras of my blog stashed anywhere, and I never hooked up Google Analytics here.

A month's worth of data was better than nothing, as some of the more popular posts still get a noticeable flow of traffic each month, at least by my modest standards. I also ran a search on Twitter for my URL and used that as a proxy for social media popularity of my posts (and in the process, found some mentions I'd never seen before since they didn't include my Twitter handle; is there a way on Twitter to get a notification every time your domain is referenced?).

In compiling the list, I went back and reread these posts for the first time in ages added a few thoughts on each.

  • Compress to Impress — my most recent post is the one that probably attracted most of the recent subscribers to my mailing list. I regret not including one of the most famous cinematic examples of rhetorical compression, from The Social Network, when Justin Timberlake's Sean Parker tells Jesse Eisenberg, "Drop the "The." Just Facebook. It's cleaner." Like much of the movie, probably made up (and also, why wasn't the movie titled just Social Network?), but still a good example how movies almost always compress the information to be visually compact scenes. The reason people tend to like the book better than the movie adaptation in almost every case is that, like Jeff Bezos and his dislike of Powerpoint, people who see both original and compressed information flows feel condescended and lied to by the latter. On the other hand, I could only make it through one and a half of the Game of Thrones novels so I much prefer the TV show's compression of that story, even as I watch every episode with super fans who can spend hours explaining what I've missed, so it feels like I have read the books after all.
  • Amazon, Apple, and the beauty of low margins — one of the great things about Apple is it attracts many strong, independent critics online (one of my favorites being John Siracusa). The other of the FAMGA tech giants (Facebook, Amazon, Microsoft, Google) don't seem to have as many dedicated fans/analysts/critics online. Perhaps it was that void that helped this post on Amazon from 2012 to go broad (again, by my modest standards). Being able to operate with low margins is not, in and of itself, enough to be a moat. Anyone can lower their prices, and more generally, any company should be wary of imitating any company's high variance strategy, lest they forget all the others who did and went extinct (i.e., a unicorn is a unicorn because it's a unicorn, right?). Being able to operate with low margins with unparalleled operational efficiency, at massive scale globally, while delivering more SKUs in more shipments with more reliability and greater speed than any other retailer is a competitive moat. Not much has changed, by the way. Apple just entered the home voice-controlled speaker market with its announcement of the HomePod and is coming in from above, as expected, at $349, as the room under Amazon's price umbrella isn't attractive.
  • Amazon and the profitless business model fallacy — the second of my posts on Amazon to get a traffic spike. It's amusing to read some of the user comments on this piece and recall a time when every time I said anything positive about Amazon I'd be inundated with comments from Amazon shorts and haters. Which is the point of the post, that people outside of Amazon really misunderstood the business model. The skeptics have largely quieted down nowadays, and maybe the shorts lost so much money that they finally went in search of weaker prey, but in some ways I don't blame the naysayers. Much of their misreading of Amazon is the result of GAAP rules which really don't reveal enough to discern how much of a company's losses are due to investments in future businesses or just aggressive depreciation of assets. GAAP rules leave a lot of wiggle room to manipulate your numbers to mask underlying profitability, especially when you have a broad portfolio of businesses munged together into single line items on the income statement and balance sheet. This doesn't absolve professional analysts who should know better than to ignore unit economics, however. Deep economic analysis isn't a strength of your typical tech beat reporter, which may explain the rise of tech pundits who can fill that gap. I concluded the post by saying that Amazon's string of quarterly losses at the time should worry its competitors more than it should assure them. That seems to have come to fruition. Amazon went through a long transition period from having a few very large fulfillment centers to having many many more smaller ones distributed more broadly, but generally located near major metropolitan areas, to improve its ability to ship to customers more quickly and cheaply. Now that the shift has been completed for much of the U.S., you're seeing the power of the fully operational Death Star, or many tiny ones, so to speak.
  • Facebook hosting doesn't change things, the world already changed — the title feels clunky, but the analysis still holds up. I got beat up by some journalists over this piece for offering a banal recommendation for their malady (focus on offering differentiated content), but if the problem were so tractable it wouldn't be a problem.
  • The network's the thing — this is from 2015, and two things come to mind since I wrote it.
    • As back then, Instagram has continued to evolve and grow, and Twitter largely has not and has not. Twitter did stop counting user handles against character limits and tried to alter its conversation UI to be more comprehensible, but the UI's still inscrutable to most. The biggest change, to an algorithmic rather than reverse chronological timeline, was an improvement, but of course Instagram had beat them to that move as well. The broader point is still that the strength of any network lies most in the composition of its network, and in that, Twitter and other networks that have seened flattening growth, like Snapchat or Pinterest, can take solace. Twitter is the social network for infovores like journalists, technorati, academics, and intellectual introverts, and that's a unique and influential group. Snapchat has great market share among U.S. millennials and teens, Pinterest among women. It may be hard for them to break out of those audiences, but those are wonderfully differentiated audiences, and it's also not easy for a giant like Facebook to cater to particular audiences when its network is so massive. Network scaling requires that a network reduce the surface area of its network to each individual user using strategies like algorithmic timelines, graph subdivision (e.g., subreddits), and personalization, otherwise networks run into reverse economies of scale in their user experience.
    • The other point that this post recalls is the danger of relying on any feature as a network moat. People give Instagram, Messenger, FB, and WhatsApp grief for copying Stories from Snapchat, but if any social network has to pin its future on any single feature, all of which are trivial to replicate in this software age, that company has a dim future. The differentiator for a network is how its network uses a features to strengthen the bonds of that network, not the feature itself. Be wary of hanging your hat on an overnight success of a feature the same way predators should be wary of mutations that offer temporary advantages over their prey. The Red Queen effect is real and relentless.
  • Tower of Babel — From earlier this year, and written at a time when I was quite depressed about a reversal in the quality of discourse online, and how the promise of connecting everyone via the internet had quickly seemed to lead us all into a local maximum (minimum?) of public interaction. I'm still bullish on the future, but when the utopian dreams of global connection run into the reality of human's coalitional instincts and the resentment from global inequality, we've seen which is the more immovable object. Perhaps nothing expresses the state of modern discourse like waking up to see so many of my followers posting snarky responses to one of Trump's tweets. Feels good, accomplishes nothing, let's all settle for the catharsis of value signaling. I've been guilty of this, and we can do better.
  • Thermodynamic theory of evolution — actually, this isn't one of my most popular posts, but I'm obsessed with the second law of thermodynamics and exceptions to it in the universe. Modeling the world as information feels like something from the Matrix but it has reinvigorated my interest in the physical universe.
  • Cuisine and empire — on the elevation of food as scarce cultural signal over music. I'll always remember this post because Tyler Cowen linked to it from Marginal Revolution. Signalling theory is perhaps one of the three most influential ideas to have changed my thinking in the past decade. I would not underestimate its explanatory power in the rise of Tesla. Elon Musk and team made the first car that allowed wealthy people to signal their environmental values without having to also send a conflicting signal about their taste in cars. It's one example where actually driving one of the uglier, less expensive EV's probably would send the stronger signal, whereas generally the more expensive and useless a signal the more effective it is.
  • Your site has a self-describing cadence — I'm fond of this one, though Hunter Walk has done so much more to point to this post than anyone that I feel like I should grant him a perpetual license to call it his own. It still holds true, almost every service and product I use online trains me how often to return. The only unpleasant part of rereading this is realizing how my low posting frequency has likely trained my readers to never visit my blog anymore.
  • Learning curves sloping up and down — probably ranks highly only because I have such a short window of data from Squarespace to examine, but I do think that companies built for the long run have to come to maintain a sense of the slope of their organization's learning curve all the time, especially in technology where the pace of evolution and thus the frequency of existential decisions is heightened.
  • The paradox of loss aversion — more tech markets than ever are winner-take-all because the internet is the most powerful and scalable multiplier of network effects in the history of the world. Optimal strategy in winner-take-all contests differs quite a bit from much conventional business strategy, so best recognize when you're playing in one.
  • Federer and the Paradox of Skill — the paradox of skill is a term I first learned from Michael Mauboussin's great book The Success Equation. This post applied it to Roger Federer, and if he seems more at peace recently, now that he's older and more evenly matched in skill to other top players, it may be that he no longer feels subject to the outsized influence of luck as he did when he was a better player. In Silicon Valley, with all its high achieving, brilliant people, understanding the paradox of skill may be essential to feeling jealous of every random person around you who fell into a pool of money. The Paradox of Skill is a cousin to The Red Queen effect, which I referenced above and which tech workers of the Bay Area should familiarize themselves with. It explains so much of the tech sector but also just living in the Bay Area. Every week I get a Curbed newsletter, and it always has a post titled "What $X will get you in San Francisco" with a walkthrough of a recent listing that you could afford on that amount of monthly rent. Over time they've had to elevate the dollar amount just to keep things interesting, or perhaps because what $2900 can rent in you in SF was depressing its readers.

Having had this blog going off and on since 2001, I only skimmed through through a fraction of the archives, but perhaps at some point I'll cringe and crawl back further to find other pieces that still seem relevant.

The network's the thing

Last week Instagram announced it was supporting more than just square aspect ratios for photos and videos. This led of course to a Slate article decrying the move, because Slate is that friend that has to be contrarian on every topic all the time, just to be annoying.

The square confines Instagram users to a small area of maneuver. It forces us to consider what details are essential, and which can be cropped out. It spares us from indulgence of the landscape and the false promise of the panorama.
 
But Instagram, which is owned by Facebook, is in the business of accommodating its users, not challenging them. One of the problems with the square, the company explained in its announcement, is that “you can’t capture the Golden Gate Bridge from end to end.” This example speaks to the needs of a certain kind of Instagram user who enjoys planting his flag on settled territory. Like an iPhone videographer at a Taylor Swift concert, the guy Instagramming the Golden Gate Bridge is not creating a rare or essential document, only proof that he saw it with his own eyes.
 
And why did he bother doing that, anyway? Clearly, because photographs cannot really capture the scope of the Golden Gate Bridge, or St. Peter’s Basilica, or the view from your car window as you drive up the Pacific Coast Highway. The impulse to capture these moments on camera is shaded by the knowledge that the moment, in all its immediacy, is too large to fit in a frame of any size.
 

I don't think my friend who snapped a pic of her daughter this morning or the friend who memorialized the little leaf the barista made in the foam on his latte was contemplating how wonderful it was that they were sparing me from the “indulgence of the landscape and the false promise of the panorama” but what do I know. I'm fairly certain the guy Instagramming the Golden Gate Bridge (I've done that a few times on Instagram) realizes he's not “creating a rare or essential document” but it never hurts to remind him, I'm sure he appreciates being set in his artistic place.

I'm glad Instagram is accommodating the additional aspect ratios, and it's a sign of how powerfully their network has matured. People confuse arbitrary limits on social networks—Twitter's 140 character limit, Instagram's square aspect ratio and limited filters, to take two prominent examples—with their core asset, which is the network itself. Sure, the limits can affect the nature of the content shared, but Instagram is above else a pure and easy way to share visual content with other people and get their feedback. That they started allowing videos and now differing aspect ratios doesn't change the core value of the network, which is the graph.

In fact, this move by Instagram validates the power of their network. If they were failing they either wouldn't have survived long enough to make such a move or it would be positioned as some desperate pivot. Instagram is dealing from a position of strength here, expanding the flexibility of its tools to meet the needs of a still growing user base.

In the same way, Twitter should have lifted the 140 character limit on DMs much earlier than they did. The power of Twitter, deep down, is that it's a public messaging protocol. The 140 character limit is not its secret power. The network is.

I'd actually remove the 140 character limit on Tweets as well, though such a move would undoubtedly spawn even more of a public outcry than Instagram's move since so many power users of Twitter are journalists. Yes, a 140 character limit enforces some concision in writing, rewarding the witty among us, but it also alienates a lot of people who hate having to edit a thought multiple times just to fit in the arbitrary limit. Lots of those people abandoned Twitter and publish on Facebook instead. Twitter could always choose to limit how much of a Tweet to display in the Timeline so as to allow for a higher vertical density of Tweets in the timeline, when people are scanning.

Look at how many users of Twitter have to break long thoughts across multiple Tweets, in Tweetstorms or just long linked series of Tweets. Many of those are power users, yet I still see power users do it incorrectly every day, making it really difficult to follow the entire sequence. Users who want to link tweets in a Tweetstorm or just to link their own Tweets together in a series should reply to each of their Tweets, removing their own username in the process. This allows readers who click one tweet to easily see the rest of the Tweets in the series, and removing one's own username adds back some characters for the content and prevents it from seeming as if you're talking to yourself like a crazy person. That many have no idea how to do it is just one of Twitter's usability issues. It's a wonderfully elegant public messaging protocol, but its insistence on staying so low level is crazy. Don't even get me started on putting a period before a username in a Tweet, try explaining that to your mother with a straight face.

Here's another example. Look at how many experienced Twitters users now turn to apps like OneShot to attach screenshots of text to their Tweets as photos, to circumvent the 140 character limit. I happen to really enjoy those screenshorts, as they're sometimes called now, and they demonstrate how Twitter could expand their 140 character limit without overwhelming the Timeline: just truncate at some point and add a click to expand function. This is yet another example of users generating useful innovation on top of Twitter when it should be coming from within the company.

Rather than force users to jump through all these hoops to publish longer content, Twitter could just allow users to write more than 140 characters in one Tweet, truncating the whole of it after some limit and posting a Read More button to allow readers to see the rest of the thought. Blasphemy! many will shout. I can already see the pitchforks in the distance. Some good old blasphemy is just what Twitter needs.

Longer character limits would likely increase the ability to follow conversations and dialogues on the service, too. One of the wonderful things about Twitter is that conversations between specific users can be read by other users. That's one of the greatest things about Twitter as a public messaging protocol. But because replies have to fit within 140 characters, often they need to be broken up into multiple Tweets. Many who reply don't realize that unless they hit the reply button on the previous Tweet in the conversation, the dialogue link is broken. Many mistakenly compose a new Tweet to continue the dialogue, not realizing that any reader clicking on that Tweet will not automatically see other Tweets in that conversation. Instead, it will just display by itself, as an orphan.

I run into this multiple times every day on the service, clicking on a Tweet without any easy way to figure out what it was in response to. If a lot of time has passed, it's often impossible to piece the conversation back together. It drives me crazy. I tried explaining how to piece broken conversation threads like this back together to a few people who abandoned Twitter and then realized I sounded like a madman. Why, in this day and age, should they have to learn such low level nonsense? Threaded conversations are, for the most part, a solved UI issue in this day and age.

I'm not done with the character limits, so hold your disgust. You may wish to bring more than just your pitchforks after I'm done. Every Twitter conversation that involves more than two people devolves into a short series of retorts that eventually dies because each additional username consumes more of the 140 character limit, until there is no more room for actual dialogue.

It's absurd, but it's always been that way. Why usernames count towards the 140 character limit has always befuddled me. Meaningful conversation always has to migrate off of Twitter to some other platform, for no reason other than a stubborn allegiance to an arbitrary limit which made sense in the SMS age but now is a defect. If you're going to keep a character limit (could we at least double it?), let's not have usernames count against the limit. In fact, if I hit reply to someone's Tweet, do we even need to insert that person's username at the front of the Tweet? You can still send the notification to that user that I replied to their Tweet, and suddenly my reply won't seem so oddly formatted to the average reader. There are plenty of ways to indicate who the message is addressed to through contextual formatting, and if I wanted to mention them explicitly I could always write @username in the Tweet. But it's unnecessary to insert it by default.

Vine is perhaps the only network whose chief content-creation limit seems intrinsically part of the network, and that's because video is one type of content which can't be scanned, in which each additional second of content imposes a linear attention cost of one second on the viewer. A six minute video costs the reader 60X the attention cost that a 6 second video does, and to even create a 6 second video of any interest requires some clever editing to produce a coherent narrative. A Vine video joke has its own distinct pace, it's like a two line riddle, often a 4.5 second setup with a 1.5 second punchline (at least that's the pacing in most Vines in my home feed).

This 6-second limit still constrains the size of Vine's userbase, and they may be okay with that. I think that's fine. I enjoy Vine, it's its own art form. Still, the 6 second limit means a lot of people don't turn to it for a lot of their video sharing. It's not easy to come up with a succinct 6 second video clip.

Look at how Snapchat has evolved to see another company realizing that its power is not the initial constraint but the network. Snapchat still imposes a 10 second limit on video length. But now you can string many videos together into My Story. This was brilliant on their part; it allows viewers to skip any boring clip with one tap, but it allows the creator to tell longer stories simply by shooting multiple snaps in sequence. They lowered the content generation cost on creators without meaningfully increasing it for viewers.

Furthermore, Snapchat now allows you to download your Stories to your camera roll. Those who claim ephemerality is the key to Snapchat's success might panic at such a change, but all it demonstrates is that they realize they now have users for whom ephemerality isn't the main draw of the service. They haven't confused an arbitrary early limit for being the root of their success, and they understand the underlying power of their platform.

Perhaps more than any other social network, Facebook has long recognized that their chief asset is their graph. They've made all sorts of major changes to their interface, changes that always leads to huge initial outcries from their users, followed by a fade to silence as users continue to access the service in increasing numbers.

That they recognized this and had the courage of their convictions from such an early stage is not to be discounted. Plenty of companies live in fear of their early adopters, who often react negatively at any change. This leaves these companies paralyzed, unable to grow when they hit  saturation of their early adopter segment. Because the global market of users has been expanded by the unprecedented reach of connected smart phones, early adopter segments can now number in the tens of millions, confusing companies into thinking that their early adopter segment is actually the mass market.

Twitter, more than any other company, needs to stop listening to its earliest users and recognize that deep down, its core strength is not the 140 character limit per Tweet, nor is it the strict reverse chronological timeline, or many other things its earliest users treat as gospel.

It's not even the ability to follow people, though for its current power users that has proved a useful way to mine some of the most relevant content from the billions of Tweets on the service. If Twitter realizes this, they'll understand that their chief goal should not necessarily be to teach the next several hundred million users how to follow hundreds of people, the way that the early adopters did. To do so is to mistake the next wave of users as being identical in their information consumption preferences and habits as the first 300 million, or whatever the true active count is among that number (I'm going to guess it's in the range of 40 to 80 million truly active daily users, though it's hard to tell without seeing the data).

Twitter's chief strength is that it's an elegant public messaging protocol that allows anyone to write something quickly and easily, and for anyone in the world to see that writing. It's a public marketplace of information. That's an amazing network, and the reason people struggle to describe Twitter is that a platform like that can be used for so many things.

If Twitter realizes that, then they'll realize that making that information marketplace much more efficient is the most critical way to realize the full potential of what is a world-changing concept. How do you match content from people who publish on Twitter with the readers who'd enjoy that content?

A people follow model is one way, but a topic-based matching algorithm is another. Event-based channels are just a specific version of that. Search is one option, but why isn't there browse? I can think of a dozen other ways to turbocharge that marketplace off the top of my head, and the third party developer community, kicked out of the yard by Twitter so many times like stray dogs, could likely come up with dozens of others if they were allowed back in.

Twitter can leave the reverse chronological timeline in place for grumpy early adopters. It can be Twitter Classic. Most of those early adopters are largely happy with things the way they are, and if Twitter is scared to lose them, leave the current experience in place for them. I honestly don't think they'd abandon the service if Twitter raised the 140 character limit, or allowed for following of topics, or any number of other changes suggested here, because I think the power of the network is the network itself, but if the company has any such trepidations, it's not a big deal to leave Twitter Classic in place. The company has a huge engineering and product team, it's easy to park that experience in maintenance mode.

When social networks come into their own, when they realize their power is not in any one feature but in the network itself, they make changes like this that seem heretical. They aren't. Instead, these are fantastic developmental milestones, indicative of a network achieving self-awareness. A feature is trivial to copy. A network, on the other hand, is like a series of atoms that have bonded into a molecule. Not so easy to split.

It's a post for another day, but one of the defining features of our age is the rise of the network. Software may be eating the world, but I posit that networks are going to eat an outsized share because they capitalize disproportionately on the internet. Journalism, advertising, video, music, publishing, transportation, finance, retail, and more—networks are going to enter those spaces faster than those industries can turn themselves into networks. That some of our first generation online social networks have begun self-actualizing is just the beginning of that movement.

“People need dramatic examples to shake them out of apathy and I can't do that as Bruce Wayne. As a man, I'm flesh and blood, I can be ignored, I can be destroyed; but as a symbol... as a symbol I can be incorruptible, I can be everlasting.” Bruce Wayne, Batman Begins

This week in NBA Twitter

That should be a television show. It's too bad Twitter wasn't around when Michael Jordan was at the height of his basketball powers because his homicidal competitive streak would have had him up all night looking for any perceived slight on Twitter and then responding in some terrifyingly inappropriate manner.

Having MJ-wannabe Kobe actively tweeting is a solid consolation prize, though.

Moments in tech history: surge pricing

According to this site which claims to be able to surface the first tweet on any subject, the first two tweets about Uber surge pricing were these:

The first wonders what surge pricing is, and then the second, coming just five minutes later, complains about it. A succinct and perfect summary of the public reception to surge pricing for the history books.

We live in glorious times, when the time to the inevitable backlash approaches zero.

“Platform” risk

Last night, Twitter curtailed Meerkat's access to its graph. I saw lots of discussion on Twitter (I'd say this was ironic but it's just expected) about why and whether Twitter should just compete on its own merits with its recent acquisition Periscope.

Some have termed what happened to Meerkat “platform risk,” and it is, but one must be willfully naive to consider ad-monetized social graphs like Facebook and Twitter to be capital P Platforms. I prefer to call them little p “platforms” (I'm drawing air quotes with my fingers in case you aren't watching me live on Meerkat as I write this).

Amazon Web Services (AWS) is a Platform. That is, you can count on it even if you use it to compete with its parent company Amazon. Netflix still uses AWS in their tech stack even as Amazon Instant Video is spending over a billion dollars on content to battle it out with Netflix in the video streaming space, to name one example, and I've yet to hear of any company of any size getting bounced from AWS because they were competitive to Amazon. You could even start a retail company and use AWS. It's a utility like the power company.

The reasons why lie in both Amazon's business model and philosophy. AWS isn't free. This is crucial because Amazon makes money off of its AWS customers regardless of what business they're in. As for AWS's philosophy, you can call it altruistic or just pragmatic or both, but if Amazon wants to compete with a company that uses AWS, Amazon will try to beat them in the marketplace. If they can't, they still get a bite of that competitor's income through AWS fees. It's a win either way, and considering AWS is a fast-growing platform that's a critical piece of the world's technology stack, it's more than a minor one.

Compare this to free tech platforms offered by companies like Facebook and Twitter that make money off of ads targeted at their social graphs. If a company like Meerkat comes along and piggybacks off the Twitter graph to explosive growth and captures a unique graph, in this case around live video-casting, Twitter doesn't make any money. On the contrary, since the network effects of graph-based products tend to lead to “winner takes all” lock-in, Twitter just ends up having armed a formidable competitor that it might have to spend a lot to buy or compete with later. It's a no-win situation.

Facebook has similar ambivalence as a platform. Anyone familiar with the tech space in recent years can name more than one company that rode the Facebook graph and News Feed to explosive growth only to plummet off a cliff when Facebook turned a knob behind the scenes or just cut off access.

None of this should be surprising unless you're some “don't be evil” idealist. Take a more realpolitik view of tech and put yourself in Twitter and Facebook's shoes. Why do they want developers to build off of their platforms?  The most ideal developers on their platforms would be apps and services that publish lots of great content into Facebook's News Feed and Twitter's Timeline such that users spent more time in either service seeing ads.

The worst kind of developer would be one that used either the News Feed or Timeline just as a captive notification stream to build their own competitive social graph. Meerkat is guilty of at least one part of that. Meerkat leaves random links in Twitter that take users out of Twitter's timeline to some other app to experience content, and Meerkat's stale links just sit in Twitter timelines like branding debris or worse, as spam.

For all its press these past few weeks, Meerkat's graph is relatively shallow. However, the potential for being first to get traction as another near real-time medium of note was rising with every live broadcast notification from another tech influencer. As Twitter knows better than anyone, it's not necessarily how many users you convert in the beginning of your journey to create a high-value graph, it's who you convert, and Meerkat had captured the imagination of some real luminaries. Furthermore, Meerkat is actually more real-time than Twitter, which lays claim to being the best publicly available real-time social network.

Notifications are the most valuable communication channel of the modern age given the ubiquity of smartphones, and Facebook and Twitter are among the most valuable information streams to tap into given their large user bases and extensive graphs. Email is no longer the summit of the communication hierarchy, and both Facebook and Twitter want to avoid the spam issue that polluted email's waterfalls.

This conflict of interest is why I refer to Facebook and Twitter as little p platforms. Developer beware. Unless they change their business model, any developer trying to build some other graph off of Facebook or Twitter should have a second strategy in place in case of explosive growth because access won't persist.

Even before Facebook and Twitter, this type of platform risk from ad-supported businesses lay in wait to trap unsuspecting companies. Google search engine traffic is one of the more well-known ones. Google's PageRank algorithm is, for the most part, a black box, and I've encountered many a company that fell on hard times or went out of business after Google tweaked PageRank behind the scenes and turned off the bulk of a their organic traffic overnight. As Google enters more and more businesses, that platform risk only escalates.

Alternative Platforms do exist, even if they're not perfect, and that matters because AWS, as developer friendly as it is, doesn't offer a useful graph for companies looking for viral growth.

The most important such platform to date might be Apple's contact book. It's certainly one of the largest graphs in the world, and Apple doesn't rely on advertising to those users for income. The App Store is not completely open, but it's reasonably so, and once you're approved as an app it's rare that Apple would pull the rug out from underneath you the way Facebook and Twitter have.

Phone numbers were the previous generation's most accessible and widespread key for identity and the social graph, and Apple's iOS and Google's Android operating systems and the rise of the smartphone suddenly opened a gateway to that graph. Many messaging apps bootstrapped alternative or parallel social graphs just that way. I doubt the telcos were looking that many moves ahead on the chess board, and even if they had, I'm not sure they would have had much recourse even if they had wanted to prevent it from happening.

Meerkat is a very specific situation though, and the reason I still think of Twitter and Facebook as valuable platforms, even if it's with a lowercase p, is that both developers and Twitter and Facebook can benefit from lots of other more symbiotic relationships with each other. These relationships are possible specifically because of the nature of Twitter and Facebook primary ad unit.

Both companies could do a better job of clarifying the nuance of just what types of relationships qualify. This would head off more developer frustration and prevent them from just writing off those two platforms entirely, as many already have. Given how many developers have been burned in the past, distrust is high, but I believe a lack of clear and predictable rules makes up more of the platform risk here than is necessary.

More on that in a follow-up post.