In the aftermath of the dot-com crash, a new era for the web began to take hold a turning point whose seismic shift was hyped under the moniker "Web 2.0." The concept referred to the web becoming a platform, a home for services whose popularity grew through network effects, user-generated content and collaboration. Blogging, social media sites, wikis, mashups, and more reflected a changing consciousness among the Internet's denizens one which Tim O'Reilly, whose Web 2.0 conferences helped solidify the term as a part of our everyday lexicon, once described as a "collective intelligence, turning the web into a kind of global brain."
Since that time, because of humans' intrinsic need to apply a structure to amorphous things to give them a semblance of order things like the web, for example there have been many attempts to define what "Web 3.0? might be. At one point, the assumption was that Web 3.0 woud be the "semantic web." A place where machine-readable metadata is applied, allowing the web and the services that live upon it to understand the content and the links between the people, places, and things that fill its servers.
To some extent, the semantic web did arrive in things like Google's Knowledge Graph, an upgrade to Google Search whose underpinnings include a database filled with millions of objects and billions of connections between them. But semantic technology never became so widespread so as to define a new era of the web itself.
Meanwhile, others claimed Web 3.0 would be the shift to mobile devices, the rise of the "Internet of Things," or would emerge from web services growing more personalized to their users Google's predictive search service "Google Now" could be seen as one example of this, perhaps.
But none of these got to win the Web 3.0 branding, either.
So what will come next?
Will another notable turning point for the web as we know it ever evolve?
Yes, of course, and it's happening now.
It's harder to spot because this time around because it's not growing out of the ashes of a largely desiccated web as with Web 2.0, which blossomed following the dot-com era's end. Instead, the new web is growing up alongside the web of today. It could, one day, take over, but that remains to be seen.
And we don't have to call it Web 3.0. That's a bit simplistic. But it does deserve recognition.
A Rebellion
In retrospect, what Web 2.0 meant to the vast lot of the web's users was a large number of lightweight services software perpetually in beta that ran online not out of a box. It harnessed the wisdom of the crowds and the willing contributions of user data, which, in the end, became the services' value.
Facebook's social graph and profile data, for instance, is now the product it sells to advertisers who target anonymized demographic groups based on things like age, education, location, and more. Wikipedia grew from the efforts of thousands who aggregated their time and knowledge to building an online encyclopedia. Even the "blogosphere" is a Web 2.0 product, one where a network of writers and publishers linked and commented, reblogged and shared.
But the web is not a static thing. It grows and shifts to reflect the society it serves.
For those who saw the web emerge in their lifetimes, the ability to publish and connect with a vast audience around the world is a marvel. To rediscover long-lost friends on social networks, or chat with someone on the other side of the globe, or share photos with your friends and family so easily, still amazes.
But today, a new group of web users is coming of age. They aren't in awe of the connectivity and openness the web provides, that's just the way it's always been. And sometimes, they even kind of resent it. Barely able to remember a time when the web didn't exist, this group has been forced to grow up online, living in public like the artists in the human terrarium under New York City once did in an art project-slash-eerie premonition of a future yet to come.
"In the not-so-distant future of life online, we will willingly trade our privacy for the connection and recognition we all deeply desire," said Ondi Timoner, who documented this and other controversial human experiments in her 2009 film "We Live in Public."
She also warned us of the dangers of living our lives exposed, with what now sounds like common sense: "the Internet, as wonderful as it is, is not an intimate medium. It's just not," she said. "If you want to keep something intimate and if you want to keep something sacred, you probably shouldn't post it."
But we did it anyway. We posted it. We liked it. We shared it. We hashtagged it.
And when we ran out of things to document about ourselves, we turned towards our children.
Now of age, those young digital natives whose lives we cataloged without their consent are rebelling. They're discarding the values of the previous generation those of their parents, the authoritarians and defining new ones.
They don't want open social networks, they want intimacy. They don't believe every action has to be meaningful and permanent. The imagine the web as deletable.
The Rise Of The Ephemeralnet
The incredible growth of Snapchat, the "ephemeral" messaging service where pictures and videos are taken, shared, then discarded allowed to become memories is often pointed to as the key trend defining this new era, but that's just wrong. It's only one example.
Among its active users, Snapchat is engaging and addictive, and representative of an increased desire for privacy. However, it's not the only service out there defining a different kind of experience. The global messaging market as a whole has given way to a fragmented collection containing dozens of similar services each with millions of users of their own. While these may not have the parlor trick of "disappearing" messages, they also represent a rebellion against the "one network to rule them all" concept.
These messaging apps are often used with a close set of friends or family members, where data shared remains fairly isolated and private, as opposed to publicized and findable on the larger web. It's not about anonymity. It's about a different type of community. One not cluttered by bosses or parents. One less searchable.
Even Twitter is returning to its SMS roots among these younger users, who revel in its semi-private nature. Twitter users can adopt pseudonyms, and you can't surface tweets older than a week through Twitter search, which makes it feel like less of a permanent record, and a freedom to be "real" without consequence.
Meanwhile, on the youth-dominated social service Tumblr, users also don't have to sign up with their "real" identities. This allows them to explore and experiment with new identities and sub-cultures, the way young adults naturally do in the offline world.
And on a growing mobile social network for sharing secrets, Whisper, which this month saw over 2.1 billion pageviews, users can express their innermost feelings even those they're ashamed or scared of and become connected with a community for support, or, in the case of darker impulses, with actual help. And all this before they identify themselves by name.
While some confuse the "Ephemeralnet" with the so-called "SnapchatNet," in reality, it's not only a new way to socialize online, it's a new way to think about everything. You can see the trend also in the rise of the (somewhat) anonymous and untraceable digital currency Bitcoin. Unlike traditional transactions, Bitcoin is decentralized and doesn't require banks or governmental oversight or involvement. And though it's not entirely anonymous, there are already efforts, like Zerocoin, working to change that.
There are also efforts at making other forms of communication more ephemeral, too. Phone calls become more private through apps like Burner, SMS secure through apps like Gryphn or Seecrypt, and internal business communications unarchivable through apps like O.T.R.
As we head into the post-PRISM era, there's even a chance that this trend towards privacy will become further entrenched. Take for instance, a little-known service like anonymous search engine DuckDuckGo saw traffic spike by 50 percent in just over a week after the PRISM reveal. If it can now find traction for its online service and accompanying mobile apps based not just on PRISM fears, but on connecting with this larger trend of impermanence, then it could even have a shot at siphoning away a big enough handful of users to sustain its business.
But at the end of the day, the Ephemeralnet may never get to become as defining a trend as Web 2.0 once was. Though it may find adoption beyond the demographics of its youngest participants, it will continue to share the web with the services that preceded it services too big, too habitual, and too lucrative, to die off entirely.
But in the meantime, a new social norm could still be established. One where those who play for the cameras are outed and ostracized; where we value human connections enabled by technology over meaningless "social" scores; and where we care more about our relationships, and less about the number of likes and shares we have.
@aweissman you only use hashtags if you want to be found. I think there's a certain stigma around people who try too hard on the Internet.
ninakix (@ninakix) June 17, 2013
Image credits: giphy; lead unknown via Weheartit
No hay comentarios:
Publicar un comentario