Now that Web 2.0 has become a redundant term (after all, don't we expect the web to be interactive, feeding, streaming and end-user tweakable these days) when will the terminology move to the next step up the version hierarchy.
I have to apologize in advance, because while this lovehate will incorporate many open-ended questions that I invite people to answer for me, the majority of the discussion will not revolve around the real object of my derision: version-branding everything... but I digress 3.5.
When the Web 2.0 moniker became de rigeur around aught four, more people clamped onto the fashion of the version upgradability of the name as opposed to knowing what it was all about.
Remember the first time you could easily manage your RSS feeds, or the first time you got to move widgets around a page on the fly to create a custom portal experience? Remember the first time you Dugg something by clicking a simple button that updated on the fly, or giving a thumbs-up or down to a comment? Remember seeing your first REALLY cool freaky-styley Flash interface and then getting annoyed by them like so many animated gifs and beveled buttons of the past? Remember finding websites by designers that learned how to use flash for substance instead of style? Remember when social networks opened our eyes from a history of forums and newsgroups or even listservs?
Most of us remember all of this as the nascent signs that would be the explosion of Web 2.0 and yet, to me anyway, and I'm guessing many others, that seemed so long ago. Surely we've hit the version change or the upgrade somewhere along the way. We must be at Web 2.4.6 or Web 2.0 SP-1 by now. If we're going to buy into the branding of the Web as a version number, shouldn't we be willing to run with the entire procedure?
And here's a BTW 4.6: what ever happened to the Internet? Now, I could be wrong, but didn't the web used to be part of the internet? Wasn't "Internet" the global catchphrase dropped by politicians who wanted to seem too cool for their own good. Has Web 2.0 not only usurped "Web" but also "Internet" as well? Has the term "internet" become nothing more than a series of tubes? Have the words become interchangeable due to the Web's popularity?
And just by way of a WTF? 5.1, if Web 3.0 is supposed to be the advancement of server models that are not just storage and retrieval, but of execution as found in already existing web productivity applications like those promoted by Apple, Microsoft and, most prominently, Google, haven't we breached the outer vestiges of Web 3.0 already? Haven't we started to float through the Cloud? Surely we're starting to reach some of the potential if not benchmarks that would constitute a version shift, yet no one is ready to say we're officially at the Web 3.0 stage yet, but how about 2.5, 2.3, 2.1... hell, I'd even take 2.0.1 alpha at this point because it seems no version advancement moves so slow as that which evolves before our eyes.
And so if constant evolution is actually preventing a clean division between 2.0 and 3.0, who will ultimately be the voice responsible for leading us from the arbitrary muck and mire to the magic number? Will it be something as simple as a prompt from the social networker with the most followers? A well-placed tweet that gets re-tweeted ad infinitum until, by no fault, wish, cause, or ability of our own, we live in a Web 3.0 world and the first bloggers and eager tech column writers start heralding the advancements that are bound to be present in Web 4.0.
Maybe by then it will be called iWeb or MSWeb or GoogleWeb or The People's Web Republic of the United Provinces of China or Skynet.
I never thought I'd miss the days of some wide-eyed day-shift reporter who never thought they were ever going to do anything but read headline copy go painstakingly over a diagram or schematic 35 times while so-called experts, who were usually just whoever could be reached by phone first, were called to comment on the same picture. I never thought I would miss that until the Hudson River Splashdown.
What followed on CNN was some of the most painful reporting I have ever seen since the network kept vigil outside of Brett Favre's plane on the tarmac for 45 minutes when he came to NYC before signing last season.
I don't know, or want to know, the reporter's name as I've tried to burn all record of it from my skull, but CNN talked to a man who saw the plane touch down in the river from his 25th floor office and then disappear from sight behind other buildings. Ben Vonklemperer's moment in the sun was peppered with questions like: "Did the plane seem in distress?", "Were there any obvious signs something was wrong?", and "How do you think the pilot handled the situation?" This is a guy in AN OFFICE! One minute he's pushing papers around a desk and the next he's being called into services as a field correspondent in avionics.
Don't get me wrong, I'm as excited about the prospect of 10,000 points of information streaming in from an array of sources when a crisis arises. The web is equipped to deal with such information, the television networks are not. Two friends and I sat in front of the CNN web feed while this poor guy was being asked to wax intellectual about things he had no idea about, and one could tell from the tone of his voice that he was as dumbfounded at the questions as the viewers were.
Citizen Journalism is an oxymoron.
A citizen can witness, absorb, and even find ways to interact with a story. Their participation in a story is, in many ways, part of the story itself. A witness is flooded by perceptions from one viewpoint at one time. They are qualified to relay just that, one viewpoint from one time.
A reporter's job is to parse the viewpoints and opinions and statements and subjectivity and objectivity to craft what comes to be, at least with everything available at the time, a definitive statement about the events until the next definitive statement comes along. The web scares the hell out of real reporters and journalists. Let's face it, the concept of being "scooped" has always been the death knell of a story. If someone reports before you do, your story is derivative. Television news, with it's current technology, will ALWAYS be scooped by the web.
But reporters shouldn't be afraid of this. They should, instead, still take the time to craft the story instead of providing us the equivalent of a Twitter hashfeed over the air. This immediacy to journalism, while intoxicating to some viewers, yet strangely excruciating to me, has turned journalism into rehash and reporting into commercial fishing: let's cast a big net and see if we can come up with anything.
Twitter users are not journalists. They only qualify as reporters at the semantic level and, most often, the only real information their reporting in how many other people are tweeting the same things. Someone who snaps a picture on their iPhone may or may not be a good photographer, but they are certainly not a journalist.
Television journalism is dying because networks are trying to keep up with a medium that moves fast that cable news feeds or satellite hookups. The reason these so-called "citizen journalists" are getting any credibility at all is not because citizen journalists are getting better, but simply that traditional journalism is getting worse.
On a sliding scale between Ron Burgundy and Walter Cronkite, the credibility and attention to crafting a clear, concise story places almost all citizen journalists well below Burgundy. Traditional journalists, however, are showing themselves quite adept at closing that gap... maybe that's how they roll.
A neat idea and smooth interface considering the amount of content on the front page. It's was a neat experience to search the "social" category and click on the tab structure to check out all the results without leaving the page: joongel.com
Of trade shows, conventions, gadgets, products, memes and... oh, I don't know, let's say pandas.
In being thoroughly discouraged by what cuts it as an internet meme these days, I've decided to do a little deconstruction in determining what make a meme into the little slice of temporary pop culture phenomena that it is.
First, let's not deceive ourselves into thinking that ascertaining a meme's popularity is totally predictable. I maintain that a mainstream meme is the result of sheer luck and circumstance of a well-placed tweet or digg by a popular blogger, or a surreptitious mention on a popular podcast. So if one's heart is set on creating the next big meme, where does one begin?
Ingredient One: I Can Mistake Inglish?
Back as far as "All Your Base Are Belong To Us" people have flocked to mildly humorous examples of the English language being misrepresented or completely mismanaged to create a lasting effect that ranges from the silly to the absurd. Of course several years after the "Base" meme ran its course, "I can has cheezburger" kept up the trend, but included what will become our second step. The "Base" meme, due to its early nature, took longer to evolve and, because of it, stuck around longer. Several music and video remixes were made that required a certain level of expertise and allowed for the endurance of "Base".
Ingredient Two: Animalz R Phunny
Whether it's a cat, owl, or prairie dog, the sure sense of a odds-on meme will include an animal of some sort. With popularity going back to the early days of cats making unsuccessful jumps from sofas to tables, people love to see animals in two different scenarios: 1) being cute, 2) wiping out. The animal memes rely heavily on the minor abilities of people to use image editing to add text to photos. The partial, yet relatively minor skills involved in pushing this type of meme forward will spread it far more quickly, but ultimately cause it to flame out quicker.
Ingredient Three: Unmotivationals
The minor Photoshopping skills that people require for the text/animal mashups can also be used to create faux motivational posters. While this has become a meme in itself that would have run its course, the endless content that can be adapted has kept this satirical or parody-inspired practice in vogue. Also, the sheer ridiculous factor of the ever-growing original Motivators will continue to inspire this knockoff meme.
Ingredient Four: People Say/Do the Stupidest Things
"Stupid" people (read: wrong time, wrong place, wrong words for many of them) initiate this style of meme that propagates through video. Let's face it, it only takes the flailing of Star Wars Kid or a beauty pageant candidate exposing her sheer idiocy to capture the imagination of a mashup web generation. Remember "I like turtles!", "I'm not taking my glasses off", or "Leave Britney Alone!" If you don't, you must have been away from the web or ignoring the Fw:fw:fw: in your webmail boxes during the perfect time period. The stupidity inspires mashups, knockoffs, and responses that can keep these memes alive for a few weeks. The ease of use in spreading the word about these clips have made them some of the most popular memes of all. After all, what does it really take to email a youtube link to a friend, post it on twitter or facebook, or blog about it? But even if video dries up, you can always just add text to a picture of a person caught in an embarrassing situation that reads "EPIC FAIL!"
Ingredient Five: The Unexpected
From the early efforts of people being redirected to gross out porn to the more recent efforts that have revived Rick Astley's career through Rickrolling, the ability of someone to perform misdirection in link text or similar disguise has become as much an email meme as it has a web meme. Microblogging is a ripe medium for such an effort as it has become so simple to type "You Have to See This Car Accident" and then have the url redirect to Astley or a dozen other crazy clips. Kind of the laziest practical joke going, the misdirected link to unexpected content will always be around in one form or another.
And so we come to the part of the post where I try to create the ultimate meme. While I will try to incorporate as many of the ingredients as possible, I may not hit all of them. Cats have been done to death so I'm mashing up a picture of a soft-shelled turtle splayed out on the sand with its head half peeking out with the all upper case captions "I NEEDZ VIAGRA" across the top and "CLIC HERE TO HELP" across the bottom. In blazing red upper and lower case mix, diagonal to the top right we have "EPiC SHeLL FaiL" and the entire picture, when clicked, links to the misdirection video clip from an 80s band. While I've missed out on the Motivational parody and the human aspect in the original content, I do believe the goofy humans in the video make up for it. So we have a 1-2-5 meme with a dash of post 4.
Please feel free to send the link to as many friends as you like or mashup your own soft-shelled turtle viagra jokes as you can muster... I feel cheap and dirty.
Did you ever notice that, when you've eaten enough of your Cheerios to have the remaining lingerers left bobbing on the ripple surface of the milk like so many little beige inner tubes, they tend to clump together? Their round shapes allow each unit to hug each other in a tenuous fashion until others come to shore up the group in flowery patterns around the central group leader. And with each bite comes decay, disruption, and even the occasional disassembly of one group that prompts scattered, bobbing floating to a new group. Such are the life patterns of the Cheerios who were far too busy with other things to join the masses of their of lemming-like siblings into the orifice of doom.
There used to be a time where the concept of an in-person social network involved a pub, a movie, a dance, a concert, or some other event where like-minded people would gather for the sake of a shared experience. You see, today there's really not that much need to go to a film when we've got screen that fill walls and surround sound that rumbles the seats. Yet we still go out in record numbers to big films, not because we're afraid we're going to miss them, but because of the shared experience. We need the cluster. Even by two we tend to roll off each other.
I used to find the activity of flipping through record or CD bins a couple of times a week very therapeutic. I would flip absent-mindedly, knowing there was little to no chance I would find anything to buy, but there used to be a culture to a record store that was unparalleled for someone in their teens and twenties. There was a certain level of comfort in being able to rhyme off the names of 1000 bands and song titles that most other people hadn't heard of. Sure, maybe we were music snobs, but snobs cherish a certain aloof status that can often breach the realm of xenophobic. We were not such animals. We could not live without the culture. I knew at least a dozen people by look alone that would rifle through over 60 covers a minute and just wait for the opportunity to share an ounce of precious knowledge with the assembled masses.
Woe be the neophyte that walked in and asked a clerk to identify a song by a broken, dyslexic boopboopbeep melody line that could have been a hundred songs. We craved the ineptitude of the clerk. We wanted to possess that grail of knowledge that could pluck the arcane track from the depths of oceans of discographies. We loved Pete Frame. We floated, avoiding spoons, in this bowl for years. We were comfortable. We were not alone.
And then, just as now, there were "shows". Comic book shows, record shows, trade shows, and collectors would gather from far and wide to barter on limited run indie comics or bootleg concert vinyl or video tape. Again, most of the stuff we saw there wasn't anything that we couldn't have had our local dealer order in, but the mass experience of dozens, if not hundreds, of people sharing a common interest, gathering to pursue acquisition dreams was just too good to pass up. Our clusters got larger. Soon we would fill the top of the bowl and leave nowhere to run should the utensils try to pick us off again. Because while we contained our quiet elitism in our home group, while the cluster ocean was exciting, our elitism was lost - we had become "normal" to this environment. This was not acceptable. We needed a sense of elitism yet again while not being robbed of the ocean's lure.
The face of the gatherings, or the "shows" has changed. Shows still exist at the local level, but the growing ability to communicate their existence has promoted the knowledge of the conventions to a wider audience. Conventions which only used to draw dealers, now reached for a select group of consumers. We had found our Panacea. We could live out the fantasies of the sprawling ocean of knowledge where we could abandon our elitism and forsake the gravitas we held back in our home clusters. We were no longer afraid to look occasionally uninformed because WE HAD TRAVELED TO THE CONVENTION!
By, like so many snowbirds going south on the I-75, traveling to the ocean, there would always be a locale to return to where we could be the expert. Some people considered us crazy:
"You're paying how much money to go and see a bunch of comic books?"
"You're going to Las Vegas for four days and you're going to look at TVs and DVD players?"
"You're taking time off work so that you can watch a guy in a black turtleneck get on stage and do a commercial for an hour about a computer named after a fruit!?!"
But for everyone of the unwashed masses that would bat an eye back home, we were the envies of those in the clusters and the stores and the shows. We finally found a place where we could indulge our obsessive knowledge and wander with admitted awe and reverence. We could share our joy with sometimes thousands of people who shared our predilection of medium or genre. We could share, relax, ingest, experience and enjoy. For when we returned home we would certainly be deities amongst our cluster. We were sure all the other Cheerios would rise on edge out of the bowl and cry, "He has returned! He has returned! Please share your invaluable knowledge with us!"
We were sure of all this until we remembered every one of our friends had watched a streaming video of the entire convention and subsequently read every blog, blogged themselves, tweeted and retweeted a thousand tidbits of information. You discovered that you wouldn't be revered, that your knowledge was maybe even less about the events you attended live than your friends. And your oncoming disappointment turned to surprise when your friends still gathered 'round, still in sufficient awe, still with excitement to ask, "What was it like?" Because no matter how much knowledge you have about something, no matter how many links you click, or followers you have, or blog postings you read or write, there's nothing that will replace a visceral experience of being among a thousand, ten thousand, or a hundred thousand people with whom you share something.
It's why, forsaking the store and local cluster, we flock to the web, because short of being at a convention, or a concert, or a movie every day, we can at least participate in the illusion of the full bowl of Cheerios all standing as one in defiance of the spoon - and when the visceral is unavailable or unattainable, maybe the illusion is the next best thing.
An examination of the evolution of my shopping habits and how new media is not that new after all.
Click HERE to listen or subscribe to DyscultureD on iTunes!
Full Dysclosure
MacWorld Expo: The Show Before The Show
Phishing for the Fail Whale on Twitter
Facebook Says No Boobs Allowed - Unless You Are One Of Their Policy-Makers
Why Facebook is losing its status as “Treehouse 2.0″ - Parents Welcome!
Spotify: The Torrent Alternative
Tech Segment
The 6 Things That Kinda Shoulda Probably Won’t Happen in 2009
Wheel of Pop
Movies 1990
Websites Of The Week
Mike: www.makezine.com
Anth: www.ponoko.com
Musical Selection
Fembots
Web2.0 has taken us to a place where we now have the tools to disseminate any information far and wide through networks of contacts, friends, followers, etc.. Since the web has also become the great repository of content, one would think that the two naturally go hand in hand. And in many cases they do just that. We are about to hit a point, however, where a traditional media critic would shake their head and most people would wander blindly.
We have become far better educated on the workings on traditional media than the outlets would like to have us. For those who care to look, it's not difficult to see that television, radio and print media skew content to affect advertising. In fact, many of us, while we would like to wag our fingers in shame, often just give a wry smile of self-satisfaction that we have once again caught sight of the wizards behind the curtain. We assume there is an agenda behind everything. We're jaded. We may not know the content gatekeepers by name, but we know their motivations and, thus, infer their tactics.
And we do begrudge them in our own ways. Maybe it's because they've cancelled our favorite television show because the ratings never took off. Maybe it's because I can't listen to an afternoon drive time personality without four minute commercial breaks every ten minutes. Maybe it's that my local arts & entertainment weekly seems more concerned with filling up pages with ads for massage parlors and escort services than reporting on movies or music. But no matter how much we begrudge them, we reluctantly "get it" and grin and bear it from week to week.
This is what the traditional media critic understands, condemns and rails against when they think it can make a difference. We try to peddle whatever influence we may have with the gatekeepers to shape our vision of a medium more friendly to the consumer. After all, "shouldn't the public airwaves belong to the public?" we decry in our moral outrage.
Over the past decade for most (and two decades for some) the trickle of information from traditional media outlets to online ones has turned to torrents. Where traditional media often suffers from a lack of original content, the web has an abundance. Where traditional media has stifled creators to fitting a formula for acceptance, the web (on many levels anyway) is free of formula and parameters. If you want to post or upload something, go right ahead. But how many people could get any of their YouTube content broadcast on traditional media outlets? The gatekeepers that we view so cynically would never let an untrusted, unproven element enter their content. Their filters are what has made their media so safe, constrained, and, in many ways, boring.
The New Media critic is one that is more overwhelmed in trying to keep up with, and report on, the vast array of content and technologies. They write articles on websites, blogs, microblogs, streams, aggregators, sms, feeds and every different flavor and alpha and beta associated with them. What I think many of them are missing, or maybe just can't think of an audience interested to hear about it, is the incumbent problem that arises from a system with so much unrestrained content: new media is still media. The rules will remain the same. In an ocean of content, the end consumer needs a boat to sail on - i.e. gatekeepers have evolved from other popular consumers who have been given such authority.
The oft-regurgitated and imitated internet memes are not the result of someone in a suit wringing their hands together in Montgomery Burns-like Machiavellian glee. Instead, maybe it's through sites that offer "suggestions" or "favorites" and then ratings on the favorites. Maybe it's through thousands re-Tweets and site hits based on an innocuous post from a someone with 50,000 microblog followers. And it's not just memes, there is a small group of web authorities that knowingly or unknowingly craft popularity within the medium. And while I certainly don't begrudge them their popularity or their influence, they are a big reason why LOLcats exploded. For every popular web authority that dropped a harmless "this is hilarious" and link to "I Can Haz Cheezburger", 50,000 people went scrambling off in Prell shampoo-like fashion to tell two friends, and they tell two friends, and so on, and so on....
I'm not saying that New Media doesn't need a gatekeeper system of some sort, but, plainly, we just can't sit idly by and allow a microscopic trickle of content get reduced from the ocean. I'm all for the banal and the idiotic popping up it's absurdist head into pop culture once in a while. But when banal becomes popular, and popularity breeds more banality we're becoming no better than the television pilot writer who decides not to write a script that is controversial because it will never get picked up. Such a pattern reduces, not the entire web, but indeed the popular and public face of it to its lowest common denominator - far from the wild, untamed frontier we might like to think it is.
I hope the pattern isn't inevitable. I hope all the rules of old media don't apply to new media, but our lifetimes of training to be passive consumers have left few of us with any need or want to treat the web any differently from television. While many still see the user-based ownership potential over the web, two things are happening that are very old school: 1) popularity is breeding power and 2) the influx of the general populace means even more people who want to be fed instead of hunt and gather. I fear an impending timeline that will make New Media an outdated concept. The providers of content may be different, but the rules to gain access to the audience will be the same.
If 2008 was the year of Twitter, you can bet 2009 will be the year of more Imitwitters. Not that they haven't been tried already, but as code goes widespread and open source, be ready for your social media map to start looking like this.