Some considerations on what we may be sacrificing to Wolfram Alpha's "computational" engine, and a long lingering question about why the colon separator has become so popular in non-fiction.
Some considerations on what we may be sacrificing to Wolfram Alpha's "computational" engine, and a long lingering question about why the colon separator has become so popular in non-fiction.
In the past few years of webolution we've dealt with the advances of technologies and platforms that are greeted with great fanfare. What tends to get lost, or at least glossed over as time passes, are some of the ethical questions that arise from the technologies.
Remember when everyone thought Google was the greatest thing ever? Wait... I guess most people still do. There once was a time were the links that popped up for your searches were suspect. Why those links and not others? Was there some grand design that we were unaware of? Was Google harnessing the power of search results, feeding us what they wanted to feed us and instead of an "above board", transparent list of links where my little homepage could ever have a chance of hitting the top of the list? I remember thinking about this once... for a few seconds anyway, and then I got to my searching. I had ceded my link aggregation to Google.
Wikipedia has crowdsourced knowledge to the point where noted journalists are buying into the entries as though they're gospel. We search. We find entries that we can pretty much guarantee are suspect in one regard or another, yet we cite, source, report, and pass off the amalgamated ponderings of others as the 21st century Funk & Wagnall's. Don't get me wrong; I can appreciate the fact that we've moved from an authority system based on printing press to one based on monitor text. I'm not naive enough to think that print encyclopedias were without bias. I do know, however, that the filtering system to go from research, to edit, to publish at a publishing house is at least tangibly more complex than clicking "edit". We have ceded our knowledge to Wikipedia.
And then past year's fears of Digg manipulating their stories to jack up the ratings of "superusers" or otherwise manipulating their front page results. The community cried "Foul!" and most of us went back there anyway.
Let's digress for a minute though...
Google, Wikipedia, Digg - none of them are bound to ANY public recourse or obligation. These are private companies that may be community-driven in some respects, but are beholden to no one but themselves, or their shareholders. Even still, we have ceded authority to the aggregators... and I'm sadly willing to accept it, because their functionality makes my life easier and I'm far too lazy to pursue the alternative.
So now we are presented with Wolfram Alpha which purports to be a "computational knowledge engine" which is way cool and has potential written all over it. But its existence (and future) raises questions concerning our divested authority. While Google and Digg ask us to accept rankings and Wikipedia asks us to accept knowledge, Wolfram Alpha is asking us to accept solutions. While this may seem to be a fine line (and one that I'm sure I'll be accepting sometime soon) the line does lead down the path to bigger ethical questions than link aggregation.
Is it okay to cede problem solving to the web? Don't get me wrong here. I realize that WA is not apt to solve the world's problems even with the best placed query. My fear is that the ourobouros of crowdsourcing will increase exponentially. When does a Wikipedia entry that's received a million hits, because of its listing in Google, become so accepted that it is "fact"? When does "fact" get integrated into research which, itself, gets re-cited back into Wikipedia and other sources? When does Wolfram Alpha generate solutions based on a "fact" that, in itself, gets republished to create new "research"?
And I guess we can go back to the paper v. digital question where I'm sure someone will correctly assert that this feared pattern has all happened before in paper, ink and press. I'll concede to that. My issue is the filtering. Now it can happen in an hour or a day. Research used to be time intensive and subject to the self-questioning that the research and publishing process would allow. The speed of the web CAN deny such reflection. Where it has always been incumbent on consumers of media to question content providers, the obligation becomes even greater when server-side computation verges on impending nascent stages of AI. Alright, I know were not talking Skynet here, but there's a big difference between "here's where you can go to possibly find the answer" and "here's the answer".
Who's afraid of the Big Bad Wolfram?
Web2.0 has taken us to a place where we now have the tools to disseminate any information far and wide through networks of contacts, friends, followers, etc.. Since the web has also become the great repository of content, one would think that the two naturally go hand in hand. And in many cases they do just that. We are about to hit a point, however, where a traditional media critic would shake their head and most people would wander blindly.
We have become far better educated on the workings on traditional media than the outlets would like to have us. For those who care to look, it's not difficult to see that television, radio and print media skew content to affect advertising. In fact, many of us, while we would like to wag our fingers in shame, often just give a wry smile of self-satisfaction that we have once again caught sight of the wizards behind the curtain. We assume there is an agenda behind everything. We're jaded. We may not know the content gatekeepers by name, but we know their motivations and, thus, infer their tactics.
And we do begrudge them in our own ways. Maybe it's because they've cancelled our favorite television show because the ratings never took off. Maybe it's because I can't listen to an afternoon drive time personality without four minute commercial breaks every ten minutes. Maybe it's that my local arts & entertainment weekly seems more concerned with filling up pages with ads for massage parlors and escort services than reporting on movies or music. But no matter how much we begrudge them, we reluctantly "get it" and grin and bear it from week to week.
This is what the traditional media critic understands, condemns and rails against when they think it can make a difference. We try to peddle whatever influence we may have with the gatekeepers to shape our vision of a medium more friendly to the consumer. After all, "shouldn't the public airwaves belong to the public?" we decry in our moral outrage.
Over the past decade for most (and two decades for some) the trickle of information from traditional media outlets to online ones has turned to torrents. Where traditional media often suffers from a lack of original content, the web has an abundance. Where traditional media has stifled creators to fitting a formula for acceptance, the web (on many levels anyway) is free of formula and parameters. If you want to post or upload something, go right ahead. But how many people could get any of their YouTube content broadcast on traditional media outlets? The gatekeepers that we view so cynically would never let an untrusted, unproven element enter their content. Their filters are what has made their media so safe, constrained, and, in many ways, boring.
The New Media critic is one that is more overwhelmed in trying to keep up with, and report on, the vast array of content and technologies. They write articles on websites, blogs, microblogs, streams, aggregators, sms, feeds and every different flavor and alpha and beta associated with them. What I think many of them are missing, or maybe just can't think of an audience interested to hear about it, is the incumbent problem that arises from a system with so much unrestrained content: new media is still media. The rules will remain the same. In an ocean of content, the end consumer needs a boat to sail on - i.e. gatekeepers have evolved from other popular consumers who have been given such authority.
The oft-regurgitated and imitated internet memes are not the result of someone in a suit wringing their hands together in Montgomery Burns-like Machiavellian glee. Instead, maybe it's through sites that offer "suggestions" or "favorites" and then ratings on the favorites. Maybe it's through thousands re-Tweets and site hits based on an innocuous post from a someone with 50,000 microblog followers. And it's not just memes, there is a small group of web authorities that knowingly or unknowingly craft popularity within the medium. And while I certainly don't begrudge them their popularity or their influence, they are a big reason why LOLcats exploded. For every popular web authority that dropped a harmless "this is hilarious" and link to "I Can Haz Cheezburger", 50,000 people went scrambling off in Prell shampoo-like fashion to tell two friends, and they tell two friends, and so on, and so on....
I'm not saying that New Media doesn't need a gatekeeper system of some sort, but, plainly, we just can't sit idly by and allow a microscopic trickle of content get reduced from the ocean. I'm all for the banal and the idiotic popping up it's absurdist head into pop culture once in a while. But when banal becomes popular, and popularity breeds more banality we're becoming no better than the television pilot writer who decides not to write a script that is controversial because it will never get picked up. Such a pattern reduces, not the entire web, but indeed the popular and public face of it to its lowest common denominator - far from the wild, untamed frontier we might like to think it is.
I hope the pattern isn't inevitable. I hope all the rules of old media don't apply to new media, but our lifetimes of training to be passive consumers have left few of us with any need or want to treat the web any differently from television. While many still see the user-based ownership potential over the web, two things are happening that are very old school: 1) popularity is breeding power and 2) the influx of the general populace means even more people who want to be fed instead of hunt and gather. I fear an impending timeline that will make New Media an outdated concept. The providers of content may be different, but the rules to gain access to the audience will be the same.