Are You Data Literate?

Imagine a world where books were ubiquitous, but nobody knew how to read. Books as fine objects d’art. Connoisseurs prizing intricate bindings. Hand tooled leather covers, rich papers with hand torn edges, crisp fonts in dark inks marching across every page. But no comprehension, just a market place of… objects. Manufacturers pumping out these beauties so even the lowliest Walmartian could afford a whole room dedicated to blind beauty.

The weaving together and then unraveling of A’s and B’s and C’s into meaning and insight, no matter how mundane is a critical skill in our modern world. Literacy, the ability to use text to encode, transmit and decode meaning, context, and texture, enriches our lives. By and large we recognize that, teaching reading at a very young age, filling in with adult literacy programs, and getting into righteous debates about the best way for kids to learn to read.

Reading Data
Is data literacy any less important? As Moore’s Law doubles and redoubles our Digital capabilities, isn’t the data represented world becoming as ubiquitous as the print represented world? And yet the evidence is all around us, in big ways and small, that our collective ability to read data, to understand and navigate in data represented realities can’t even match the muddled appreciation of our alleged Walmartian mentioned above.

Have you noticed that every single car insurance ad from every single company seems to mention that on average, drivers that switch to them save a gazillion dollars? How is that possible? It would seem to imply a spiral that eventually has the insurance companies paying us to cover our cars (hmmm… could we make that work for health care?). That trope, of course, depends on a basic misunderstanding of what constitutes good data. Of course drivers that switch save money. Why else would they switch? It’s a population that self selects and says absolutely nothing about the relative costs of insurance between companies in general. Duh.

That’s perhaps no big deal, simply a tax on the data illiterate. But if we turn to, say, politics then things can get serious, such as the recent election and the ideology-over-reality driven agenda of our fledgling 2011 House of Representative. Take, for example, the health care repeal they recently passed. Whichever side you’re on about universal healthcare, healthcare reform, death panels etc, the charge up that repeal hill should be a tad puzzling. One of the standards held high in that action was fiscal responsibility. And yet, neutral observers agree that actually succeeding in repeal would cost an incremental $230 billion dollars over the next 10 years or so. Pretty soon, we’re talking real money.

Or let’s turn to Ms. Palin’s disingenuous cry of “blood libel” after the events in Tucson. Absolutely Jared Loughner is a loon for whom she cannot be held directly responsible. Liberal attempts to make that connection as if in a court of law are, to be generous, reaching. But to shrilly insist that the representations we make (cross hairs on districts) and the Digital cannons that we fire them with (her own website) have no impact on the collective conscious… Well I guess I’d trust my Walmartian’s digital data savvy more than hers (or the liberal head hunters for that matter).

Data is Truth, Truth Data – That is All?
We seem to have this childish belief that once we have the data, truth will follow, that the ambiguity and confusion will fall away and no further intelligence or judgment is required. When we amplify that belief through Digital means, bizarre stuff begins to happen.

If you’ve followed this rant for any period of time, you know I get jumpy about the editing that occurs when we necessarily trim messy analog reality to fit in the little data boxes that drive Digital capabilities. I still worry about that, but as we get further into this digital revolution, I’m beginning to pay more nervous attention to how those little data boxes get unpacked by us consumers of the Digital (aka everyone).

Like a band of hyper–terriers following a fleet of rats, we and our doctors leap and dart after every new study of pharmaceutical approaches to health. Are you taking statin drugs to reduce cholesterol? I am. It’s a clinically proven fact that statins reduce my chance of having a heart attack. Or maybe not.

It would seem that the ultimate data jockeys, scientists, are beginning to question the, ah, ultimate stability of any data set. In a New Yorker article titled “The Truth Wears Off”, Johan Lehrer recounts growing concern in the scientific community about something labeled the “Decline Effect” The short hand on this is (here I go trimming), many of our treasured, clinically replicated, data driven findings seem to become less data driven over time. That is, the data (and the interpreted results) of early studies become less and less replicable over time. True for popular anti-depressant drugs, studies in memory and perceptions, and a wide range of other scientifically derived “reality.” Over time, the data driven medical ‘truths” on which your doc is basing his prescriptions for you, may not be quite as true as we first thought.

If the data is getting squishy on the scientists, God help us mere mortals in our data wrestling endeavors.

Towards a More Literate Reality – Represented or Otherwise
So what to do? Run squealing into some imagined Analog Only sanctum? Well probably not. The Digital drawer of Pandora’s box has been opened and we need to find a way to cope. Digital lives and breaths on data derived reality. All the digital wonders rely on our ability to package realities into bits and bytes, transport those tidy little packages and then unpack them at some remove.

Certainly we want to hold our packagers of data, the computer scientists, the engineers, the programmers to a high standard for their initial transformation of Analog reality into Digital representation. But the burden of fidelity is not just on those that do the packaging. It also resides with all of us, whenever we open those tidy little packages and let them bloom back into the Analog light of day.

We cannot be confused, as were early consumers of moving pictures, between the picture and the represented reality. Like-wise we cannot ignore the preconceptions, the values, and the biases that the packagers bring to their tasks. Over the years, the decades, the millennia we have become quite adept at appreciating not only the object of art, but also the impact of the artists’ beliefs, skill, and context-of-the-moment on the work in front of us.

Whether Caveman, Dutch Master, or skuzzy corner performance artist there is a part of us that is made human by our representations of reality and our appreciation of them. Digital doesn’t change that, but it does amplify and accelerate the impacts of our representations and the underlying beliefs and savvy with which we create them. With great capability, comes great accountability. Be literate out there.

Pigs and Professionals in a Digital Age

No, I’m not talking the crumb-infested, corner-cubicle-coveting, pontificating bane on everyone’s existence from the IT department. I’m talking real pigs.

Let me explain. When I first moved to the Mid-West, I was a bit mystified to hear one of my colleagues extolling the local pigs. Seems she had just been to the state fair and reported back, “Them pigs! They were real professionals! They knew just where the oreos were and right how to get there!” On further investigation, turns out that pig racing is a big thing at the fair. It involves a little track and oreos at the end to entice the pigs to hustle around the corners, which they apparently do with some skill and enthusiasm.

An Extremely Brief History of Professionalism
Enquiry into the nature and practice of professionalism has longer legs than your average Mid-Western racing pig, reaching at least back to the Middle Ages and its system of professional guilds and apprenticeships. Even in our own, more proximate, middle ages, the 20th century, some folks were at great pains to draw distinctions between “the professions” such as medicine, clergy and law and the occupations of the rest of us wage slaves. Trying to untangle the nuances of those arguments is an exercise for a more bored and idle time.

However, as we cede more and more decision making to the Digital boxes, it’s probably worth taking a moment to consider what such a shift means for us as individuals and professionals.

Professionals and Digital Surrogates
Don’t get me wrong. I’m more than happy to have a chip monitoring the temperature in my house and making the call of too hot, too cold, or just right in best Goldilocks fashion. I’m glad, on those long interstate sprints, to turn the moment-by-moment decision making of faster, slower or maintain over to an engine control module. Frees me up so I can cast my focus elsewhere such as being amazed at the guy one car over who’s eating with one hand and texting with the other. Guess that’s why God invented knees.

There are whole volumes of decisions, once fraught with ambiguity and danger, which have become so well understood that no real on-the-fly judgment is required to balance out risks and desired outcomes. Hurrah for the science, technology, and industry, computer or otherwise!

Unfortunately, there is no commonly upheld, exhaustive list delineating which decisions fall in that bucket and which decisions still require someone with a body of training and experience to make a call. That’s where the conversation about being professional and our growing Digital reality collide.

Anatomy of a Digital Decision
From one vantage point, Digital is all about canned decision making. A bunch of “Is this a ‘0’ or a ‘1’?” decisions get munged together into ever expanding chains. Those nano-decisions collude to create all kinds of miracles from robotic surgery to the latest version of Angry Birds on my cell phone.

It is important to remember, though, that these miracles are completely dependent on the “canned” nature of both the nano-decisions and their answers. If you can’t predict the choices that need to be made or the appropriate answers, Digital need not apply. Digital isn’t the only the only thug out there beating the snot out of common sense, but it’s certainly at the party. Digital has a penchant for substituting data for direct experience and accelerating any decision process. Couple those two with suspect quality control and the result is not likely to pretty.

If you listen to the Lean Six Sigma priesthood, you can’t make good decisions with out good data which is true for a certain kind of decision. However, it seems to me that most of the really interesting decisions call for more than just data and established process, something not quite so susceptible to the lies, damn lies, and statistics of modern business and political discourse (with apologies to Twain, but not Fox News).

The trick, of course, is to know when a decision is ready to be canned, when we know enough and the boxes are savvy enough to permit the prediction and its reliable encoding into this system or that. Get it right and some new miracle of efficiency and precision is born. Get it wrong and you get the decisional equivalent of botulism.

Decisional Wheat, Digital Chaff
Real professionals, true leaders in their chosen fields, don’t earn their street cred by parroting back the same decisions described in the text books or reference manuals of their education and apprenticeships. True professionals make their mark, build their legacy, when a gap opens up between the present reality and the available data, information, and solutions. The better the professional, the quicker they see that gap and the more creative they are in filling it, whether they’re doctors, managers, programmers or plumbers.

Designer Reality

It would seem we’ve made our Digital world in our own image. “Well, duh!” you might say, wondering how it took me so long to reach that pretty obvious conclusion. And it is pretty obvious on the face of it. However, if we’ve made this Digital world in our own image, what can we learn about it and ourselves if we dig beneath the surface a bit?

Regular readers of the Underground know a steady theme is attention to what’s edited out by our Digital representations of Analog. That editing has always been cast as a conscious choice, made in a hubristic attempt to “improve” things. Turns out that editing may begin long before we summon up any conscious picture of what better looks like.

Making a Monkey Out of Percpetion
In their book, The Invisible Gorilla, Christopher Chabris and Daniel Simons recount a number of psychological experiments on attention, knowledge and logic. Turns out we don’t know or notice as much as we think we do. The opening chapter recounts a simple experiment in which subjects are asked to watch a video tape and count the number of passes between basketball players. To make it a little more challenging, some players wear white uniforms, some wear black and the subjects are only to count the passes between players in white. Pretty straight forward and most subjects deliver an accurate count of the number passes.

There is one other little detail that makes this experiment more interesting and relevant to our conversation. About half way through the video, a person in a full gorilla suit wanders through the middle of the basketball exercise. They stop in the middle of the frame, wave their hands, dance around, etc. They’re not trying to sneak in and back out without being noticed. But almost half the subjects, when asked about what they saw in the video will talk only about the number of passes. If asked outright about seeing a gorilla they will say they saw no gorilla. Many in that group, if shown the same video again, will deny that the gorilla was there the first time through.

It only gets worse from there as they recount other experiments that call into question our abilities to accurately remember significant events, draw rational conclusions about cause and effect, and in general reliably estimate our own capabilities and potential. As remedy, Chabris and Simons’ have an almost child-like faith that experimentation can clearly reveal an accurate knowledge of the world as it is. Their book’s sub-title is “And Other Ways Our Intuitions Deceive Us.”

I’m not so sure. Or at least not sure that really solves the more complex social, environmental, or personal challenges which are not reducible to simple experimentation. Last I checked we only get to live a given life one way. No winding the clock back and trying some other path.

The phenomenon of editing is not a revelation in the Digital world. It’s all about editing. But as we play God in creating that Digital reality, it’s probably worth asking if our Analog selves are more like Yahweh, the God of Ages in full control and delivering our certain prognostications in a rumbling voice of thunder. Or are we more like Epimetheus, the Greek God know for “running forward while looking behind.”

It’s an Architected Life
If popular movies are any indication of our baser intuitions, it seems we’re casting our instinctual vote for Epimetheus. There’s a trend in movies over the last decade of Digital emergence to have a role of a flawed God, more commonly called “The Architect” These “Architects” are not to be confused with the humble practitioner down the street designing MacMansions for the boobocracy, nor even the soaring spirits that have designed our finest public monuments. These “Architects” are designing whole worlds, but not worlds that will actually be built, touched, inhabited, but rather only experienced either as base neural stimulations (The Matrix series) or in dreams (Inception).

These imagined worlds, no matter the skill of the architect are always flawed, eventually falling into some kind of chaos. It is as if the designed reality lacks some stabilizing or self correcting element to prevent eventual disaster; as if the designer, the architect, by definition cannot see a big enough frame to create a self sustaining reality.

Of course, just because the movies say it don’t make it so, but this one rings true both in intuition born of experience and in experimentation with invisible gorillas.

Go find someone who has the title “Enterprise Architect.” About any self respecting medium to large IT shop will have at least one these days. We’re a sorry lot (did my time there several years ago) charged with the impossible task of articulating a comprehensive vision for robust technology in the modern organization. Actually that’s not so bad. Where the trouble comes is in actually implementing and maintaining that vision. Like the sci-fi architects writ large across the big screen we seem doomed to failure, to watch time and the newest buzz words relentlessly crumble our creations to chaos and dust.

Better than Perfect
But there is a kind of glory and honor to be had from this Sisyphean task, the restless plucking of dripping bits of order from the relentless chaos. We achieve that glory and honor not in realizing the perfection of our designs, but rather in understanding both our Digital and Analog worlds are improved by a certain humility, an acceptance that even our best designs are flawed in ways we cannot, at first, imagine. Honor comes not from our perfection, but rather from our living through, beyond and above our realized flaws.

i-Mutter

The Analog/Digital divide is warping and shimmering like the horizon on a sun-baked Texas highway in middle of a blue sky summer day. In particular, two new digital devices have come to our offices at One Analog Way that are rocking our world. One’s a digital single lens reflex camera and the other is an i-mumble. Yeah, that’s right an i-cough.

We’ll get to the camera in a minute, but I suppose I should get the i-hrrmph out of the way. Yeah. I scoffed at the Apple dudes and all their iPad hypoteering. Yes, I told them to put their collective heads between their collective knees and breathe into a paper bag for a while. No, I’m not entirely prepared to take that back. But after a couple of months with my own i-Mutter, while I still can’t say the name without a hitch in my get along, I have to admit it’s a pretty cool device. Doesn’t rise to the asserted level of “magical and revolutionary” but it sure has changed a few things in my house and my Analog and Digital habits.

iRead Therefore iAm
The most notable change is in my reading habits or at least my choice of media. I was one of those folks known to assert that you’d take away my print-on-paper books when you pried them from my cold, dead hands. I haven’t taken my library out in the street and built a bonfire to dance merrily around (too damn hot in the middle of this summer for that kind of shenanigans). But anybody tracking my media purchase history (shout out to my buds at Nielsen) would note a distinct downward trend in the print-on-paper purchases.

I’ve gone from an emphatic print-on-paper only policy to print-on-paper only for special situations policy in the space of about one e-book. I like my e-readers on the i-Mutter. I like that I can carry 20 or 30 books around with me all the time, dipping into the three or four that I’m actively reading. I like always having a line up waiting when I finish any given book. I’m smitten with reading about a book in the New York Times Review of Books and being able to instantly acquire it. I absolutely love having that moment of curiosity and, right then, being able to go find a relevant book or magazine even if it is the middle of the night.

And that’s just the e-book capabilities. High def photo album that I can take anywhere. Oh my (No all those phone screens just don’t count. Size does matter). Movies when I want them without the Netflix or cable box. Oh yes! Almost all of my information consumption needs, neatly packaged and delivered in a highly tote-able form. Suuuuwheeeet.

Fear not, dear reader. I have not been infected with the “This Changes EVERYTHING” disease. That hagiography to the i-Mutter is not done without a visceral awareness of something lost as well as gained. The i-Mutter has the heft of a small book. I actually feel like I’m reading a book. Curling up on the couch with this thing plopped on my chest and my glasses popped up on my old fart forehead feels authentically like a reading experience.

But, oh that delicious scent that rises to my nose when first opening a print-on-paper book. That incense of compacted writer’s craft, of distilled memory of every other book I’ve ever read, of the wafted intimation of story yet to be told. Some how the smell of metal and plastic just doesn’t carry the same weight, doesn’t prepare me as well mentally for engagement with the other worlds, the voices, the truths of good writing.

I Loan Therefore I Am
And then there’s the social act of lending a book. All the social networking pointers and referrals is not the same as handing over an object to a friend, knowing it will engage them intellectually, emotionally, spiritually. And some day having that object handed back, looking in their eyes and reading the changes there that have been wrought by this now shared encounter.

I have one book on the shelves of my library that is almost as well-traveled as I am. In an ironic twist that only reality can generate, it is Blue Highways, by William Least Heat Moon, a recounting of traveling the back roads of the U.S. I read it and fell instantly in love with the author’s written hand. Almost as quickly, I thought of a fellow wonderer, and the morning after I finished it, loaned the book to him.

I don’t loan books lightly. It is too intimate an exchange. I’m wounded in some fundamental way when loaned books are not returned. My wonderer friend knows this, so I was a bit surprised to learn that he had the same reaction I did to the book and, in his enthusiasm, loaned it to a complete stranger to me while on vacation with friends several states away!

Part of the reason I like this guy is the quality of people he knows, and while my book was put in a stranger’s hands, I need not have worried because about a month later a package arrived with the book and a short note thanking me for finding that gem. With this particular book, that pattern repeated itself two more times, loaned to friend, received back from a stranger. Needless to say, I’m not going to loan out my i-Mutter in that way. It might be the source of other kinds of bonding, but not in quite that serendipitous mode. There is something of our relationships to our Digital objects that is simultaneously both more possessive and more casual, as if their representational, second-order-of-reality nature both obsesses and bores us.

I See Therefore I Am

Which brings me to the camera. Nothing about this camera repels me. Out of the box I walked into the backyard and took one of the best pictures I’ve ever taken in forty years of amateur snap-ology. But it does raise this interesting question about digital representations of representations, a dream with-in a dream. In 1/100 of a second, I began to learn things about bees (they’re furry) and coneflowers (there’s a hallucinogenic sunrise at the tip of every flower), that I never knew.

Not a huge big deal, but as I was drawn to this other world, actual, but not entirely visible to the naked eye, I began to remember how attached my dad was to photography, began to sense the curiosity and sensitivity to beauty that he had. While he was alive, I never appreciated that, being too impatient or embarrassed with yet another stop to capture yet another sunset or arrangement of autumn leaves. Stoopid me. Perhaps slowing down, looking more closely at things had a potential I hadn’t appreciated then.

Beyond my personal epiphany, though, is another, more generalizable nugget. Analog comes packaged with all its data and incompressible timeline (talk about object oriented!) whether we’re prepared to process it or not. Digital mimics that, but any representation has fidelity issues. The challenge with Digital is not so much that lost fidelity as it is our insistence that the loss is insignificant, even an improvement over the real thing. We seem to believe the picture data paints to be more real, more factual, than the reality it only mimics. Rather than derive meaning directly from what is in front of us, we have a bias to believe the numbers instead of the obvious reality. This seems especially true when the scale, either up to huge volumes or down to microscopic truths, overwhelms us. Rather than slowing down, digging in, reflecting we take the first read of the data that comes along and build entire world views from that single point of analysis.

Kind of like the teenage me sitting in the back seat of the car, fuming as my Dad stops to take another picture. No real attempt to understand more than my first read of the data of my Dad doing something different, standing out, wasting my precious… well what…? Can’t remember anymore where I was in such a hurry to get to, some other place or life I guess.

Ain’t nothing wrong with more and better data unless we let it encourage us to trivialize and intentionally misunderstand ourselves and the world around us. Yet, sometimes, the Digital picture, that representation of representations, the most Digital of moments gets turned to Analog ends, to my attachment, my love for my Father and to my understanding of him and myself in new ways. Not bad for 1/100th of a second, but then I guess most revelation happens about that quickly. Perhaps we humans are not so unprepared for the nano-world of Digital as it sometimes seems.

How Digital Are You?

I started chewing on this when a keynote speaker at a conference polled the audience for their reactions to a text message interrupting a conversation at a restaurant. His choices were a)irritation, b)indifference, or c) enthusiasm.

His analysis was that, generally, the grey hairs (or no hairs in my case) would be offended, viewing the device and its communication stream as some how inferior to the immediacy of the interactions around the table. I believe the word “rude” was used in his description. On the flip side, he suggested that another, younger, demographic would not skip a beat and might actually join the text stream on their own phones.

The audience response, given before his analysis, seemed to confirm at least the first half of his conjecture. Bunch of gray hairs and we pretty uniformly raised our hands for the first option. After talking us through the options again, he asked how many of us would have invited a friend who actually walked up to the table to sit down and join the party. The speaker suggested that there’s a part of the population out there that views that hypothetical text in exactly the same way as someone walking up to the table.

I’ve had the opportunity to try this out on several audiences since then and, damn, he’s right! Just a few weeks later I was talking with a group of about 60 college students and poised the same question and got exactly the response he predicted. Most would have been at least indifferent, and likely enthusiastic for inviting the texter to the virtual table.

May I see Your Papers, Please?
Since then, I’ve been noticing a lot of chatter about Digital Natives versus Digital Immigrants. The natives are those folks young enough to have grown up in a digitally saturated environment. The immigrants are those of us who came to that environment latter in life, who, no matter how fluent, don’t have Digital as our native tongue. We retain some of those quaint habits and odd perspectives from the old country.

So back to the original question. How Digital are you? Suddenly that seems like a much more personal question. Perhaps our, ah, misguided friends in Arizona can figure out how to check our papers for that. Or perhaps a Facebook quiz is in order. Or maybe there’s an app for that. The last two approaches would favor the natives and the first, ironically enough, would probably favor the immigrants as it smacks of old world sensibilities.

Is it even important to ask the question? Do we really need the officer of our conscious to pull us over to the virtual curb of our brain, lights flashing, and demand to see some verification that we have a right to be in this Digital age? Well maybe that much useless drama isn’t necessary, but asking the question does open some interesting lines of discussion.

A Bit of Profiling
If language shapes how we view the world, how we present ourselves, what’s the impact of having Digital as your native tongue? As with any native/immigrant discussion, one has to tread carefully to avoid blurring reality with stereotypes. In addition, good will between the parties is a better starting point than enmity.

One of those observations that pops up a little too frequently to be entirely true is that natives are better multi-taskers than immigrants. If I hear one more parent shake their head with wonder at their teenager doing homework, listening to music, chatting in four different windows, while the television’s on, I’m going to…. well… start listening with one ear, while reading a book, driving though the wastelands of the western Florida interstate highway system, eating my way through a bucket of fried chicken. It only seems like dark magic from the outside. Trust me.

Another bit of rapidly emerging, if not particularly robust, piece of conventional wisdom is that the natives aren’t very good in a face to face conversation. Apparently all that intermediated connection, the chats, the texts, even the old-school e-mail have dulled their ability to react in real time verbally. Yeah right. I hate to remind my fellow oldsters, but communications skills have always been a sore point between technologists and regular humans and between adolescents and the next tick up the age ladder (though in my experience, the kids are getting better at bridging their gap) Does the intermediated thing aggravate the problem? Perhaps. Did it create the problem? No way.

I’ve also heard tell that us immigrants are less collaborative, more hierarchical than the natives, that our desire for position and power overwhelms our sense of connection and community approbation. Hmmmm… Wall street? Every randy elected official? Deepwater horizon? There would seem to be a few data points there. And yet. And yet. It was my mom and her brothers, all reaching into their 70’s at the time, who formed up the family e-mail list. They brought me back together with some of my sweet cousins that I hadn’t spoken to for years. Now we’re following each other on Facebook, on blogs, through twitter and the e-mail list.

The Melting Pot
Bottom line? As always with the Analog Underground, the call is not to roll back the clock to some better, old world times. The call is to pay attention to how the strange ways of this new land may change us, for better or worse. To examine what gets amplified, what gets attenuated. And to make those choices consciously with as much awareness of the consequences, intended and unintended, as is possible. Sorry, Arizona, but that kind of judgment probably isn’t very susceptible to documentation and legislation. Still requires an engaged brain and open heart.

Doin’ the HowWhat Dance!

Somewhere beneath the rolling pyroclastic hype of the 90’s web and the mobile ‘00s, as most meaningful discussion of people and technology got polished off the face of the planet, something really did change. All that sound and fury, while not particularly articulate or well directed, was, at least, like an adolescent’s hormones, either the source or the signal of something seismic. Experienced Analog Undergrounders know to take that kind of blatant, yet ambiguous marker as an entry point to something deeper.

As best I can tell, all that noise, motion and stink was related to the emergence of the Digital from being a “how” into being a “what.” “Pyroclastic hype, indeed” you might say. O.k. let me try again.

Digital DNA
It’s been practically imprinted in the DNA of every corporate IT drone that IT comes second to some mysterious inspiration about businessy things somewhere else in the “business.” That attitude spilled over into development of the Digital in general. Rarely a “what for”, almost always a “how to”. Someone else figures out where we’re going. We just figure out how to get there. Yeah, there’s always been that rogue element wondering back alleys with a solution in search of a problem to club into submission, but you wouldn’t want your kid marrying one of those hypoteers.

Then Glick’s First Law of Software kicked in. To refresh, my first law of software is that users will invent uses for applications that their creators never imagined. Back in the pre-big bang miasma of early networking there were all kinds of proprietary document sharing and discussion management networked applications. Then this guy in Switzerland (how appropriate) proposed a platform neutral way to share and interlink research docs across the emerging internet. Bang. Really big bang. But not immediately. The world wide web was spread pretty thinly over the world’s population for the first several years of its existence, with most of the small constituency well plugged into its original intent. But then the software genie, as it always does with killer apps, got out of the bottle.

Users and Genetic Engineering
Al Gore to the contrary, nobody really invented the internet in all its random, evolving glory. I’m pretty sure there was never a small cabal of businessmen somewhere that woke up one morning and said we’re going to do something that will shift commercial power from the producers where it’s been since the beginning of the industrial revolution to the consumer that previously pretty much had to take whatever was offered at whatever price was stated.

And there were aftershocks to that big webby bang. Mobility was almost as big, maybe bigger, but certainly of significance whatever the measure. There probably wasn’t, at the beginning anyway, a plan that said let’s take that hand held device that carries voice and replace the voice with text and music and apps and embed it so deeply in the young adult experience as to make it a new social currency. Ya’ can’t make that stuff up! It just happens, emerging from the roiling chaos of creativity, striving and greed that is an open market.

Somewhere along the line, all those Digital “how-to’s” added up to a “what-for.” You’d have to be in some kind of pharmaceutically induced denial to suggest that technology hasn’t driven business and society in all kinds of different ways over the last 10 or 15 years. Some of it’s been pretty amazing, some of it pretty amazingly foolish, but the most successful businesses have been as good at reacting to technology as they had been at driving it previously.

Reality Version 329722.1
A new “what.” So what? Well, when “what” wonders over the border from Analog reality, with all its well understood and socially supported constraints, to the wild west of Digital Virtuality, we need to start paying more attention to how that virtuality will influence our new reality.

I don’t want to stumble into the quagmire of whether there are inherent values embedded in this or that technology. It’s enough to know that technology in general and Digital in particular makes some outcomes more likely. How we decided to value those outcomes is another conversation, but the probabilities are, within reasonable margins of errors, easily discernable facts, at least over time. Yeah, one of the glories of being human is the way we confound those probabilities, but… damn just got the boots of reason and logic sucked off my feet.

So back to the, ah, factual probabilities… Let’s take the rich media, multi-tasked environment that digital natives swim in. All those images, video clips, down-loaded individual songs, simultaneous chat sessions, and text messages swirl together into a rich stew of data steaming off all kinds of information, right?

Well, yeah, but no matter how tasty the confection is that all you want to eat?

The More Things Change…
All that virtuality, may, in practice, not be adequate preparation for some mundane hallway conversation with your boss, or some dining room discussion with your spouse. As some of our “whats” cross the Analog/Digital boarder, we have to remember that even now, not all the “hows” came along for the ride. Before the IT tribe starts our drumming, dancing celebration of having finally seized a few “what” flags, we might benefit from recalling our ingrained experience that “how” inevitably follows “what.” What’s more, some of the “hows” of meaningful change, contribution and value like relationship, communication and meaning haven’t exactly been our strong suites nor have we always effectively built them into our digital progeny.

Make no mistake. Seizing the flag of “what” is not the same as creating sustainable value. Value emerges in the interactions between people and groups, organizations and societies. As all that Digital “What” lava begins to cool, as the land firms up and the first green sprouts appear, let’s not forget all we’ve learned in the Analog world about connections and constraints, value and meaning.

Be Bop Bazzzzzzit!

O.k. Analog Undergrounders. Prepare to have your mind blown! If you don’t know Pat Metheny, you should. He’s written the Analog Underground anthem. Hell, he’s written the whole hymnal. As with all things T.A.U. it’s not quite what I would have predicted (no words!), but it is unmistakably music for dancing along the Analog/Digital divide. Take a minute, seven actually, hop over to youtube, and take in an interview with Pat on his latest project, Orchestrion.

I’m no musical savant so when I first caught a blurb on this, I had to go look it up. Yeah, it’s a real word, meaning a machine that produces the affect of an orchestra. Seeing “machine” and “music” in the same sentence is a bit jarring and not something that I’d expect to lead to an Analog epiphany, but reality has its own agenda and we’re only rarely its master.

If you haven’t gone out to the interview, Pat’s latest project is a complete jazz band driven entirely off his guitar. “So what?” you say, “Isn’t that true of any ensemble with a star lead?” Well, yeah, but in this case the ensemble is just him. The rest is all automation. The hot base line, the riding cymbals, the drums, the piano, every riff and run, every improvisation, it’s all ‘bots. Cables, solenoids, and levers wrap around the instruments, poking and jerking and whirring over the bits that generate sound like some mad scientist’s nightmare of a band. It doesn’t seem possible that the result could be anything but alien, mechanical, sterile.

So much for being constrained by the possible, though. The result is another rich example of that mystical expression of the human condition we call music, that howl and whisper, the ecstatic shout and the caressing murmur that life elicits from each of us in one form or another.

How can this be? Music is one of those most human of preserves, like painting, or preparing the family holiday banquet, or poetry, or playing chess (whoops, never mind, we already lost that one to the boxes). No. No. No. NO, I DON”T WANT TO LIKE THIS. But I do. It’s even more than that. Not only do I like this, but I recognize myself in it, both in the sound and approach. I can almost feel the Analog ground shifting under me.

I guess I shouldn’t be surprised as I’ve always disowned the Luddite Analog-Only approach to navigating our Digital age. Perhaps familiarity does breed contempt, but it can also be the ground for admiration, or at least grudging respect and such is the emerging partnership between Analog and Digital. Like a smarmy romantic comedy or Hallmark Channel tragedy, we just know the two opposites are going to attract, but not before some either hilarious or tragic conflict and maybe a little of both.

The whole game for the machines and their computer brains is leverage. Extension of our physical and logical clout, baby! The Digital allows a near miraculous scaling of human capabilities. Digital allows us to throw space shuttles into orbit and drive a surgeon’s blade with microscopic precision. It allows us to react in nano-seconds and manage consistent action across decades and even perhaps millennia. We collect millions of bits of type into a massive library on the head of a pin. It is all amazing and wonderful. Why not extend the musician’s finger tips and lips beyond a single instrument?

The risk is, that without some guided intention, all the leverage in the world only falls on us like noise, like an avalanche, like a mysterious act of a malevolent God. It is that guided intention that draws my Analog Underground eye. The scaling up and down of human capabilities, all that leverage needs some guidance to achieve its best effect. Scale, up or down, without sense, without perspective is a woeful thing. Like eternal teenagers, when we figure out how to do something, we’re almost helpless to stop ourselves from actually trying it. Unfortunately, history is littered with examples of how inadequate a moral compass the phrase “Because we can” usually turns out to be. Our current economic crisis is only the most recent example.

All things digital are based on differentiation and step functions. Yeah, given our Digital capabilities we can often tune those step functions well below our physical ability to perceive them in sight and sound. But a funny happens on the journey from our physical senses to making sense, making meaning. For better or worse, it seems that most meaning that means anything emerges from the space outside of the Digitally replicated states, explicitly perceived or not. Something happens in between, beyond those static reference points, no matter how many we stack up and stream together. Low fidelity as mentioned in other posts is one thing. No fidelity is something else all together.

The Digital world gives us ever increasing multipliers on our intentions. Balance would suggest we also become ever more proficient at processing and applying the continuous Analog feedback that daily life provides. Whether we’re paying attention or not, Analog is always putting out a steady stream of on-the-ground leading indicators of result. We may not always be able to summon those up into conscious, articulated thought. They may not be infinitely replicable or rapidly transportable between people, places, and situations, but that doesn’t mean they are not real or useful. A friend of mind has dubbed this “The optician view of morality.” You know, two experiences and then the question, “Better or Worse?” The unique value of Analog is that it lets us play this vital game on an infinitely sliding scale with the middle relentlessly attached to the ends, paying attention not only to the data but, also, to the actual outcome.

Riding the Certainty Sine Wave

If you take a long enough view, the future is certain. Whether through a Christian/Mayan mélange of apocalypse, or the more mundane operations of thermo-dynamics, sooner or later there is no future. From an equally lofty view point the past is pretty much fully baked as well. The show got started as either the casual casting of God’s mind on the emptiness around 40,000 years ago or the flicking of some cosmic light switch several orders of magnitude earlier. Smack dab in the middle is the present moment, feeling equally inflexible in its restraints of the laws of physics, recent history, and our own various attachments.

Yet change happens. Somewhere between that eventual nothing and this specific now, the future gets complex. Like a towering growler of a North Shore wave avalanching towards the present, all the possibilities, all the mysteries pile up in a roiling, thrashing maelstrom of choice, aspiration and longing. Eventually the gravity of time pulls that skyscraper down into a single specific present. The ride isn’t over, though, as all that accumulated momentum and energy then whistles off into a mid-term past that endlessly kaleidoscopes through the various lenses of our belief and interpretation.

And then it instantly repeats. Again. And again. And… well, anyways, like walking, if you stop to think about it too much, you probably can’t do it very well.

‘Scoping the Future
None the less, those of us swept up in the Analog/Digital tango might benefit from a closer examination of the repetitive nature of this rolling series of near random events we call reality.

The physicists and mathematicians among us know that a sine wave is a smooth repetitive oscillation, which brings us some of our favorite consumables such as sound and light. Those of us that know the physicists and mathematicians among us also know how challenging it is to try and follow them through a description of those waves collapsing into particles as we turn on the kitchen light. Which gives us a perfect metaphor for a day in the life of an IT professional.

‘Scoping IT
Each and every morning we can count on something unexpected happening that will trouble the emotionless minds of our digital charges. We’ll have to rush in with some urgent patch, retreat into deep dive analysis of the problem, or, if we’re lucky, march into the undefined territory of new product development. If we’re lucky and stoned (aka working at Apple on the iPad) we might even get involved in something “magical and revolutionary.” Hey Apple dudes. Word of advice. Put your heads between your knees and breathe into a paper bag for a while.

At a more abstract level, any moment in the life of a sine wave inevitably contains its own opposite. If you’re riding the sine wave of certainty, you’re also riding the sine wave of uncertainty, surely as a Calvinistic destiny. In IT, we spend inordinate amounts of time hammering up cages made out of requirements and specifications, test cases and training manuals to drive a herd of needs into a corral of solutions. Then some sales person goes out and actually meets with a potential customer and suddenly we’re talking steam engines and not horses anymore.

The boxes can’t cope with that degree of uncertainty very well. But we, their technology nursemaids, can, if we decide to. We can dry those tears of 404 Page not Found errors and the temper tantrums of program ab-ends, given enough time. Being human in a digital age doesn’t mean becoming more like the machines, but rather getting back in touch with the things that make us human in the first place.

Like Any Good Parent
Raising our digital charges, helping them to become more productive members of our society as opposed to stumbling bureaucratic Frankensteins, requires us to employ our broader array of intelligences. The boxes need context, and kinds of context they can’t provide themselves. In addition to the expected analytical and systems intelligence, we need to bring our emotional, creative, and even spiritual intelligence to bear.

Emotions in the workplace! In the digital workplace! Flame on, you might say! Well, yeah, if we’re as dull and limited as our binary off-spring. The problem is not that we have too much emotion in the workplace, on the net and at home. It’s that we have too little facility with emotion, too little familiarity with the constructive application of passion. That first stirring of fear or anger, that single tick on passion’s Geiger counter are wonderful leading indicators of the need for focus, honed over years and generations of learning. The problems come when we let those emotions swirl into the all-too-common morass of useless, unharnessed energy and desire.

Same with our spiritual selves. I’m not talking religion here. Atheists are invited to come along. I’m talking about the part of our selves that makes meaning, that helps us understand what is significant and what is not. Like sancti-mommies run wild, left to their own devices the boxes will create an impression of significance for any output they produce regardless of real value. Ask the refugees from any failed SAP implementation.

Lord knows, I’m not calling for better, more improved self-righteous puffery at work or at home. We’ve got plenty of that, thanks to the likes of Karl Rove, Al Franken, Rush Limbaugh, Ralph Nader, Nancy Grace, and Glen Beck. What we need is more discipline and persistence in sorting through all the random stimulants to find those that actually lead us to our better selves, better organizations, and a better world.

The boxes are really good at helping us manage the floods of data along the sine wave of certainty/uncertainty. Making sense of all that data, knowing how it feels, figuring out what it means and managing our response… that probably still requires a human touch.

The 100% Club and Why I Don’t Want to Join

Somewhere between that last nine in 99.999999 and giving 110% we’ve lost sight of the big picture. In our quest for perfect control, that platinum club, queue skipping, arrow straight, GPS-tuned line from idea to delivery, we seem to have forgotten that reality isn’t quite that neat and tidy. We’ve got a risk-free monkey on our back. To be perfectly fair (chuff!), I can’t lay this addiction at the feet of the Digital but you can bet Digital is not on the menu at the Perfect Control Rehab Clinic either.


Back in 1772 Voltaire said “The perfect is the enemy of the good” so apparently we’ve got a running start at this jones for dominion over variation. It took a mechanical engineer, Fredrick Winslow Taylor, and his time and motion studies to really get rolling with the application of assembly line thinking to the masses. By the time ENIAC started frying apocryphal moths against its vacuum tubes in the 1940’s we barely felt the prick of the needle seeking that last good vein.

Who Let That .000001 In?
You’d think the software engineering industry would get that there’s always a gap between the actual and the ideal. After all, software engineering is nothing more than the translation of some reality from natural language through progressively more structured formats until we arrive at something the boxes can apply. That process isn’t perfect.

We clarify and trim a hallway conversation into a meeting agenda and power point slides. The meeting notes refine debate into a requirements document. Some designer edits and prioritizes to accommodate budget, technical, or political limitations. Then some engineer, hectored by syntax or admiration of their own creativity, lays some Jackson Pollock-like description of the envisioned state at the foot of the boxes. They then race off at the speed of electrons to recreate a fifth hand rumor of the original idea. All those clipped off bits of yellowed understanding and crumbling clarity are scattered like fallen leaves into some dusty cubicle corner. At the end of it all you’re left standing in front of a confused clerk who pushes the same button on the same point-of-sale terminal for the 15th time, hoping for a different result.

Sorry Mr. Taylor, but Meaning Gets Lost in all that Time and Motion
We act like Bermuda-shorted tourists shooting blurred pictures of what might or might not have been the latest b-list reality show star spilling their coffee at Starbucks with our 10 megapixel camera. One of the best software engineers I know says there isn’t much value in trying to take the measure of something in units finer than precision of available tools. Put another way, what we do with the boxes is an approximation of some reality. Much as we might wish it otherwise, the fidelity of that approximation is limited by the precision of our tools, both intellectual and physical. When we acknowledge these limitations and act within them, good things happen. When we don’t, watch out.

I’d love to suggest that this just a matter of more disciplined analysis and engineering. However, we’re not talking some street corner thugs stealing cigarettes. When executives, engineers, and society as a whole collude to ignore the limitations of our Digital tools, it’s more like the Mob moving in to muscle out understanding and meaning. Tonight, reality sleeps with the fishes and no RICO wielding Digital DA is ever going to figure out what happened.

Celebrating the .000001
The Analog is an infinitely variable, no-solution calculus. Digital is always an attempt to take its measure, and fans of quantum physics know you can’t measure something without changing it. Our application of machines and computers to the physics and logic of reality cannot help but change that reality. We almost always base our designs on some snapshot of the passing show, freezing reality, dumbing it down so we can apply the mechanical advantage and logical efficiency of our machines. We spend a tremendous amount of energy chasing that absolutely perfect, 100% accurate snapshot. However, whatever our intent, only the snapshot is frozen. Reality moves on. Eventually every machine and every computer is, at best, solving a problem that doesn’t matter anymore, or, at worst, creating new problems more intractable than the old ones.

The machines are always going to have a greater mechanical advantage. The boxes are always going to be quicker at puzzling out some mundane piece of logic. We don’t become better humans beings by getting better at playing second fiddle in those games, endlessly refining out the those .000001 variations and deficiencies. We become better human beings when we acknowledge and even celebrate the endless variation we call reality. We become better human beings as we come together at that source of all growth and learning.

Digital Ways in an Analog World

It seems we can’t go long without hearing the Digital mob raise its favorite chant “This changes everything!” Like first time parents, they’re convinced there’s never been anything quite as wonderful as their new baby, be it the latest operating system, the next killer app, the next generation of phone, or the unfortunately named iPad. It’s tough not to roll one’s eyes, to turn away to more substantial considerations, and leave the myopic bright eyes (digitally enhanced no doubt) to their eventual disappointment or distraction.

But Digital IS different. It may not, ahem, change everything, but like any new experience it may cast a new light across our known world, illuminating features we hadn’t noticed before.

Most of the time when I get cranked up here, I’m drawing attention to ways that Digital hasn’t changed the fundamentals of what it means to be human, either personally or professionally. I’m not quite ready to climb down off that soapbox, but I’d be remiss if I didn’t acknowledge that as we track outcomes on the way to more Digital, we may learn something about being human as well.

A Digital Lens
Digital is a fertile ground for metaphors of the human experience. Forget something? You “lost a pointer into memory.” Can’t quite function before that first cup of coffee? You’re just having trouble “booting up.” Feeling the need to get away? You just need to “go offline” for a while. Beyond those throw away phrases, though, are some richer veins of understanding to be mined.

Most recently this was brought to mind when trying to implement some minor bit of process at my current job. Hey WAIT A MINUTE! Before you run off screaming over the horizon at the mere mention of the word “process” for fear of its evil twin, “bureaucracy,” take a deep breath. There’s a story here that might be worth hearing.

I’m the first one to rant that process isn’t some magic elixir you pour down some poor drone’s throat and get instant conformant, efficient behavior, even it happens to have “Best Practice” printed on the label. Implementing effective process, at least in situations that mean anything, is always more complex than a single power point slide.

The evil twin, Bureaucracy, didn’t start out leaden and overly complex. Rather it was born in that moment when we imagine there is a simple, single, repeatable solution to a problem that is only poorly understood. “If we just do… THIS, the problem is solved, right?” Mmmm, but then what about this condition, or that exception? A few iterations, and all of a sudden we’re filling out form 732-B5.2 in triplicate while standing in a line that isn’t moving because somebody at the front is waving form 327-A4.5 in the face of an uncomprehending, uncaring clerk.

Does that sound familiar? It should to anybody that’s ever worked on a piece of software that is more than fifteen minutes old. We may all be only six degrees of separation away from Kevin Bacon, but trust me, we’re all infinitely removed from the mind of the original software engineer for a piece of unfamiliar code. And that’s just the well written (if poorly documented) stuff, a vanishingly small portion of all the software rat-racing down all the digital circuits around the world.

Digitally Enabled Godzilla
So what’s my point? That all software developers are really just closet bureaucrats? While perhaps true, I’m headed somewhere else. Developing good software is an amazingly difficult undertaking. Oh, it may not be too hard to come up with the happy path of functions, that yellow brick road of the well known and easily anticipated. But then we push our little digital undertaking up against the realities of everyday. In a flash it’s not so much Fred Astaire as Godzilla with a huge snort of cocaine up his prodigious nose, rampaging through the neat little processes of our workplace.

Going on thirty years ago one of my CompSci professors mused that if we built bridges the way we build software, we’d all be dead. True then and probably still true today, but in the past thirty years we’ve learned a lot about how to codify and structure an activity in such a way that it can be explained to and satisfactorily replicated by our dear if somewhat dim digital boxes.

Hmmmm. Codify and structure an activity. Software or process. You make the call, but I can’t help but think that some of the approaches we’ve developed over time to deal with the amorphous translation of ideas into software could also be applied to translation of objectives into process for our organizations.

We’d never imagine that we could take a thirty year old application written for an IBM mainframe, shove it onto our sleek little iPhone and get good results. Yet it seems like you can’t turn around at work without some newly minted VP attempting just that with process. And bureaucracy is born.

Process, just like software is only as good as its assumptions about inputs, outputs, and translations. Good software employs a degree of flexibility as a way to predictability. Lazy software just suppresses that flexibility (or throws the computer equivalent of a temper tantrum and crashes). Same deal with good process. Good process supports and enhances the application of good judgment to the unexpected on the way to predictability. Lazy, bureaucratic process drains away any opportunity for judgment and any flexibility to address the unexpected.

We Think Therefore We Stomp
Unfortunately for the process engineers, whether at home and at work, the platform for software , a computer, is much more structured and predictable than the platform for process, human beings. It doesn’t take long to get a good bead on the underlying capabilities and limitations of a particular box or application. Humans? Well not so much. We constantly surprise each other in both useful and not so useful ways. The human capacity of imagination and interpretation separates us from the boxes and is the blessing and the bane of any process engineer.

Which brings us back to more familiar Analog Underground territory. The practices of the Digirati may cast some light on how we choose to program ourselves and live our lives. Those practices, however, cannot be applied as if there is no difference between machines and humans. At best we’d create many minor irritations, inconveniences, and other petty bureaucracy. At worst we’d edit out all the random, chaotic pathways to unexpected meaning and connection that make us fully human.