Zombies!!! – The Undead Technologies Among Us

Is a zombie technology eating the brains out of your organization or, worse yet, your personal life?  You know what I’m talking about.  The applications and devices that stumble around on rotting, obsolete limbs.  They’ve long overstayed their usefulness, unaware they’re dead but not gone. They feed with an insatiable lust on our time and attention and in the process turn us into … well… them.

In the world of work, there are thousands of COBOL programs muttering through our halls waiting to lunge out at us, diverting us from productivity to the sustenance of their undead existence.  More recently that new app or new mobile device policy shows up at the door all spiffy in new clothes and fresh faced, only to decay into a dirty, threadbare, unintelligible monster pulling us into a dark corner never to emerge again.

Our Own Personal Zombies

In the good old days, most of our personal encounters with zombie technology came through the intermediary of some bureaucrat.  We might get battered about a bit as they struggled to fit the reality of us into some truculent program created in search of some long gone efficiency.  The stink of rotting purpose might linger as we grumbled about the wait at the DMV but pretty quickly we were back in the light and sunshine.

That was before the internet and mobile technology.   Allof the sudden we’re forced to deal directly with the grisly automatons of the brainless websites, the morons of texting and driving, the ever so friendly but clueless voice response systems.  If we’re lucky, all they want is our currency and trade. Worst case, some malevolent shaman is sending them shambling off to do us real harm.

The crux of the matter is that even the best technology can’t compensate for bad process, but really good process might improve bad technology.  We’ve all seen the poorly architected, fragile technology succeed wildly through wily marketing, whole cottage industries of support, and the occasional intervention of blind luck.  “Would you like Windows with that?”  And we’ve seen really sweet, well thought out technology wither on the vine from the lack of sustaining integration and deployment with the mass market or just our departmental business process.

And don’t let the “P” word lull you into thinking that zombie technology only roams the halls at work.  All I ask, gentle reader, is that you substitute the word “habit” for “process” and then we can talk.

Zombies – Shiny and New – Get Them While They’re Hot!

The problem is that almost no technology shows up looking undead. COBOL was as an huge innovation in its day as Facebook and the iPad seem to us now.  The zombie emerges slowly over time as our needs, wants, and understanding mature and evolve, but the technology doesn’t.  The zombie technology is frozen in time at the moment some Digital conjurer baked the passing Analog stream into a set of fleet, but rigid bits and bytes.  No matter how dazzling and innovative when new, it can’t easily regenerate a new incarnation to address some unforeseen situation.  Any incarnation of technology moving from the ethereal Digital world to the rough and tumble Analog stream slowly acquires that ragged, falling apart look from repeated impacts of new, unaccounted-for reality.  “Would you like Facebook or iPad with that?”

The funny thing (if you like that kind of humor) is that there is a completely human predecessor for this Digital behavior.  It’s called ideology.  On the upside, you get some visionary true believer like Frank Lloyd Wright that creates uplifting architecture that just isn’t so great for actually living in long term.  “Would you like Steve Jobs with that?”  On the down side you get nut jobs like Osama Bin Laden or that guy who drags his family around to the funerals of our soldiers with some bizarre message about divine retribution in the face of honor and sorrow.  Somewhere in between, you get folks like our current round of politicians that have lots of ideas, but can’t seem to do basic math.

In every case, Digital or Analog, someone with a kind of tunnel vision cooks up a set of rules away from the messy facts and complexities of life as it is lived.  Armed with that convenient, second order picture of reality, they come charging out into the real world, hacking away at anything that doesn’t fit their particular version of a rosy picture.  God forbid you have brains, ‘cuz they’re hungry.

Being the Anti-Zombie
What’s a poor human to do as the moaning zombie mob closes in?  Despite iRobot, The Matrix I, II and III, and all the Terminator movies, Digital and the other various stripes of fundamentalism aren’t all that flexible, and that’s our out.  We don’t win by being better zombies than the zombies themselves.  We win, at work and at home, by constantly evolving our selves to account for new realities.  We win by leading with empathy and a healthy suspicion that we don’t know as much as we think we do.  That’s the baseball bat to the zombie cranium.

Analog Art and Digital Avatars

There is a kind of art in the ever more ubiquitous Digital representations of our world. The software engineers, the hardware jockeys, the web designers wield their various electronic paint brushes to draw us a picture of the world. Like Analog art, those pictures are sometimes a representation of the world as we think it is, sometimes the world as we wish it were, or even sometimes a world we want others to believe in. In a similar vein there are better or worse painters, and viewers with more or less sophisticated ways of absorbing and interpreting those Digital creations.

Picasso’s Les Demoiselles D’Avignon
Interestingly enough, however, we don’t seem to have had our Digital Picasso or Pollack moment, at least not intentionally. Where’s the software engineer that’s taught us a different way of seeing, the hardware designer showing us a new way of interpreting the world?



Pollack’s Painting #1 1950

 Grace Murray Hopper certainly has left her mark on the industry. We may not be writing a ton of new COBOL code, but God forbid it suddenly disappears from the planet. Perhaps, in its time it was as revolutionary as Picasso’s Les Demoiselles, that first step down the Cubist path in Analog art.

I’m sure there are those that will raise the Steve Jobs flag here, but truth to tell Jobs has been more a packager of other’s ideas, more Barnum than Pollock, though the end result has a certain abstract expressionist vibe to it. Tim Berners-Lee comes closer with his hypertext seed that grew into the web, but even there his role is more the guy who gave Picasso his paints rather than Picasso himself. The same could be said of Jack Kilby and Robert Noyce (semi-conductors).

What is Art? –The Digital Redux
As anyone who’s taken an Art Appreciation class knows, on the Analog side we’ve never arrived at a conclusive answer to the question “What is Art?” Perhaps we’re destined to that same wondering-in-the-fog on the Digital side, but I can’t help but think there’s a difference between the endeavor of Analog Art and Digital representations. I’m suspicious that we don’t have the Digital Picasso or Pollack because the work of Digital is driven by a different instinct than that of the painter or the sculptor. What’s more I think that difference can be traced back to two distinct roots. The first is a matter of time lines, the second, a matter of pragmatism.

Denis Dutton in his book, The Art Instinct: Beauty, Pleasure, and Human Evolution, suggests that at least some of our artistic sensibilities are hard-wired at the level of instinct. He cites studies that show a cross cultural preference in landscapes for a lot of blue sky, some kind of savannah, with a border of trees and available water. Makes sense if our sense of beauty got wired in when we were coming out of the trees and being to walk up right. The folks who survived that stage of our evolution would have had a preference for exactly that kind of landscape and passed that preference on to their progeny.

Perhaps we don’t have the Digital Picasso yet because we are, at least Digitally, just coming down out of the trees, exploring what it means to stand up right, so to speak. There is, after all, some 30,000+ years between the cave drawings in Chauvet, France and Pollack’s “Painting #1, 1950.”

Another way of thinking about this is the perhaps apocryphal story of the king who declared he wanted a fine painting of a rooster. His advisors began a search for the finest painter of animals in the realm and eventually landed on one artist. The king personally appeared at his studio and commissioned the work. When asked when he could expect to take possession of the picture, the artist told the king to come back in one year. A year passed and the king became ever more obsessed with having the finest painting of a rooster in existence. Finally the day arrived and he swooped back into the artist’s studio, barely able to contain his anticipation of the great work, a year in the painting.

The artist looks up from his work and says “Ah, The king arrives.” He gets up, walks over to a blank canvas, and in a few minutes a beautiful picture of a proud rooster emerges. All assembled admire the subtle coloring, the cocky strut, the vibrancy that almost has them hearing the rooster’s call. The king, however, is not amused. He sputters out an angry declamation that he did not wait a year to have this artist slap dash something together no matter how good it might appear. A deathly silence falls across the studio, but the artist remains cool.

He summons the king to the door of a back room. He opens the door and reveals thousands and thousands of drawings, paintings and sculptures of roosters, and even one or two of the live birds picking through the chaos. “My most revered highness, I could not have done that one painting without the year it took to work through all of these.”

Perhaps Digital is just not that far along in its own exploratory “year.” We are still early on in the process of understanding both the medium and its subjects.

Why is Art? – Digital Redux II
If the first art history question is “What is Art?” then surely the second has to be “Why is Art?” That leads directly to the second foundation of difference in Digital and Analog art. The “Why?” question on the Analog side spurs more debate than answer.  On the other hand, the why of the art of the Digital is usually pretty obvious. There is some pragmatic problem to be solved, some unanswered (and sometimes unimagined) need to be filled. There is a defining pragmatism in the art of the Digital that is not present in the Analog Arts.

The art of the Digital is rarely concerned with purely philosophical or aesthetic ends (which might explain all the really ugly software and websites out there). At least part of the fervor for Apple products is their ability to wonder into those realms while at least making a wave at meeting the uber-measure of good Digital which is practical application. We flock that direction because it satisfies a part of us that technology does not usually touch and in fact does its best to ignore. Trimming that quirky, individual human perception out of the binary, either/or representations is necessary to sustainable Digital experience. Those very aspects of our selves that most make us most human are least susceptible to representation in the art of the Digital as currently practiced.

So the Analog Underground calls out to all the artists of the Digital. Practice on! Labor through our “year” of exploration! And as we create the digital avatars of our various selves, perhaps it is worth a look aside to the world of Analog art. We might find some guidance there, from that span of 30,000+ years, on how to represent those things that make us most human.

Are You Data Literate?

Imagine a world where books were ubiquitous, but nobody knew how to read. Books as fine objects d’art. Connoisseurs prizing intricate bindings. Hand tooled leather covers, rich papers with hand torn edges, crisp fonts in dark inks marching across every page. But no comprehension, just a market place of… objects. Manufacturers pumping out these beauties so even the lowliest Walmartian could afford a whole room dedicated to blind beauty.

The weaving together and then unraveling of A’s and B’s and C’s into meaning and insight, no matter how mundane is a critical skill in our modern world. Literacy, the ability to use text to encode, transmit and decode meaning, context, and texture, enriches our lives. By and large we recognize that, teaching reading at a very young age, filling in with adult literacy programs, and getting into righteous debates about the best way for kids to learn to read.

Reading Data
Is data literacy any less important? As Moore’s Law doubles and redoubles our Digital capabilities, isn’t the data represented world becoming as ubiquitous as the print represented world? And yet the evidence is all around us, in big ways and small, that our collective ability to read data, to understand and navigate in data represented realities can’t even match the muddled appreciation of our alleged Walmartian mentioned above.

Have you noticed that every single car insurance ad from every single company seems to mention that on average, drivers that switch to them save a gazillion dollars? How is that possible? It would seem to imply a spiral that eventually has the insurance companies paying us to cover our cars (hmmm… could we make that work for health care?). That trope, of course, depends on a basic misunderstanding of what constitutes good data. Of course drivers that switch save money. Why else would they switch? It’s a population that self selects and says absolutely nothing about the relative costs of insurance between companies in general. Duh.

That’s perhaps no big deal, simply a tax on the data illiterate. But if we turn to, say, politics then things can get serious, such as the recent election and the ideology-over-reality driven agenda of our fledgling 2011 House of Representative. Take, for example, the health care repeal they recently passed. Whichever side you’re on about universal healthcare, healthcare reform, death panels etc, the charge up that repeal hill should be a tad puzzling. One of the standards held high in that action was fiscal responsibility. And yet, neutral observers agree that actually succeeding in repeal would cost an incremental $230 billion dollars over the next 10 years or so. Pretty soon, we’re talking real money.

Or let’s turn to Ms. Palin’s disingenuous cry of “blood libel” after the events in Tucson. Absolutely Jared Loughner is a loon for whom she cannot be held directly responsible. Liberal attempts to make that connection as if in a court of law are, to be generous, reaching. But to shrilly insist that the representations we make (cross hairs on districts) and the Digital cannons that we fire them with (her own website) have no impact on the collective conscious… Well I guess I’d trust my Walmartian’s digital data savvy more than hers (or the liberal head hunters for that matter).

Data is Truth, Truth Data – That is All?
We seem to have this childish belief that once we have the data, truth will follow, that the ambiguity and confusion will fall away and no further intelligence or judgment is required. When we amplify that belief through Digital means, bizarre stuff begins to happen.

If you’ve followed this rant for any period of time, you know I get jumpy about the editing that occurs when we necessarily trim messy analog reality to fit in the little data boxes that drive Digital capabilities. I still worry about that, but as we get further into this digital revolution, I’m beginning to pay more nervous attention to how those little data boxes get unpacked by us consumers of the Digital (aka everyone).

Like a band of hyper–terriers following a fleet of rats, we and our doctors leap and dart after every new study of pharmaceutical approaches to health. Are you taking statin drugs to reduce cholesterol? I am. It’s a clinically proven fact that statins reduce my chance of having a heart attack. Or maybe not.

It would seem that the ultimate data jockeys, scientists, are beginning to question the, ah, ultimate stability of any data set. In a New Yorker article titled “The Truth Wears Off”, Johan Lehrer recounts growing concern in the scientific community about something labeled the “Decline Effect” The short hand on this is (here I go trimming), many of our treasured, clinically replicated, data driven findings seem to become less data driven over time. That is, the data (and the interpreted results) of early studies become less and less replicable over time. True for popular anti-depressant drugs, studies in memory and perceptions, and a wide range of other scientifically derived “reality.” Over time, the data driven medical ‘truths” on which your doc is basing his prescriptions for you, may not be quite as true as we first thought.

If the data is getting squishy on the scientists, God help us mere mortals in our data wrestling endeavors.

Towards a More Literate Reality – Represented or Otherwise
So what to do? Run squealing into some imagined Analog Only sanctum? Well probably not. The Digital drawer of Pandora’s box has been opened and we need to find a way to cope. Digital lives and breaths on data derived reality. All the digital wonders rely on our ability to package realities into bits and bytes, transport those tidy little packages and then unpack them at some remove.

Certainly we want to hold our packagers of data, the computer scientists, the engineers, the programmers to a high standard for their initial transformation of Analog reality into Digital representation. But the burden of fidelity is not just on those that do the packaging. It also resides with all of us, whenever we open those tidy little packages and let them bloom back into the Analog light of day.

We cannot be confused, as were early consumers of moving pictures, between the picture and the represented reality. Like-wise we cannot ignore the preconceptions, the values, and the biases that the packagers bring to their tasks. Over the years, the decades, the millennia we have become quite adept at appreciating not only the object of art, but also the impact of the artists’ beliefs, skill, and context-of-the-moment on the work in front of us.

Whether Caveman, Dutch Master, or skuzzy corner performance artist there is a part of us that is made human by our representations of reality and our appreciation of them. Digital doesn’t change that, but it does amplify and accelerate the impacts of our representations and the underlying beliefs and savvy with which we create them. With great capability, comes great accountability. Be literate out there.

Pigs and Professionals in a Digital Age

No, I’m not talking the crumb-infested, corner-cubicle-coveting, pontificating bane on everyone’s existence from the IT department. I’m talking real pigs.

Let me explain. When I first moved to the Mid-West, I was a bit mystified to hear one of my colleagues extolling the local pigs. Seems she had just been to the state fair and reported back, “Them pigs! They were real professionals! They knew just where the oreos were and right how to get there!” On further investigation, turns out that pig racing is a big thing at the fair. It involves a little track and oreos at the end to entice the pigs to hustle around the corners, which they apparently do with some skill and enthusiasm.

An Extremely Brief History of Professionalism
Enquiry into the nature and practice of professionalism has longer legs than your average Mid-Western racing pig, reaching at least back to the Middle Ages and its system of professional guilds and apprenticeships. Even in our own, more proximate, middle ages, the 20th century, some folks were at great pains to draw distinctions between “the professions” such as medicine, clergy and law and the occupations of the rest of us wage slaves. Trying to untangle the nuances of those arguments is an exercise for a more bored and idle time.

However, as we cede more and more decision making to the Digital boxes, it’s probably worth taking a moment to consider what such a shift means for us as individuals and professionals.

Professionals and Digital Surrogates
Don’t get me wrong. I’m more than happy to have a chip monitoring the temperature in my house and making the call of too hot, too cold, or just right in best Goldilocks fashion. I’m glad, on those long interstate sprints, to turn the moment-by-moment decision making of faster, slower or maintain over to an engine control module. Frees me up so I can cast my focus elsewhere such as being amazed at the guy one car over who’s eating with one hand and texting with the other. Guess that’s why God invented knees.

There are whole volumes of decisions, once fraught with ambiguity and danger, which have become so well understood that no real on-the-fly judgment is required to balance out risks and desired outcomes. Hurrah for the science, technology, and industry, computer or otherwise!

Unfortunately, there is no commonly upheld, exhaustive list delineating which decisions fall in that bucket and which decisions still require someone with a body of training and experience to make a call. That’s where the conversation about being professional and our growing Digital reality collide.

Anatomy of a Digital Decision
From one vantage point, Digital is all about canned decision making. A bunch of “Is this a ‘0’ or a ‘1’?” decisions get munged together into ever expanding chains. Those nano-decisions collude to create all kinds of miracles from robotic surgery to the latest version of Angry Birds on my cell phone.

It is important to remember, though, that these miracles are completely dependent on the “canned” nature of both the nano-decisions and their answers. If you can’t predict the choices that need to be made or the appropriate answers, Digital need not apply. Digital isn’t the only the only thug out there beating the snot out of common sense, but it’s certainly at the party. Digital has a penchant for substituting data for direct experience and accelerating any decision process. Couple those two with suspect quality control and the result is not likely to pretty.

If you listen to the Lean Six Sigma priesthood, you can’t make good decisions with out good data which is true for a certain kind of decision. However, it seems to me that most of the really interesting decisions call for more than just data and established process, something not quite so susceptible to the lies, damn lies, and statistics of modern business and political discourse (with apologies to Twain, but not Fox News).

The trick, of course, is to know when a decision is ready to be canned, when we know enough and the boxes are savvy enough to permit the prediction and its reliable encoding into this system or that. Get it right and some new miracle of efficiency and precision is born. Get it wrong and you get the decisional equivalent of botulism.

Decisional Wheat, Digital Chaff
Real professionals, true leaders in their chosen fields, don’t earn their street cred by parroting back the same decisions described in the text books or reference manuals of their education and apprenticeships. True professionals make their mark, build their legacy, when a gap opens up between the present reality and the available data, information, and solutions. The better the professional, the quicker they see that gap and the more creative they are in filling it, whether they’re doctors, managers, programmers or plumbers.

Designer Reality

It would seem we’ve made our Digital world in our own image. “Well, duh!” you might say, wondering how it took me so long to reach that pretty obvious conclusion. And it is pretty obvious on the face of it. However, if we’ve made this Digital world in our own image, what can we learn about it and ourselves if we dig beneath the surface a bit?

Regular readers of the Underground know a steady theme is attention to what’s edited out by our Digital representations of Analog. That editing has always been cast as a conscious choice, made in a hubristic attempt to “improve” things. Turns out that editing may begin long before we summon up any conscious picture of what better looks like.

Making a Monkey Out of Percpetion
In their book, The Invisible Gorilla, Christopher Chabris and Daniel Simons recount a number of psychological experiments on attention, knowledge and logic. Turns out we don’t know or notice as much as we think we do. The opening chapter recounts a simple experiment in which subjects are asked to watch a video tape and count the number of passes between basketball players. To make it a little more challenging, some players wear white uniforms, some wear black and the subjects are only to count the passes between players in white. Pretty straight forward and most subjects deliver an accurate count of the number passes.

There is one other little detail that makes this experiment more interesting and relevant to our conversation. About half way through the video, a person in a full gorilla suit wanders through the middle of the basketball exercise. They stop in the middle of the frame, wave their hands, dance around, etc. They’re not trying to sneak in and back out without being noticed. But almost half the subjects, when asked about what they saw in the video will talk only about the number of passes. If asked outright about seeing a gorilla they will say they saw no gorilla. Many in that group, if shown the same video again, will deny that the gorilla was there the first time through.

It only gets worse from there as they recount other experiments that call into question our abilities to accurately remember significant events, draw rational conclusions about cause and effect, and in general reliably estimate our own capabilities and potential. As remedy, Chabris and Simons’ have an almost child-like faith that experimentation can clearly reveal an accurate knowledge of the world as it is. Their book’s sub-title is “And Other Ways Our Intuitions Deceive Us.”

I’m not so sure. Or at least not sure that really solves the more complex social, environmental, or personal challenges which are not reducible to simple experimentation. Last I checked we only get to live a given life one way. No winding the clock back and trying some other path.

The phenomenon of editing is not a revelation in the Digital world. It’s all about editing. But as we play God in creating that Digital reality, it’s probably worth asking if our Analog selves are more like Yahweh, the God of Ages in full control and delivering our certain prognostications in a rumbling voice of thunder. Or are we more like Epimetheus, the Greek God know for “running forward while looking behind.”

It’s an Architected Life
If popular movies are any indication of our baser intuitions, it seems we’re casting our instinctual vote for Epimetheus. There’s a trend in movies over the last decade of Digital emergence to have a role of a flawed God, more commonly called “The Architect” These “Architects” are not to be confused with the humble practitioner down the street designing MacMansions for the boobocracy, nor even the soaring spirits that have designed our finest public monuments. These “Architects” are designing whole worlds, but not worlds that will actually be built, touched, inhabited, but rather only experienced either as base neural stimulations (The Matrix series) or in dreams (Inception).

These imagined worlds, no matter the skill of the architect are always flawed, eventually falling into some kind of chaos. It is as if the designed reality lacks some stabilizing or self correcting element to prevent eventual disaster; as if the designer, the architect, by definition cannot see a big enough frame to create a self sustaining reality.

Of course, just because the movies say it don’t make it so, but this one rings true both in intuition born of experience and in experimentation with invisible gorillas.

Go find someone who has the title “Enterprise Architect.” About any self respecting medium to large IT shop will have at least one these days. We’re a sorry lot (did my time there several years ago) charged with the impossible task of articulating a comprehensive vision for robust technology in the modern organization. Actually that’s not so bad. Where the trouble comes is in actually implementing and maintaining that vision. Like the sci-fi architects writ large across the big screen we seem doomed to failure, to watch time and the newest buzz words relentlessly crumble our creations to chaos and dust.

Better than Perfect
But there is a kind of glory and honor to be had from this Sisyphean task, the restless plucking of dripping bits of order from the relentless chaos. We achieve that glory and honor not in realizing the perfection of our designs, but rather in understanding both our Digital and Analog worlds are improved by a certain humility, an acceptance that even our best designs are flawed in ways we cannot, at first, imagine. Honor comes not from our perfection, but rather from our living through, beyond and above our realized flaws.

i-Mutter

The Analog/Digital divide is warping and shimmering like the horizon on a sun-baked Texas highway in middle of a blue sky summer day. In particular, two new digital devices have come to our offices at One Analog Way that are rocking our world. One’s a digital single lens reflex camera and the other is an i-mumble. Yeah, that’s right an i-cough.

We’ll get to the camera in a minute, but I suppose I should get the i-hrrmph out of the way. Yeah. I scoffed at the Apple dudes and all their iPad hypoteering. Yes, I told them to put their collective heads between their collective knees and breathe into a paper bag for a while. No, I’m not entirely prepared to take that back. But after a couple of months with my own i-Mutter, while I still can’t say the name without a hitch in my get along, I have to admit it’s a pretty cool device. Doesn’t rise to the asserted level of “magical and revolutionary” but it sure has changed a few things in my house and my Analog and Digital habits.

iRead Therefore iAm
The most notable change is in my reading habits or at least my choice of media. I was one of those folks known to assert that you’d take away my print-on-paper books when you pried them from my cold, dead hands. I haven’t taken my library out in the street and built a bonfire to dance merrily around (too damn hot in the middle of this summer for that kind of shenanigans). But anybody tracking my media purchase history (shout out to my buds at Nielsen) would note a distinct downward trend in the print-on-paper purchases.

I’ve gone from an emphatic print-on-paper only policy to print-on-paper only for special situations policy in the space of about one e-book. I like my e-readers on the i-Mutter. I like that I can carry 20 or 30 books around with me all the time, dipping into the three or four that I’m actively reading. I like always having a line up waiting when I finish any given book. I’m smitten with reading about a book in the New York Times Review of Books and being able to instantly acquire it. I absolutely love having that moment of curiosity and, right then, being able to go find a relevant book or magazine even if it is the middle of the night.

And that’s just the e-book capabilities. High def photo album that I can take anywhere. Oh my (No all those phone screens just don’t count. Size does matter). Movies when I want them without the Netflix or cable box. Oh yes! Almost all of my information consumption needs, neatly packaged and delivered in a highly tote-able form. Suuuuwheeeet.

Fear not, dear reader. I have not been infected with the “This Changes EVERYTHING” disease. That hagiography to the i-Mutter is not done without a visceral awareness of something lost as well as gained. The i-Mutter has the heft of a small book. I actually feel like I’m reading a book. Curling up on the couch with this thing plopped on my chest and my glasses popped up on my old fart forehead feels authentically like a reading experience.

But, oh that delicious scent that rises to my nose when first opening a print-on-paper book. That incense of compacted writer’s craft, of distilled memory of every other book I’ve ever read, of the wafted intimation of story yet to be told. Some how the smell of metal and plastic just doesn’t carry the same weight, doesn’t prepare me as well mentally for engagement with the other worlds, the voices, the truths of good writing.

I Loan Therefore I Am
And then there’s the social act of lending a book. All the social networking pointers and referrals is not the same as handing over an object to a friend, knowing it will engage them intellectually, emotionally, spiritually. And some day having that object handed back, looking in their eyes and reading the changes there that have been wrought by this now shared encounter.

I have one book on the shelves of my library that is almost as well-traveled as I am. In an ironic twist that only reality can generate, it is Blue Highways, by William Least Heat Moon, a recounting of traveling the back roads of the U.S. I read it and fell instantly in love with the author’s written hand. Almost as quickly, I thought of a fellow wonderer, and the morning after I finished it, loaned the book to him.

I don’t loan books lightly. It is too intimate an exchange. I’m wounded in some fundamental way when loaned books are not returned. My wonderer friend knows this, so I was a bit surprised to learn that he had the same reaction I did to the book and, in his enthusiasm, loaned it to a complete stranger to me while on vacation with friends several states away!

Part of the reason I like this guy is the quality of people he knows, and while my book was put in a stranger’s hands, I need not have worried because about a month later a package arrived with the book and a short note thanking me for finding that gem. With this particular book, that pattern repeated itself two more times, loaned to friend, received back from a stranger. Needless to say, I’m not going to loan out my i-Mutter in that way. It might be the source of other kinds of bonding, but not in quite that serendipitous mode. There is something of our relationships to our Digital objects that is simultaneously both more possessive and more casual, as if their representational, second-order-of-reality nature both obsesses and bores us.

I See Therefore I Am

Which brings me to the camera. Nothing about this camera repels me. Out of the box I walked into the backyard and took one of the best pictures I’ve ever taken in forty years of amateur snap-ology. But it does raise this interesting question about digital representations of representations, a dream with-in a dream. In 1/100 of a second, I began to learn things about bees (they’re furry) and coneflowers (there’s a hallucinogenic sunrise at the tip of every flower), that I never knew.

Not a huge big deal, but as I was drawn to this other world, actual, but not entirely visible to the naked eye, I began to remember how attached my dad was to photography, began to sense the curiosity and sensitivity to beauty that he had. While he was alive, I never appreciated that, being too impatient or embarrassed with yet another stop to capture yet another sunset or arrangement of autumn leaves. Stoopid me. Perhaps slowing down, looking more closely at things had a potential I hadn’t appreciated then.

Beyond my personal epiphany, though, is another, more generalizable nugget. Analog comes packaged with all its data and incompressible timeline (talk about object oriented!) whether we’re prepared to process it or not. Digital mimics that, but any representation has fidelity issues. The challenge with Digital is not so much that lost fidelity as it is our insistence that the loss is insignificant, even an improvement over the real thing. We seem to believe the picture data paints to be more real, more factual, than the reality it only mimics. Rather than derive meaning directly from what is in front of us, we have a bias to believe the numbers instead of the obvious reality. This seems especially true when the scale, either up to huge volumes or down to microscopic truths, overwhelms us. Rather than slowing down, digging in, reflecting we take the first read of the data that comes along and build entire world views from that single point of analysis.

Kind of like the teenage me sitting in the back seat of the car, fuming as my Dad stops to take another picture. No real attempt to understand more than my first read of the data of my Dad doing something different, standing out, wasting my precious… well what…? Can’t remember anymore where I was in such a hurry to get to, some other place or life I guess.

Ain’t nothing wrong with more and better data unless we let it encourage us to trivialize and intentionally misunderstand ourselves and the world around us. Yet, sometimes, the Digital picture, that representation of representations, the most Digital of moments gets turned to Analog ends, to my attachment, my love for my Father and to my understanding of him and myself in new ways. Not bad for 1/100th of a second, but then I guess most revelation happens about that quickly. Perhaps we humans are not so unprepared for the nano-world of Digital as it sometimes seems.

How Digital Are You?

I started chewing on this when a keynote speaker at a conference polled the audience for their reactions to a text message interrupting a conversation at a restaurant. His choices were a)irritation, b)indifference, or c) enthusiasm.

His analysis was that, generally, the grey hairs (or no hairs in my case) would be offended, viewing the device and its communication stream as some how inferior to the immediacy of the interactions around the table. I believe the word “rude” was used in his description. On the flip side, he suggested that another, younger, demographic would not skip a beat and might actually join the text stream on their own phones.

The audience response, given before his analysis, seemed to confirm at least the first half of his conjecture. Bunch of gray hairs and we pretty uniformly raised our hands for the first option. After talking us through the options again, he asked how many of us would have invited a friend who actually walked up to the table to sit down and join the party. The speaker suggested that there’s a part of the population out there that views that hypothetical text in exactly the same way as someone walking up to the table.

I’ve had the opportunity to try this out on several audiences since then and, damn, he’s right! Just a few weeks later I was talking with a group of about 60 college students and poised the same question and got exactly the response he predicted. Most would have been at least indifferent, and likely enthusiastic for inviting the texter to the virtual table.

May I see Your Papers, Please?
Since then, I’ve been noticing a lot of chatter about Digital Natives versus Digital Immigrants. The natives are those folks young enough to have grown up in a digitally saturated environment. The immigrants are those of us who came to that environment latter in life, who, no matter how fluent, don’t have Digital as our native tongue. We retain some of those quaint habits and odd perspectives from the old country.

So back to the original question. How Digital are you? Suddenly that seems like a much more personal question. Perhaps our, ah, misguided friends in Arizona can figure out how to check our papers for that. Or perhaps a Facebook quiz is in order. Or maybe there’s an app for that. The last two approaches would favor the natives and the first, ironically enough, would probably favor the immigrants as it smacks of old world sensibilities.

Is it even important to ask the question? Do we really need the officer of our conscious to pull us over to the virtual curb of our brain, lights flashing, and demand to see some verification that we have a right to be in this Digital age? Well maybe that much useless drama isn’t necessary, but asking the question does open some interesting lines of discussion.

A Bit of Profiling
If language shapes how we view the world, how we present ourselves, what’s the impact of having Digital as your native tongue? As with any native/immigrant discussion, one has to tread carefully to avoid blurring reality with stereotypes. In addition, good will between the parties is a better starting point than enmity.

One of those observations that pops up a little too frequently to be entirely true is that natives are better multi-taskers than immigrants. If I hear one more parent shake their head with wonder at their teenager doing homework, listening to music, chatting in four different windows, while the television’s on, I’m going to…. well… start listening with one ear, while reading a book, driving though the wastelands of the western Florida interstate highway system, eating my way through a bucket of fried chicken. It only seems like dark magic from the outside. Trust me.

Another bit of rapidly emerging, if not particularly robust, piece of conventional wisdom is that the natives aren’t very good in a face to face conversation. Apparently all that intermediated connection, the chats, the texts, even the old-school e-mail have dulled their ability to react in real time verbally. Yeah right. I hate to remind my fellow oldsters, but communications skills have always been a sore point between technologists and regular humans and between adolescents and the next tick up the age ladder (though in my experience, the kids are getting better at bridging their gap) Does the intermediated thing aggravate the problem? Perhaps. Did it create the problem? No way.

I’ve also heard tell that us immigrants are less collaborative, more hierarchical than the natives, that our desire for position and power overwhelms our sense of connection and community approbation. Hmmmm… Wall street? Every randy elected official? Deepwater horizon? There would seem to be a few data points there. And yet. And yet. It was my mom and her brothers, all reaching into their 70’s at the time, who formed up the family e-mail list. They brought me back together with some of my sweet cousins that I hadn’t spoken to for years. Now we’re following each other on Facebook, on blogs, through twitter and the e-mail list.

The Melting Pot
Bottom line? As always with the Analog Underground, the call is not to roll back the clock to some better, old world times. The call is to pay attention to how the strange ways of this new land may change us, for better or worse. To examine what gets amplified, what gets attenuated. And to make those choices consciously with as much awareness of the consequences, intended and unintended, as is possible. Sorry, Arizona, but that kind of judgment probably isn’t very susceptible to documentation and legislation. Still requires an engaged brain and open heart.