Nothing is more worth studying than human discourse. We are the language-using species and if we don’t understand how we use it we’ll never understand anything. Recently, courtesy of the Net, we’ve been using it in smaller pieces which require smaller investments of time and attention. These are new things; are they good things? [Warning: long.]
Credits · I wrote this after reading Nicholas Carr’s Clutter, which was provoked in part by my Empty Walls and quotes at length from Steven Johnson’s How the E-Book Will Change the Way We Read and Write. Go read Carr and Johnson; they’re both good. Carr says:
“Whatever its charms, the online world is a world of clutter. It’s designed to be a world of clutter - of distractions and interruptions, of attention doled out by the thimbleful, of little loosely connected bits whirling in and out of consciousness.”
That’s a great sentence and I haven’t been able to stop thinking about it since.
Also, I can’t ever write something of the form “If you don’t understand that you’ll never understand anything” without an indirect tip of the hat to Margaret Atwood.
History · Consider the 95% or so of the human time-span that predates writing, when language took one form: speech. Whatever is built-in to us about language and thought was built in by evolution, and that process was pretty well over by the time writing arrived.
Speech is remarkably plastic; at one extreme, the monosyllables of technical specialists or lovers; debate flowing around a meeting table somewhere in the middle; and at the long end, oration, notable examples of which extend in length from a handful of minutes to many hours.
There’s nothing much on the Net that’s without precedent in spoken language. What’s new is that written discourse is becoming less like oration and more like conversation. It’s not clear that this is bad.
What’s New? · The Net has had a twofold effect on short-form publishing: First, it’s cheap, verging on free. Second, it enjoys many routes to potentially large audiences. It’s the second that’s interesting. Until recently, you simply couldn’t find a large audience for your thoughts unless you had books on shelves in stores, and books had to be a couple of hundred pages long (usually more) to get on shelves in stores. This resulted, among other things, in many unnecessarily thick books.
Thus, books are now competing, on a fairly level playing field, with the Net media: blogs and Twitter and mailing lists and fora of other flavors. News Flash: Books are losing market share! Unsurprising, because when you start at 100%, there’s nowhere to go but down.
Length Matters · Pardon me for stating the obvious: A tweet, an essay-length piece like the one you’re reading, and a book-size work are essentially and qualitatively different in the reader’s experience. But I’m not sure it matters much whether the words are on a screen or a page.
Here’s one thing I’m sure of: there is no danger of all human discourse converging on the short, medium, or long form. One Day In The Life of Ivan Denisovich is a very short book and does not need to be a single word longer. At the other end of the spectrum, A Suitable Boy would be diminished by the omission of any of its thousand-plus pages.
Four minutes is the right length for a pop-music performance on YouTube. Forty-two minutes is the right length for a Lost episode. Three hours is the right length for a baseball broadcast.
Where the Words Are · Carr and Johnson, argue, essentially, that the medium is the message. Carr:
“A change in form is always, as well, a change in content. That is unavoidable, as history tells us over and over again. One reads an electronic book differently than one reads a printed book - just as one reads a printed book differently than one reads a scribal book and one reads a scribal book differently than one reads a scroll and one reads a scroll differently than one reads a clay tablet.”
I’m unconvinced. I have no direct experience with scrolls or tablets, but here’s a thought experiment. Let’s imagine different ways of doing Twitter in print: I imagine little slips floating out of a wall-slot, or old-fashioned paper-roll printers grinding away. As long as we preserve the essential characteristics — absolute user control over who you read, latency of less than a minute, and primitive but usable threading — would it feel different, qualitatively? It’s just not obvious.
And yes, I think Marshall McLuhan was not only wilfully obscurantist but also mostly wrong.
Johnson’s argument is subtler: that since a Kindle or equivalent isn’t one book but millions, reading on it is inherently more subject to interruption:
“Because they have been largely walled off from the world of hypertext, print books have remained a kind of game preserve for the endangered species of linear, deep-focus reading.”
Maybe. And maybe the lesson is that in a busy world, sometimes you need a little help in managing your attention. There are other approaches than limiting yourself to dead trees.
Distracted · The Carr and Johnson arguments converge on the issue of attention, and they’re not alone there; the hand-wringing is intense among traditionalists who lament the slipping-away of the Golden Age of lengthy undisturbed rumination. And, as always, Kids These Days are no good. The hand-wringers may be comforted by finding that their concerns are shared by many of us who live by choice in the thick of the thousand-channel flow.
Here is some raw unsorted evidence:
Linda Stone’s Continuous Partial Attention.
Wikipedia on The Flow.
Freedom enforces freedom. For up to eight hours at a time.
WriteRoom: “Distraction free writing.”
Inbox Zero; Doesn’t work for me, but it’s got buzz.
Focus ·
Guy walks into a doctor’s office:
“Doctor, it hurts when I do this.”
“So, don’t do that.”
I agree with the traditionalists: It requires an investment of serious amounts of fully-focused time to accomplish anything that matters. If there’s something you need to get done and you can’t make that investment, you have a problem. If one of the reasons is that you’re fast-switching between your email and your chat and your Twitter and your phone, you have to stop doing that.
But, speaking from personal experience, I’m not convinced there’s anything new here. There are two kinds of things that I need to do: let’s call them obsessions and chores. When I’ve got an obsession cooking, I have the opposite problem: a tendency to tune out not just the Internet, but my wife and children and finances and scheduled meetings and sleep. Hours vanish with no particular effort of will. I’ve learned by bitter experience that I really ought not pick up an interesting new book when, around midnight, I put work aside.
When, on the other hand, I’ve got a chore on the to-do list (I’m looking at you, travel expenses), I’ll let anything, no matter how trivial and stupid, distract me. It sure doesn’t take the Internet; I’ll actually find myself dusting my office or going through backlogs in mailing lists I don’t care about, or checking out my son’s baseball team’s grounds-keeping schedule.
If you need extra mental discipline or tool support to get the focus you need to do what you have to do, there’s nothing wrong with that, I suppose. But if none of your work is pulling you into The Zone, quite possibly you have a job problem not an Internet problem.
The world is distracting, and particularly when you’re open to distraction. But then, it always has been.
Comment feed for ongoing:
From: Spencer Rose (Apr 27 2009, at 20:41)
Great article. I am an attorney/webdeveloper and communication (especially written) is so important to me, and usually I see it done in such a poor fashion. Anyway, I think "distraction" as a whole is increasing. I think distractions have always been there, but distraction as a large information comotion that seems to be today's world seems a little more intense, and steadily increasing. Just a thought while reading.
[link]
From: S. (Apr 27 2009, at 22:17)
I imagine Twitter in print as ticker-tape.
[link]
From: bananaranha (Apr 27 2009, at 23:31)
<b>“A change in form is always, as well, a change in content. That is unavoidable, as history tells us over and over again. One reads an electronic book differently than one reads a printed book - just as one reads a printed book differently than one reads a scribal book and one reads a scribal book differently than one reads a scroll and one reads a scroll differently than one reads a clay tablet.”
I’m unconvinced. I have no direct experience with scrolls or tablets, but here’s a thought experiment. Let’s imagine different ways of doing Twitter in print: I imagine little slips floating out of a wall-slot, or old-fashioned paper-roll printers grinding away. As long as we preserve the essential characteristics — absolute user control over who you read, latency of less than a minute, and primitive but usable threading — would it feel different, qualitatively? It’s just not obvious.</b>
Well, you missed the whole premise behind the "A change in form is always, as well, a change in content" argument. Which is that after a change in form you cannot just "preserve the essential characteristics" --because the mechanics of content delivery and consumption are changed.
Case in point, your own though experiment: a Twitter in print would not be or feel the same as the current Twitter. In fact it would be quite different qualitatively. You naive assume that we could "preserve the essential characteristics" but the mechanisms for "absolute user control over who you read, latency of less than a minute, and primitive but usable threading" just aren't there in print (or come at prohibitive cost with a very difficult implementation), while they are free and/or a given in an internet app. (And of course, were nowhere near achievable before print).
[link]
From: Martin Probst (Apr 28 2009, at 01:01)
<blockquote>I’m unconvinced. I have no direct experience with scrolls or tablets, but here’s a thought experiment. Let’s imagine different ways of doing Twitter in print: I imagine little slips floating out of a wall-slot, or old-fashioned paper-roll printers grinding away. As long as we preserve the essential characteristics — absolute user control over who you read, latency of less than a minute, and primitive but usable threading — would it feel different, qualitatively? It’s just not obvious.</blockquote>
I think you get the definition of medium wrong. At least to me, the most defining characteristics of a medium are not if it's printed on paper or displayed on a screen, but rather the way you interact with it and the possibilities that medium opens for consumption and re-use.
So where you say "as long as we preserve the essential characteristics ...", I'd argue that these are indeed the defining features of a medium. If you keep those identical, not much will change. If you change these, you get a different medium and interesting things happen.
[link]
From: david (Apr 28 2009, at 04:15)
An eBook can be read two ways, in a reader on my computer or in a reader that, in some way, mimics the form of a book. I've come to the conclusion that reading on a computer screen *is* different from reading a book, at least for me. I find myself scanning rather than reading and I don't become immersed.
Years ago I tried reading on my Palm and I found that it too was an unsatisfactory experience. I didn't scan as I did sitting at the computer but I never got past the fact I was reading on a PDA. The immersion never took place.
Due to my experience with a PDA I never took ebook readers such as Sony's or Amazon's seriously. This winter while visiting family I had the chance to try a Kindle and found that it *is* just like reading a book; maybe even better. It isn't as heavy as a hard back Balducci tome, it doesn't smell like a new paperback, and the pages don't stick.
[link]
From: Mike Hayes (Apr 28 2009, at 06:27)
When faced with a chore I don't want to do, I sometimes grab a pen and a piece of paper and then start in on the chore.
I write down the distraction as it pops into my head and try to resist the urge to do anything with it.
When you do this, while you probably won't get the chore completed, you will have a good list of things that have been rattling around in your head taking up space and hopefully some good ideas. It's amazing what your mind will come up with to avoid drudgery.
The downside of course is that you may end up with even more chores. But I think it helps to have those things down on paper rather than in your head.
[link]
From: Carolyn (Apr 28 2009, at 06:48)
>>I agree with the traditionalists: It requires an investment of serious amounts of fully-focused time to accomplish anything that matters. If there’s something you need to get done and you can’t make that investment, you have a problem. If one of the reasons is that you’re fast-switching between your email and your chat and your Twitter and your phone, you have to stop doing that.>>
Indeed. You are talking about one of my past favorite pet research projects, on “Attention Deficit Trait”, a term I believe was coined by Dr. Edward Hallowell, which is environmentally induced. I am glad to see you and Nick Carr converge on the topic. I’m also glad to see you give it attention.
Carolyn A. Colborn
[link]
From: len (Apr 28 2009, at 07:00)
A former VP I worked for would get upset when I worked on the web servers. It was a time when few knew much about them and although my job was analysis for the business development group (intermediary), I was the one who knew how. He said, "Your job is interupt driven and I want you to focus on that. The programmers have non-interrupt jobs and I want them to focus on that".
Carr is not wrong that the medium changes the style of content delivery and that in turn, some aspects of content. It is not a new observation as communication theorists have noted this particularly since the advent of the telegraph. It is a change not just in content style but reach. In a way, Twitter is a return to the telegraph but with a much higher frequency, lower amplitude, and greater reach. As a result, the content shifts away from the important to the mundane.
I don't tweet. Why? I've read almost nothing in the twits that could not have waited or actually merited saying. It IS clutter and unlike some, I dislike clutter that adds little beyond an interruption. Would Twitter help me do my job which is already interrupt driven? No. Will it help me compose music and videos or 3D? Absolutely not.
Is it worth doing? No.
There is another story coming from this. The notion that medium changes the form of content and that changes the character of language use is hardly worth mentioning in the sense this is very old news. What is becoming noticeable are the wasted resources being expended and lauded on technical hulu hoops.
As I watch another effort by major players to rewrite history and toss out years of the work of others to advance the technical pleasure of the few (yes, we are still debating if standards are worth having on the web), the chasm between those who program and those who must manage the products of these people becomes wider. We are in the time of technical gimcrackery, the time when selling the digital hoop skirt adominates the finances of the web.
Twitter is to communications what a beaver skin hat is to the well-dressed man of the 19th century: fashionable but otherwise worthless. So the question comes down to, am I willing to enable yet another insistent set of interruptions to my day for the sake of a hat that won't keep its shine and smells after a year of wearing it.
[link]
From: Andrej (Apr 28 2009, at 09:24)
You made some excellent points about attention and distraction, but I too think you've missed the mark about differences between media.
There are several factors that make reading (as opposed to skimming) powerfully sucky on a computer:
- you're staring at a giant lightbulb (eye strain)
- the pixels, even on a laptop, are huge (eye strain) -- text on a laptop is still crappy and lo-res compared to print
- you have to contort yourself to the device (whole body strain, poor posture causes fatigue)
- funnily, I find text on a computer to be more linearized than in a book -- there's no 3rd dimension (book thickness) and no ability to use all our marvellous spatial perception wetware to quickly leap back and forth between sections of text.
This leads to a massive qualitative difference in the physical experience of reading from a computer screen vs. a printed page – the physical experience of using the device causes a great deal of strain and fatigue, which interfere with the kind of concentration needed for deep reading.
E-book readers are OK for linear, entertainment-oriented fiction, but they’re not there yet for anything requiring deep study or back- or forward-references. Or, for that matter, the display of more than 1 paperpack-sized page at a time. They also prevent you from using your spatial wetware, and their screens, while much more comfortable than monitors, are still sucky compared to real paper.
I think the “new” thing is the assertion that display screens are as good as paper for the consumption of complicated long-form narratives, textbooks, etc. They’re just not. They make a fantastic adjunct, though – searching for a half-remembered reference in a paper book doesn’t work very well.
[link]
From: Rick Jelliffe (Apr 28 2009, at 11:43)
Monks can be monastical, and hermits eremetical. But ordinary folks, even the bearded pater familias, cannot and probably should not.
In the great rush to identify trends, to schematify tomorrow so that we are ready for it, we can fall into the trap of universalizing those trends, as if they are forces acting on all of us rather than accidents that naturally accompany a particular lifestyle to one extent or other.
It may be that some people's world is full of clutter and distractions, but I don't think mine is at all, for example. Mind you, I am probably a horder, so I would think that.
If you want to get rid of clutter, don't get rid of the books, get rid of the TV and the electronic gadgets. Get rid of the new thing, the next thing, the big thing, the thing that simulates personal responsible connections but which can be turned off, the grass-is-greener thing. Don't twitter, grow vegies; or only twitter about the vegies.
It seems to me that any techno-social trendwatcher who does not have any idea of 'vocation' in their analysis is fatally simplifying. The vocations of being a parent, lover, carer, musician, soldier, infant, and so on each will dictate far more how the world of things compete with our internal life than the latest gizmo, it seems to me. The social trends a la Linda Stone may be interesting, but their interest should be on whether they make the lot of different vocations easier or harder, it seems to me. "We" barely exist, where "we" are the subjects of pundits' universal trends.
[link]
From: Jim Fuller (Apr 28 2009, at 11:46)
very interesting article ... perhaps one trend in mediums to point out is that in the modern age most succesful communications/mediums tend to fill up with distractions quick (that is advertising). For some reason it seemed in books (or scrolls for that matter) sacrilege to print an advert on the back of every page.
sms and twitter are great examples of mediums that have resisted attempts at advertising in them (remember that telephone service that was free but you had to first listen to an advert ... never took off) but once they do get commoditized the 'borg will phase shift again' to create new mediums which are free from non relevant distractions (advertising opportunities). Email is an example of a successful medium that got commoditized ..., well because 80% of the stuff is spam and eventually the savings in electricity will get optimized/realized in some other way of communicating.
I think the end game to all this is the notion of private roving radio networks, kind of like the guys from Lopht were demonstrating 10-15 years ago. Complete ownership in the infrastructure will effectively keep things private and perhaps we will see internetworks based on families, companies etc...
[link]
From: len (Apr 29 2009, at 07:12)
Longer reply:
Here is a distraction that won't fit the model. It is *more* like oration, less like a line of text and demonstrates the power of standards over the pronouncements of pundits (Clay Shirky's infamous "What is VRML good for? Good riddance!"). The material used to make this movie is 12 years old. It still runs in modern X3D browsers enabling me to make this movie (in four parts - this is part 1).
In the rush to do the 'next big thing', we are losing track of the game. We are failing to preserve and worse to enable reuse of digital assets. We are swapping immediate stimulation for long term sustainable works of depth.
The first volume/chapter (four total) of the IrishSpace movie is on YouTube. (approx 9:39 min/secs).
http://www.youtube.com/watch?v=-1b5wajK5Bk
This project uses both the Vivaty and the BitManagement Contact engines for rendering, Jing for screen capture, and Sony Vegas for video production and editing. 2D images have been composited via Vegas.
Some points:
1. The VRML97 code was written in 1996/97 on a schedule of 3.5 months. None of the authors met until the final assembly in Ireland. Final assembly was a day. Details of the project are online.
2. The only modifications made to the code to run it for this capture was to remove LoadURL statements that were part of the original kiosk GUI.
3. After 12 years, VRML97 code still works brilliantly in current X3D browsers. While the graphics may seem primitive by comparison to current work, this was done fast, online with multiple authors, an immovable deadline and very famous people in attendance (Neil Armstrong, Deputy Prime Minister and other Irish VIPs). Only one of the authors was a full-time professional (Paul Hoffman).
4. This work combines real time 3D, images, a full narration (performed in Ireland by citizens of Tralee) and musical score.
X3D/VRML97 has proven its suitability for long life cycle projects, for archival of real time 3D graphics, for assembly in modern editing systems, for applications such as product demonstrations, entertainment and machinima.
This is the important point:
Without a standard and process supported by a consortium with open IP policies and sustainable business models, it would not be possible for this current version of work over a decade old to be repurposed and improved. This is critical to the *business interests* of the customers of any customer of any company creating 3D products today*. Teams are required to work at this level of complexity and length. Those teams have to be supported with open tools, may not be in the same locale, may work for long periods under intense pressure, and must produce products capable of being maintained, modified, archived and repurposed at minimum expense, effort, and need to maintain skills.
We have to stop throwing away what we've built and moving on to the 'next new thing' when that thing forces us back ten years. Reach is no longer a challenge. Creating something worth preserving is.
[link]
From: Norman Walsh (Apr 29 2009, at 08:58)
Nice piece. You're spot on about focus. That's exactly the way it works for me too.
[link]
From: T.J. Hart (Apr 30 2009, at 04:40)
Research would suggest that multi tasking is a myth: http://www.brainrules.net/pdf/references_attention.pdf
[link]
From: rvr (May 04 2009, at 09:42)
i haven't even finished reading the post yet, but i have to take issue with something you write near the top:
"Whatever is built-in to us about language and thought was built in by evolution, and that process was pretty well over by the time writing arrived."
you're kidding, right? evolution is over? the whole idea of evolution is that it doesn't end.
anyway, just wanted to point that out. back to reading the post.
[link]