Hallway conversations

Rachael in the elevator: "So, Mike, are you going to do a doctorate?" Dr. Tibbo as she was leaving her office: "So, Mike, has Carolyn talked to you about joining the doctoral program?"

Digital History Hacks

William Turkel, an assistant professor of history at the University of Western Ontario, runs a great blog, “Digital History Hacks: Methodology for the infinite archive.” I first ran across his blog last year via a couple of his research-related posts, the kind of “how to succeed at grad school” material that I continue to scarf up. One, on knowing when to stop doing research, offered great advice from one of his advisors: “Your research is done when it stops making a difference to your interpretation.”

Another post recommended just writing down the direct quotes and avoiding paraphrasing. He diagnoses his students’ note-taking problems as simply not using enough sources (but, again, know when it’s time to stop looking).

But what really fires Turkel up is using technology to grapple with history and I find his ideas and opinions invigorating. Similar to how historians want to get their hands on old documents, Turkel wants to use today’s digital tools to examine historical evidence.

His About page says, “In my blog I discuss the kinds of techniques that are appropriate for an archive that has near-zero transaction costs, is constantly changing and effectively infinite. ” Given that one of the themes of my education includes providing curated homes for digital materials, I’m curious as to his attack on the subject of dealing with digital records as historical documents and historical documents transformed into digital records. I also think his embrace of technology–especially programming–within a humanities-oriented discipline provokes some interesting ideas on how technology could be used or promoted within the academy.

He has a definite zest for the tech side and encourages digital historians to embrace programming as a tool that’s as creative and useful and ubiquitous as email or RSS feeds have become. He has co-authored an e-book and web site called The Programming Historian that introduces the tools and basic knowledge needed to create simple programs in Python and JavaScript. The goal isn’t necessarily to become a programmer, but to introduce to historians and other scholars in the humanities a new set of tools they can use to further their research and scholarship. Instead of scouring SourceForge for a unique one-off utility, says Turkel, create your own. The intellectual experience alone is enough to grow your capacity for looking at problems in a different way and, I would say, builds your confidence for attacking bigger and more unusual problems.

Turkel provides a great example of what he’s talking about in his series of posts titled “A Naive Bayesian in the Old Bailey,” a step-by-step account of the tools and approaches he used to perform data mining on over 200,000 XML files of digitized records from the Old Bailey. His final post sums up the experience, his decisions, and the value such an endeavor can provide.

Turkel’s vigorous advocacy of learning basic programming and tech tools reminds me of this post from the blog “Getting Things Done in Academia,” where Physiological Ecologist Carlos Martinez del Rio suggests that science grad students pick up two tools, with at least one being a programming language. This enables the eventual scientist to add to their own toolkits, encourages logical thinking, and enables a flexibility and enhanced ground speed when it comes to research.

This is not an attitude that I’ve seen in many of the courses I’ve taken so far at SILS, I think. There is certainly a zeal for programming and technology that arises naturally from the students themselves; they’re so fluent with the web and a zillion different web apps and sites, that they can imagine a solution to a problem in their minds and see PHP, CSS, JavaScript, and so on, as building blocks–or perhaps, a latticework–that will eventually solve the puzzle. And I know the faculty encourages the students to explore. No one is holding them back.

But, to be fair, it’s more likely that that attitude really isn’t germane to the primarily introductory classes I’ve been taking for the last 4 semesters. I’ve only recently settled on a focus area that will help me choose courses and a line of study for the next 4 semesters. Most of the technology I’ve played with so far–such as the Protege ontology editor–has served as a fine introduction to what’s out there, but there’s no time to practice mastery.

The master’s program’s primary goal is mainly to introduce us to a body of literature and a field of study; soak us in the basic ideas and concepts; and raise our awareness of the issues and problems that exist. If you want to go deeper and more technical, that’s fine, you can do that, and your master’s project offers an opportunity to develop a skill if you want it. But SILS occupies an unusual position in the campus course offerings. UNC’s computer science department doesn’t offer some basic courses, so SILS feels it needs to offer them; for example, courses on web databases and XML. It’s acknowledged that the standards of these courses are not up to those taught by the regular faculty. Still, these courses offer a safe place to practice and make mistakes, and that’s valuable. And, as one professor told me, if you’re smart, you’ll be able to pick up what you need and get out of it what you want. The important thing is to just start, wallow around for a while, and see what emerges.

The last word goes to Turkel, who says here that historians, more so than other practitioners in other disciplines, are uniquely positioned to pick up the basics of programming, in a passage I find rather inspiring, and not just for students:

Historians have a secret advantage when it comes to learning technical material like programming: we are already used to doing close readings of documents that are confusing, ambiguous, incomplete or inconsistent. We all sit down to our primary sources with the sense that we will understand them, even if we’re going to be confused for a while. This approach allows us to eventually produce learned books about subjects far from our own experience or training.

I believe in eating my own dogfood, and wouldn’t subject my students to anything I wouldn’t take on myself. As my own research and teaching moves more toward desktop fabrication, I’ve been reading a lot about materials science, structural engineering, machining, CNC and other subjects for which I have absolutely no preparation. It’s pretty confusing, of course, but each day it all seems a little more clear. I’ve also been making a lot of mistakes as I try to make things. As humanists, I don’t think we can do better than to follow Terence’s adage that nothing human should be alien to us. It is possible to learn anything, if you’re willing to begin in the middle.

You will have to understand that the logic of success is radically different from the logic of vocation. The logic of what our society means by “success” supposedly leads you ever upward to any higher-paying job that can be done sitting down. The logic of vocation holds that there is an indispensable justice, to yourself and to others, in doing well the work that you are “called” or prepared by your talents to do.

And so you must refuse to accept the common delusion that a career is an adequate context for a life. The logic of success insinuates that self-enlargement is your only responsibility, and that any job, any career will be satisfying if you succeed in it.

But I can tell you, on the authority of much evidence, that a lot of people highly successful by that logic are painfully dissatisfied. I can tell you further that you cannot live in a career, and that satisfaction can come only from your life. To give satisfaction, your life will have to be lived in a family, a neighborhood, a community, an ecosystem, a watershed, a place, meeting your responsibilities to all those things to which you belong.

Notes - The Book, The Internet, Literature

First heard of the "Is Google Making Us Stupid/Killing Literature" foomfahrah via this Mark Hurst post and this follow-up. Kevin Kelly was quite a player in the debate also, here and here, and all the above links will let you read all sides to your heart's desire. Clay Shirky's post questioning the "cult of literature" really popped the cork. Both Kelly and Hurst agreed with Jeremy Hatch's post that it's not the medium that disturbs your reading focus so much as your inability to discipline your reading habits, whether online or off. I wish I had the rhetorical power and skill (and time) to write a blessay on the subject, but here are the rough notes I made today as I criss-crossed cyberspace reading, skimming, and frowning. They add different vegetables to an already spicy gumbo.

  • Hatch and Kelly (and others) have no problem with reading on a computer screen. Hurst and Kelly both highlight this quote from Hatch's post: "...your ability to concentrate on a long text is not a function of the medium of delivery, but a function of your personal discipline and your aims in reading."I would say that that is probably true for Hatch, but not so true for me. I've had surgeries on both eyes for detached retinas and cataracts (and follow-up laser treatments to burn off lens plaque); reading online for long periods tires my eyes in a way reading paper-based materials do not. Perhaps this is because the light is being pushed to my eyes via my 20" Trinitron monitor rather than the light being reflected off the page; I don't know. My cataract doctor also urged me and every computer user I know to use wetting drops or lubricant eye drops at least hourly. He said he's observed computer and laptop users not blinking their eyes for nearly a minute, and this aggravates dryness and irritation of the eyeball.Kelly asked for some scientific studies of how reading online is materially or measurably different from reading books. In addition to scans of brain activity, why not also check eye movements, eye health, posture, etc.?
  • Better equipment may also help. I did read a book or two on my Clie in years past and it was OK, but it's not an experience I sought out very much. (Also, reading on my Clie isn't the event that an evening spent reading a book is, for me.) My 13" MacBook has a great screen for reading, but most PDFs I get don't fit comfortably on that screen, so I often wind up changing zoom levels and scrolling around a lot. On my PC, running the monitor means running my big desktop PC with the loud fan, which is annoying. Also, the hummmm of the equipment impels me to do something--don't just read! My apparatus for online reading isn't as transparent as the typical book apparatus I'm used it. I do often print out the things I want to read and take them with me.
  • Kelly, I think, points out the arguments of how word processors changed writing styles. Other commentators pointed out how every new technology changed how we created or consumed stories or (ugh) content. James Burke's series "The Day The Universe Changed" heavily makes the point that writing altered people's memories; it certainly had implications for the creation and performance of epic poems. I think it's safe to assume that the online experience will change reading habits, but we don't know how.
  • I was fascinated by Hatch's post where he said he really hasn't known life without computers around. I'm part of the generation that bridged the computing divide; I didn't use computers for full-time work until 1989, when I started using a Mac II for writing and laying out a newsletter. And the Internet (in the form of Compuserve) and the Web weren't part of my life till about 5 years afterward. Before that, yep, it was books, typewriters, and lots of scratch paper.
  • If people are having trouble reading books because they're reading online too much, it may be as Hatch says, more a matter of discipline or habit. But we're talking experienced readers and computer users here. It may be that the computer offers wonderful distractions. But it may be a generational thing, where us older readers are comforted by the handrails a book offers: pagination, tactile response, heft, the ability to open a book into 3 places at one time to check the TOC, endnotes, and a diagram. I find I miss the handrails when reading online: I have to use a little more cognitive juice to gauge how far I've come and how far I have to go in a book (though the scroll bar suffices), I have to think about how to set a bookmark if I want to go back and check something I've read before, I have to think about how to implement marginalia. I know all of these can be done online, but I have to think about how to do it; these tasks feel more "natural" (that is to say, "practiced" and "learned" and "I already know how to do it") with a book in hand.
  • I remember a long-ago question to Marilyn vos Savant. A guy noticed he was having trouble concentrating. What was the one best thing he could do to regain his focus? Her answer: read a novel.
  • "Is Google Making Us Stupid?" Were we stupid before? Or are we letting ourselves get lazy? Is that the same thing?
  • I'll probably change this answer after reading Carr's article, but: I think the simple answer would be to just shut the damn computer off and stop the input for a while.
  • How much of our reading mechanisms are "natural"--that is to say, innate, inborn? Our brain's hardware hasn't really changed all that much for the last several thousands of years. How important is training and association, and simply what we're most comfortable with? Could we refer to these latter components as the "software" running on our wonderful hardware?
  • Burke said in his series that, with a book, you could hold a man's mind in your hand, argue with him, learn from him--without having to go and see him. But books (and the publishing industry that grew up around them) eventually grew to serve as mediators and quality gates for centuries, becoming another effective barrier. If text (like music) is now flowing at us in a stream, it means that we're now again accepting unmediated information. Lots of that information may be worthless, but other mediators will arise (like the NY Times, Slate, Salon, Yahoo, and others), readers will choose which they prefer to use to sample the stream's myriad contents, and the mediation will continue, but in new forms.
  • I suppose one test you could do to check the efficacy of online vs book reading would be to have book-reader James Wood and bits-reader Jeremy Hatch read the same book in their preferred formats and see how the discussion proceeds. Does the medium change what they notice or what they talk about? Methinks that the conversation we'd overhear (and I'd love to overhear it) would be two excellent readers discussing what impressed them about the book, the (ugh) content. Instead of references to "that scene on page 12" we might instead hear "that scene where she cuts the watermelon", but that's not a big deal.
  • I do like Kelly's point about redefining what a book is, what are its boundaries. "Book" to me means a specific physical object. We need a new name, a new metaphor, a new image.
  • But truthfully, and I think even the digital partisans would agree, some subjects just work better in a book or folio form. Large-format art books, for example. I have a great big book of illuminated journals and letters that I adore turning the pages of, and my Absolute Watchmen and Alice in Sunderland volumes are just exquisite pleasures to read, browse, linger over, and they're easy on my poor eyes. I get great joy from appreciating the craft of the book, its art. There's also something about the possession of a beautiful physical object I can hold in my hands that I don't feel with digital objects.
  • Is the worry that we're becoming illiterate or aliterate? People may choose not to read because there are other things they're rather be doing. I'd say the latter is more precisely the issue some worry about. But haven't there always been fewer literate educated people in the world, than the reverse? (How many copies of a book do you need to sell to get on the NY Times Bestseller List? Compare that to the opening weekend attendance of the worst summer movie in the world. Which is larger? By what magnitude? There's no going back.)
  • Reminds me of Gore Vidal's comment that, at the dawn of civilization, song and poetry were at the center of the culture. Then books occupied the center, and pushed poetry out to the edges. Then movies and radio occupied the center, pushed books and novels to the edges, pushing poetry even further out. Then television rose in the center, and so on and so on. While none of these earlier artforms have died out, they aren't at the center and their enthusiasts talk to each other more than they talk to the mass audience.
  • I was struck by some commentators' replies that they loved their PDAs or iPhones to read books while standing in line, making use of downtime, etc. (A friend at work calls reading while on the toilet "parallel processing.") Not to be a prig, but -- is that really the best use of your time? Wouldn't your brain benefit from no input AT ALL for just a few minutes? When I'm in line at the grocery, I'll say a mantra to just pass the time and put me in a good mood. I'd hate to start reading something, get lost in it, and then have to hurriedly close it to push my cart forward. When I start reading, I want to stay in that world for a while. When I'm not reading, I want to stay in this world and be aware of what's around me or just mull things over.
  • Kelly mentioned audiobooks as a medium that no one was talking about. I listen to mine in the car, so only ever hear them in snippets; it makes for a somewhat disjointed experience. In Steve Martin's memoir that I got through Audible.com, I lost the photos that appeared in the book but I got banjo interludes between chapters and him actually singing some of his songs. So that was a good trade-off.
  • Genre became an issue with Shirky's essay and Birkerts, too. Fiction vs non-fiction seemed to be the issue. Would the discussion change if we were talking about poetry rather than prose? Could you read a few lines of Shakespeare or Keats or the Iliad while waiting in the grocery line, and then could you say you really read it? And what do I mean by "really reading it"? Does the context of where and how you're reading affect how you read a specific genre? (Obligatory mention of Poetry Daily, which I do visit daily.)
  • I'm surprised Wendell Berry hasn't weighed in by now (but then, someone would have to print out all the essays and send them to him). Wendell would add some more fun to the discussion.

Update: Talk about serendipity. Listened to a BBC Radio 3 discussion on the Future of the Book. In addition to talking about how a book, being self-contained, excludes other distractions, they mentioned the signaling aspects of book-readers, particularly subway or tube readers. Their choice of book signals to the other riders what kind of person they are; a "One Hundred Years of Solitude" reader might be advertising something about themselves quite different from a "Da Vinci Code" reader. One presumes a Kindle or iPhone reader are also advertising something about themselves to the people around them.

Update: "The Amazon Kindle I passed around the room was so forgettable that no one mentioned it during the next 90 minutes."

Got that? They’ll be a quiz. Originally from Little Pet’s Picture Alphabet, 1850’s. (via Nonist Annex)

Bene Gesserit Litany Against Fear

I must not fear. Fear is the mind-killer. Fear is the little-death that brings total obliteration. I will face my fear. I will permit it to pass over me and through me. And when it has gone past, I will turn the inner eye to see its path. Where the fear has gone there will be nothing. Only I will remain.


Frank Herbert, Dune

Here’s some advice for successfully reading a book: You need to stay focused, so try to avoid distractions. Avoid multitasking. Avoid task switching. Turn off the TV. Shift positions occasionally so you don’t get cramps or backaches. Don’t get too comfortable or you might fall asleep. (Interestingly, many of these same rules apply to having sex, except that you can read a book with a cat in your lap.)

Stephen Fry on arguments between cousins

My previous post on winning arguments unfairly reminded me of a blog posting by the actor, writer, wit, and all-around bon vivant Stephen Fry. In this post,  (scroll down to “Getting Overheated”) Fry discusses how Englishers and Americans differ when having an argument. While he and his fellow Englishmen love a good hearty tussle of ideas, he finds Americans discomfited by the idea of argument or debate of any kind.

I was warned many, many years ago by the great Jonathan Lynn, co-creator of “Yes Minister” and director of the comic masterpiece “My Cousin Vinnie”, that Americans are not raised in a tradition of debate and that the adversarial ferocity common around a dinner table in Britain is more or less unheard of in America. When Jonathan first went to live in LA he couldn’t understand the terrible silences that would fall when he trashed an statement he disagreed with and said something like “yes, but that’s just arrant nonsense, isn’t it? It doesn’t make sense. It’s self-contradictory.” To a Briton pointing out that something is nonsense, rubbish, tosh or logically impossible in its own terms is not an attack on the person saying it – it’s often no more than a salvo in what one hopes might become an enjoyable intellectual tussle. Jonathan soon found that most Americans responded with offence, hurt or anger to this order of cut and thrust. Yes, one hesitates ever to make generalizations, but let’s be honest the cultures are different, if they weren’t how much poorer the world would be and Americans really don’t seem to be very good at or very used to the idea of a good no-holds barred verbal scrap. I’m not talking about inter-family ‘discussions’ here, I don’t doubt that within American families and amongst close friends, all kinds of liveliness and hoo-hah is possible, I’m talking about what for good or ill one might as well call dinner-party conversation. Disagreement and energetic debate appears to leave a loud smell in the air.

Studying for the GRE

I've stopped updating my previous blog, Oddments of High Unimportance, after Google's Blogger-bots thought I was a spam-blog and prevented me from making posts for about 2 weeks. They finally decided I was for real and basically republished the blog, adding a "9" to the first part of the URL. This has the charming side-effect of breaking links to all of my old articles. Now, Oddments was my first blog and it was a place to just pin to the wall various Web and other ephemera that crossed my path. I messed about with blogging but was never a serious, dedicated blogger. However, I did take the time and trouble to write some longer posts now and then, and it would be a shame to lose them.

So I thought I'd rescue two of those posts, on what I learned from studying for the GRE in the summer of 2006. My commitment to the GRE project surprised even me, I must say; I knew it needed to be done and I took the steps needed to do it.

V:800 Q:640: http://highunimportance9.blogspot.com/2006/08/v800-q640.html

Rating my GRE study materials: http://highunimportance9.blogspot.com/2006/08/rating-my-gre-study-materials.html

Winning Arguments (Unfairly)

The following notes are from a 1982 book by Daniel Cohen called “Re:thinking: How to Succeed by Learning How to Think.” (Bookfinder link – this book is WAY old, people!) It struck me at the time I read it, sometime in the mid-90’s, as a coherent summary of the mind literature extant in 1982 for a mainstream audience, along with basic primers on logical fallacies and the like.

It’s rather interesting to read notes on a book that predates the computer and internet revolutions. In many ways, the brain’s hardware and software hasn’t changed all that much, and his advice and tips, particularly on creativity, ideas, and handling “information overload,” echo through lots of the “25 Ways to do/be/have X” posts the blogosphere is littered with.

What struck me the most from my notes were the following tips on arguing and how to unfairly win arguments. Cohen spent a bit of time in his book dealing with logical fallacies and illustrating how to break out of one’s default thinking habits. Arguing as a way to change other’s thinking habits never work, Cohen says; he characterizes them as street fights and asks the reader to consider the following before starting an argument:

  • I’m not going to change anyone’s mind and I’m probably not going to learn anything.
  • Can I walk away from this?
  • If I win, what will I win and what do I stand to lose?
  • If I lose, what do I lose and what do I stand to gain?
  • Do I know what we are really arguing about?

But if you find yourself in an argument, Cohen provides a handy checklist of ways to unfairly win an argument–or, if you’d rather, how others may pull these gambits on you. I’m unfamiliar with classic debating strategies so these may be old-hat, but I found it quite interesting to review in this political season, as the Reps, Dems, and Fox News pull these tricks in press releases, media statements, chatter-TV, and the like.

  • Appear calm. Decry the opposition for his “emotionalism.”
  • Well-directed show of anger can be effective as it puts the opposition on the defensive.
  • Be sure of facts if the opposition knows something about the subject; stick to generalities and attack the opposition on trivial errors.
  • Ask the opposition to cite sources–and then discredit the sources.
  • Ask the opposition to “define their terms” and then attack the definitions.
  • All-or-nothing: extend the opposition’s point to the logical (but absurd) extreme.
  • Claim the opposition has misstated your case, which puts him on the defensive.
  • If you’re trapped in a misstatement, claim your words have been taken out of context.
  • Deny inconsistency. Bring your previous statements in line with what you’ve just said.
  • Distract the opposition with a side issue.
  • Damn the alternatives.
  • Justify your position by insisting it’s necessary because of the evil deeds of the opposition.
  • Personal attack. “I never argue with such people.”
  • Be gracious, as it makes a good impression on the audience.
  • A tie is better than a loss. “You and I are basically in agreement.”
  • Declare the question not yet settled and that more investigation/thought/time is needed.

Write what you feel

Advice for the creative writer, yes. But the student? My manager is taking a summer class and his teacher told the class, "Don't write down what I say. Write down what you feel about what I say." Interesting advice for a note-taker who's thinking about regurgitating the content for the next test. My reporting background feeds into my natural tendencies to observe and notate, to somehow duplicate what I'm reading or listening to in class; it's distancing. Paraphrasing what the teacher says during a lecture is a good idea, but the cognitive load of paraphrasing something said a minute ago in my own words as new content is also streaming in is too much for me.

But I like the idea of recording my reactions in class, even if they're baffled. It's fast, it's in the moment, it hooks me. Engage me on the emotional level, and I'm halfway there. That said, I can see this strategy applying more to issues-oriented topics than information retrieval algorithms. But it's a new tool I definitely want to try out this fall.

Links 18-Jul-08

  1. Convenience and impermanence. But look at the size of that keyboard! And her happy smile! This is one of the issues that's ruefully discussed in some of my SILS classes, particularly the digital archiving and electronic records courses. It's become one of those burdens we've chosen to shoulder, I think, without really examining why we do it in the first place. Or rather, we propose lots of solutions as we try to understand the problem, which is likely not a technological one at all.
  2. I love homilies and rules of thumb, and this Zhurnaly page collects a great set from Physicist David Stearn. It traverses the small (write yourself notes and index them) to the large ("Being a physicist is a great privilege. Be worthy of it. Most of humanity spends its life doing boring repetitive tasks."). Here's a  slightly different version by Stern from his web page.

Mark Hurst's "Bit Literacy"

Mark Hurst’s book Bit Literacy: Productivity in the Age of Information and E-mail Overload attacks a problem that, of all people, my Alexander Technique therapist mentioned to me today. She said that evolution has granted our bodies numerous ways to deal with few or no calories, but no way -- except obesity -- to deal with too many calories. Likewise, our brains are adapted to recognize patterns and intuit deductions from minimal information, and it does this unconsciously and automatically. But our brains can’t naturally accommodate too much information and it can stun our brains into paralysis. "Information overload" is the conventional term for this condition.

Hurst’s book is an attempt in this Web 2.0 age of Lifehackery and GTD’ing to advise on his own methods of stemming the flow of information so as to decrease the sense of overwhelm.

Various reviews I found on the web marvel that this young guy -- and an MIT computer science grad, to boot -- has a seemingly curmudgeonly attitude to applications and computer habits: he uses older versions of Mac apps, he eschews Web 2.0 services, he trusts in text files and recommends copying emails you want to save into text files you store on your own hard drive.

This is the kind of book I would push on a relative or person older than me who’s not computer-literate and doesn’t quite know what to do with or how to handle the files they compile on their PC. It’s bad enough that most PC/Mac owners inevitably become their own sysadmins; it’s insult to injury that their computers don’t automatically read minds and track all the info they find interesting and keep their files and photos nice and orderly without significant manual intervention.

I was irked a bit by some of Hurst's assumptions that drive this book's messages. But even as an old computer hand, I learned -- re-learned, actually -- some good lessons and reminders regarding file-naming, directory organization, and being responsible for the bits I invite into my life.

What follows are various thoughts, criticisms, and observations about the book. For more information on Hurst, visit his web site, Good Experience, or subscribe to his sensibly formatted newsletter.

  • Hurst’s big idea is Let the bits go. Similar to the basic instructions on organization--do, delegate, defer, or delete--Hurst’s advice is to act on what’s actionable, deliberately save only what you think you need, and let the rest go. This enables one to move swiftly through all the RSS feeds and downloaded files while still being able to find the one file you really need. “Just in case” is not really a good reason to save anything.
  • Hurst prefers the bits (i.e., electronically captured and shared data) over paper. Paper requires energy to produce and transport, it doesn’t scale, and it can’t capture the instant arrival and transformation of bits. Paper is old-fashioned and simply can’t keep up with the flow.
  • I disagree with all of Hurst’s opinions about paper. In regards to the energy needed to produce paper--exactly how many nuclear-, hydro-, or coal-powered plants are needed to produce the electricity for you to read these words? If paper isn’t a good repository for to-dos or information, then maybe it’s because people didn’t learn good habits on how to use them? If the bits are so wonderful, our use of paper should have naturally declined. Instead, we need Hurst’s book to tell us how to use the bits--just as many people for many years taught knowledge workers how to use and file paper. So maybe it isn't the medium that's at fault here.
  • As for the inability of paper to transform bits on the fly--if the goal is to transform an email into a to-do, then I phrase the to-do in my head (which is the hardest part of the task, incidentally--I’m continually re-learning how to phrase a to-do so it’s actionable), write down the task in my paper diary for whatever day I need to do it, and then delete the email or file it for reference. The to-do is thus ready for me to tackle when I'm ready to do it, and since I use my paper diary daily, I don't have to worry that I'll forget to do it. A paper diary well-used -- I prefer Mark Forster’s Do It Tomorrow system -- is to me superior to all the electronic tools I’ve tried.
  • I think paper is not the disadvantage. Nor are excessive bits. The disadvantage is that people haven’t decided what information is really important to them and then been schooled in how to use either method effectively. Paper and electronic methods for handling info exist and either one will work fine. But if you think that everything is important or that you may need this information “someday,” then you do curse yourself into being a custodian of huge wodges of information for a long time and that is a thankless task.
  • Hurst’s contempt for paper is oddly reflected in his self-published book's contempt for the niceties of book design, thus impairing a good reading experience. The paragraphs are separated by a blank line (drafted in a text file, no doubt) instead of more visually attractive line spacings. And--this is what really annoyed me--there’s no friggin’ index! How am I expected to find the reference to the reformatted New York Times article links? Or to the Macintosh apps he recommends? The table of contents is no help. Guess I’ll have to thumb through the book until I find the footnote on page 177 that lists them all -- but then, how will I remember them? Write them down? On PAPER??  A simple back-of-the-book index is an example of a sensible device to navigate paper-based information, exactly the kind of device that Hurst doesn’t acknowledge existing.
  • As for handling to-dos, I tried his Gootodo service and it just didn’t mesh with how I process my tasks using my paper diary and Forster’s DIT system. I agree with the school of thought that says writing things down by hand engages parts of the brain that typing doesn’t. Forster describes how the simple act of writing down an idea that occurs to you, rather than acting on it when you get it, automatically puts distance between you and the task, allowing you to think more clearly about what actually needs to be done. Deferring a task is also possible with Gootodo, of course, but I'd offer this as an example of, if you know what you want to accomplish, then either digital or paper methods should work fine.
  • It sounds like I’m anti-Hurst, but I’m not. I agree that users need to take responsibility for their “stuff,” and I’ve hit on my own file- and folder-naming strategies, similar to Hurst's, that enable me to store and scan efficiently, based on my own needs. My own flirtations with various proprietary applications like Lotus Agenda, Infoselect, and Evernote have taught me that I accumulate way more info than I ever need (”just in case”), that that info never survives intact when transformed, and that I hardly ever need that info anyway. As a result, I’m saving more stuff in txt or rtf files (usually procedures or projects I'm pursuing at the time), I’m stockpiling bookmarks in Delicious, and I'm squirreling stuff web pages or other information away using a Gmail REF label. I don't perceive that storing them causes a cognitive burden on me. Although the bits are not truly "gone," were I to lose them, I wouldn’t be sad.
  • I liked his description of how the best way to save photo files. Very good and sensible advice. I was doing something similar but tweaked my layout to match his rules. Although it's curious that his book doesn't address ways to save and access downloaded music or video files, which are surely as ubiquitous as digital photos. Perhaps, as a Mac Man, he uses iTunes, which handles a lot of that for him. For myself, I use Media Monkey on my PC to handle that chore, and I prefer a directory-based layout as the foundation layer for any music apps.
  • On maintaining a media diet, I agree with his statement that "an unbounded bitstream tends toward irrelevance." Alas, I still maintain too many RSS feeds, but hardly any hard-copy publications. For my RSS feeds, I have a single must-read folder, a second read-when-I-have-a-moment folder, and the rest are all optional. As with many of Hurst's other suggestions, the aim is to control the limited resource that is your time and attention; being profligate with your energy and focus on digital snack-food doesn't help your cause.
  • His chapters on file formats, naming, and storing files are what I wish I'd had when I started using PCs lo those many years ago.
  • I very much  agree with his advice to find a "bit lever," which is essentially a global AutoCorrect app that will expand abbreviations to full words, phrases, paragraphs, URLs, etc. I'd also suggest a good clipboard management program. For Windows, the best is ClipMate; I haven't found a great one for the Mac, but am evaluating CopyPaste Pro. I also like having a macro program around; for the PC, I've used Macro Express for years, but ActiveWords looks good, too. As for managing passwords, I've relied on Roboform on Windows, but haven't really investigated such apps for the Mac.
  • Hurst advocates the Dvorak keyboard layout, which I pick up and put down two or three times a year. When I'm in a crunch, I usually return to Qwerty and stay there.
  • For the index: page 151 lists the programs he recommends for specifying frequently used folders and directories. I have to tip my hat to him for recommending FileBox eXtender for Windows, which I've been pretty happy with so far.
  • For screenshots: SnagIt on the PC. For backups to the cloud: JungleDisk and the Amazon S3 service.
  • Disagree about not using Excel as a database. It works quite well as a flat-file database. If you want to keep a simple list of names and addresses, a text file or Excel is preferred over a database program.
  • Most of Hurst's recommendations, though, he would probably consider small potatoes compared to his bigger vision of re-tooling users for the future as he describes it: more bits, more proprietary file formats or protections (like DRM), more social software and the implication of every bit being tracked and stored somewhere for someone to process. I think there will always be a need for strong opinions on "here's how you should do it" because many of us simply don't have the time or take the time to think through all the implications of the tools we're directed to use. These bit-level tactics will always be needed and will always need re-tooling for the next wave of technology that washes over us.
  • I think, in addition to Hurst's prescriptions, the real key will be in people deciding what they want to do with the technology, with the bits, with their digital tools. If they haven't decided what's really important to them (which is the problem addressed by Hurst's "media diet" chapter), then they'll need all the help they can get to stay on top of it. If they've decided what's of interest to them and their lives and work, then--like Donald Knuth and Alan Lightman--they can choose to eschew email and other bit-processors totally, and get on with what they were put on Earth to do.

Update 08/06/2012

I have been using Hurst's Goodtodo web service for about a year now and have woven it into my daily/weekly task management. It works great as a future reminder system. I may blog later about how my always evolving system, which includes Goodtodo, works nowadays.

"Why you should throw books out"

That's the title of today's post from Tyler Cowen both at his blog and as a guest blogger at Penguin. His point seems to be that the book you've read is likely not the best book you could be reading, and by passing it down the line (via donation or BookMooch or leaving it somewhere in public) your "gift" is preventing someone from reading something better. He says the calculations here are tricky; you could give the book to a friend, but if the friend is highly discriminating, then your standing in their eyes could suffer by proffering them a substandard book. Better to avoid those calculations and simply throw the book in the trash. The author has been paid, you've gotten what you want out of the book, and you've saved some poor schlub from having to make the calculations you made when you thought about buying the book in the first place.

His commenters are mainly book-lovers who beseech, implore, and adjure to donate the book to a library for its book sale, or a thrift store, or just leave it somewhere as a serendipitous gift for someone else. They also point out that Tyler may not know his friends as well as he thinks and that the second-bookstore or thrift shop would know better than he what value books have in their local market.

I go through periodic book purges. My usual method is to pile them up in a box (along with any CDs I've stopped listening to) and take them to BDFAR or Nice Price for trade. Whatever they don't take, I donate to the library for their book sale. And then the box goes back into the closet to collect more books, the making of which there is no ending.

I had a friend years ago who threw away an Anais Nin book because she thought it was so trashy she couldn't bear it anymore. I remember being astonished at the time (I was in my 20s) at the thought of throwing a book into the trash. Even for books I despised, I still would trade them for something better. Today, I'm still more likely than not to write in the margins and trade them if possible, even though I have less time today than ever to read books. My goal now is to either borrow them from the library or in some other way reduce the flow taking up room on my shelves, so that I reduce the time spent on purging them later.