Advice to a 40-odder on re-entering school

When I let it be known around the office back in 2006 that I was interested in going back to school, and that I'd targeted UNC's SILS, an acquaintance introduced me to a friend of hers who had just gotten her MSLS degree from there. I think we talked in January or February and I was amazed at the impromptu compactness and pointedness of the excellent advice she gave me. It was a great example of how, when you make your intentions specific and known, life opens its hand and leads you where you want to go. Anyway, here's the advice she gave me, with a few tips and embellishments from me.

  • Read the professor's bios and see what their backgrounds are. Focus on the ones whose interests match yours. One of them could be your advisor after you enter BUT--start talking to your fellow students after you arrive and get their advice on potential advisors as well.
  • Avoid applying for the fall semester. Apply in January instead--the application pool is smaller and there's less competition.
  • One of the things that made me an attractive candidate was not just my work experience, but also that I wouldn't require a scholarship.
  • Information Science is wide open and encompasses a broad field. Even if you don't know exactly where you'll land in SILS, you should be able to find a place in it.
  • Feel free to call the office and ask to set up a visit. The staff is very friendly and they often conduct tours of the building and surrounding area to prospects.
  • The GRE is a formality if the admissions committee thinks you'll contribute to the program. (That didn't make the GRE any more pleasurable!)
  • This was the best advice: she suggested taking some SILS classes, even online classes, as a continuing ed student via the Friday Center. The courses are cheaper than if you're in a degree program, and provide some familiarity with the school and professors (though adjuncts often teach the online courses). I took two classes this way and those hours transferred in very easily after I was accepted.
  • I scheduled my first cont-ed class during a summer session. This allowed me to get familiar with campus and the bus schedules at a more relaxed pace than I could have done during the general crush and chaos of the fall or spring semester. And no long line waiting to get an ID card!
  • Be aware that the Friday Center and the Graduate School are two separate entities. If you fill out the North Carolina residency form for one, you also have to fill one out for the other. An instance of the bureaucracy being set up for the bureaucracy's convenience rather than the student's. (And no, no one tells you this. You either have to find out on your own or read some big dumb blogger passing on his hard-won wisdom.)

Related post: Studying for the GRE

To suggest that people disconnect from a glorious stream of free speech, live news, and entertainment — all in the name of increased productivity — is a bit like saying that you should start timing your bathroom breaks in an effort to get them all under the 30 second mark, or that foreplay and dessert menus should both be banned.

Hallway conversations

Rachael in the elevator: "So, Mike, are you going to do a doctorate?" Dr. Tibbo as she was leaving her office: "So, Mike, has Carolyn talked to you about joining the doctoral program?"

NEVER wear white socks with dress shoes. It’s akin to “finishing” too early on your honeymoon night. Which means you should be ashamed.

Links Harvest: novels, narrative, BAE

  1. Narrative and novels as models for social relations and as simulations of economic approaches.
  2. First in a series of BBC4 radio programs on what the novelist's imagination can offer sociological research on place. Settings: the rural idyll, the city, and the suburb.
  3. "Once you've restricted yourself to information that turns up in Google searches, you begin having a very distorted view of the world...A book is not 150 successive blog entries, just like a novel isn't 150 character sketches, descriptions, and scraps of dialog. " A narrative, even in a computer book, helps to order experience.  Computer book author Charles Petzold on the grim economics and reality of book authorship.
  4. Grim? Grim. Writer and editor Susie Bright explains why she's stopped editing the Best American Erotica Series, laments the collapse of the short-story market (no readers=no markets), and predicts  what could happen next. (Her blog is NSFW, if you need to know that sort of thing.) One of many money quotes: "Book reading is not in vogue any longer, it's eccentric. No one would even bother to have an obscenity fight over text, because so few people would be in 'danger' of reading it."

Digital History Hacks

William Turkel, an assistant professor of history at the University of Western Ontario, runs a great blog, “Digital History Hacks: Methodology for the infinite archive.” I first ran across his blog last year via a couple of his research-related posts, the kind of “how to succeed at grad school” material that I continue to scarf up. One, on knowing when to stop doing research, offered great advice from one of his advisors: “Your research is done when it stops making a difference to your interpretation.”

Another post recommended just writing down the direct quotes and avoiding paraphrasing. He diagnoses his students’ note-taking problems as simply not using enough sources (but, again, know when it’s time to stop looking).

But what really fires Turkel up is using technology to grapple with history and I find his ideas and opinions invigorating. Similar to how historians want to get their hands on old documents, Turkel wants to use today’s digital tools to examine historical evidence.

His About page says, “In my blog I discuss the kinds of techniques that are appropriate for an archive that has near-zero transaction costs, is constantly changing and effectively infinite. ” Given that one of the themes of my education includes providing curated homes for digital materials, I’m curious as to his attack on the subject of dealing with digital records as historical documents and historical documents transformed into digital records. I also think his embrace of technology–especially programming–within a humanities-oriented discipline provokes some interesting ideas on how technology could be used or promoted within the academy.

He has a definite zest for the tech side and encourages digital historians to embrace programming as a tool that’s as creative and useful and ubiquitous as email or RSS feeds have become. He has co-authored an e-book and web site called The Programming Historian that introduces the tools and basic knowledge needed to create simple programs in Python and JavaScript. The goal isn’t necessarily to become a programmer, but to introduce to historians and other scholars in the humanities a new set of tools they can use to further their research and scholarship. Instead of scouring SourceForge for a unique one-off utility, says Turkel, create your own. The intellectual experience alone is enough to grow your capacity for looking at problems in a different way and, I would say, builds your confidence for attacking bigger and more unusual problems.

Turkel provides a great example of what he’s talking about in his series of posts titled “A Naive Bayesian in the Old Bailey,” a step-by-step account of the tools and approaches he used to perform data mining on over 200,000 XML files of digitized records from the Old Bailey. His final post sums up the experience, his decisions, and the value such an endeavor can provide.

Turkel’s vigorous advocacy of learning basic programming and tech tools reminds me of this post from the blog “Getting Things Done in Academia,” where Physiological Ecologist Carlos Martinez del Rio suggests that science grad students pick up two tools, with at least one being a programming language. This enables the eventual scientist to add to their own toolkits, encourages logical thinking, and enables a flexibility and enhanced ground speed when it comes to research.

This is not an attitude that I’ve seen in many of the courses I’ve taken so far at SILS, I think. There is certainly a zeal for programming and technology that arises naturally from the students themselves; they’re so fluent with the web and a zillion different web apps and sites, that they can imagine a solution to a problem in their minds and see PHP, CSS, JavaScript, and so on, as building blocks–or perhaps, a latticework–that will eventually solve the puzzle. And I know the faculty encourages the students to explore. No one is holding them back.

But, to be fair, it’s more likely that that attitude really isn’t germane to the primarily introductory classes I’ve been taking for the last 4 semesters. I’ve only recently settled on a focus area that will help me choose courses and a line of study for the next 4 semesters. Most of the technology I’ve played with so far–such as the Protege ontology editor–has served as a fine introduction to what’s out there, but there’s no time to practice mastery.

The master’s program’s primary goal is mainly to introduce us to a body of literature and a field of study; soak us in the basic ideas and concepts; and raise our awareness of the issues and problems that exist. If you want to go deeper and more technical, that’s fine, you can do that, and your master’s project offers an opportunity to develop a skill if you want it. But SILS occupies an unusual position in the campus course offerings. UNC’s computer science department doesn’t offer some basic courses, so SILS feels it needs to offer them; for example, courses on web databases and XML. It’s acknowledged that the standards of these courses are not up to those taught by the regular faculty. Still, these courses offer a safe place to practice and make mistakes, and that’s valuable. And, as one professor told me, if you’re smart, you’ll be able to pick up what you need and get out of it what you want. The important thing is to just start, wallow around for a while, and see what emerges.

The last word goes to Turkel, who says here that historians, more so than other practitioners in other disciplines, are uniquely positioned to pick up the basics of programming, in a passage I find rather inspiring, and not just for students:

Historians have a secret advantage when it comes to learning technical material like programming: we are already used to doing close readings of documents that are confusing, ambiguous, incomplete or inconsistent. We all sit down to our primary sources with the sense that we will understand them, even if we’re going to be confused for a while. This approach allows us to eventually produce learned books about subjects far from our own experience or training.

I believe in eating my own dogfood, and wouldn’t subject my students to anything I wouldn’t take on myself. As my own research and teaching moves more toward desktop fabrication, I’ve been reading a lot about materials science, structural engineering, machining, CNC and other subjects for which I have absolutely no preparation. It’s pretty confusing, of course, but each day it all seems a little more clear. I’ve also been making a lot of mistakes as I try to make things. As humanists, I don’t think we can do better than to follow Terence’s adage that nothing human should be alien to us. It is possible to learn anything, if you’re willing to begin in the middle.

You will have to understand that the logic of success is radically different from the logic of vocation. The logic of what our society means by “success” supposedly leads you ever upward to any higher-paying job that can be done sitting down. The logic of vocation holds that there is an indispensable justice, to yourself and to others, in doing well the work that you are “called” or prepared by your talents to do.

And so you must refuse to accept the common delusion that a career is an adequate context for a life. The logic of success insinuates that self-enlargement is your only responsibility, and that any job, any career will be satisfying if you succeed in it.

But I can tell you, on the authority of much evidence, that a lot of people highly successful by that logic are painfully dissatisfied. I can tell you further that you cannot live in a career, and that satisfaction can come only from your life. To give satisfaction, your life will have to be lived in a family, a neighborhood, a community, an ecosystem, a watershed, a place, meeting your responsibilities to all those things to which you belong.

Notes - The Book, The Internet, Literature

First heard of the "Is Google Making Us Stupid/Killing Literature" foomfahrah via this Mark Hurst post and this follow-up. Kevin Kelly was quite a player in the debate also, here and here, and all the above links will let you read all sides to your heart's desire. Clay Shirky's post questioning the "cult of literature" really popped the cork. Both Kelly and Hurst agreed with Jeremy Hatch's post that it's not the medium that disturbs your reading focus so much as your inability to discipline your reading habits, whether online or off. I wish I had the rhetorical power and skill (and time) to write a blessay on the subject, but here are the rough notes I made today as I criss-crossed cyberspace reading, skimming, and frowning. They add different vegetables to an already spicy gumbo.

  • Hatch and Kelly (and others) have no problem with reading on a computer screen. Hurst and Kelly both highlight this quote from Hatch's post: "...your ability to concentrate on a long text is not a function of the medium of delivery, but a function of your personal discipline and your aims in reading."I would say that that is probably true for Hatch, but not so true for me. I've had surgeries on both eyes for detached retinas and cataracts (and follow-up laser treatments to burn off lens plaque); reading online for long periods tires my eyes in a way reading paper-based materials do not. Perhaps this is because the light is being pushed to my eyes via my 20" Trinitron monitor rather than the light being reflected off the page; I don't know. My cataract doctor also urged me and every computer user I know to use wetting drops or lubricant eye drops at least hourly. He said he's observed computer and laptop users not blinking their eyes for nearly a minute, and this aggravates dryness and irritation of the eyeball.Kelly asked for some scientific studies of how reading online is materially or measurably different from reading books. In addition to scans of brain activity, why not also check eye movements, eye health, posture, etc.?
  • Better equipment may also help. I did read a book or two on my Clie in years past and it was OK, but it's not an experience I sought out very much. (Also, reading on my Clie isn't the event that an evening spent reading a book is, for me.) My 13" MacBook has a great screen for reading, but most PDFs I get don't fit comfortably on that screen, so I often wind up changing zoom levels and scrolling around a lot. On my PC, running the monitor means running my big desktop PC with the loud fan, which is annoying. Also, the hummmm of the equipment impels me to do something--don't just read! My apparatus for online reading isn't as transparent as the typical book apparatus I'm used it. I do often print out the things I want to read and take them with me.
  • Kelly, I think, points out the arguments of how word processors changed writing styles. Other commentators pointed out how every new technology changed how we created or consumed stories or (ugh) content. James Burke's series "The Day The Universe Changed" heavily makes the point that writing altered people's memories; it certainly had implications for the creation and performance of epic poems. I think it's safe to assume that the online experience will change reading habits, but we don't know how.
  • I was fascinated by Hatch's post where he said he really hasn't known life without computers around. I'm part of the generation that bridged the computing divide; I didn't use computers for full-time work until 1989, when I started using a Mac II for writing and laying out a newsletter. And the Internet (in the form of Compuserve) and the Web weren't part of my life till about 5 years afterward. Before that, yep, it was books, typewriters, and lots of scratch paper.
  • If people are having trouble reading books because they're reading online too much, it may be as Hatch says, more a matter of discipline or habit. But we're talking experienced readers and computer users here. It may be that the computer offers wonderful distractions. But it may be a generational thing, where us older readers are comforted by the handrails a book offers: pagination, tactile response, heft, the ability to open a book into 3 places at one time to check the TOC, endnotes, and a diagram. I find I miss the handrails when reading online: I have to use a little more cognitive juice to gauge how far I've come and how far I have to go in a book (though the scroll bar suffices), I have to think about how to set a bookmark if I want to go back and check something I've read before, I have to think about how to implement marginalia. I know all of these can be done online, but I have to think about how to do it; these tasks feel more "natural" (that is to say, "practiced" and "learned" and "I already know how to do it") with a book in hand.
  • I remember a long-ago question to Marilyn vos Savant. A guy noticed he was having trouble concentrating. What was the one best thing he could do to regain his focus? Her answer: read a novel.
  • "Is Google Making Us Stupid?" Were we stupid before? Or are we letting ourselves get lazy? Is that the same thing?
  • I'll probably change this answer after reading Carr's article, but: I think the simple answer would be to just shut the damn computer off and stop the input for a while.
  • How much of our reading mechanisms are "natural"--that is to say, innate, inborn? Our brain's hardware hasn't really changed all that much for the last several thousands of years. How important is training and association, and simply what we're most comfortable with? Could we refer to these latter components as the "software" running on our wonderful hardware?
  • Burke said in his series that, with a book, you could hold a man's mind in your hand, argue with him, learn from him--without having to go and see him. But books (and the publishing industry that grew up around them) eventually grew to serve as mediators and quality gates for centuries, becoming another effective barrier. If text (like music) is now flowing at us in a stream, it means that we're now again accepting unmediated information. Lots of that information may be worthless, but other mediators will arise (like the NY Times, Slate, Salon, Yahoo, and others), readers will choose which they prefer to use to sample the stream's myriad contents, and the mediation will continue, but in new forms.
  • I suppose one test you could do to check the efficacy of online vs book reading would be to have book-reader James Wood and bits-reader Jeremy Hatch read the same book in their preferred formats and see how the discussion proceeds. Does the medium change what they notice or what they talk about? Methinks that the conversation we'd overhear (and I'd love to overhear it) would be two excellent readers discussing what impressed them about the book, the (ugh) content. Instead of references to "that scene on page 12" we might instead hear "that scene where she cuts the watermelon", but that's not a big deal.
  • I do like Kelly's point about redefining what a book is, what are its boundaries. "Book" to me means a specific physical object. We need a new name, a new metaphor, a new image.
  • But truthfully, and I think even the digital partisans would agree, some subjects just work better in a book or folio form. Large-format art books, for example. I have a great big book of illuminated journals and letters that I adore turning the pages of, and my Absolute Watchmen and Alice in Sunderland volumes are just exquisite pleasures to read, browse, linger over, and they're easy on my poor eyes. I get great joy from appreciating the craft of the book, its art. There's also something about the possession of a beautiful physical object I can hold in my hands that I don't feel with digital objects.
  • Is the worry that we're becoming illiterate or aliterate? People may choose not to read because there are other things they're rather be doing. I'd say the latter is more precisely the issue some worry about. But haven't there always been fewer literate educated people in the world, than the reverse? (How many copies of a book do you need to sell to get on the NY Times Bestseller List? Compare that to the opening weekend attendance of the worst summer movie in the world. Which is larger? By what magnitude? There's no going back.)
  • Reminds me of Gore Vidal's comment that, at the dawn of civilization, song and poetry were at the center of the culture. Then books occupied the center, and pushed poetry out to the edges. Then movies and radio occupied the center, pushed books and novels to the edges, pushing poetry even further out. Then television rose in the center, and so on and so on. While none of these earlier artforms have died out, they aren't at the center and their enthusiasts talk to each other more than they talk to the mass audience.
  • I was struck by some commentators' replies that they loved their PDAs or iPhones to read books while standing in line, making use of downtime, etc. (A friend at work calls reading while on the toilet "parallel processing.") Not to be a prig, but -- is that really the best use of your time? Wouldn't your brain benefit from no input AT ALL for just a few minutes? When I'm in line at the grocery, I'll say a mantra to just pass the time and put me in a good mood. I'd hate to start reading something, get lost in it, and then have to hurriedly close it to push my cart forward. When I start reading, I want to stay in that world for a while. When I'm not reading, I want to stay in this world and be aware of what's around me or just mull things over.
  • Kelly mentioned audiobooks as a medium that no one was talking about. I listen to mine in the car, so only ever hear them in snippets; it makes for a somewhat disjointed experience. In Steve Martin's memoir that I got through Audible.com, I lost the photos that appeared in the book but I got banjo interludes between chapters and him actually singing some of his songs. So that was a good trade-off.
  • Genre became an issue with Shirky's essay and Birkerts, too. Fiction vs non-fiction seemed to be the issue. Would the discussion change if we were talking about poetry rather than prose? Could you read a few lines of Shakespeare or Keats or the Iliad while waiting in the grocery line, and then could you say you really read it? And what do I mean by "really reading it"? Does the context of where and how you're reading affect how you read a specific genre? (Obligatory mention of Poetry Daily, which I do visit daily.)
  • I'm surprised Wendell Berry hasn't weighed in by now (but then, someone would have to print out all the essays and send them to him). Wendell would add some more fun to the discussion.

Update: Talk about serendipity. Listened to a BBC Radio 3 discussion on the Future of the Book. In addition to talking about how a book, being self-contained, excludes other distractions, they mentioned the signaling aspects of book-readers, particularly subway or tube readers. Their choice of book signals to the other riders what kind of person they are; a "One Hundred Years of Solitude" reader might be advertising something about themselves quite different from a "Da Vinci Code" reader. One presumes a Kindle or iPhone reader are also advertising something about themselves to the people around them.

Update: "The Amazon Kindle I passed around the room was so forgettable that no one mentioned it during the next 90 minutes."

“There is nothing funny about a clown in the moonlight.” -Lon Chaney

Got that? They’ll be a quiz. Originally from Little Pet’s Picture Alphabet, 1850’s. (via Nonist Annex)

Look, a lot of people out there write editorials and try to persuade you to change an opinion based on “evidence”. All I’m saying is this – I don’t need that phony shit. When I say something, you can believe it. Turn off that thing in your brain that questions things, and just listen to the truth in front of you. This is how America was made. (Fact.). ((Fun fact - Things in parentheses – always fact.))

Bene Gesserit Litany Against Fear

I must not fear. Fear is the mind-killer. Fear is the little-death that brings total obliteration. I will face my fear. I will permit it to pass over me and through me. And when it has gone past, I will turn the inner eye to see its path. Where the fear has gone there will be nothing. Only I will remain.


Frank Herbert, Dune

Here’s some advice for successfully reading a book: You need to stay focused, so try to avoid distractions. Avoid multitasking. Avoid task switching. Turn off the TV. Shift positions occasionally so you don’t get cramps or backaches. Don’t get too comfortable or you might fall asleep. (Interestingly, many of these same rules apply to having sex, except that you can read a book with a cat in your lap.)

America’s battle is yet to fight; and we, sorrowful though nothing doubting, will wish her strength for it. New Spiritual Pythons, plenty of them; enormous Megatherions, as ugly as were ever born of mud, loom huge and hideous out of the twilight Future on America; and she will have her own agony, and her own victory, but on other terms than she is yet quite aware of. — Thomas Carlyle, 1850.

Make of it what you will.

Stephen Fry on arguments between cousins

My previous post on winning arguments unfairly reminded me of a blog posting by the actor, writer, wit, and all-around bon vivant Stephen Fry. In this post,  (scroll down to “Getting Overheated”) Fry discusses how Englishers and Americans differ when having an argument. While he and his fellow Englishmen love a good hearty tussle of ideas, he finds Americans discomfited by the idea of argument or debate of any kind.

I was warned many, many years ago by the great Jonathan Lynn, co-creator of “Yes Minister” and director of the comic masterpiece “My Cousin Vinnie”, that Americans are not raised in a tradition of debate and that the adversarial ferocity common around a dinner table in Britain is more or less unheard of in America. When Jonathan first went to live in LA he couldn’t understand the terrible silences that would fall when he trashed an statement he disagreed with and said something like “yes, but that’s just arrant nonsense, isn’t it? It doesn’t make sense. It’s self-contradictory.” To a Briton pointing out that something is nonsense, rubbish, tosh or logically impossible in its own terms is not an attack on the person saying it – it’s often no more than a salvo in what one hopes might become an enjoyable intellectual tussle. Jonathan soon found that most Americans responded with offence, hurt or anger to this order of cut and thrust. Yes, one hesitates ever to make generalizations, but let’s be honest the cultures are different, if they weren’t how much poorer the world would be and Americans really don’t seem to be very good at or very used to the idea of a good no-holds barred verbal scrap. I’m not talking about inter-family ‘discussions’ here, I don’t doubt that within American families and amongst close friends, all kinds of liveliness and hoo-hah is possible, I’m talking about what for good or ill one might as well call dinner-party conversation. Disagreement and energetic debate appears to leave a loud smell in the air.