Rachael in the elevator: "So, Mike, are you going to do a doctorate?" Dr. Tibbo as she was leaving her office: "So, Mike, has Carolyn talked to you about joining the doctoral program?"
Rachael in the elevator: "So, Mike, are you going to do a doctorate?" Dr. Tibbo as she was leaving her office: "So, Mike, has Carolyn talked to you about joining the doctoral program?"
William Turkel, an assistant professor of history at the University of Western Ontario, runs a great blog, “Digital History Hacks: Methodology for the infinite archive.” I first ran across his blog last year via a couple of his research-related posts, the kind of “how to succeed at grad school” material that I continue to scarf up. One, on knowing when to stop doing research, offered great advice from one of his advisors: “Your research is done when it stops making a difference to your interpretation.”
Another post recommended just writing down the direct quotes and avoiding paraphrasing. He diagnoses his students’ note-taking problems as simply not using enough sources (but, again, know when it’s time to stop looking).
But what really fires Turkel up is using technology to grapple with history and I find his ideas and opinions invigorating. Similar to how historians want to get their hands on old documents, Turkel wants to use today’s digital tools to examine historical evidence.
His About page says, “In my blog I discuss the kinds of techniques that are appropriate for an archive that has near-zero transaction costs, is constantly changing and effectively infinite. ” Given that one of the themes of my education includes providing curated homes for digital materials, I’m curious as to his attack on the subject of dealing with digital records as historical documents and historical documents transformed into digital records. I also think his embrace of technology–especially programming–within a humanities-oriented discipline provokes some interesting ideas on how technology could be used or promoted within the academy.
He has a definite zest for the tech side and encourages digital historians to embrace programming as a tool that’s as creative and useful and ubiquitous as email or RSS feeds have become. He has co-authored an e-book and web site called The Programming Historian that introduces the tools and basic knowledge needed to create simple programs in Python and JavaScript. The goal isn’t necessarily to become a programmer, but to introduce to historians and other scholars in the humanities a new set of tools they can use to further their research and scholarship. Instead of scouring SourceForge for a unique one-off utility, says Turkel, create your own. The intellectual experience alone is enough to grow your capacity for looking at problems in a different way and, I would say, builds your confidence for attacking bigger and more unusual problems.
Turkel provides a great example of what he’s talking about in his series of posts titled “A Naive Bayesian in the Old Bailey,” a step-by-step account of the tools and approaches he used to perform data mining on over 200,000 XML files of digitized records from the Old Bailey. His final post sums up the experience, his decisions, and the value such an endeavor can provide.
Turkel’s vigorous advocacy of learning basic programming and tech tools reminds me of this post from the blog “Getting Things Done in Academia,” where Physiological Ecologist Carlos Martinez del Rio suggests that science grad students pick up two tools, with at least one being a programming language. This enables the eventual scientist to add to their own toolkits, encourages logical thinking, and enables a flexibility and enhanced ground speed when it comes to research.
This is not an attitude that I’ve seen in many of the courses I’ve taken so far at SILS, I think. There is certainly a zeal for programming and technology that arises naturally from the students themselves; they’re so fluent with the web and a zillion different web apps and sites, that they can imagine a solution to a problem in their minds and see PHP, CSS, JavaScript, and so on, as building blocks–or perhaps, a latticework–that will eventually solve the puzzle. And I know the faculty encourages the students to explore. No one is holding them back.
But, to be fair, it’s more likely that that attitude really isn’t germane to the primarily introductory classes I’ve been taking for the last 4 semesters. I’ve only recently settled on a focus area that will help me choose courses and a line of study for the next 4 semesters. Most of the technology I’ve played with so far–such as the Protege ontology editor–has served as a fine introduction to what’s out there, but there’s no time to practice mastery.
The master’s program’s primary goal is mainly to introduce us to a body of literature and a field of study; soak us in the basic ideas and concepts; and raise our awareness of the issues and problems that exist. If you want to go deeper and more technical, that’s fine, you can do that, and your master’s project offers an opportunity to develop a skill if you want it. But SILS occupies an unusual position in the campus course offerings. UNC’s computer science department doesn’t offer some basic courses, so SILS feels it needs to offer them; for example, courses on web databases and XML. It’s acknowledged that the standards of these courses are not up to those taught by the regular faculty. Still, these courses offer a safe place to practice and make mistakes, and that’s valuable. And, as one professor told me, if you’re smart, you’ll be able to pick up what you need and get out of it what you want. The important thing is to just start, wallow around for a while, and see what emerges.
The last word goes to Turkel, who says here that historians, more so than other practitioners in other disciplines, are uniquely positioned to pick up the basics of programming, in a passage I find rather inspiring, and not just for students:
Historians have a secret advantage when it comes to learning technical material like programming: we are already used to doing close readings of documents that are confusing, ambiguous, incomplete or inconsistent. We all sit down to our primary sources with the sense that we will understand them, even if we’re going to be confused for a while. This approach allows us to eventually produce learned books about subjects far from our own experience or training.
I believe in eating my own dogfood, and wouldn’t subject my students to anything I wouldn’t take on myself. As my own research and teaching moves more toward desktop fabrication, I’ve been reading a lot about materials science, structural engineering, machining, CNC and other subjects for which I have absolutely no preparation. It’s pretty confusing, of course, but each day it all seems a little more clear. I’ve also been making a lot of mistakes as I try to make things. As humanists, I don’t think we can do better than to follow Terence’s adage that nothing human should be alien to us. It is possible to learn anything, if you’re willing to begin in the middle.
You will have to understand that the logic of success is radically different from the logic of vocation. The logic of what our society means by “success” supposedly leads you ever upward to any higher-paying job that can be done sitting down. The logic of vocation holds that there is an indispensable justice, to yourself and to others, in doing well the work that you are “called” or prepared by your talents to do.
And so you must refuse to accept the common delusion that a career is an adequate context for a life. The logic of success insinuates that self-enlargement is your only responsibility, and that any job, any career will be satisfying if you succeed in it.
But I can tell you, on the authority of much evidence, that a lot of people highly successful by that logic are painfully dissatisfied. I can tell you further that you cannot live in a career, and that satisfaction can come only from your life. To give satisfaction, your life will have to be lived in a family, a neighborhood, a community, an ecosystem, a watershed, a place, meeting your responsibilities to all those things to which you belong.
You will be told also – ignoring our permanent dependence on food, clothing, and shelter – that you live in a “knowledge-based economy,” which in fact is deeply prejudiced against all knowledge that does not produce the quickest possible return on investment.
First heard of the "Is Google Making Us Stupid/Killing Literature" foomfahrah via this Mark Hurst post and this follow-up. Kevin Kelly was quite a player in the debate also, here and here, and all the above links will let you read all sides to your heart's desire. Clay Shirky's post questioning the "cult of literature" really popped the cork. Both Kelly and Hurst agreed with Jeremy Hatch's post that it's not the medium that disturbs your reading focus so much as your inability to discipline your reading habits, whether online or off. I wish I had the rhetorical power and skill (and time) to write a blessay on the subject, but here are the rough notes I made today as I criss-crossed cyberspace reading, skimming, and frowning. They add different vegetables to an already spicy gumbo.
Update: Talk about serendipity. Listened to a BBC Radio 3 discussion on the Future of the Book. In addition to talking about how a book, being self-contained, excludes other distractions, they mentioned the signaling aspects of book-readers, particularly subway or tube readers. Their choice of book signals to the other riders what kind of person they are; a "One Hundred Years of Solitude" reader might be advertising something about themselves quite different from a "Da Vinci Code" reader. One presumes a Kindle or iPhone reader are also advertising something about themselves to the people around them.
The only thing you get to do in this world is choose what a good life is and then aim for it. But that requires being opinionated. Every day you are choosing what’s a good life for you.
Got that? They’ll be a quiz. Originally from Little Pet’s Picture Alphabet, 1850’s. (via Nonist Annex)
Bene Gesserit Litany Against Fear
I must not fear. Fear is the mind-killer. Fear is the little-death that brings total obliteration. I will face my fear. I will permit it to pass over me and through me. And when it has gone past, I will turn the inner eye to see its path. Where the fear has gone there will be nothing. Only I will remain.
Here’s some advice for successfully reading a book: You need to stay focused, so try to avoid distractions. Avoid multitasking. Avoid task switching. Turn off the TV. Shift positions occasionally so you don’t get cramps or backaches. Don’t get too comfortable or you might fall asleep. (Interestingly, many of these same rules apply to having sex, except that you can read a book with a cat in your lap.)
Declining books sales have led some publishers into thinking that the way to revive books is to make them more like an online experience. That is truly a mistake! It’s like trying to get people to exercise by making it more like napping.
My previous post on winning arguments unfairly reminded me of a blog posting by the actor, writer, wit, and all-around bon vivant Stephen Fry. In this post, (scroll down to “Getting Overheated”) Fry discusses how Englishers and Americans differ when having an argument. While he and his fellow Englishmen love a good hearty tussle of ideas, he finds Americans discomfited by the idea of argument or debate of any kind.
I was warned many, many years ago by the great Jonathan Lynn, co-creator of “Yes Minister” and director of the comic masterpiece “My Cousin Vinnie”, that Americans are not raised in a tradition of debate and that the adversarial ferocity common around a dinner table in Britain is more or less unheard of in America. When Jonathan first went to live in LA he couldn’t understand the terrible silences that would fall when he trashed an statement he disagreed with and said something like “yes, but that’s just arrant nonsense, isn’t it? It doesn’t make sense. It’s self-contradictory.” To a Briton pointing out that something is nonsense, rubbish, tosh or logically impossible in its own terms is not an attack on the person saying it – it’s often no more than a salvo in what one hopes might become an enjoyable intellectual tussle. Jonathan soon found that most Americans responded with offence, hurt or anger to this order of cut and thrust. Yes, one hesitates ever to make generalizations, but let’s be honest the cultures are different, if they weren’t how much poorer the world would be and Americans really don’t seem to be very good at or very used to the idea of a good no-holds barred verbal scrap. I’m not talking about inter-family ‘discussions’ here, I don’t doubt that within American families and amongst close friends, all kinds of liveliness and hoo-hah is possible, I’m talking about what for good or ill one might as well call dinner-party conversation. Disagreement and energetic debate appears to leave a loud smell in the air.
Don’t pack for the worst scenario. Pack for the best scenario and simply buy yourself out of any jams.
Look for kindred souls. They are few and far between, and nothing is more precious.
I've stopped updating my previous blog, Oddments of High Unimportance, after Google's Blogger-bots thought I was a spam-blog and prevented me from making posts for about 2 weeks. They finally decided I was for real and basically republished the blog, adding a "9" to the first part of the URL. This has the charming side-effect of breaking links to all of my old articles. Now, Oddments was my first blog and it was a place to just pin to the wall various Web and other ephemera that crossed my path. I messed about with blogging but was never a serious, dedicated blogger. However, I did take the time and trouble to write some longer posts now and then, and it would be a shame to lose them.
So I thought I'd rescue two of those posts, on what I learned from studying for the GRE in the summer of 2006. My commitment to the GRE project surprised even me, I must say; I knew it needed to be done and I took the steps needed to do it.
V:800 Q:640: http://highunimportance9.blogspot.com/2006/08/v800-q640.html
Rating my GRE study materials: http://highunimportance9.blogspot.com/2006/08/rating-my-gre-study-materials.html
The following notes are from a 1982 book by Daniel Cohen called “Re:thinking: How to Succeed by Learning How to Think.” (Bookfinder link – this book is WAY old, people!) It struck me at the time I read it, sometime in the mid-90’s, as a coherent summary of the mind literature extant in 1982 for a mainstream audience, along with basic primers on logical fallacies and the like.
It’s rather interesting to read notes on a book that predates the computer and internet revolutions. In many ways, the brain’s hardware and software hasn’t changed all that much, and his advice and tips, particularly on creativity, ideas, and handling “information overload,” echo through lots of the “25 Ways to do/be/have X” posts the blogosphere is littered with.
What struck me the most from my notes were the following tips on arguing and how to unfairly win arguments. Cohen spent a bit of time in his book dealing with logical fallacies and illustrating how to break out of one’s default thinking habits. Arguing as a way to change other’s thinking habits never work, Cohen says; he characterizes them as street fights and asks the reader to consider the following before starting an argument:
But if you find yourself in an argument, Cohen provides a handy checklist of ways to unfairly win an argument–or, if you’d rather, how others may pull these gambits on you. I’m unfamiliar with classic debating strategies so these may be old-hat, but I found it quite interesting to review in this political season, as the Reps, Dems, and Fox News pull these tricks in press releases, media statements, chatter-TV, and the like.
Sometimes I get emails that are more than two pages long, attempting to explain a problem. I’m going to tell you something: All career problems can be described in under 100 words. If you are going over 100 words, you don’t know your problem. If you are going over 1000 words, it’s because your self-knowledge is really bad, so that is your problem.
Advice for the creative writer, yes. But the student? My manager is taking a summer class and his teacher told the class, "Don't write down what I say. Write down what you feel about what I say." Interesting advice for a note-taker who's thinking about regurgitating the content for the next test. My reporting background feeds into my natural tendencies to observe and notate, to somehow duplicate what I'm reading or listening to in class; it's distancing. Paraphrasing what the teacher says during a lecture is a good idea, but the cognitive load of paraphrasing something said a minute ago in my own words as new content is also streaming in is too much for me.
But I like the idea of recording my reactions in class, even if they're baffled. It's fast, it's in the moment, it hooks me. Engage me on the emotional level, and I'm halfway there. That said, I can see this strategy applying more to issues-oriented topics than information retrieval algorithms. But it's a new tool I definitely want to try out this fall.
The bottom line is that you should never spend more than $1500 on art unless you know at least roughly what it is worth at auction. One of life’s good rules of thumb.
Mark Hurst’s book Bit Literacy: Productivity in the Age of Information and E-mail Overload attacks a problem that, of all people, my Alexander Technique therapist mentioned to me today. She said that evolution has granted our bodies numerous ways to deal with few or no calories, but no way -- except obesity -- to deal with too many calories. Likewise, our brains are adapted to recognize patterns and intuit deductions from minimal information, and it does this unconsciously and automatically. But our brains can’t naturally accommodate too much information and it can stun our brains into paralysis. "Information overload" is the conventional term for this condition.
Hurst’s book is an attempt in this Web 2.0 age of Lifehackery and GTD’ing to advise on his own methods of stemming the flow of information so as to decrease the sense of overwhelm.
Various reviews I found on the web marvel that this young guy -- and an MIT computer science grad, to boot -- has a seemingly curmudgeonly attitude to applications and computer habits: he uses older versions of Mac apps, he eschews Web 2.0 services, he trusts in text files and recommends copying emails you want to save into text files you store on your own hard drive.
This is the kind of book I would push on a relative or person older than me who’s not computer-literate and doesn’t quite know what to do with or how to handle the files they compile on their PC. It’s bad enough that most PC/Mac owners inevitably become their own sysadmins; it’s insult to injury that their computers don’t automatically read minds and track all the info they find interesting and keep their files and photos nice and orderly without significant manual intervention.
I was irked a bit by some of Hurst's assumptions that drive this book's messages. But even as an old computer hand, I learned -- re-learned, actually -- some good lessons and reminders regarding file-naming, directory organization, and being responsible for the bits I invite into my life.
What follows are various thoughts, criticisms, and observations about the book. For more information on Hurst, visit his web site, Good Experience, or subscribe to his sensibly formatted newsletter.
Update 08/06/2012
I have been using Hurst's Goodtodo web service for about a year now and have woven it into my daily/weekly task management. It works great as a future reminder system. I may blog later about how my always evolving system, which includes Goodtodo, works nowadays.
Nowadays, instead of saying, “He’s a prick,” I’ll say, “He’s complicated.”
That's the title of today's post from Tyler Cowen both at his blog and as a guest blogger at Penguin. His point seems to be that the book you've read is likely not the best book you could be reading, and by passing it down the line (via donation or BookMooch or leaving it somewhere in public) your "gift" is preventing someone from reading something better. He says the calculations here are tricky; you could give the book to a friend, but if the friend is highly discriminating, then your standing in their eyes could suffer by proffering them a substandard book. Better to avoid those calculations and simply throw the book in the trash. The author has been paid, you've gotten what you want out of the book, and you've saved some poor schlub from having to make the calculations you made when you thought about buying the book in the first place.
His commenters are mainly book-lovers who beseech, implore, and adjure to donate the book to a library for its book sale, or a thrift store, or just leave it somewhere as a serendipitous gift for someone else. They also point out that Tyler may not know his friends as well as he thinks and that the second-bookstore or thrift shop would know better than he what value books have in their local market.
I go through periodic book purges. My usual method is to pile them up in a box (along with any CDs I've stopped listening to) and take them to BDFAR or Nice Price for trade. Whatever they don't take, I donate to the library for their book sale. And then the box goes back into the closet to collect more books, the making of which there is no ending.
I had a friend years ago who threw away an Anais Nin book because she thought it was so trashy she couldn't bear it anymore. I remember being astonished at the time (I was in my 20s) at the thought of throwing a book into the trash. Even for books I despised, I still would trade them for something better. Today, I'm still more likely than not to write in the margins and trade them if possible, even though I have less time today than ever to read books. My goal now is to either borrow them from the library or in some other way reduce the flow taking up room on my shelves, so that I reduce the time spent on purging them later.
Do you read a lot of contemporary fiction these days? Like everyone else, no, I don’t.