Oddments of High Unimportance
Photos Archive Now Search Also on Micro.blog
  • Look for kindred souls. They are few and far between, and nothing is more precious.
    All I Really Need to Know about Physics I had to Dig Up by Myself by David P. Stern
    → 10:06 AM, Jul 22
  • Studying for the GRE

    I've stopped updating my previous blog, Oddments of High Unimportance, after Google's Blogger-bots thought I was a spam-blog and prevented me from making posts for about 2 weeks. They finally decided I was for real and basically republished the blog, adding a "9" to the first part of the URL. This has the charming side-effect of breaking links to all of my old articles. Now, Oddments was my first blog and it was a place to just pin to the wall various Web and other ephemera that crossed my path. I messed about with blogging but was never a serious, dedicated blogger. However, I did take the time and trouble to write some longer posts now and then, and it would be a shame to lose them.

    So I thought I'd rescue two of those posts, on what I learned from studying for the GRE in the summer of 2006. My commitment to the GRE project surprised even me, I must say; I knew it needed to be done and I took the steps needed to do it.

    V:800 Q:640: http://highunimportance9.blogspot.com/2006/08/v800-q640.html

    Rating my GRE study materials: http://highunimportance9.blogspot.com/2006/08/rating-my-gre-study-materials.html

    → 6:37 PM, Jul 21
  • Winning Arguments (Unfairly)

    The following notes are from a 1982 book by Daniel Cohen called "Re:thinking: How to Succeed by Learning How to Think." (Bookfinder link -- this book is WAY old, people!) It struck me at the time I read it, sometime in the mid-90's, as a coherent summary of the mind literature extant in 1982 for a mainstream audience, along with basic primers on logical fallacies and the like.

    It's rather interesting to read notes on a book that predates the computer and internet revolutions. In many ways, the brain's hardware and software hasn't changed all that much, and his advice and tips, particularly on creativity, ideas, and handling "information overload," echo through lots of the "25 Ways to do/be/have X" posts the blogosphere is littered with.

    What struck me the most from my notes were the following tips on arguing and how to unfairly win arguments. Cohen spent a bit of time in his book dealing with logical fallacies and illustrating how to break out of one's default thinking habits. Arguing as a way to change other's thinking habits never work, Cohen says; he characterizes them as street fights and asks the reader to consider the following before starting an argument:

    • I'm not going to change anyone's mind and I'm probably not going to learn anything.
    • Can I walk away from this?
    • If I win, what will I win and what do I stand to lose?
    • If I lose, what do I lose and what do I stand to gain?
    • Do I know what we are really arguing about?

    But if you find yourself in an argument, Cohen provides a handy checklist of ways to unfairly win an argument--or, if you'd rather, how others may pull these gambits on you. I'm unfamiliar with classic debating strategies so these may be old-hat, but I found it quite interesting to review in this political season, as the Reps, Dems, and Fox News pull these tricks in press releases, media statements, chatter-TV, and the like.

    • Appear calm. Decry the opposition for his "emotionalism."
    • Well-directed show of anger can be effective as it puts the opposition on the defensive.
    • Be sure of facts if the opposition knows something about the subject; stick to generalities and attack the opposition on trivial errors.
    • Ask the opposition to cite sources--and then discredit the sources.
    • Ask the opposition to "define their terms" and then attack the definitions.
    • All-or-nothing: extend the opposition's point to the logical (but absurd) extreme.
    • Claim the opposition has misstated your case, which puts him on the defensive.
    • If you're trapped in a misstatement, claim your words have been taken out of context.
    • Deny inconsistency. Bring your previous statements in line with what you've just said.
    • Distract the opposition with a side issue.
    • Damn the alternatives.
    • Justify your position by insisting it's necessary because of the evil deeds of the opposition.
    • Personal attack. "I never argue with such people."
    • Be gracious, as it makes a good impression on the audience.
    • A tie is better than a loss. "You and I are basically in agreement."
    • Declare the question not yet settled and that more investigation/thought/time is needed.
    → 6:34 PM, Jul 21
  • Sometimes I get emails that are more than two pages long, attempting to explain a problem. I’m going to tell you something: All career problems can be described in under 100 words. If you are going over 100 words, you don’t know your problem. If you are going over 1000 words, it’s because your self-knowledge is really bad, so that is your problem.
    Three bad career questions people ask me all the time » Brazen Careerist by Penelope Trunk
    → 12:40 PM, Jul 19
  • Write what you feel

    Advice for the creative writer, yes. But the student? My manager is taking a summer class and his teacher told the class, "Don't write down what I say. Write down what you feel about what I say." Interesting advice for a note-taker who's thinking about regurgitating the content for the next test. My reporting background feeds into my natural tendencies to observe and notate, to somehow duplicate what I'm reading or listening to in class; it's distancing. Paraphrasing what the teacher says during a lecture is a good idea, but the cognitive load of paraphrasing something said a minute ago in my own words as new content is also streaming in is too much for me.

    But I like the idea of recording my reactions in class, even if they're baffled. It's fast, it's in the moment, it hooks me. Engage me on the emotional level, and I'm halfway there. That said, I can see this strategy applying more to issues-oriented topics than information retrieval algorithms. But it's a new tool I definitely want to try out this fall.

    → 8:06 PM, Jul 18
  • Links 18-Jul-08

    1. Convenience and impermanence. But look at the size of that keyboard! And her happy smile! This is one of the issues that's ruefully discussed in some of my SILS classes, particularly the digital archiving and electronic records courses. It's become one of those burdens we've chosen to shoulder, I think, without really examining why we do it in the first place. Or rather, we propose lots of solutions as we try to understand the problem, which is likely not a technological one at all.
    2. I love homilies and rules of thumb, and this Zhurnaly page collects a great set from Physicist David Stearn. It traverses the small (write yourself notes and index them) to the large ("Being a physicist is a great privilege. Be worthy of it. Most of humanity spends its life doing boring repetitive tasks."). Here's a  slightly different version by Stern from his web page.
    → 7:59 PM, Jul 18
  • The bottom line is that you should never spend more than $1500 on art unless you know at least roughly what it is worth at auction. One of life’s good rules of thumb.
    Marginal Revolution: Do not buy art on cruise ships
    → 5:57 PM, Jul 18
  • Mark Hurst's "Bit Literacy"

    Mark Hurst’s book Bit Literacy: Productivity in the Age of Information and E-mail Overload attacks a problem that, of all people, my Alexander Technique therapist mentioned to me today. She said that evolution has granted our bodies numerous ways to deal with few or no calories, but no way -- except obesity -- to deal with too many calories. Likewise, our brains are adapted to recognize patterns and intuit deductions from minimal information, and it does this unconsciously and automatically. But our brains can’t naturally accommodate too much information and it can stun our brains into paralysis. "Information overload" is the conventional term for this condition.

    Hurst’s book is an attempt in this Web 2.0 age of Lifehackery and GTD’ing to advise on his own methods of stemming the flow of information so as to decrease the sense of overwhelm.

    Various reviews I found on the web marvel that this young guy -- and an MIT computer science grad, to boot -- has a seemingly curmudgeonly attitude to applications and computer habits: he uses older versions of Mac apps, he eschews Web 2.0 services, he trusts in text files and recommends copying emails you want to save into text files you store on your own hard drive.

    This is the kind of book I would push on a relative or person older than me who’s not computer-literate and doesn’t quite know what to do with or how to handle the files they compile on their PC. It’s bad enough that most PC/Mac owners inevitably become their own sysadmins; it’s insult to injury that their computers don’t automatically read minds and track all the info they find interesting and keep their files and photos nice and orderly without significant manual intervention.

    I was irked a bit by some of Hurst's assumptions that drive this book's messages. But even as an old computer hand, I learned -- re-learned, actually -- some good lessons and reminders regarding file-naming, directory organization, and being responsible for the bits I invite into my life.

    What follows are various thoughts, criticisms, and observations about the book. For more information on Hurst, visit his web site, Good Experience, or subscribe to his sensibly formatted newsletter.

    • Hurst’s big idea is Let the bits go. Similar to the basic instructions on organization--do, delegate, defer, or delete--Hurst’s advice is to act on what’s actionable, deliberately save only what you think you need, and let the rest go. This enables one to move swiftly through all the RSS feeds and downloaded files while still being able to find the one file you really need. “Just in case” is not really a good reason to save anything.
    • Hurst prefers the bits (i.e., electronically captured and shared data) over paper. Paper requires energy to produce and transport, it doesn’t scale, and it can’t capture the instant arrival and transformation of bits. Paper is old-fashioned and simply can’t keep up with the flow.
    • I disagree with all of Hurst’s opinions about paper. In regards to the energy needed to produce paper--exactly how many nuclear-, hydro-, or coal-powered plants are needed to produce the electricity for you to read these words? If paper isn’t a good repository for to-dos or information, then maybe it’s because people didn’t learn good habits on how to use them? If the bits are so wonderful, our use of paper should have naturally declined. Instead, we need Hurst’s book to tell us how to use the bits--just as many people for many years taught knowledge workers how to use and file paper. So maybe it isn't the medium that's at fault here.
    • As for the inability of paper to transform bits on the fly--if the goal is to transform an email into a to-do, then I phrase the to-do in my head (which is the hardest part of the task, incidentally--I’m continually re-learning how to phrase a to-do so it’s actionable), write down the task in my paper diary for whatever day I need to do it, and then delete the email or file it for reference. The to-do is thus ready for me to tackle when I'm ready to do it, and since I use my paper diary daily, I don't have to worry that I'll forget to do it. A paper diary well-used -- I prefer Mark Forster’s Do It Tomorrow system -- is to me superior to all the electronic tools I’ve tried.
    • I think paper is not the disadvantage. Nor are excessive bits. The disadvantage is that people haven’t decided what information is really important to them and then been schooled in how to use either method effectively. Paper and electronic methods for handling info exist and either one will work fine. But if you think that everything is important or that you may need this information “someday,” then you do curse yourself into being a custodian of huge wodges of information for a long time and that is a thankless task.
    • Hurst’s contempt for paper is oddly reflected in his self-published book's contempt for the niceties of book design, thus impairing a good reading experience. The paragraphs are separated by a blank line (drafted in a text file, no doubt) instead of more visually attractive line spacings. And--this is what really annoyed me--there’s no friggin’ index! How am I expected to find the reference to the reformatted New York Times article links? Or to the Macintosh apps he recommends? The table of contents is no help. Guess I’ll have to thumb through the book until I find the footnote on page 177 that lists them all -- but then, how will I remember them? Write them down? On PAPER??  A simple back-of-the-book index is an example of a sensible device to navigate paper-based information, exactly the kind of device that Hurst doesn’t acknowledge existing.
    • As for handling to-dos, I tried his Gootodo service and it just didn’t mesh with how I process my tasks using my paper diary and Forster’s DIT system. I agree with the school of thought that says writing things down by hand engages parts of the brain that typing doesn’t. Forster describes how the simple act of writing down an idea that occurs to you, rather than acting on it when you get it, automatically puts distance between you and the task, allowing you to think more clearly about what actually needs to be done. Deferring a task is also possible with Gootodo, of course, but I'd offer this as an example of, if you know what you want to accomplish, then either digital or paper methods should work fine.
    • It sounds like I’m anti-Hurst, but I’m not. I agree that users need to take responsibility for their “stuff,” and I’ve hit on my own file- and folder-naming strategies, similar to Hurst's, that enable me to store and scan efficiently, based on my own needs. My own flirtations with various proprietary applications like Lotus Agenda, Infoselect, and Evernote have taught me that I accumulate way more info than I ever need (”just in case”), that that info never survives intact when transformed, and that I hardly ever need that info anyway. As a result, I’m saving more stuff in txt or rtf files (usually procedures or projects I'm pursuing at the time), I’m stockpiling bookmarks in Delicious, and I'm squirreling stuff web pages or other information away using a Gmail REF label. I don't perceive that storing them causes a cognitive burden on me. Although the bits are not truly "gone," were I to lose them, I wouldn’t be sad.
    • I liked his description of how the best way to save photo files. Very good and sensible advice. I was doing something similar but tweaked my layout to match his rules. Although it's curious that his book doesn't address ways to save and access downloaded music or video files, which are surely as ubiquitous as digital photos. Perhaps, as a Mac Man, he uses iTunes, which handles a lot of that for him. For myself, I use Media Monkey on my PC to handle that chore, and I prefer a directory-based layout as the foundation layer for any music apps.
    • On maintaining a media diet, I agree with his statement that "an unbounded bitstream tends toward irrelevance." Alas, I still maintain too many RSS feeds, but hardly any hard-copy publications. For my RSS feeds, I have a single must-read folder, a second read-when-I-have-a-moment folder, and the rest are all optional. As with many of Hurst's other suggestions, the aim is to control the limited resource that is your time and attention; being profligate with your energy and focus on digital snack-food doesn't help your cause.
    • His chapters on file formats, naming, and storing files are what I wish I'd had when I started using PCs lo those many years ago.
    • I very much  agree with his advice to find a "bit lever," which is essentially a global AutoCorrect app that will expand abbreviations to full words, phrases, paragraphs, URLs, etc. I'd also suggest a good clipboard management program. For Windows, the best is ClipMate; I haven't found a great one for the Mac, but am evaluating CopyPaste Pro. I also like having a macro program around; for the PC, I've used Macro Express for years, but ActiveWords looks good, too. As for managing passwords, I've relied on Roboform on Windows, but haven't really investigated such apps for the Mac.
    • Hurst advocates the Dvorak keyboard layout, which I pick up and put down two or three times a year. When I'm in a crunch, I usually return to Qwerty and stay there.
    • For the index: page 151 lists the programs he recommends for specifying frequently used folders and directories. I have to tip my hat to him for recommending FileBox eXtender for Windows, which I've been pretty happy with so far.
    • For screenshots: SnagIt on the PC. For backups to the cloud: JungleDisk and the Amazon S3 service.
    • Disagree about not using Excel as a database. It works quite well as a flat-file database. If you want to keep a simple list of names and addresses, a text file or Excel is preferred over a database program.
    • Most of Hurst's recommendations, though, he would probably consider small potatoes compared to his bigger vision of re-tooling users for the future as he describes it: more bits, more proprietary file formats or protections (like DRM), more social software and the implication of every bit being tracked and stored somewhere for someone to process. I think there will always be a need for strong opinions on "here's how you should do it" because many of us simply don't have the time or take the time to think through all the implications of the tools we're directed to use. These bit-level tactics will always be needed and will always need re-tooling for the next wave of technology that washes over us.
    • I think, in addition to Hurst's prescriptions, the real key will be in people deciding what they want to do with the technology, with the bits, with their digital tools. If they haven't decided what's really important to them (which is the problem addressed by Hurst's "media diet" chapter), then they'll need all the help they can get to stay on top of it. If they've decided what's of interest to them and their lives and work, then--like Donald Knuth and Alan Lightman--they can choose to eschew email and other bit-processors totally, and get on with what they were put on Earth to do.

    Update 08/06/2012

    I have been using Hurst's Goodtodo web service for about a year now and have woven it into my daily/weekly task management. It works great as a future reminder system. I may blog later about how my always evolving system, which includes Goodtodo, works nowadays.

    → 10:53 AM, Jul 13
  • Nowadays, instead of saying, “He’s a prick,” I’ll say, “He’s complicated.”
    What I’ve Learned: Burt Reynolds -   MSN Lifestyle: Men
    → 4:03 PM, Jul 4
  • "Why you should throw books out"

    That's the title of today's post from Tyler Cowen both at his blog and as a guest blogger at Penguin. His point seems to be that the book you've read is likely not the best book you could be reading, and by passing it down the line (via donation or BookMooch or leaving it somewhere in public) your "gift" is preventing someone from reading something better. He says the calculations here are tricky; you could give the book to a friend, but if the friend is highly discriminating, then your standing in their eyes could suffer by proffering them a substandard book. Better to avoid those calculations and simply throw the book in the trash. The author has been paid, you've gotten what you want out of the book, and you've saved some poor schlub from having to make the calculations you made when you thought about buying the book in the first place.

    His commenters are mainly book-lovers who beseech, implore, and adjure to donate the book to a library for its book sale, or a thrift store, or just leave it somewhere as a serendipitous gift for someone else. They also point out that Tyler may not know his friends as well as he thinks and that the second-bookstore or thrift shop would know better than he what value books have in their local market.

    I go through periodic book purges. My usual method is to pile them up in a box (along with any CDs I've stopped listening to) and take them to BDFAR or Nice Price for trade. Whatever they don't take, I donate to the library for their book sale. And then the box goes back into the closet to collect more books, the making of which there is no ending.

    I had a friend years ago who threw away an Anais Nin book because she thought it was so trashy she couldn't bear it anymore. I remember being astonished at the time (I was in my 20s) at the thought of throwing a book into the trash. Even for books I despised, I still would trade them for something better. Today, I'm still more likely than not to write in the margins and trade them if possible, even though I have less time today than ever to read books. My goal now is to either borrow them from the library or in some other way reduce the flow taking up room on my shelves, so that I reduce the time spent on purging them later.

    → 7:53 PM, Jun 27
  • Do you read a lot of contemporary fiction these days? Like everyone else, no, I don’t.
    Questions for Gore Vidal - Literary Lion - Questions For Gore Vidal - Deborah Solomon - Interview - NYTimes.com
    → 1:54 PM, Jun 26
  • There is a kind of heroic pessimism running through this work, and one is inclined to appropriate for the sort of essay collected in this volume a lament Vidal once delivered for the novel: “Our lovely vulgar and most human art is at an end, if not the end. Yet that is no reason not to want to practice it, or even to read it. In any case, rather like priests who have forgotten the meaning of the prayers they chant, we shall go on for quite a long time talking of books and writing books, pretending all the while not to notice that the church is empty and the parishioners have gone elsewhere to attend other gods, perhaps in silence or with new words.”
    ‘The Selected Essays of Gore Vidal’ - Los Angeles Times
    → 12:11 AM, Jun 23
  • Links 25-May-2008

    • Penelope Trunk has an excellent post on how she got her current favorite mentor, to complement her other posts on the topic. As a forty-odder among twenty-somethings, I find that my mentors are not just the professors, but my peers who have longer experience of being a student, being at SILS, being connected to many other students who they think may be good for me to meet. I have a couple of trusted mentors -- including, of course, The Illimitable Cassidy -- both 20 years younger than me, who provide me with excellent advice and guidance.  I hope to be of use to them one day, or to pay it forward in some way.
    • I recall an author reading I went to years ago; she'd written a book about the Book of the Month club. Her opinion at that time was that literate book-culture was seeing its history growing smaller in a rearview mirror, hence the explosion of books about books, books about reading, books about bibliophiles. There's a strong flavor of sadness and melancholy in these books. I thought of this when reading the UK Guardian review of Alberto Manguel's "The Library at Night":

    The traditional library was a citadel sacred to the notion of omniscience; the web, by contrast, is 'the emblem of our ambition of omnipresence', like a supermarket that boundlessly proliferates in space and deluges the planet with its tacky wares. 'The library that contained everything,' Manguel laments, 'has become the library that contains anything.'

    • In junior high school, I got hooked on the Doc Savage novels with the James Bama covers. William Denton somehow located the author Lester Dent's Master Fiction Plot Formula for any 6000-word story. While you're there, check out William's library science pages. And I'll probably try his index card system for organizing my school work this fall. Update: I tried it for a while but it duplicated other systems for tracking work and reading that were more convenient, so I dropped it.
    → 6:17 PM, May 25
  • Links 22-May-08

    • This paper studies the CVs of assistant professors of economics at several American universities and finds "evidence of a strong brain drain" and a "predominance of empirical work." If you searched the CVs of assistant professors at top-10 IS/LS schools, what do you think you'd find? [via Marginal Revolution]
    • Michael Leddy (of the consistently fun Orange Crate Art blog) recommends this Atlantic article written from a teacher in the academic trenches. Professor X's message to her/his students? "[T]hey lack the most-basic skills and have no sense of the volume of work required; that they are in some cases barely literate; that they are so bereft of schemata, so dispossessed of contexts in which to place newly acquired knowledge, that every bit of information simply raises more questions. They are not ready for high school, some of them, much less for college." Note, though, the type of college the Professor works at. Does this lack of preparation prevail at better colleges also?
    • A great NY Times profile of the great Mad fold-in artist Al Jaffee. By hand, people!! And the Times did a fabulous job of animating some of the fold-ins. The Broderbund set of Mad CDs I bought (cheap!) years ago had that feature, also.
    • Tyler Cowen cites the really only truly most important reason for becoming a full professor.
    Person Tyler Cowen
    Right click for SmartMenu shortcuts
    → 2:27 PM, May 22
  • As within, so without

    When my mind and life get cluttered, so do my physical environments. When I lived on my own, it was the whole apartment. Now, it's pretty much confined to my home office. But as I celebrate the end of the semester and contemplate what to do with myself this summer, I scan the office and see much clutter. Starting on my far left and moving clockwise (that's left to right, for you folks who only know digital clock faces), I see:

    • My graphic novels and comics bookcase, groaning with unread material
    • Two small wicker baskets holding 1) an Airport Extreme router I've not been able to sell and 2) a stack of old MacWorld magazines, a MacBook for Dummies, and a binder of Take Control ebook printouts
    • On my desk, books to take back to the library
    • My seltzer can
    • My overflowing inbasket
    • My 10-year diary
    • My MacBook and laptop stand
    • My desktop PC and monitor with old CDs in the hutch and a 5-ft CD rack sitting atop a 2-drawer filing cabinet
    • A poster I've not had time or opportunity to put on the wall
    • Stand with a boombox and 2 big messy piles of CDs, with a turntable (unplugged, bereft) on the lower shelf
    • My banjo case and materials (restarted my lessons this week)
    • A box where I'm collecting books to take to BDFAR for trade
    • And let's stop there, shall we?

    Zoiks. Probably the first thing I should do, to put my mind in order, is to put my environment in order. As without, so within.

    → 6:21 PM, May 8
  • Too soon old, too late shmart...

    ...goes the old Yiddish proverb. And it works for the spring semester as well as for real life.

    • Using a simple 1-inch binder and two sets of five tabs were fantastic in helping me organize my two classes' syllabi, assignments, special handouts, and so on. I could carry it with me to work and school, I kept drafts of papers or sections of papers organized, and it just neatened up my work.
    • I also used the DIY Planner Two-Page Per Month calendar to keep at the front of the binder. I recorded due dates here. I also like being able to grok the month at a glance.
    • I used two large Moleskine cahiers as my notebooks for each class. This meant juggling two different notebooks, and I would occasionally pick up the wrong one. Next semester: use a Mead two-subject notebook and be done with it.
    • Some days I took lots of notes in class, other days few to none. Hence, I now have two half-empty Moleskine cahiers. Hence, using the Mead two-subject notebook to keep the damage to one notebook instead of two.
    • At the start of the semester, I also used the notebooks to record my reading notes. I found the notes helpful sometimes, especially as they fixed ideas in my head. However, as the semester ground on, I had less time available to record my thoughts and so that activity slowed and sputtered. Also, it was mainly useful to grasp the heart of what was discussed, note any unusual detail or anecdote, and skim the rest.
    • As always: there's more time at the start of the semester than there is at the end.
    • I've tried using the Little-and-often/ESS method and it worked sometimes. (It's also likely that I implemented these strategies wrongly--ie, not often enough and not little enough--or didn't stick with them long enough.) When I'm starting a paper, I'll also timebox the research task or use the Now Habit's 30 minutes of quality work trick. But I'm still thinking too much about the method and that interferes with doing the work. For example, I started using Cal's research paper database in Excel for an early paper and it was excellent for getting me started. But then I got in a time crunch and I abandoned it. I'm still keeping the idea in my back pocket, though, as it's a killer way to organize bunches of citations.
    • For my last batch of assignments (a UI critique and a paper), I borrowed a leaf from Steve Pavlina: I picked an assignment and just worked full bore on it until it was done. (Go here and scroll down to the "Single Handling" section.) And when it was sufficiently done, I moved on to the next assignment and worked full bore on that until it was sufficiently done. And so on. (By "sufficiently done," I mean "good enough." I like keeping a paper around for a couple of days to cool off, review it, and polish things a little more, add more texture to thicken it, etc. I find this re-reading and polishing takes little time or brain energy.) In fact, I was astonished at how well I took to this method and how quickly I achieved results with it. I got two deliverables done well before the due dates and had an unhurried weekend to finish my taxes and do my readings for the week. It also alleviates the problem I have with setting artificial deadlines which I can see right through; with this method, there are no deadlines, just a sufficiently done project.
    • Start all major projects earlier. Don't wait for later. Be kind to your future self. 'Nuff said.
    • Parking in the deck behind the Post Office is great at 8:30, and it gives me plenty of time to grab a coffee before class. Yay! No more waiting for the bus! I didn't discover this till the middle of the semester. However, it does cost about $3 a pop and uses more gas than taking the bus, so I'll probably use this only now and then.
    • Having the upcoming week's work and readings done by the previous Sunday evening leads to peace and contentment when the week starts, and no rushing about at the last minute.
    • I had two folders for each class that would contain the week's readings; as with the cahier notebooks, I'd sometimes get the folders mixed up. Also, they'd contain more printouts than I really needed for one day's class. I'll fix this with a staggeringly simple tip I glimpsed on a bus passenger's lap one day: Label the folders by day instead of by class. That way, each day's work is pre-sorted, I don't need to think about which folder to take, and badda-bing--Bob's your uncle.
    • When working on an assignment: re-read or maybe even type out precisely the directions, the expectations, requirements, etc. I often go off on a tangent and make the process and the final product more complicated than it needs to be. I frequently re-read my last two assignments with the focus of a Talmudic scholar, ensuring that I was delivering exactly what was asked for and not something other than what was asked for.
    • I tried creating a Google Calendar schedule (like Proto-scholar's) that delineated my commute times, class times, work schedule, etc. I never went back to it. I like my daily planner and 2-page-per-month too much. But a recent idea of Cal's--the auto-pilot schedule--I find gobsmackingly simple and brilliant and why the hell didn't I think of it myself? In fact, Pavlina's "focus on one project at a time" melded nicely with a standard day/time to work on these projects. Making these kinds of decisions ahead of time really reduces the friction of getting this work done. Given that I work full-time in addition to taking two classes, I find it necessary to designate whole evenings to one class or the other. During crunch times, I may institute emergency measures. But I think in the fall, I'll designate general class-work for specific evenings and periods of weekend time, and then work in the special projects as needed.

    As I think of more, I'll add more.

    → 7:23 PM, Apr 14
  • Running...out...of...gas...

    Is it me, or should the spring semester have ended a week ago? Why are we dragging it out for another three weeks?

    I see my fellow students in class and around campus and we're all looking tired. I've done some good work in the latter half of this semester, but it's about put me into an early grave, and we're not done yet. I have a paper due Monday, and two more things to hand in for my other class. The final due date for those is May 5 but my goal is to have everything wrapped up by the end of April.

    I'm noticing the classic signs of burnout and exhaustion--it's taking longer for me to do what used to be simple things, short attention span, generally low energy except for what I need to power me through the day. Part of this malaise, no doubt, is due to the fact that I have to make up about 13 hours of lost time at my day job this weekend to make up for the day I spend on campus and going to the eye doctor one afternoon. (Mental note: schedule doctor appointments for first thing in the morning or wait till summer.)

    → 7:14 AM, Apr 12
  • From MFA to MSIS

    In talking to a friend, he remembered that this graduate school adventure started in early 2005, when I investigated getting an MFA in Creative Writing. The next thing he knew, I was at UNC working my ass off on a MSIS degree. How I got here from there went this way, in short steps and occasional large leaps:

    • I'd been dabbling and playing with creative writing for 20 years, and thought, in early 2005, that I wanted to commit myself to it, go back to school, read a lot, write a lot, and see if I had any talent. I felt it was time. I'd always told myself I'd never go back to school unless it was for something I was interested in; I'd never get a degree just to qualify myself for a job.
    • I talked to the head of NCSU's creative writing department about the program's various requirements and so on. I went so far as to revise some old stories, compile them, and send them to him for review. Never heard back.
    • Background to early 2005: I'd been unemployed for most of 2004, and was only an hourly worker at a tech-writing company. As much as I wanted to go to school and study writing, I realized that I didn't have the money to go back to school and that, after getting the MFA, I'd be back where I was at the start: working technical writing jobs that were increasingly unsatisfying and becoming more uncertain of the career's value as time wore on. Also, my career path had kept me on the traditional side of tech writing, away from XML, DITA, structured authoring, and so on. I was aging out.
    • I felt, consequently (and here's Leap One), that I needed to solidify my career options for at least the next 5-8 years. This meant eschewing an MFA and focusing on a degree that would provide me with a more promising and interesting career. But I didn't know what that would be. However, the wheels of higher education were now in motion, in my mind and imagination if nowhere else.
    • Eventually, in June 2005, I got a job that provided a steady income, dependable benefits (much needed at that time), and a place where I could lick my wounds after a wounding 18 months of illness, layoffs, and deep uncertainty.
    • To satisfy my writing needs, I searched out and joined a writer's group in early 2006, and stayed with them till September 2007, when school demands overtook me. That involvement was enough to get me to revising old stories, write some new ones, think about my creative process, and hone my critiquing skills.
    • A local RTP group on Lifehackery started up and I somehow heard of it, and went to a dinner meeting, where we introduced ourselves around, and talked about our productivity compulsions. One of the fellows was Abe Crystal, who said he was a PhD student at UNC in Information Science. Information wha? What's that? (Cue: Leap Two.)
    • I must have done some research because I fixated on attending UNC, getting a master's in IS, and collecting advice from whoever I could. I received excellent advice from a friend of a co-worker, who had graduated with an LS degree from UNC, and I followed her advice to the letter. (I really should post that advice sometime.) By June of 2006, I was a continuing ed student taking my first class, studying for the GRE, and wrestling with UNC's byzantine and antiquated graduate admissions process.
    • More background: My manager was entering school in the Fall of 2006 to get an MBA, and he urged me to take advantage of our company's tuition reimbursement program. That, and he wanted someone else to go through the pain with him of working full-time while going to school.
    • By the Spring of 2007, I was enrolled in UNC's SILS program. My manager urged me, quite rightly, to take two classes at a time. "You're gonna be old when you graduate, Mike, you need to get in as many classes as you can," he said. Well, setting aside the fact that I'll be old anyway, he was right. I'll probably write another post sometime on why taking two classes at a time is good for me.

    Today, in April 2008, I've nearly finished with 24 hours of a 48-hour Master's of Science in Information Science degree. I've not written a short story in a year or so. And I'm barely reading anything that doesn't have eleventy-million citations to its name.  I have another 4 semesters to go.

    Best decision I've made in a long long time

    → 8:40 PM, Apr 5
  • Halving, doubling, and Virginia Woolf

    When I am asked, "Why did you decide to go back to school?" or "How in the world can you work a full-time job and take two classes at the same time?", I can often provide at least 43 separate answers. That is the blessing and curse of my loquacious gift, which makes essay-writing easy but a succinct answer impossible.

    I have a couple of good reasons I toss out about why I prefer taking two classes at a time: I often find points of unexpected connection between the classes, which I wouldn't find were I taking them one at a time; I'm going to be old by the time I get this degree, so let's hurry it up; I find the pressure of the second class provides time/energy constraints that force me to think creatively about my schedule, priorities; and so on.

    Those are all nice, quantitative answers. But there's another, bigger reason that also goes to the heart of why I came back to school in the first place. I can't remember where I read it, but it's a quote by Virginia Woolf that goes approximately thusly:

    After the age of forty, a novelist must either halve her output or double it.

    For whatever reason, that quote and its idea has stuck with me. If you've published or written a lot in your early career, Woolf's advice is to slow the output and create fewer, denser works. But if you've thought more than you've written, then you need to use your remaining time to better advantage.

    When I look at my last 25 years or so, I see that my output has been low. Others who look at my life may disagree, but for me, emotionally, I think I could have done more. Probably lots of people feel that way about their own lives.

    So, one of my reasons for going back to school was to boost my output and make as much of the time and energy left to me as I can. Yes, I'm racing around like a maniac, I'm frequently overwhelmed, and my task diary is a paper-based super-collider of conflicting tasks, projects, and personal obligations. But--and here's the punchline-- I'm learning, writing, and producing a quantity and variety of material that, in my opinion, dwarfs what I have tried to attempt to do on my own over the last 10 years. And since I have the energy and the stamina now to take it all on, I want to make the most of this time and this opportunity.

    → 8:22 PM, Mar 3
  • Speed Networking

    The SILS Alumni Association held a speed networking event earlier this week. It's the second one I attended and, although fewer students showed up this year than last year, I thought it went very well.The "mentors" -- either SILS alums or local folks working in the IS/LS domains who have ties to SILS -- sat inside a U-shaped line of tables, while the students moved from chair to chair every 3 minutes at the ring of Pavlov's bell. Here are some thoughts on what I liked about it and why I think the experience was valuable.

    • It gets you talking to people. We're not, after all, the business or performing arts school. We're mostly a group of introverts, some of us more sociable than others, granted, but it's tough to get us talking to strangers. A 3-minute speed-networking event with the emphasis on communication and fact-finding levels the playing field wonderfully and I think gets people talking with an urgency they wouldn't have at a polite meet'n'greet.
    • You learn to start marketing yourself. With only 3 minutes total, I had to hone my spiel to something quick so that we could actually discover whether we had much to say to each other. It took me about 4 or 5 tries to get this right, and even then, I tweaked it based on the feedback I received. Unnatural, perhaps, but is a job interview more natural? The only way to get better is to practice, and this event provided that.
    • You learn some basic chat skills. See "talking to people" above. Because I'm IS (Information Science), and the majority of mentors there were LS (Library Science), I'd sometimes fall back to standard questions: "Tell me about your library," "What kind of work do you do," that kind of thing, to make them feel OK about talking to to an obvious interloper. Alas, I was flummoxed when, just as I was finishing my screed, the young woman I was talking to smiled and asked, "Do you like working with children?" Ah, a children's librarian! We both laughed but I'm embarrassed to say I never recovered my aplomb and fum-fuh'd till the bell rang.
    • Overview of the local field and the profession generally. By talking to lots of people working at different places, it's possible to gauge the health of the local market and get peoples' takes on the profession as a whole. Will there be jobs available when I eventually graduate? Where's the demand? What are some of the problems they're having to figure out? You can absorb very quickly a range of job descriptions and experiences. I also could feel myself, as I talked to folks, get excited or a little bored by the subject matter of the conversation. With no time to indulge in the deep thinking we INTJs like to wallow in, I reacted honestly to the subjects I'm more naturally interested in. (And yes, I am separating the message from the messenger here, not confusing one with the other.)
    • It's encouraging to be encouraged. I do feel doubt occasionally about why I'm at school sometimes, as I entered it on a leap of faith, with no assurance of what I'd be doing with this degree when I finally got it. But several people reassured me that the skills I've acquired over the last 20 years, added to my education and interests, will help me when I eventually move into whatever field I choose. Made me feel much better about my choice.
    → 7:07 PM, Feb 29
  • Prototyping; GUIdebook

    Found some interesting or otherwise time-passable things on the web related to prototyping and our discussion on Wednesday. A List Apart runs deep-dish articles on web design. This article shows how paper is good for tabbed interfaces, widgets, and usability testing. He also suggests keeping a glue stick handy.

    • A List Apart: Articles: Paper Prototyping

    Pen-based low-fi vs hi-fi; use while keeping the above paper prototypes in mind.

    • Sketching with a Sharpie - (37signals) - "Ballpoints and fine tips just don’t fill the page like a Sharpie does. Fine tips invite you to draw while Sharpies invite you to just to get your concepts out into big bold shapes and lines. When you sketch with a thin tip you tend to draw at a higher resolution and worry a bit too much about making things look good. Sharpies encourage you to ignore details early on."

    A neat idea if you want to keep your prototypes looking rough.

    • Napkin Look & Feel - "The Napkin Look & Feel is a pluggable Java look and feel that looks like it was scrawled on a napkin. ... Often when people see a GUI mock-up, or a complete GUI without full functionality, they assume that the code behind it is working. ... So the idea is to create a complete look and feel that can be used while the thing is not done which will convey an emotional message to match the rational one. As pieces of the work are done, the GUI for those pieces can be switched to use the "formal" (final) look and feel, allowing someone looking at demos over time to see the progress of the entire system reflected in the expression of the GUI."

    This is a really good post that links to Napkin and other sources to express what we heard in class, namely, the more "done" the prototype looks, the more finished the client expects the entire application it to.

    • Creating Passionate Users: Don't make the Demo look Done

    The SILK project grew out of someone's dissertation research. The current public release of Denim runs on Mac, Win, and *nix.

    • DUB - DENIM and SILK - Research - "Through a study of web site design practice, we observed that web site designers design sites at different levels of refinement -- site map, storyboard, and individual page -- and that designers sketch at all levels during the early stages of design. However, existing web design tools do not support these tasks very well. Informed by these observations, we created DENIM, a system that helps web site designers in the early stages of design. DENIM supports sketching input, allows design at different refinement levels, and unifies the levels through zooming."

    Referred to in the List Apart article, this is a neat site that shows the evolution of OS and application GUIs from their inception to today. It has sections for splash screens, icons, the tutorials that were included to help us learn how to click with a mouse, and a timeline showing the slow progress of GUIs from the Lisa and GEOS on up to Leopard. The site appears to have run out of gas around 2005 or so. I have personal experience of GEOS (Commodore 64 & PC), Amiga, DOS 3-5, Windows 3.x, Mac (mid-80s-early 90s), and OS/2.

    • GUIdebook: Graphical User Interface gallery
    • GUIdebook > Timelines > Combined timeline
    → 6:01 PM, Feb 29
  • Links: file-naming conventions

    I remember reading a columnist in one of the Ziff-Davis mags, back in the mid-90s, lamenting the busting of the old 8.3 file-naming conventions that DOS imposed. With the new Win95 long filenames-with-spaces convention, he predicted that people would actually lose more files than find them again. He used as an example their production process, in which every directory name and every character in a filename carried a specific meaning in the workflow. That kind of discipline ensured that everyone knew what state the files were in. With longer filenames, he was afraid that users would be mainly writing reminders to themselves rather than helping out the next worker on the production line.

    Reading the identifiers article reminded me of a 43folders.com blog posting, and that led me to other postings related to how folks name files. The people commenting are mainly graphic designers and web designers, whose work involves tracking lots of little individual files that collectively make up a single job.


    This is from the developers' point of view. Read the original post but skim the comments to get an idea of what developers have to consider when creating files the users will depend on. The Old Joel on Software Forum - Restrictions on # of files in a Windows Directory?

    E: if it is problematic to have several thousand separate directory entries in one directory, I could envision a directory structure in which the all user IDs ending in '0' go to a directory called c:userdata, user IDs ending in '1' go to a directory called c:userdata1, etc. Or use more digits from the end of the user ID for greater granularity: c:userdata00, c:userdata01, etc.

    Vox Populi: Best practices for file naming | 43 Folders

    But, just so I don’t lose you, do give me your best tips in comments: What are your favorite current conventions for naming files? How does your team show iterations and versions? Do you rely more on Folder organization than file names in your work? How have Spotlight, Quicksilver, and the like changed the way you think about this stuff?

    My god, there are 86 comments on this thread and many of them are detailed and illustrated....

    ...and then Lifehacker.com gets in on the fun. There are some some commenters who say "don't include the date in the filename" as that info is already captured with the file and you can sort on that info in most file managers. I include the date because I often share my documents with others and the date in the filename is the quickest way for them to discern whether they have the latest copy. Ask the Readers: Filing naming conventions? Another very long posting that inspired the 43folders post above. It's interesting to note that, for designers, they all have certain types of info they want captured in the filename, such as the client name and draft iteration. But where they put that info depends, probably, on who set up the system first, tradition, etc. What Do I Know - File Naming / Organization Methods?

    Only 4 comments in this one, but they have good detail and pretty much mirror the other postings. Read this one to get a flavor of the longer screeds. File Naming and Archiving | 43 Folders

    A single post detailing another designer's setup at his workplace. Use a boilerplate folder setup and consistent, meaningful names | 43 Folders

    → 7:07 PM, Feb 10
  • Jumping the gun on a MacBook?

    Although UNC requires incoming freshmen to buy a laptop computer, and although some SILS classes require a laptop (I'm thinking here of the database or programming courses), by and large, I've found that I haven't really needed a laptop on campus. I prefer taking notes by hand on paper, and the campus is lousy with workstations where I can check my email, which is what most people do anyway. Most of my homework and papers I prefer to write on my home PC, simply because it's already customized for my peculiar needs. Nevertheless, since I entered the program, I felt a burning urgency to purchase a laptop--I'm falling behind! All the other kids have a laptop! I'm feeling left out!--and took advantage of a pretty good deal at the campus computer store to buy a black MacBook with the eerie glowing ghost-apple on the lid. I added an extra gig of RAM and donated the printer that came with it to a charitable organization. So, no worries there.

    I also bought several of the Take Control ebooks to learn some more about the Mac. I tried out various backpacks, briefcases, and sheathes. I bought a Bluetooth mouse. I dedicated a spot to it on my desk where it sits and recharges.

    And where it still sits, mostly unused. It's a fine machine, but I just haven't needed to use it.

    The new MacBooks are now arriving with Leopard, which means that's another expense I'll have when I decide to upgrade the OS. Fortunately, I've bought no other software to install on it, so the hard drive and OS are still pristine, making the upgrade easier, I should think. Thinking more calmly now, I should have waited to buy till Leopard was pre-installed on all MacBooks.

    It's clear to me now, looking back, that I had induced a panic state in myself over this issue and reason's sweet song would ne'er enter my ear. I took out a loan from the bank in order to pay for both my spring semester tuition and the MacBook, so paying that back every week is a constant reminder of getting too far ahead of myself.

    Update: I wrote the above over a couple of days last week. This past Saturday, I decided to reinstall XP on my home PC, after dithering on that decision for a while. The reinstall went fine--except that Windows couldn't see the second internal hard drive, which holds all of my install files for my other software. I verified that the BIOS could see the drive but XP remained willfully blind. I schlepped the PC to Intrex (where I'd bought the PC in 2006 or so) for them to diagnose and (I hope) fix.

    I didn't enter a panic state on this snafu, interestingly enough. I took the precautions of backing up my volatile data to my external USB drive and to the cloud, so they're accessible if I need them.

    And, need I say, I had a laptop--an underused MacBook on which I could check my mail, finish my homework assignment due on the following Monday, and store info on my paper that's due in 2 weeks. Funny how these things work out.

    Addendum:  Back up those drivers, kids! And print out your Device Manager settings! I should have inserted the motherboard CD and installed the RAID and sound drivers; that's why Windows couldn't see the second internal hard drive. OK, that goes on the master checklist for reinstalling Windows...

    → 7:28 AM, Feb 5
  • Drafting scenarios and stories

    This post discusses the following readings:

    • Gruen, D., Rauch, T., Redpath, S., & Ruettinger, S. (2002). The use of stories in user experience design. International Journal of Human-Computer Interaction, 14(3&4), 503-534.
    • Head, A. J. (2003). Personas: setting the stage for building usable information sites. Online, 27(4), 14-21.

    <<In class, we wrote sample story/scenarios, and I refer to a great story written by a classmate about a guy at a party who is covertly listening to his music while grudgingly assisting his wife with hosting a house party.>>

    I thought the story about the guy at the party trying to hide the earphone was great--it worked as a complete vignette, the character had a secret (which puts the reader on his side), and it has a nice curlicue at the end. It's complete in itself but could fit nicely inside a larger story about this character.

    OK, now *that* I would consider a story, more so than the scenarios we read in the IBMers' paper.

    I've been writing short stories off and on since college and did a couple of NaNoWriMo stints, so here's what I think about the narrative devices used to create stories that could be used for scenarios.

    CHARACTERS. Some of the best ways to create a character include starting with an archetype (the Scrooge type, the strong and silent type, the talkative type, the Type A type), someone you know, or a fictional character you know really well. As you write and spend time with the character, you'll get to know them better and their own personality emerges, especially as you put them in difficult situations.

    You can create an amalgam character or persona, but one person that has many different kinds of tags (like the primary persona in the Personas article we read) can seem a little unreal to me, very manufactured. At that point, I think you're checking stuff off a list rather than creating an imaginary character that *seems* real, which is the goal of fiction. I'd suggest starting simple and then adding stuff as it feels right.

    One of the age-old questions to ask about a character to get your imagination primed, is to ask yourself what the character eats for breakfast. This is also a good opening question to loosen up interview subjects, BTW.

    PLOT. The IBMers don't talk about the mechanics of plotting, which is one of the toughest jobs in story-writing. A story's theme is what the story's about; the story's plot is this happened, then that happened, then this other thing happened.

    Samuel R. Delany has a technique he calls "thickening the plot," in which the writer describes the setting in detail and gets the character interacting with it. So in the party story, we see the character moving around the house, taking things to the kitchen, anything to disengage himself from the party. People trying to talk to him, him turning to hide the earpiece, all help to thicken the plot and ratchet the tension that he'll be discovered.

    RACHETING THE TENSION. In the party story, the tension is, "Will he be discovered?" There's no such tension in the IBM stories because, really, what's at stake for the characters? Nothing much. Particularly, that last story iteration they did was all Star Trek technobabble, there were too many characters (so no one person a reader could care about), and there was really no tension or emotion. (I'd say this is a danger of stories in the IBM method, in which lots of people start using the story as a dumping ground for their ideas and you start losing the main thread.)

    But tugging on heartstrings isn't what scenarios are supposed to do; they're mainly of use to engage your imagination so you see the whole problem space, not just a little piece of it. (The other advantage being they get the picture and expectations from inside your head into someone else's head.)

    The best IBM story was the one where the guy was installing software at 3 a.m. because the workers would be coming to do their jobs in a few hours. A ticking-bomb deadline is tried and true. I'd say that even the Madeline scenario <<a scenario provided by the professor, of someone using a health-care information system>> could use a ticking-bomb urgency, if the waiting room is crowded, people are being processed quickly, and the subject needs to hurry up so he can get back to work.

    GOALS AND OBSTACLES. This is plot. An interesting character in an interesting situation creates the plot naturally without too much intervention. In the case of scenarios, we could introduce massive power failures, ice storms, zombies, etc. but they don't really help us with our purpose, which is to design a good user experience. (Another case where stories diverge from scenarios.) I would call scenarios not stories but soap operas: just one damn thing after another, until the fadeout.

    That said, yes, the protagonist wants something and is frustrated by a stupid UI, a deadline, ice storm, zombies, etc. which means that something has to be at stake for him or her, and there have to be consequences for failure. In the party story, the husband gambled with multiple consequences of being discovered, which is what made it entertaining (another difference from scenarios: scenarios don't have to be entertaining, though they're more fun to read if they are). In the Madeline scenario, what are the consequences of not understanding the UI? Will I feel sorry for that character if they can't get the video working?

    Here endeth another of my verbose postings. Carry on.

    → 9:02 AM, Jan 31
  • Article critiques: scenarios, stories

    This post discusses the following readings:

    • Go, K., & Carroll, J.M. (2004). The blind men and the elephant: Views of scenario-based system design. interactions, 11(6), 44-53.
    • Gruen, D., Rauch, T., Redpath, S., & Ruettinger, S. (2002). The use of stories in user experience design. International Journal of Human-Computer Interaction, 14(3&4), 503-534.

    I thought the best thing about the Go and Carroll article was their listing of differences between scenarios and specifications (though it would have worked better as a table than as text) and their review of the literature surrounding the techniques. I also liked the breakdown of strategy/requirements/HCI planning to year/day/moments. Apart from those squibs, I thought the article was unbelievably dry and unimaginative (which is odd, considering they're talking about the importance of imagination in creating scenarios); for one thing, they introduce the "blind men and the elephant" story in the lead without following it up in the rest of the article. Do scenarios help us see the elephant? Or do they only show us pieces? By the end of the article, we don't know and the authors haven't told us. (I wonder if the editor made them tack it on.)

    The Gruen, et al., article by the IBMers I thought was more interesting and meaty; they seemed really in love with their new tool which seemed to have united disparate stakeholders within IBM as well as their clients. I also thought it was interesting how the stories could be decomposed for other audiences as well, down to the design, marketing, and documentation materials. They don't attempt to speculate as to *why* they think stories unite audiences with differing needs, but I'd guess that we're simply trained, from childhood onward, to think in terms of linear narrative. A page of prose describing someone solving a problem is easier to read and understand than a functional specification document, which requires a specialist to draft. Stories don't require specialists.

    Their descriptions of its use made it seem like a silver bullet, and I would have liked to know what, if any, limitations they encountered. How do they control their stories, to keep them from becoming distended or unbalanced when descriptions get too specific?

    I'd also say that what they're calling stories are not stories, but extended scenarios that use narrative devices like character, setting, plot, etc. The chief characteristic of a story is that the character is different at the end of the story than at the beginning. Their example scenarios don't have that quality; they're more like Star Trek problem stories: Picard is trapped on the holodeck--how do we get him out? No character in such stories really learns about himself or his life. The interest is mainly in seeing people spew technobabble and race against the clock.

    Likewise, the IBM scenarios attempt to trap someone in a problem and watch them squirm to get out. The interest is in watching this particular character squirm (would a different character behave differently in the same situation?) and noting the details of what they do to solve their problem.

    → 8:43 PM, Jan 30
  • RSS
  • JSON Feed
  • Surprise me!