Thursday, December 31, 2009

My Year in Media and Cities, 2009

Still inspired by Jason over at kottke.org, I'm continuing my practice of cataloging some of the things I’ve done and the places I’ve been over the past year. You can find 2008's list here. 


Here we go with 2009.


Cities I’ve attended, spending at least a day and a night in each locale
Manhattan, New York
West Orange, New Jersey
Allentown, Pennsylvania
Philadelphia, Pennsylvania
Baltimore, Maryland
Honesdale, Pennsylvania
Cleveland, Ohio
Detroit, Michigan
Milwaukee, Wisconsin
Skokie, Illinois
Chicago, Illinois
Washington, DC
Missoula, Montana
Kalispell, Montana
Glacier National Park, Montana
San Francisco, California
Berkeley, California
Yosemite National Park, California
Bronx, New York
Jerusalem, Israel


Movies I’ve seen on the Big Screen
Slumdog Millionaire
The Wrestler
The Class
Watchmen
X-Men Origins: Wolverine
Star Trek
UP
The Hangover
The Hurt Locker
Harry Potter and the Half Blood Prince
500 Days of Summer
Inglourious Basterds
Brief Interviews with Hideous Men
A Serious Man
The Shining
Fantastic Mr. Fox
Avatar
 
Movies I’ve seen on the little screen
Burn After Reading
The Big Lebowski
One Flew Over the Cuckoo’s Nest
The Departed
Death Wish
Blue Velvet
North By Northwest
WALL-E
Baby Mama
Darjeeling Limited
The Graduate
The Dark Knight
Gunnin’ for that #1 Spot
The Hammer
Dirty Harry
Step Brothers
Waltz With Bashir
Rachel Getting Married
Being John Malkovich
Semi-Pro
Objectified
The Way We Get By
Tyson
Lost Highway
Adventureland
Donnie Darko
Anvil


Books I’ve read
Ender in Exile
Lolita
Batman: The Dark Knight Returns
Watchmen
Seven Seconds or Less
Baseball Prospectus 2009
God Save The Fan
White Teeth
On the Road
The Corrections
Going Deep
Armageddon in Retrospect
A Farewell to Arms
The Breaks of the Game
A Fraction of the Whole
Underworld
When You Are Engulfed in Flames
The Machine
A Supposedly Fun Thing I’ll Never Do Again
Best American Sports Writing 2009
The Discomfort Zone
I Drink for a Reason
The Soul of Baseball
60 Stories
   
Sporting Events I’ve attended 
Wednesday, January 14
New York Knicks 128, Washington Wizards 122
Madison Square Garden


Monday, February 2
Los Angeles Lakers 126, New York Knicks 117
Madison Square Garden
Kobe Bryant set a Madison Square Garden scoring record with 61 points


Wednesday, March 4
New York Knicks 109, Atlanta Hawks 105
Madison Square Garden


Wednesday, March 25
Los Angeles Clippers 140, New York Knicks 135, OT
Madison Square Garden


Thursday, April 30
New York Yankees 7, Los Angeles Angels of Anaheim 4
Yankee Stadium


Wednesday, May 13
Tampa Bay Rays 8, Baltimore Orioles 6
Oriole Park at Camdem Yards


Thursday, May 14
Los Angeles Dodgers 5, Philadelphia Phillies 3, 10 innings
Citizens Bank Park


Wednesday, July 1
New York Yankees 4, Seattle Mariners 2
Yankee Stadium


Sunday, July 5
Oakland Athletics 5, Cleveland Indians 2
Progressive Field


Monday, July 6
Kansas City Royals 4, Detroit Tigers 3
Comerica Park


Tuesday, July 7
St. Louis Cardinals 5, Milwaukee Brewers 0
Miller Park


Wednesday, July 8
Atlanta Braves 4, Chicago Cubs 1
Wrigley Field


Thursday, August 27
Arizona Diamondbacks 11, San Francisco Giants 0
AT&T Park


Thursday, November 19
Syracuse Orange 95, California Golden Bears 73
North Carolina Tar Heels 77, Ohio State Buckeyes 73
Madison Square Garden


Tuesday, December 1
New York Knicks 126, Phoenix Suns 99
Madison Square Garden


Monday, December 7
New York Knicks 93, Portland Trailblazers 84
Madison Square Garden


Saturday, December 19
Los Angeles Lakers 103, New Jersey Nets 84
Izod Center


TV Seasons I’ve watched
The Wire
Seasons 1-5


Mad Men
Season 3


Flight of the Conchords
Season 2


Arrested Development
Seasons 1-3


The Simpsons
Seasons 3-5


Bored to Death
Season 1


It’s Always Sunny In Philadelphia
Season 5


The League
Season 1


Sit Down, Shut Up
Season 1


Each new episode of: The Office, 30 Rock, The Simpsons, Parks and Recreations, Modern Family, Community
 
Plays I’ve attended
Saturday, July 11
Dizzy Miss Lizzie’s Roadside Revue presents The Oresteia


Monday, December 6
Hamlet
Broadhurst Theatre


Concerts I've attended
Wednesday, February 11
Brett Dennen
Webster Hall


So, on average, I've:
  • visited a new place every ~18 days;
  • seen a movie on the big screen every ~21 days;
  • seen a movie on the little screen every ~13.5 days;
  • read a book every ~15 days;
  • attended a live sporting event every ~21 days;
  • watched a season of TV every ~21 days (not including all the partial seasons);
  • attended a play once every six months;
  • attended a concert once every 12 months;
  • and composed a blog post once every ~4.2 days (including this one).


It's been a busy year.

Tuesday, December 29, 2009

Blog-iversary

I'm about four weeks late with this, but happy Blog-iversary to me. Once again, we'll celebrate by reviewing some of the highlights of The Daily Snowman from year two to three in this, the 185th Post Spectacular.


Here we go:
 


Best Embedded Videos



and



Best investigative Series
Abandoned Car Watch, Parts One, Two and Three
Bakery Logo Plagiarism
 


Best Photos






 and



(Both from here.)
 

Best Sentences
  • "As times passes and first-hand knowledge of these wars fades, these memorials play a huge role in determining how the wars themselves are remembered."
  • "His barber made some type of mistake, and his head said, 'Except to Win.'”
Most Prominent Recognition of The Daily Snowman
Sorgi on Deadspin
Gunnin' for That #1 Spot on Ball Don't Lie


Best The Simpsons Posts
The Highly Literate Simpsons
The Simpsons' New Intro


Best Post
A Quick Lesson on Being Interesting
How The New York Times Does Business


Best Weekly Paragraphs
"Made in America"--America, made. In many ways, the story that comes together in the pieces of this book is that of people taking up the two elemental American fables--the fable of discovery and the fable of founding--and making their own versions: their own versions of the fables, which is to say their own version of America itself. Who knows if it is John F. Kennedy delivering his Inaugural Address or Jay Gatsby throwing one more party who is more truly invoking John Wintrhop's "A Model of Christian Charity" from three centuries before? Is it Frederick Douglass or Hank Williams who has the most to tell us, not to mention Jefferson's ghost, about the real meaning of the Declaration of Independence? Doesn't Emily Dickinson, within her own Amherst walls, invent as complete a nation--loose in the wilderness in flight from all forms of restraint, be they those of God or man--as Ahab on the quarterdeck or Lincoln at the East Face of the Capitol?

and

Scott McCloud, in his cartoon treatise Understanding Comics, argues that the image you have of yourself when you're conversing is very different from your image of the person you're conversing with. You interlocutor may produce universal smiles and universal frowns, and they may help you to identify with him emotionally, but he also has a very particular nose and particular skin and particular hair that continually remind you that he's an Other. The image you have of your own face, by contrast, is highly cartoonish. When you feel yourself smile, you imagine a cartoon of smiling, not the complete skin-and-nose-and-hair package. It's precisely the simplicity and universality of cartoon faces, the absence of Otherly particulars, that invite us to love them as we love ourselves. The most widely loved (and profitable) faces in the modern world tend to be exceptionally basic and abstract cartoons: Mickey Mouse, the Simpsons, Tintin, and--simplest of all, barely more than a circle, two dots, and a horizontal line--Charlie Brown.

Best Movie Reviews
Wolverine's Origins Are Surprisingly Boring
UP


Best Reading/Books Posts
The Changing Reader
Universal Authorship


Best Jewish Posts
Chabad Throws the Best Telethons
Old Jews Telling Jokes


Best Travel Posts    
Taking a Hike
Tour de Midwest

There've been about 6000 visitors to the site in the last year. Even though Ariel gets that every time he mentions Pearl Jam, I'm really quite overwhelmed by your support.

Thanks for reading.

Monday, December 28, 2009

Paragraph of the Week

From Ariel Levy's "Either/Or" in the November 30, 2009 issue of The New Yorker:

Unfortunately for I.A.A.F. [International Association of Athletics Federations] officials, they are faced with a question that no one has ever been able to answer: what is the ultimate difference between a man and a woman? "This is not a solvable problem," Alice Dreger said. "People always press me: 'Isn't there one marker we can use?' No. We couldn't then and we can't now, and science is making it more difficult and not less, because it ends up showing us how much blending there is and how many nuances, and it becomes impossible to point to one thing, or even a set of things, and say that's what it means to be male."

Sunday, December 27, 2009

Reviewing Avatar

[Spoilers for Avatar ahead.]

I'm not sure that this is necessarily a good practice[*], but I've fallen into the habit of reading some pretty detailed reviews of most of the TV programs and movies I watch (if you're interested, Alan Sepinwall's TV review blog is comprehensive and excellent--and, as a bonus, he's a Jersey guy). The crucial difference between reviews for movies and for the type of reviewing most often done for TV shows is that movie reviews generally are written for people who have not yet watched the film, while TV reviews usually are meant for already-watched programs. I go out of my way to not read movie reviews before I see the film in question, because not only would this exacerbate the second question enumerated in the footnote below but it would also cause me to see the movie itself with the perspective of a reviewer. Avatar proved to be a weird movie to me.

Describing a movie through writing that the reader hasn't seen is a somewhat futile exercise even under the most fortuitous circumstances. (This is what is so--intentionally--frustrating/cool about the dozens upon dozens of pages of descriptions of fictional movies in Infinite Jest.) But Avatar is more problematic in this regard than just about any other movie I can think of. As has been well-documented, Avatar's plot is almost completely derivative--the watcher has seen this plot before, and she knows how it ends. And yet the visual beauty of the film is so stunning that watching it becomes a very powerful movie experience. But if the main selling point is the visual aspect, what's there to say about it?

It's the literal job, of course, of a group of reviewers to find something interesting to say about movies that the reader hasn't seen, and you can judge for yourself how well they succeeded (hint--the good ones manage to say something more than "Looks pretty, recommended"). More promising is a detailed analysis meant for people who've already watched the film. But even here, the story-telling flaws are so obvious--Stephen Lang's colonel character is flat, with no believable motivation; the heavy-handed narration is often superfluous and distracting; etc., etc.--that they don't really warrant mentioning. There's still room for more creative excursions, using the film as a starting point. (Personally, I'm waiting for Bethlehem Shoals to elaborate on his theory that Avatar is an allegory for the NBA.) But these types of writings don't really fit in with the classic genre of movie reviews.

So what's left? I managed to hold a ~40 minute conversation centered around Avatar with the three friends I attended the movie with during our subway ride home. In retrospect, the most insightful points we made were hyper-focused analyses of small details. For example, we noticed that the repeated use of the phrase "I see you" really cheapened what could have been a meaningful line. The great lines of movie history--think "I drink your milkshake" in There Will Be Blood--are great because they're meaningful in the context of the moment. In Avatar, Cameron seems to favor the brute-force method: if the line isn't sufficiently expressive, repeat it throughout the movie until the audience realizes that it's supposed to be important. But is this a crucial point to note when discussing the film? Probably not--the movie is still well-worth seeing, and yet this specific story-telling failure pales in comparison with the more obvious deficiencies. In the end, this analysis might still be worthwhile even if it's not crucial, because the insight about "I see you" both deepens in a small way the my understanding of this film and is transferable to other films.

Sometimes I'm glad that I'm not a professional reviewer.

[*]
I have second thoughts about this type of habitual review reading because it elicits two main questions: 1. Is each episode of Community, for example, really so crucial that I need to read a review of it (the episode)? (Follow-up to question 1: Shouldn't I spend my time reading things more important and lasting than reviews of a decidedly mediocre sitcom?) 2. Am I becoming reliant on reviewers to think through my TV and movies for me? I think it's probably worthwhile, though, because reading well-written and thoughtful reviews provides a sort of crash-course in reading the movies and TV shows that the reviews review. I'm going to leave for another time the question of whether becoming a better TV watcher is a worthwhile goal.

Monday, December 21, 2009

Paragraph of the Week

From "Robert Kennedy Saved From Drowning," by Donald Barthelme:

For Poulet, it is not enough to speak of seizing the moment. It is rather a question of, and I quote, 'recognizing in the instant which lives and dies, which surges out of nothingness and which ends in dream, an intensity and depth of significance which ordinarily attaches only to the whole of existence.'

I usually refrain from commenting on these paragraphs, but this idea seems especially interesting in comparison to Don Gately's approach to suffering in Infinite Jest: in brief, that great suffering can be overcome by recognizing that each individual moment is tolerable.

Wednesday, December 9, 2009

Happy Belated

Besides for the twin facts that it is no longer cool and that it at times seems to exist solely to allow people to communicate about their pretend farms, the real problem with Facebook is that it cheapens birthdays in all kinds of ways. Facebook takes all the effort out of remembering someone's birthday. Remembering the birthday of a friend used to mean something. No longer. Also, it's almost impossible to remember the birthday of a person who has held out on this whole Facebook fad. I'm sure there are more reasons why Facebook has ruined birthdays. And I'm aware that this is fairly well-trod ground we're treading here.

But I bring it up for a very specific reason: I missed my own blog's birthday this year. The Daily Snowman's blog-iversary is December 3. I completely forgot. Maybe I should make a Facebook profile for this blog.

I'm taking a moderately important test next week. Sometime after that, I'll compose a more festive blog-iversary post.

Sunday, December 6, 2009

Paragraph of the Week

From a recent Knicks Knation blog post by Frank Isola about the changing game experience of Madison Square Garden:

But what do you expect from the Garden, which no longer acts or resembles the Garden of old? They have a group of people who are constantly firing T-shirts into the crowd. It begins right before tip-off and never ends. They do that in Memphis, and for good reason. Such antics should be beneath MSG. And do we really need to hear the public address announcer tell the crowd to “Stand up and cheer for your Knicks?" What in the good name of John Condon is going on over there?

Friday, December 4, 2009

Happy Repeal Day!

Repeal Day, as always, is December 5. That is tomorrow.

Celebrate your freedom, America.

Monday, November 30, 2009

Paragraph of the Week

Sorry for the one day delay. Long weekends are strangely disorienting. Without further ado, here's a paragraph for you, from Greil Marcus and Werner Sollors' introduction to A New Literary History of America:

"Made in America"--America, made. In many ways, the story that comes together in the pieces of this book is that of people taking up the two elemental American fables--the fable of discovery and the fable of founding--and making their own versions: their own versions of the fables, which is to say their own version of America itself. Who knows if it is John F. Kennedy delivering his Inaugural Address or Jay Gatsby throwing one more party who is more truly invoking John Wintrhop's "A Model of Christian Charity" from three centuries before? Is it Frederick Douglass or Hank Williams who has the most to tell us, not to mention Jefferson's ghost, about the real meaning of the Declaration of Independence? Doesn't Emily Dickinson, within her own Amherst walls, invent as complete a nation--loose in the wilderness in flight from all forms of restraint, be they those of God or man--as Ahab on the quarterdeck or Lincoln at the East Face of the Capitol?

Friday, November 27, 2009

Every Object Tells a Story

The first class I ever took in college was called American Autobiography. The secret to writing college-level papers on autobiographies, it turns out, is to realize that memoirists do more than just write down the things that happen to them. There is always both an agenda and a designer. Here's an excerpt from a paper written for that class:
Franklin describes in great detail his initial experiences in Philadelphia. He was dressed in his work uniform, and was weary from a long voyage. He relates that "I was dirty from my journey; my pockets were stuffed out with shirts and stockings; I knew no soul, nor where to look for lodging" (pg 92). He made his way to a bakery in order to purchase bread with what little money he had. Even such a simple task was difficult in an unfamiliar setting. A difference in dialect prevented him from effectively communication to the baker which type of bread he preferred. Eventually he managed to purchase "three great puffy rolls" (pg. 93) but was forced to meander down the street "with a roll under each arm and eating the other" (pg. 93). He admits that made "a most awkward, ridiculous appearance" (pg. 93).
Why does good ol' Ben Franklin go through such length to describe this seemingly embarrassing and un-educational experience? He confesses that "I have been the more particular in this description of my journey, and shall be so of my entry into that city, that you may in your mind compare such unlikely beginnings with the figure I have since made there." Aha. Franklin is nice enough to explain his agenda for the inclusion of this particular passage. But even if the memoirist declines to share her motivation, don't doubt for a second that she has one. Said James Young, a historian of memorials, "The motives of memory are never pure."

I just watched an hour-long version on PBS of a slightly longer documentary called Objectified, directed by Gary Hustwit. Objects, it would seem, are a whole lot like autobiographies: the really good ones make you forget that they were designed at all. But if we can look and think again about our objects we'll realize that someone (hopefully) thought long and hard about how our objects work and look. The film includes a great interview with Jonathan Ive, a designer at Apple, who explained the thinking behind the little sleep indicator light on my MacBook. It should do its job, he explained, but then it should disappear when it no longer needs to indicate anything. And its cool to see how he accomplished that: my 18-month old laptop's sleep indicator light is still visible when the computer is not sleeping; but the latest MacBook iteration features an indicator light that just plain disappears when not in use.

That's just one example of how maybe the least significant feature of one product is the result of deep thought and design. The movie is great in all types of ways, especially if you're interested in chair design (designers, apparently, love talking about chairs), why form no longer follows function, how designers are starting to think about issues of sustainability, and attempting to bring a fresh perspective to all the things that fill our world.

Monday, November 23, 2009

The Changing Reader

As I mentioned briefly last week, I spent Friday at the first ever Footnotes conference, subtitled thus: New Directions in David Foster Wallace Studies. If you're interested, you can read what Twitter (or, mostly me) thought of the conference. And if you're at all interested, you should check out Nick's recap on The Howling Fantods!, which is way more thorough and informed than any review attempted by me could hope to be. But I do want to elaborate on one point that seemed to recur throughout the conference: readers, as "consumers" of media, have changed as much as, if not more than, media in the last 30-40 years. This is true, I posit, even for forms that have been around all those years, namely, fiction and television/film.

The ways in which postmodern literature and film differ from their predecessors is fairly well established. There is no shortage of information on these changes on your friendly neighborhood internet, but since I've already linked to this once before, here's The L.A. Times' Jacket Copy blog's list of postmodern attributes. But perhaps not quite as developed is a study of how readers and film-watchers have changed over that same time period.

Consider, for example, something as simple as establishing shots on television. If you watch something as recent as Seinfeld, you'll notice the preponderance of establishing shots. The exterior of Jerry's apartment, the diner: these are some of the show's most iconic images. Compare that to "Souvenir," the eighth episode of the third season of Mad Men. If you haven't seen it, here's Alan Sepinwall on the cinematography of this episode (sans significant spoilers):

Included in the stylistic template of "Mad Men" is a reluctance to use establishing shots. Though we occasionally see the outside of the Sterling Cooper building, most scenes don't get any kind of transitional image to tell you, "Okay, now we're moving from here to here" or "Okay, we're back here on the following morning." It's not always that noticeable because the show does such long scenes, but there were several sequences in "Souvenir" where we just followed either Betty or Pete throughout their day, bam-bam-bam - no establishing shots, no dissolves or other obvious transitions, just one quick cut after another of their frustrated, empty lives.

I'm not saying that a lack of establishing shots is postmodern or that Mad Men is postmodern because it lacks establishing shots. But, in a way, this episode only works for sophisticated TV viewers who can follow extremely quick cut jumps, a skill which may require having watched countless hours of television to understand the conventions of the medium.

So what similar examples are true of readers? We may be more distrustful of narrators than readers of forty years ago. This questioning of authority is connected with the fact that, to paraphrase DFW, we're been marketed to very effectively for the entirety of our lives. Which came first? Authors going out of their way to make narrators inherently distrustful or readers learning not to take the written word at face value?

And since this post was inspired by a conference called Footnotes, let's think about footnotes. Reading used to be a basically linear activity. Except for reference books, you started reading on the first page and kept going until the end. Unless, of course, the author incorporated footnotes and endnotes, like DFW did, in a conscious effort to disrupt the narrative flow. But the way we read on the internet is unrestricted and deviating. I click from one page to another, with half a dozen tabs open at once. I do most of my reading on devices that can perform a multitude of tasks. It's surprisingly rare that I read one thing at a time, without distractions. Blog posts and, especially, Twitter seem to cater to this limited attention span reading environment. How will our reading expectations affect contemporary and future literature?

Sunday, November 22, 2009

Paragraph of the Week

The New Yorker, Nov. 23, 2009. Adam Gopnik, "What's the Recipe?"

The woman who reads the fashion magazines isn't passively imaging the act of having; she's actively imagining the act of shopping. (And distantly imagining the act of wearing.) She turns down pages not because she wants to look again but because, for that moment, she really intends to buy that--for a decisive moment she did buy it, even if she knows she never will. Reading recipe books is an active practice too, even if all the action takes place in your mind. We reanimate our passions by imagining the possibilities, and the act of wanting ends up mattering more than the fact of getting. It's not the false hope that it will turn out right that makes us go on with our reading but our being resigned to the knowledge that it won't ever, quite.

Thursday, November 19, 2009

Live-Tweeting

I have some longer posts planned (including one about reading on an iPhone) but for your more immediate The Daily Snowman fix, follow me on Twitter to read about the Coaches vs. Cancer Classic from Madison Square Garden tonight and a conference titled Footnotes: New Directions in David Foster Wallace Studies tomorrow morning.

Saturday, November 14, 2009

Paragraph of the Week

I would like to try to share with you, every Sunday, the best paragraph I've read in the week preceding the particular Sunday in question. Our inaugural entry is from Jonathan Franzen's personal history, The Discomfort Zone:

Scott McCloud, in his cartoon treatise Understanding Comics, argues that the image you have of yourself when you're conversing is very different from your image of the person you're conversing with. You interlocutor may produce universal smiles and universal frowns, and they may help you to identify with him emotionally, but he also has a very particular nose and particular skin and particular hair that continually remind you that he's an Other. The image you have of your own face, by contrast, is highly cartoonish. When you feel yourself smile, you imagine a cartoon of smiling, not the complete skin-and-nose-and-hair package. It's precisely the simplicity and universality of cartoon faces, the absence of Otherly particulars, that invite us to love them as we love ourselves. The most widely loved (and profitable) faces in the modern world tend to be exceptionally basic and abstract cartoons: Mickey Mouse, the Simpsons, Tintin, and--simplest of all, barely more than a circle, two dots, and a horizontal line--Charlie Brown.

Sunday, November 8, 2009

Judging Books By Their Covers

After determining through trial and error that seemingly every movie even remotely worth seeing in NYC was sold out, I stopped by a local neighborhood Barnes & Noble retail store. I collected about half a dozen books to purchase, carried them all over the shop, and settled on three that I would take home. (I sometimes feel bad buying new books because I usually have a healthy stack of books that I have not yet read. Including last night's troika, I now count ten: six novels, two memoirs, one short story collection, and one anthology. I justified the new three by their official B&N Bargain Priced stickers.)

In between the collection of six books and the selection of three, a small end-of-aisle section caught my eye. These books all belonged to a series called Penguin Classic Deluxe Editions, and deluxe they are. In addition to the jagged page construction, these books also include French flaps. But by far the coolest thing about these titles is the cover illustration.

Here's one as an example, drawn by Ruben Toledo, an artist who works most regularly in fashion.


I love the extension of the scarlet theme to the Prynnes' hair color.

It's not quite my favorite book cover ever, but this--and two others that Toledo designed, in addition to the other titles of the series--do a great job at making a classic more accessible. Toledo, in fact, identifies this as the goal of his project:
My only command from Penguin was to make art that would make youngsters want to read — to introduce these stories to a new public no matter what age. That’s why I think the fact that I had never read them was an asset, [combined with] the fact that they are all period stories, clearly set in another time and ruled by the mode of their time in history, yet are totally relevant to this Twitter world we live in.
I like how Twitter now defines our era. But besides for that, I'd say that Toledo succeeds.

Wednesday, November 4, 2009

This Kid Loves Basketball

And spelling his last name and having fun and playing defense and making shots from right there and parks near his house and swings and basketball.




Via @jeskeets

Sunday, October 25, 2009

Destination Maps

In the fall of 2007 I took a class called Envisioning the 20th- and 21st -Century American City. The first assignment consisted of designing a map of "my New York City." I was still growing accustomed to navigating this metropolis--if my knowledge of the precise order of cross streets is not perfect now, it was barely functional then--and drew up a destination-based map of my New York, ignoring directional conventions in favor of organizing locations around which subway line I took to get there. It was a fun project, reminiscent, P. Geyh mentioned in my subsequent analysis of this personal map, of medieval destination-based maps.

So it was pretty cool to find this three-year-old's view of the NYC subway. This is very similar to my project except way better designed (I remember that it took a surprisingly long time to print, cut out [with scissors], arrange, and tape [with masking tape] all the parts of my non-graphically designed map) and meant for a toddler.

But this type of destination-based map might be growing in importance. My dad always used to try to get me to pay attention to the route we were traveling in the car. I think he gave up once I got a GPS. And now, with just about everyone carrying an iPhone or a Blackberry, etc., step-by-step directions are always available. While awesome and convenient, this type of resource encourages our reliance on easily accessibly information, to the detriment of retaining knowledge. I know people who can provide driving directions from just about any point on the east coast to just about any other point on the east coast. All of these people are 50 or older. I, for one, welcome our new destination-based map overlords.

Thursday, October 22, 2009

Universal Authorship

You know that line about everyone being a critic? It may be more true than you think. Denis G. Pelli and Charles Bigelow, two cool seeming professors at NYU and the Rochester Institute of Technology, respectively, claim in a new article in something called Seed Magazine that we are quickly becoming a society of writers:
To quantify our changing reading and writing habits, we plotted the number of published authors per year, since 1400, for books and more recent social media (blogs, Facebook, and Twitter). This is the first published graph of the history of authorship. We found that the number of published authors per year increased nearly tenfold every century for six centuries. By 2000, there were 1 million book authors per year. One million authors is a lot, but they are only a tiny fraction, 0.01 percent, of the nearly 7 billion people on Earth. Since 1400, book authorship has grown nearly tenfold in each century. Currently, authorship, including books and new media, is growing nearly tenfold each year. That’s 100 times faster. Authors, once a select minority, will soon be a majority.
I'm not sure if the particular conclusion that "every person will publish by 2013" is accurate, but I think the details here aren't terribly important. Even the people I know who refuse to give blogging or Twitter a try are basically already authors, whether their chosen medium is a Google Chat status message or whatever Facebook is calling its status updates these days. This is a cool thing.

The most basic underlying mechanics are remarkably similar for each of these media (and are themselves remarkably similar to what we think of as traditional authorial pursuits): unlike a simple email or letter, in all these cases the author composes a thought for an audience that is largely undefined. It takes a certain measure of creativity, imagination, and empathy to be able to write effectively for an undefined audience. The author needs to be able to recognize the differing levels of background knowledge, reading ability, and cultural literacy for an amorphous reader. That's not an easy task. I'm not saying that these new forms of authors will achieve a high level of this empathy--or that even the best authors aren't capable of being hideous people--but this shift to near-universal authorship in our society might prove to have an impact in areas far beyond what internet start-up is popular this week.

I'm curious also about how this changing pattern in authorship will affect reading and our languages. Will people become more serious, thoughtful readers if they better understand the writing process? Will our use of languages change more rapidly? How in the heck will dictionaries decide what constitutes credible sources for usage? I obviously don't know, but it'll be cool to find out.

Wednesday, October 21, 2009

Old Jews Telling Jokes

I'm not sure how I hadn't heard of Old Jews Telling Jokes until today. It just seems like the type of thing I would have heard about. Here's an example:



And the cool part is that the silly concept turns out to be something more than just a way to laugh at my grandparents. So says Sam Hoffman, the director, producer, and editor of this project:
I wrote before that these jokes shed a light on a culture, but they also reveal much about their tellers. Like playing the piano, telling a joke requires craft, artistry and style. Does the teller have the patience for the set-up? Can he or she remember the details that make the characters familiar? Will he or she commit to the voices and the accents and keep them consistent? Is there that innate sense of timing.
I agree.

And this is one more reason why you really should follow Roger Ebert on twitter.

Friday, October 2, 2009

Writerly Pacing

Part of a super-long interview promoting his new book, Joe Posnanski takes a moment to describe pacing:
There is something about pacing in writing that has always fascinated me. You wish you could be there with every reader and say, "OK, this part you're supposed to read really fast. And this part, no, slow down, take your time on this part. And that part, yeah, just skim over that part." I suppose the writers who can get the readers to do that—to speed up and slow down instinctively—are the special ones. I don't have that talent, obviously, but it's something I do think about.
Is it possible that standard-length books of, let's say, 200-400 pages just aren't capable of holding a reader's full attention all the way through? I think it is possible, and probably even likely. 200 pages is a big time commitment. It's a little weird, though, to think that each reader would be interested in the same topics. And it's fascinating that even the author realizes that some parts of a book just aren't as interesting. But, again, I might be enthralled by a part of the book that the author himself just wasn't in love with. Morte D'Author, indeed.

Tuesday, September 29, 2009

Wanted: A New David/Dave

The past 12 months have been rough at times. I think that's what happens when your first foray into adult life coincides with a bad economy. But more than that, I've been missing my David/Dave. From the age of 12, I've always had a celebrity David/Dave to entertain, enlighten, and edutain me. I started off with Dave Barry, reading books and books along with as many archived columns as the internet could hold. Soon after, I transitioned to David Sedaris. This was nice, because he was actually producing new material at the time I was reading him. There was even a brief time when two of David Cross's stand-up albums were basically played on repeat in my apartment (check out maybe Cross's best known bit at the 7:20 mark of this youtube video with plenty of cussin'.) Soon after, I discovered David Foster Wallace. And just when I had finished reading eulogies of DFW, I started watching The Wire, created by David Simon. But I finished The Wire in the early portion of 2009, and I've been Dave-less ever since, the first time since the late '90s that my favorite artist in the world hasn't been named David.

I'm looking for a new David/Dave.

I've done some light research using the internet. By this I mean, I typed the words "Dave" and "David" into Amazon's search engine, and I let the mildly creepy auto-complete function fill in the rest of my search. Leaving out redundancies (e.g., Dave Matthews Band and Dave Matthews) this is what it came up with:
  • Dave Matthews Band
  • Dave Ramsey
  • Dave Eggers
  • Dave Chappelle
  • Dave Brubeck
  • David Gray
  • David Guetta
  • David Bowie
  • David Sedaris
  • David Cross
  • David Archuleta
  • David Foster Wallace
  • David Crowder Band
  • David Baldacci
  • David Cook
Musicians are out because I don't like/understand music quite enough to obsessively follow any of them. Though I do kinda like David Bowie. That eliminates most of this list, and almost all of the Davids/Daves that I hadn't heard of before tonight. Dave Ramsey is some sort of financial writer, who has authored some works as The Total Money Makeover: A Proven Plan For Financial Fitness. David Baldacci writes crappy mass market novels. I've seen enough of Dave Chappelle's work to be impressed, but I'm happy to leave it at that.

Which leaves us with Dave Eggers. A Heartbreaking Work of Staggering Genius is one of the best things I've ever read. But What is the What is the first book I can remember that led me to consciously decide to stop reading. (How We Are Hungry was pretty good.) But can a pantheon David/Dave member be responsible for a What is the What level misstep? I've heard good things about You Shall Know Our Velocity, so maybe that will be Eggers's last chance to achieve official David/Dave status.

The only David that I know of who may fit the bill is David Chase, creator of The Sopranos. I think I'm going to give him a shot, but if you know of any good Davids/Daves that are out there, please let me know. It's not easy being Dave-less.

Tuesday, September 22, 2009

Chabad Throws The Best Telethons

Chabad sure knows how to party. And fundraise. And publicize itself.

The organization managed to involve--and I'm not sure you could come up with an odder pairing--both Ron Artest and Triumph the Insult Comic Dog in its most recent telethon.

First, Artest. A philanthropist agreed to donate $1000 for every free throw completed in the span of one minute. So Ron Artest agreed to show up to a TV studio at 6:50 AM. He made 29 free throws and behaved himself.
"[Artest] grew up in Queens, so seeing a yarmulke or a Rabbi with a black hat wasn't National Geographic for him," Marcus told me. In all, and in contrast to his fierce on-court rep and off-court rap sheet, Artest was, Marcus says, "absolutely super menschy."
And here's Triumph:



The whole thing reminds me of this picture of Wilson Chandler and a Rabbi Grossman, which originally appeared here on this blog.

I am endlessly fascinated by interactions between NBA players and rabbis.

Friday, September 18, 2009

Google Throughout The Ages

For such a simplistic site, the Google main search page has gone through a fair number of changes since its launch in November 1998. Did anyone reading this remember that the wordmark above the search-box originally had an exclamation point appended to the end? I certainly didn't. To be fair, the extraneous bit of punctuation was there for less than a year.

Also of note is the page highlighting PC Magazine’s decision to award Google with its Technical Excellence Award way back in 1999. Good call, PC Magazine.

The internet is old.

Thursday, September 17, 2009

The Highly Literate Simpsons

The HeiDeas blog has done all of us a tremendous service by cataloging all the linguistic jokes used on The Simpsons during the past four seasons. It's certainly worth perusing. Here's a good example to get you warmed up:
Episode: Funeral for a Fiend (2007)

Category:Idiom chunks, degree phrase

Sideshow Bob's psychiatic expert witness is giving evidence that SB was insane during his most recent attempt on the Simpsons' lives:

Psychiatrist: Robert was a peaceful boy, sickly and weak from a congenital heart defect. [He shows a picture of SB going to his prom in bed. The jury goes "Awwww!"] But then that Simpson boy started tormenting him, and he crossed over into dementia!
Sideshow Bob (defending himself): To what degree was this dementia blown?
Psychiatrist: Full! [Jury gasps.]

Tuesday, September 15, 2009

Why I Like Rock Band

I like Rock Band because it's fun. Most video games are. And they may even be good for you. But Rock Band is more than just fun and possibly a good teacher of problem-solving skills: it makes me appreciate music more. I've thought this for a while, but I had a hard time coming up with an appropriate food metaphor about it until I read Seth Schiesel's review of The Beatles: Rock Band in The NY Times:
It is an imperfect analogy, but listening to a finished song is perhaps like being served a finished recipe: you know it tastes great even if you have no sense of how it was created. By contrast, playing a music game like Rock Band is a bit closer to following a recipe yourself or watching a cooking show on television. Sure, the result won’t be of professional caliber (after all, you didn’t go to cooking school, the equivalent of music lessons), but you may have a greater appreciation for the genius who created the dish than the restaurantgoer, because you have attempted it yourself.
More than just introducing me to new artists and songs, Rock Band has taught me how to listen to music. I really never listened to music growing up--even in the car it was always sports or sports talk radio. Music was often a soundtrack to other things, but I rarely listened to it as activity in its own right. So I had a lot to learn. I'm not saying that I appreciate music as much as someone who actually plays an instrument or has devoted countless hours to intent listening, but I think I'm getting a hang for the basics. I can certainly hear more than I could when I started playing Rock Band, which isn't necessarily saying much because until I started playing this game I listened to music almost exclusively for the lyrics. Which is the point. Slowly but surely, a video game is helping me understand why everyone in the world seems to like this music thing so gosh darn much.

Monday, September 14, 2009

When Athletes Speak

Michael Jordan--along with John Stockton, David Robinson, Jerry Sloan and C. Vivian Stringer--was inducted into the Basketball Hall of Fame last Friday evening. This year's Hall of Fame ceremony has garnered more attention than most years', and with good reason: it very well might be the most impressive class of inductees in the history of this type of thing. The ceremony was moved to a larger venue, ESPN marketed the event way beyond its efforts of years past, and even Jordan's choice of David Thompson as presenter was reported breathlessly by the media. But the real fireworks began when Jordan got up to speak.

Here's video of the address.



This speech has led to a mostly critical backlash, including this response from Yahoo!'s excellent NBA reporter, Adrian Wojnarowski:

This wasn’t a Hall of Fame induction speech, but a bully tripping nerds with lunch trays in the school cafeteria. He had a responsibility to his standing in history, to players past and present, and he let everyone down. This was a night to leave behind the petty grievances and past slights – real and imagined. This was a night to be gracious, to be generous with praise and credit.

I understand where Wojnarowski is coming from: Jordan is pretty universally regarded as the best basketball player of all time, and the settling of scores in this forum, from the greatest, is somewhat petty. But it's this very hyper-competitiveness that made him great. Here is Dime Magazine's explanation for why Jordan ranks number one in their list of the Top 25 Motherf*ckers of All Time:

Michael wasn’t the most talented guy to ever pick up a basketball, but he became the greatest motherf*cker of all time because of unparalleled mental toughness. People made careers out of trying to be “Jordan Stoppers,” but no one was ever able to actually live up to that title. After getting that crown, Gerald Wilkins got a 31 ppg helping from His Airness in the ‘93 Eastern Conference Semi’s. What other motherf*cker abused guys specifically set out to stop him like Mike? Even as he was approaching 40 in the Wizards phase of his career, he refused to show weakness and was still feared.

Would it be nice if the cutthroat on-court presence made way for a gregarious off-court persona? Sure. But I think that may be unreasonable to expect. Jordan wouldn't have dominated basketball to the extent that he did if he didn't remember every slight mistreatment, using it as motivational fuel to continue dominating a sport beat nearly into submission.

David Foster Wallace discusses this general concept in his essay "How Tracy Austin Broke my Heart," collected in Consider the Lobster. He posits that world-class athletes are unable to say anything interesting about performing under pressure because they are hard-wired to remove all thoughts from their brains when a game or match is on the line. Here's a somewhat lengthy excerpt:

It is not an accident that great athletes are often called "naturals," because they can, in performance, be totally present: they can proceed on instinct and muscle-memory and autonomic will such that agent and action are one. Great athletes can do this even--and, for the truly great ones like Borg and Bird and Nicklaus and Jordan and Austin, especially--under wilting pressure and scrutiny. They can withstand forces of distraction that would break a mind prone to self-conscious fear in two.

The real secret behind top athletes' genius, then, may be as esoteric and obvious and dull and profound as silence itself. The real, many-veiled answer to the question of just what goes through a great player's mind as stands at the center of hostile crowd-noise and lines up the free-throw that will decide the game might well be: nothing at all.

The price we pay, DFW claims, for athletic brilliance under pressure is boring interviews. And, I would claim, the price we pay for hyper-competitive athletes who basically never lose is a type of personality ill-suited to award acceptance speeches.

Friday, September 11, 2009

Inglourious Basterds

I was perhaps unreasonably excited for Inglourious Basterds. This happens just about any time one of my favorite writers/directors/actors/podcasters/bloggers takes on a project that means something to me. Will Ferrell making a movie about the ABA in the '70s? The concept sounded awesome at the time, even though that movie, it turns out, sucks. So when I heard that Quentin Tarantino was making a movie largely focused on American Jews, well yeah, this would be the type of thing that I would get excited about. But it turns out that the Nazi context slightly tempered my enjoyment of the film. Not necessarily because I consider it inappropriate to make entertaining movies about the Holocaust, but because any context at all makes Tarantino's hallmark violence and gore just a tad bit uncomfortable. It's a feeling I didn't get while watching any of his other movies, but it made this one just a tad bit awkward.

But the frame of reference isn't my biggest issue with this film. I'll let the excellent Matt Zoller Seitz explain what drives Tarantino movies:
Tarantino’s talk is not just the fuel of his movies: it’s the engine, the wheels and most of the frame. It’s where the real dramatic and philosophical action takes place. The gunshots, car crashes and torture scenes are punctuation.
And despite the mixed metaphor, I agree with his point wholeheartedly. To show you why he (and I) thinks this, here's a montage of this dialogue compiled by Seitz:



So we run into a problem when the majority of the dialogue in a Tarantino film is spoken in either French or German. The problem, specifically, is, I don't speak French or German, and I imagine that this is true for most of the American audience. So even though some pretty spectacular things are done with the subtitles--in addition to the incomparable Tarantino plot and camera-work and two Mexican standoffs--it's just not the same. Tarantino cuts the audience off here from what it most connects with. And it's this gap, more than any Holocaust squeamishness, that keeps this movie from being great. It comes close, but it's not great.

Monday, September 7, 2009

Taking a Hike

So here's the thing about spending four days hiking in a place as beautiful as Glacier National Park in Montana: you quickly run out of ways to describe the beauty of your surroundings. There are only so many times you can point out how nice the view is or how clear and blue the lake is before these observations become boring. Next time I'll bring a thesaurus.

But seriously, look at how nice these views are and how clear and blue these lakes are.





And I think the inability to describe the scenery combined with just the general human ability to grow accustomed to even the most breathtaking vistas led me to focus more on the hiking aspect of these trails than the destination. Sure, it was nice to reach the top of a mountain and look out at what we saw, but, in truth, there was no shortage of amazing sights from even the bottom of the mountain, not to mention the posters in the ubiquitous gift shops or, even, a simple google image search. This sounds cliche--and, in fact, it is a cliche--but you can learn a lot about yourself and your ability to persevere during the last four miles of a twelve mile hike after you've done more than 30 miles in the three days previous and you've been up since 7:00 AM after sleeping on several rocks and at least one tree root the night before and your knees and ankles are pounding and you really need to find a bathroom and no pit latrines don't count and you still haven't completely adjusted to the altitude and you haven't showered since Sunday morning and it is now Wednesday afternoon and it's hard to remember the original color of your ankles and calves because they are coated in dirt and you're so sick of eating food in bar form that you never want to see anything produced by Chewy or Nature Valley or Clif ever again. Or at least I did. This is the reason to hike.

***
The other interesting thing about Glacier National Park is how much effort is exerted to create a feeling of being in nature. You would think this wouldn't be necessary in a national park--but it is. The shuttle buses, obviously, are all low emissions. But even smaller touches--like benches, No Parking signs, support beams for pit latrines, and traffic barriers all constructed from logs instead of a sturdier, more man-made material--communicate the idea that this park values nature above man.




All smart places do this: Wrigley Field is old, but it purposely maintains this aesthetic of maturity. The same is true of Glacier National Park--metal park benches would feel out of place. But it's important to realize that these choices were conscious ones, even if they were also obvious ones.