Wednesday, February 26, 2014

Gone from the charts but not from our hearts

Out of the mouths of babes emerge comments that make me feel like an old fuddy duddy.
 
One of the occupational hazards of being a photographer is that, occasionally, I run into kids who remind me that I’m no longer one of them, and they would know. The latest instance occurred a few weeks ago when I was grabbing some shots at an after-school youth program in Biddeford. I was chilin’ with a couple of fourth- grade girls in a study lounge – do kids still say “chillin’?” – when the conversation turned to music. 
 
Music is a subject that’s right in my wheelhouse, so I thought we were covering safe terrain. But when I asked one of the girls what she enjoyed listening to, her answer left me positively flabbergasted.
 
“I listen to a lot of stuff, like Eminem and Shania Twain,” she said. “I like a lot of different kinds of music. I don’t really like heavy metal, though. It’s old people music.”
 
Wait, what?
 
It wasn’t a response I was prepared for. I was shocked. Stymied. Mystified. If my jaw had dropped any farther, I would have whacked myself in a sensitive man area, and spent the rest of the day cursing my fate, all in the voice of Mickey Mouse taking massive tokes off a helium balloon.
 
See, I’m a metal guy. Like the young lady, my tastes are varied – gimme some smooth jazz or classic rock any day of the week, brother – but when I hear a crunchy guitar riff that could singe eyebrows from across a football stadium, that could peel the paint from a sports car, that could melt the wax holding together Joan Rivers’ face ... well, it does something to me. Something primal. It stirs up my innards with an iron finger, stoking a joyous rebellion that would shame the warring Scotsmen of “Braveheart.” If it were in any way socially acceptable, I’d respond to the opening riff of Metallica’s “Master of Puppets” by felling a mountain lion with my bare hands, hosting it above my head in the town square, and yelling “Freedom!” until my vocal chords were so much charred meat. Then I’d slink away for about a week, because that would be stupid.
 
But you know, I look at some of my metal heroes today, and while they still rock the house, you can see the crows’ feet and gray hairs sprouting like dandelions. Dave Mustaine, the frontman for Megadeth and a legendary drug fiend, is starting to look like a rumpled mail sack. Time makes fools of us all; that’s true for musicians and fans alike, except for the members of the Rolling Stones, whose organs are preserved under a thick crust of cocaine.
 
When the girl spooked me with her old people comment, that’s when it occurred to me: If I can make it to old age, I’ll be listening to this stuff in a retirement home. It’s hard to imagine how that would work. 
 
A couple years ago, during a different assignment, I found myself at one such retirement home shooting some photos of a pianist who had come to entertain the residents. The songs he played, naturally, were selected from the era in which that generation came of age; the set was heavy on Sinatra and Martin and some of the public domain ditties from the great American songbook, tunes that the house bands on late-night shows can play without cracking open the company wallet. It was fun. The piano guy was a good player, the residents were clearly enjoying themselves, and even as I was contorting myself to get various angles, my toes kept tappin’ to the rhythm. If I had stayed any longer I’d have played a few rounds of canasta in a pair of pink slippers.
 
That kind of music, naturally laid back and mellow, is a nice fit for the age group. More difficult to imagine is Piano Man ripping into a frenzied acoustic version of Slayer’s “Angel of Death.” I picture a cluster of elders leaning on their canes in a geriatric mosh pit, with Dolores from 309 hip-checking her friend Beverly into the snack table during the blistering solos. Then, when Slayer morphs into “Bring Your Daughter to the Slaughter” by Iron Maiden, all the World War II veterans who fought in Iwo Jima can stage-dive off the beverage cart and crowd surf while dousing themselves in beer.
 
Seems like a farfetched scenario. But when “old people” like me reach that age, there’ll at least be a handful of us still taking bands like Anthrax and Black Sabbath for a spin. This should make for some interesting musical requests during these special events; maybe future musicians who play the retirement home circuit will be dragging around Flying V guitars instead of settling gingerly behind the keys of a stately Steinway. They’d better be prepared. Not all of us are hip to Shania Twain.
 
Granted, metal’s a fringe genre. It mainly appeals to people who have unruly body hair and wear lots of leather. I’m sure that others in my age group who’ll be enjoying their twilight years will be perfectly content to whistle the day away to the soothing strains of Michael Buble or Adele, and that’s fine – I’m a man, I can take it. But no amount of aging will ever convince me that Megadeth’s “Holy Wars” isn’t the greatest things since the invention of Fluffernutter. 
 
Mmm. Fluffernutter.
 
Maybe the young girl is right; maybe metal’s made the leap from empowering youth music to rusted artifact of a time gone by. Gives a guy some outside perspective on his life, at least. What’s comforting is that the accumulated years have given me some perspective of my own; and I know there’ll come a time, one day, when someone says to the girl’s grown-up counterpart, “You know, I don’t really like Shania Twain. It’s old people music.”
 
Won’t that be the day. And I. Can’t. Wait.

Saturday, February 22, 2014

How tweet it is

“Hashtag” used to be a word used solely by computer programmers and pale men with really bad acne. The tiny club of people who knew what it meant could have fit inside one of Shaquille O’Neal’s shoes. Now it’s a prominent member of the English lexicon, and we’ve got stupid Twitter to blame.
 
Not that I’m bitter.
 
Social media isn’t evil, in and of itself. It can be used for evil purposes, like bullying a classmate, or posting various pictures of the gross injury you sustained while pogo-sticking over a stack of empty Fruit Loops boxes. But the medium itself is a fairly neutral thing. It can even be used for positive change. Sparking democratic revolutions in Middle Eastern countries? Check. Spreading inspirational messages falsely attributed to sensuous headshots of Johnny Depp? Check. These are things with the potential to improve peoples’ lives, despite the uncomfortable fact that Johnny Depp always looks like he’s five minutes away from directing a porno. I think it’s his hats. They’re porn hats.
 
But these instances are more common to traditional social media platforms like Facebook, which allow users to post comments, engage in discussions, and share links to videos of ferrets who headbang to Slayer tunes. 
 
Twitter’s a different animal. It’s the ugly stepchild of social media. In case you’ve spent the past few years hunting wild game with a tribe of pygmies, Twitter is a website that allows its users to post short messages – called, obnoxiously, “tweets” – that are limited to 140 characters in length. This is just long enough to share something banal and trivial, like “I enjoy ham,” and just short enough to discourage fully formed thoughts, like “I enjoy ham because it reminds me of Sunday dinners with my Uncle Horatio, and here’s four paragraphs about the mole on his neck.”
 
And that’s basically it. If Twitter had any less substance, it wouldn’t exist. It’s not completely devoid of usefulness; fine publications like the Journal Tribune can use it to post public notices. (Go team!) This scant functionality, though, is hardly enough to explain why it’s become such a ubiquitous obsession. It’s everywhere. The lingo it’s spawned is even creeping into the common vernacular, with news anchors and late-night TV hosts forced to conceal their disgust whenever they mention a “tweet” or a “hashtag” – the latter sounding like some kind of shady, arcane drug paraphernalia.
 
A hashtag, in Twitter terms (twerms?), is a word or phrase with a number sign affixed to the front of it, like #stateoftheunion, or #paulyshoresucks. Tacked on to the end of an annoyingly brief Twitter message, it groups a particular posting with other posts that have the same hashtag; this serves the dual purpose of facilitating searchability while giving a harsh noogie to a rapidly eroding English language. It depresses me that I know this. That a non-user like myself can define a hashtag in detail speaks to how thoroughly it’s permeated the social landscape. It spreads faster than STDs on “The Bachelor.” And in five minutes there’ll be a Twitter post about how gross that analogy was.
 
Astoundingly, there are distinguished people of letters who have taken to the form. Among the more notable is British comedian and Mr. Potato Head look-alike Stephen Fry. While he isn’t super well-known in the States – due in part to the fact that he’s never egged a Beverly Hills mansion from the back seat of the Kardashian limo – he’s a big deal across the pond, rounding out a career as a writer and director with an education in English literature from Cambridge. Dude’s got about four million Twitter followers, despite routinely avoiding phrases like “tiger blood” and “edible underwear.” On the Canadian radio show “Q with Jian Ghomeshi,” Fry explained that he’s drawn to the site’s format because it forces a writer to be creative, to pare away extraneous language and deliver a message succinctly. I don’t share his passion for brevity (obviously), but I see his point.
 
The sentiment is nice in theory. In practice, the most active and followed users tend to be celebrities – and not to generalize, but when you combine a dumb medium with a mostly dumb user base, the results are, well, dumb. Here’s a sampling of messages posted by various celebrities:
 
From Sherri Shepherd: “OMG just woke up and my rear is plugged – tried everything – yawning … pinching my nose, closing mouth and pushing out breath … won’t unstop”
 
From Paris Hilton: “No, no, I didn’t go to England; I went to London.”
 
From Miley Cyrus: “Good morning everyone. Life is good, I am laying in bed with my mommy right now scratching her bug bites.”
 
From Lil’ Kim: “I’m driving right now”
 
People actually spend time reading this stuff. What’s worse, people spend time writing it. Fry may be correct in that it forces a writer – at least some writers – to be creatively brief, but the flip side is that it also lets people off the hook. It actively encourages half-formed thoughts and ill-advised brain droppings. The resulting drek is forced down our throats during news broadcasts, with anchors now reading tweets from viewers  reacting to the big stories of the day. As if our understanding of world events will be deepened by the musings of Satanic _Kitty69, and his highly trenchant observation that “oBaMa shuld rspect the constitushun!!!11”
 
But what am I doing? Here I am using all these words when I should be excoriating Twitter in 140 characters or less. So here it is, my 137-character warning to this evil scourge:
 
“If I catch you around these parts again, I’ll wrestle you into a headlock and squeeze until your face is red as a basket of strawberries.”
 
Hmm. Maybe Fry was right. That was actually kinda fun.
 

Saturday, February 15, 2014

Time keeps on slippin'

If I had a time machine, I’d travel back to the heyday of the Roman Coliseum so I could watch toga-wearing schmucks use broadswords to fend off armor-plated lions. 
 
Yup.
 
Picture the scene: I’m wearing a homemade robe fashioned out of a faded Care Bears beach blanket I used when I was four. There’s a fig leaf in my hair, which, for the purposes of this fantasy, exists as actual hair. I watch a few rounds of testosterone--percolating bloodsport. Then, when someone notices I’m wearing Reeboks and taking pictures with a fancy metal memory-maker, gifted to me by the gods of Olympus, I dash outta the joint, hop into my flying DeLorean, and punch it right back to 2014.
 
Or maybe I zip back about a billion years and have lunch with a stegosaurus. Or travel to ancient Greece to watch the legendary battle of Thermopylae. Or prevent Nicholas Cage from making “Ghost Rider.”
 
So many different things we could do with a time machine. Most of us have fantasized about it at some point in our lives. It’s one of those hypothetical questions that pops up, more often in youth perhaps, during dreamy, star-gazing conversations late at night with a close friend – one who had the foresight to bring along a few nips of brandy in a pocket flask. 
 
“What would you do with a time machine?” they ask, and you think, “Gee, I’d go back to my childhood and warn myself not to blow my entire allowance on Jawbreakers and Spider-Man comics. Then I’d take those savings, invest in Google stock, and spend the rest of my life on a yacht, sipping Seabreezes from the navels of Florida State volleyball players.”
 
I don’t know whether this is unique to my own experience, or if it’s a more universal thing, but the people I’ve jawed with about this tend to think in terms of visiting the past rather than the future. Maybe they’re afraid to catch a glimpse of themselves in old age, or fear that the coming decades will see society dissolve into a dystopian hellscape, in which Mad Max-style renegades trade arms to terrorist groups in exchange for free bowling passes. Could be that there’s comfort in already knowing what the past has in store. 
 
It’s a pity, because time travel to the future is theoretically possible.
 
Over the past century, physicists have discovered that time is intimately linked with gravitational fields – you know, that old story. There’s a lot of math and technical mumbo jumbo associated with how this works, but the basic gist is that time flows differently in different parts of the cosmos; a year on Earth is different than a year on Mars, which is different than a year on Jupiter, etc. The stronger the gravity, the slower time flows.
 
So if you want to send someone to the distant future, you start by rocketing their butt into outer space and having them orbit a celestial body with an insane amount of mass, like a black hole. (As New Jersey Governor Chris Christie can attest, gravity is stronger around massive bodies. Zing!) If our intrepid astronaut orbits the black hole for a couple of years – or what feels like a couple of years to him – he’ll return to find that, on Earth, many centuries have passed. In this way, by sacrificing a couple years of his life, a brave journeyman could potentially travel to the year 2514, where he’ll likely find that Jay Leno is in his eighth stint hosting “The Tonight Show.”
 
Travel to the past is trickier. It involves contradictions. Physics dorks like Brian Greene and Michio Kaku are fond of invoking what’s called “the grandfather paradox”: Let’s say you go back in time and kill your grandfather. Never mind why; maybe he stiffed you at Christmas. That means he never met your grandmother, which means your mother was never born, which means you never existed and couldn’t have gone back in time to kill your grandfather in the first place. If this is confusing, you can read about it in more detail in my upcoming memoir, “Musings of a Man Who Hasn’t Dated in a Long Time.”
 
What it boils down to is that time-traveling backward is, in all likelihood, impossible. That’s disappointing. It means I’ll never see the sprawling armies of Alexander the Great. Or watch John Hancock sign the Declaration of Independence. Or convince Francis Ford Coppola that two “Godfather” movies were quite enough, thank you.
 
But it’s still fun to think about. One of the neat things about the time travel question is that it teases our imaginations. It forces us to think of ourselves in terms of a broader historical context, and to wonder how things might have gone differently. If a butterfly flapping its wings in China triggers a ripple effect that produces a tornado in Oklahoma, just think what conscious human intervention in the past could accomplish. We travel to the late 1930s and stop Hitler from invading Poland, and back in the present day we find that Americans speak Swahili and commute to work on the backs of genetically modified pterodactyls. We stop Lincoln from being assassinated, and find that Lady Gaga is a straight-laced investment banker, and the federal reserve is overseen by a hard-drinking circus clown named Bubbles. World events are ripples on a pond, and what power we’d wield if we were the stone.
 
And now I pass the flask to you, dear reader, and ask you to consider the following: A crazy-eyed scientist shows up at your doorstep and hands you the keys to a flying DeLorean. What would you do?
 
Go ahead and mull it over a while. You’ve got all the time in the world.
 

Saturday, February 8, 2014

A faint impression

Charles Dickens loved to write about prostitutes. They were never the main subjects of his work, but the more you delve into his novels, the more you realize how consistently he features them as side characters, usually covered in chimney soot and espousing the hard-boiled wisdom of the streets. One might expect that such denizens of the underworld would have fairly strong constitutions, able to pluck birdshot from a baron’s buttocks if push came to shove. But in “Oliver Twist,” when a prostitute is relayed some particularly shocking information, she faints on the spot. Just drops like a sack of dead haddock.
 
If even sex workers are prone to fainting, then what hope is there for humanity?
 
Well if that’s the criteria, then a lot, as it turns out. Because nobody faints anymore. Not Southern belles, not children, not people who work with large quantities of meat. Nobody. At some point, it just stopped being a thing. 
 
Kind of. I mean, if we’re being totally honest, then it does still happen from time to time, mostly among the elderly and people with nervous system disorders. According to WebMD, fainting accounts for about three percent of all visitors to the emergency room. The other 97 percent were hit by projectiles thrown by Justin Bieber.
 
But that’s not the kind of fainting I’m talking about. I’m talking about the flushed swooning of London gentry, the breathless, “Gone With the Wind” kind of fainting that befalls a person struck by scandalous news. Over time, that particular affliction went the way of polio and scurvy, something experienced by the scowling Dust Bowl farmers of yellowed tintype photographs. Certainly nothing that would happen to someone reading tech blogs on their iPhone at a Chuck E. Cheese.
 
And this wasn’t a condition unique to movie damsels and Dickensian prostitutes. Nosiree. There’s an abundance of documented cases of this kind of thing, some of them found in the Journal Tribune’s very own archives.
 
In a couple of rambling run-on sentences published in the Biddeford Daily Journal in January 1914, a reporter – who I’d like to think was wearing a fedora – recounts two unusual occurrences at the close of hearings in a supreme court trial in Saco. The first was the burning of a hotel owned by the plaintiff. 
 
“(The) second,” writes the reporter, “was the fainting of the defendant while on the witness stand, Mr. Emerson becoming practically unconscious, his sudden illness necessitating his removal from the witness stand after his revival from the stupor into which the excitement had thrown him.”
 
A stupor! Holy cow! You know, I’ve been to quite a few court proceedings over the years. Sex crimes, murders, manslaughter. Not once have I been witness to an honest-to-goodness stupor. Fingers crossed, I guess.
 
So there it is, hard and fast proof that fainting spells were a real-life occurrence, not solely the stuff of melodramatic, bodice-ripping dime novels. But this begs a question that, by now, should be fairly obvious: Why are these types of fainting spells no longer as common? What happened?
 
I’ve read a fair amount about fainting over the past few days, which brings me close to having conducted actual “research,” a genuinely adult activity that makes me feel ashamed for decorating my windowsills with Ninja Turtles figurines. It would have been nice, during the course of this reading, to have discovered a satisfactory answer for why fainting now exists almost solely as an adjunct to some greater affliction. Such an answer, though, doesn’t seem to exist. The disappearance of Victorian artistocrat-style fainting spells has slipped by relatively unnoticed.
 
Which means I can attempt to explain it myself. Cue knuckle-cracking noises.
 
Obviously part of it has to do with an increased awareness of how to maintain one’s general health; let’s face it, fainting is pretty lame, and with better access to healthful foods and medical care, nobody collapses anymore unless something serious and non-lame is happening, like a heart attack, or a roundhouse kick to the head from Chuck Norris. But while knowledge is power, I don’t think that’s the whole story.
 
My guess is that fainting’s diminished role owes largely to desensitization. You can’t shock someone into unconsciousness if nothing shocks them. In the days of corsets and frilly teacups, when children were taught handwriting by monacled old men at their family’s estate in Pigbuttshire, a normal life was sequestered from scandal and intrigue. So when something even mildly dramatic and unexpected happened, like a handsome man splitting the seam of his pants dismounting a horse, then sure, let the swooning begin.
 
Now, nobody’s sequestered from anything. At six years old, kids are watching YouTube videos of would-be stunt bicyclists getting sideswiped by minivans. People play video games in which the object is to pry a shotgun from a dead farmer’s hands so they can mow down an army of zombies. At this point, there’s nothing a person can see or hear that would cause them to spontaneously faint, because they’ve seen and heard it all already. 
 
Technology and access galvanize us against shock, but the tradeoff is that we lose our innocence before we’re old enough to remember what it’s like to have it in the first place. There’s something a little sad about that.

Dickens had a vivid imagination. But even he couldn’t have predicted a world in which a child would have a stronger constitution than a London prostitute.