Five years ago, Hollywood’s henchmen released the third film in the
Spider-Man trilogy, wrapping up Peter Parker’s whimsical tale and
ensuring that there would never be another Spider-Man movie, ever.
Until they made another one.
With new actors, a new script, new director, and probably a new pastry
cart, the vague entity that is Hollywood decided the Spider-Man
franchise needed what’s called a “reboot.” That’s when a character or
movie series is given a re-imagining, a fresh take with a fresh new
approach and, yes, fresh pastries. (Never write a column before lunch.)
The movie industry has done this before with a superhero franchise, most
notably with Batman. The original series, which started out marginally
less than embarrassing, devolved into a hammy and pathetic costume party
populated by hollow-eyed actors grimly collecting paychecks. The new
series is a dramatic improvement, and unlike the Spider-Man reboot, has
the benefit of occurring more than ten minutes after the release of the
last.
Even so, you have wonder when it’ll all end.
Not just the incessant
rebooting, which is getting annoying in its own right, but the superhero
craze in general. I wouldn’t say I’m necessarily looking forward to the
demise of the superhero film as a genre – I’ve seen more than a few,
and I generally enjoy them – but realistically, they can’t go on
forever. There’s only so many times you can tell Spider-Man’s origin
story before audiences lose interest, and once you’ve exhausted the
storytelling possibilities for the more popular characters (Iron Man,
the X-Men, et al.), then which comic book heroes do you turn to for
cinematic inspiration? The Red Bee? Squirrel Girl? Who would enjoy those
movies? I don’t even enjoy admitting those characters are real.
Eventually, the superhero movie will go the way of that now-dormant genre from yesteryear: The western.
Westerns
used to be all over the place, and for a simple reason: They were cheap
to make. Grab a handful of character actors, give them giant hats and
horses, and stage some shootouts on an inexpensive ghost-town lot in the
California desert, and voila, you’ve got yourself a western. You can’t
flip the channel to AMC these days without catching a glimpse of these
remnants of cinema’s past, complete with scraggly beards and
tobacco-drenched spittoons. Growing up, there was rarely a Saturday
afternoon when I didn’t catch my father engrossed in one of these
relics, dutifully following the exploits of Sonny the Hardscrabble
Cattle Wrangler, or whoever was the gunslinger of the day. Forty years
from now, I imagine film buffs of the current generation will be found
similarly rapt, only instead of rooting for Clint Eastwood, they’ll be
cheering on a sexually ambiguous vigilante who wears his underwear on
the outside of his pants.
We have modern special effects to thank for the deluge of protagonists
who look like they’re made of candy. Without computer-generated effects,
there would have been no plausible way to do, say, Iron Man; Robert
Downey, Jr. would basically have been playing an aerial version of
RoboCop, and the rudimentary green-screen technology used for the flying
sequences would have prompted theater owners to hand out a barf bag
along with each ticket. Actually, considering the quality of the recent
Green Lantern movie, that may not be a bad idea anyway.
The problem is that the newfound ability to depict these superheroes
accurately is leaving the market saturated. And I say this as a huge fan
of some of these characters.
Yeah, I was a comic book kid. With
thick glasses and a body mass index that would have made Jonah Hill bow
and call me god, I pretty much had to be. My pre-teen years were
littered with dog-eared copies of Batman and Wolverine, and what
appealed to me then still appeals to me now: The adolescent, flamboyant
fantasies embodied in the grim scowls of dudes with masks and attitudes.
This stuff still tickles the ten-year-old boy in me, and I’m not
ashamed to admit it.
But, as with anything that tickles, you can’t breathe if it doesn’t
stop. I don’t want to see these movies come to an end; I just wish
Hollywood would pace itself. Too much of a good thing leads to wistful
nostalgia marathons on AMC. Just ask Clint Eastwood.
Friday, November 30, 2012
Saturday, November 17, 2012
World wide wed
A little over a week ago, voters of all affiliations breathed a sigh of
relief – even if their candidates lost, or referendum issues didn’t go
their way. The relief was borne of a desire to see an end to the
political bickering and bitterness that was a hallmark of the 2012
campaign, in which insults and accusations were flung more prolifically
than those spouted by professional wrestlers and Celebrity Deathmatch
contestants. The negative ads have ceased, the debate showdowns are
over, and the country is in recovery mode, catching its breath after a
headlong sprint toward closure and finality.
So you’d think the time for political analysis is over. Which is why I’m hesitant to talk about Question 1, the referendum in which Maine voters decided to allow gay and lesbian couples to marry. You’re sick of hearing about it, and I don’t blame you. I certainly don’t want to be accused of shooting after the buzzer, and besides, with the election season properly buried, resurrecting its corpse feels a bit like reanimating Frankenstein’s monster, only without the drooling and electroshock.
Indulge me, if you would.
For a news guy, Election Night is like the Olympics, minus the chlorine and speedos. I spent much of it with my attention split between news broadcasts and the Internet, monitoring progress as results trickled in. One of the web sites I tracked was Facebook – which is an interesting forum for debate in that it’s free of punditry and half-baked analysis from tired, over-caffeinated broadcasters in wrinkled suits. My feed was awash in opinions from friends and family, and their updates were an insight into the demographics comprising my little online circle – dominated, it seems, by moderate voters with an enthusiasm for the process, if not necessarily the results.
One woman shared a story that I feel bears repeating.
This woman – we’ll call her “Stacy” – was asked by her four-year-old daughter if she could accompany her mother to the polls on Election Day. Stacey agreed, and so brought the young one to her local polling place to give the toddler a peek into the voting process. Stacy told her daughter that, aside from voting for the nation’s president, she would also be voicing her opinion on a critical issue: Whether same-sex couples should be allowed to marry.
“What do you think?” she asked her daughter. “Should men be able to marry other men, and women marry other women?” The daughter asked if that meant a couple they knew would finally be able to tie the knot; Stacy said that, yes, it would. The child looked up at her mother with a delighted giggle and said, “Oh, I really hope other people pick ‘Yes!’”
“I have often thought how important it is to teach tolerance to my children,” said Stacey in her post, “but as you can see from this simple anecdote, children are intrinsically tolerant. They only learn to think otherwise from the role models in their lives. It did feel great, though, to nurture that inherent tolerance with which my beautiful daughter was born.”
That made me smile. And it allowed me to envision a future in which the next generation sees same-sex marriage as an immutable right, as unchangeable as the right of women to vote, blacks to marry whites, and speech to be free.
Opponents of same-sex marriage have repeatedly argued that such unions would somehow impinge on their own marriages, effectively devaluing them like a defunct currency. But here’s what happened to their marriages on the day after the election: Nothing. They woke up, kissed their spouses, ate buttery toast in their breakfast nooks, and plotted their days. Business as usual.
A marriage is a personal relationship. That is where its value lies. It is a bond that exists independent of the marriages of others, of widespread divorce, of politics and punditry. And now, in Maine, it exists independent of sexual orientation. When the Declaration of Independence established a person’s right to life, liberty, and the pursuit of happiness, it did so without an asterisk. It did so without discrimination.
Stacey’s daughter has not read the country’s founding documents, and would not be able to articulate that sentiment. But children often know what is fair, and they know it with purity of heart. It is adulthood that sours that purity and turns it against itself, masquerading as maturity and wisdom.
That child deserves a country that is tolerant and free. Same-sex marriage isn’t the skeleton key that opens the door fully on that reality. But it’s a start.
So you’d think the time for political analysis is over. Which is why I’m hesitant to talk about Question 1, the referendum in which Maine voters decided to allow gay and lesbian couples to marry. You’re sick of hearing about it, and I don’t blame you. I certainly don’t want to be accused of shooting after the buzzer, and besides, with the election season properly buried, resurrecting its corpse feels a bit like reanimating Frankenstein’s monster, only without the drooling and electroshock.
Indulge me, if you would.
For a news guy, Election Night is like the Olympics, minus the chlorine and speedos. I spent much of it with my attention split between news broadcasts and the Internet, monitoring progress as results trickled in. One of the web sites I tracked was Facebook – which is an interesting forum for debate in that it’s free of punditry and half-baked analysis from tired, over-caffeinated broadcasters in wrinkled suits. My feed was awash in opinions from friends and family, and their updates were an insight into the demographics comprising my little online circle – dominated, it seems, by moderate voters with an enthusiasm for the process, if not necessarily the results.
One woman shared a story that I feel bears repeating.
This woman – we’ll call her “Stacy” – was asked by her four-year-old daughter if she could accompany her mother to the polls on Election Day. Stacey agreed, and so brought the young one to her local polling place to give the toddler a peek into the voting process. Stacy told her daughter that, aside from voting for the nation’s president, she would also be voicing her opinion on a critical issue: Whether same-sex couples should be allowed to marry.
“What do you think?” she asked her daughter. “Should men be able to marry other men, and women marry other women?” The daughter asked if that meant a couple they knew would finally be able to tie the knot; Stacy said that, yes, it would. The child looked up at her mother with a delighted giggle and said, “Oh, I really hope other people pick ‘Yes!’”
“I have often thought how important it is to teach tolerance to my children,” said Stacey in her post, “but as you can see from this simple anecdote, children are intrinsically tolerant. They only learn to think otherwise from the role models in their lives. It did feel great, though, to nurture that inherent tolerance with which my beautiful daughter was born.”
That made me smile. And it allowed me to envision a future in which the next generation sees same-sex marriage as an immutable right, as unchangeable as the right of women to vote, blacks to marry whites, and speech to be free.
Opponents of same-sex marriage have repeatedly argued that such unions would somehow impinge on their own marriages, effectively devaluing them like a defunct currency. But here’s what happened to their marriages on the day after the election: Nothing. They woke up, kissed their spouses, ate buttery toast in their breakfast nooks, and plotted their days. Business as usual.
A marriage is a personal relationship. That is where its value lies. It is a bond that exists independent of the marriages of others, of widespread divorce, of politics and punditry. And now, in Maine, it exists independent of sexual orientation. When the Declaration of Independence established a person’s right to life, liberty, and the pursuit of happiness, it did so without an asterisk. It did so without discrimination.
Stacey’s daughter has not read the country’s founding documents, and would not be able to articulate that sentiment. But children often know what is fair, and they know it with purity of heart. It is adulthood that sours that purity and turns it against itself, masquerading as maturity and wisdom.
That child deserves a country that is tolerant and free. Same-sex marriage isn’t the skeleton key that opens the door fully on that reality. But it’s a start.
Monday, November 12, 2012
Yule be sorry
Already, it feels like the home stretch.
It shouldn’t. There are two months left before the end of the year, and so it seems premature to be setting sights on Christmas and beyond; it’s a bit like looking forward to St. Patrick’s Day at the beginning of January. Of course, depending on your predilection for Irish whiskey and pummeling hangovers, that may be a moot analogy.
It’s hard to tell which came first: Advertisers’ early promotion of holiday sales, or our own early excitement over the holidays themselves. Did advertisers sense our eagerness, or did they cause it? Either way, the colorful hubbub that closes out each year feels like a rock band pummeling their instruments during an anthem’s violent finale, only the finale lasts for two months and leaves you broke and bloated on sugar cookies.
Ultimately, whether it’s due to the whims of ad executives or our own anticipation, advertising has extended the season into a marathon. This year, I saw my first holiday commercial, a pitch for a department store, days before trick-or-treaters started their neighborhood skulking. Hearing sleigh bells in the middle of a zombie movie is a disorienting experience, and borderline uncomfortable. Christmas ads aired before Thanksgiving are premature; Christmas ads aired before Halloween are an abomination, ranking up there on the Offend-O-Meter next to public flatulence and Joe Biden’s hair.
Aired during the right time of year, holiday ads can provide for some memorable moments, such as Santa Claus riding through the snow on a giant Norelco shaver, or polar bears finding comfort in an ice-cold bottle of Coke. Even though both of those ad campaigns have been off the air for years, people of certain generations remember them fondly, and speak about them each Christmas with a touch of nostalgia, as if the ads were friends who had moved away and no longer call. In a perfect world, these would be the only ads aired during the entire month of December, with exceptions granted to the embarrassing holiday-themed efforts of local car dealerships.
The problem with these ads, though, is timing, not content. If advertising and television executives adhered to a strict rule of not allowing holiday-themed spots to air before Thanksgiving, we would feel a lot less saturated with visions of sugarplums. Instead of rolling our eyes at the 847th airing of a Macy’s spot in which the same smiling girl wears the same sparkling coat that’s an astounding 40 percent off, we’d get an appropriate dose and move onto the next phase of our lives: Figuring out where and with whom we’ll get hammered on New Year’s.
There’s plenty to be said about the content, sure: The gross over-commercialization, the abuse of Santa’s reputation as a pitchman, the overall creepiness of elves. (Am I the only one who feels this way? I feel like hell would be making rocking horses on an assembly line sandwiched between two vacant-eyed elves. Maybe that’s just me.)
But we’re savvy. We’ve got our guard up for a commercialized holiday. We know that, for weeks leading up to Christmas, we’ll be assaulted with a panoply of jolly snowmen, glittery wrapping paper, and cherubic carolers singing about bargains.
What our guard isn’t equipped to handle is a holiday ad season that starts before Halloween. It offends sensibilities established through years of yuletide routines and rhythms. And it makes the holidays more exhausting than they already are. As it is, the day after Christmas feels like the first sweaty, air-gulping moments at the end of a three-legged sack race.
Plus, there’s the kicker: It makes time in general go by way too quickly. Five minutes ago it was Christmas 2011. Then I turned around to pick stray tinsel off my butt and boom, I was writing this rant.
And so I humbly fall to my knees, raise my hands to the sky, and beg advertisers not to get too eager this holiday season. I know you can’t wait to inform the public about killer deals on useless gadgets, but at least wait until the turkey gravy has had a chance to clog our veins.
This is that weird limbo before the storm hits us. Without the intrusion of squirm-inducing elves, we should at least have the chance to enjoy it.
It shouldn’t. There are two months left before the end of the year, and so it seems premature to be setting sights on Christmas and beyond; it’s a bit like looking forward to St. Patrick’s Day at the beginning of January. Of course, depending on your predilection for Irish whiskey and pummeling hangovers, that may be a moot analogy.
It’s hard to tell which came first: Advertisers’ early promotion of holiday sales, or our own early excitement over the holidays themselves. Did advertisers sense our eagerness, or did they cause it? Either way, the colorful hubbub that closes out each year feels like a rock band pummeling their instruments during an anthem’s violent finale, only the finale lasts for two months and leaves you broke and bloated on sugar cookies.
Ultimately, whether it’s due to the whims of ad executives or our own anticipation, advertising has extended the season into a marathon. This year, I saw my first holiday commercial, a pitch for a department store, days before trick-or-treaters started their neighborhood skulking. Hearing sleigh bells in the middle of a zombie movie is a disorienting experience, and borderline uncomfortable. Christmas ads aired before Thanksgiving are premature; Christmas ads aired before Halloween are an abomination, ranking up there on the Offend-O-Meter next to public flatulence and Joe Biden’s hair.
Aired during the right time of year, holiday ads can provide for some memorable moments, such as Santa Claus riding through the snow on a giant Norelco shaver, or polar bears finding comfort in an ice-cold bottle of Coke. Even though both of those ad campaigns have been off the air for years, people of certain generations remember them fondly, and speak about them each Christmas with a touch of nostalgia, as if the ads were friends who had moved away and no longer call. In a perfect world, these would be the only ads aired during the entire month of December, with exceptions granted to the embarrassing holiday-themed efforts of local car dealerships.
The problem with these ads, though, is timing, not content. If advertising and television executives adhered to a strict rule of not allowing holiday-themed spots to air before Thanksgiving, we would feel a lot less saturated with visions of sugarplums. Instead of rolling our eyes at the 847th airing of a Macy’s spot in which the same smiling girl wears the same sparkling coat that’s an astounding 40 percent off, we’d get an appropriate dose and move onto the next phase of our lives: Figuring out where and with whom we’ll get hammered on New Year’s.
There’s plenty to be said about the content, sure: The gross over-commercialization, the abuse of Santa’s reputation as a pitchman, the overall creepiness of elves. (Am I the only one who feels this way? I feel like hell would be making rocking horses on an assembly line sandwiched between two vacant-eyed elves. Maybe that’s just me.)
But we’re savvy. We’ve got our guard up for a commercialized holiday. We know that, for weeks leading up to Christmas, we’ll be assaulted with a panoply of jolly snowmen, glittery wrapping paper, and cherubic carolers singing about bargains.
What our guard isn’t equipped to handle is a holiday ad season that starts before Halloween. It offends sensibilities established through years of yuletide routines and rhythms. And it makes the holidays more exhausting than they already are. As it is, the day after Christmas feels like the first sweaty, air-gulping moments at the end of a three-legged sack race.
Plus, there’s the kicker: It makes time in general go by way too quickly. Five minutes ago it was Christmas 2011. Then I turned around to pick stray tinsel off my butt and boom, I was writing this rant.
And so I humbly fall to my knees, raise my hands to the sky, and beg advertisers not to get too eager this holiday season. I know you can’t wait to inform the public about killer deals on useless gadgets, but at least wait until the turkey gravy has had a chance to clog our veins.
This is that weird limbo before the storm hits us. Without the intrusion of squirm-inducing elves, we should at least have the chance to enjoy it.
Saturday, November 10, 2012
The great vacation dilemma
My last vacation almost killed me.
Nothing dramatic happened; I didn’t go skydiving with a bum parachute, or get attacked by zebras on an African safari, or run with scissors through downtown Lewiston. I did, however, experience more than two consecutive days off, and in many lines of work, that’s akin to a rapidly ascending deep-sea diver getting the bends: You get used to a certain level of pressure, and when it’s released, your body doesn’t know how to handle it.
In the modern civilized era, vacations have become as important to human survival as water and back massages. Imagine a world without them – vacations, not back massages. Imagine going to your job on Monday, plowing through your work week, feeling euphoric as punch-out time on Friday draws near, and squeezing in all your living, your memories, your trips to Cancun, in the meager two days usually reserved for laziness and football. You’d spend a lot of your time hyperventilating, and Aunt Maude would never get to see her grandkids.
It’s a pretty common complaint among Americans that vacation time is not plentiful enough. That’s not a complaint you hear in most other civilized countries.
In France, workers are guaranteed 37 days off per year. That’s a guarantee because it’s the law. Give an employee a scant 36 days and you are beaten about the face with baguettes and wheels of bitter cheese.
In Germany, you get 35 days. In Brazil, 34. Even our neighbors to the north, those proud Canadians, get 26 days, which they need, because they commute to work in temperatures colder than the vacuum of space. That’s gotta take it out of you.
We Americans? We average 13 days.
Thirteen days to travel out to Oak Ridge, Tenn., to take a tour of their X-10 nuclear reactor. Thirteen days to check out the Superman Museum in Metropolis, Ill. Want to see the world’s largest ball of twine in Branson, Mo.? Better equip your car with some turbo and hope you don’t get caught.
In all fairness to American companies (and the lawmakers who regulate them), we’re not capped at 13 days. There are opportunities to earn more, although the requirements for doing so are often pretty steep – like having to work fifteen years straight without a sick day while constantly petting the boss’ cat. That might earn you an extra half-day and a gift cetificate to a pizza joint cited for code violations.
But imagine the feeling of starting a new job and knowing you won’t have to go through years of initiation to bank the necessary time to travel to Selkirk, Manitoba – home of a giant statue of Chuck the Channel Cat.
Now I don’t know about you, but the longer I go without a day off – and especially some extended time, even a long weekend – the more of a struggle it is to perform a job up to my own standards. A good analogy is physical exercise: When burning off cheesecake on a treadmill, you’re always running faster at the beginning of the workout than at the end. That’s because your workout muscles are rested and ready to roll. You attack the treadmill like Garfield attacks Odie.
Likewise, rested work muscles let a person attack their job harder, faster, and free of bizarre treadmill analogies. It can be argued that the work done fresh off a vacation is more efficient, and performed with a greater attention to detail.
And it doesn’t really matter what line of work you’re in. You could be completely in love with your job and still benefit from some prolonged hammock time. If your job is to lie on a bed of marshmallows and fig leaves and let fashion models rub your feet and fan you with palm fronds, you could still use a vacation – although at that point it might be hard to find a leisure activity that provides an adequate counterpoint. Maybe a trip to South Dakota to see the Mitchell Corn Palace.
Point being, Americans are in desperate need of more vacation time. So lawmakers, take note. Just look at some examples from around the world – Italy, for example, which tops the time-off list with a whopping 42 vacation days. Forty-two days! And have you ever seen an angry Italian?
Wait. Don’t answer that.
Nothing dramatic happened; I didn’t go skydiving with a bum parachute, or get attacked by zebras on an African safari, or run with scissors through downtown Lewiston. I did, however, experience more than two consecutive days off, and in many lines of work, that’s akin to a rapidly ascending deep-sea diver getting the bends: You get used to a certain level of pressure, and when it’s released, your body doesn’t know how to handle it.
In the modern civilized era, vacations have become as important to human survival as water and back massages. Imagine a world without them – vacations, not back massages. Imagine going to your job on Monday, plowing through your work week, feeling euphoric as punch-out time on Friday draws near, and squeezing in all your living, your memories, your trips to Cancun, in the meager two days usually reserved for laziness and football. You’d spend a lot of your time hyperventilating, and Aunt Maude would never get to see her grandkids.
It’s a pretty common complaint among Americans that vacation time is not plentiful enough. That’s not a complaint you hear in most other civilized countries.
In France, workers are guaranteed 37 days off per year. That’s a guarantee because it’s the law. Give an employee a scant 36 days and you are beaten about the face with baguettes and wheels of bitter cheese.
In Germany, you get 35 days. In Brazil, 34. Even our neighbors to the north, those proud Canadians, get 26 days, which they need, because they commute to work in temperatures colder than the vacuum of space. That’s gotta take it out of you.
We Americans? We average 13 days.
Thirteen days to travel out to Oak Ridge, Tenn., to take a tour of their X-10 nuclear reactor. Thirteen days to check out the Superman Museum in Metropolis, Ill. Want to see the world’s largest ball of twine in Branson, Mo.? Better equip your car with some turbo and hope you don’t get caught.
In all fairness to American companies (and the lawmakers who regulate them), we’re not capped at 13 days. There are opportunities to earn more, although the requirements for doing so are often pretty steep – like having to work fifteen years straight without a sick day while constantly petting the boss’ cat. That might earn you an extra half-day and a gift cetificate to a pizza joint cited for code violations.
But imagine the feeling of starting a new job and knowing you won’t have to go through years of initiation to bank the necessary time to travel to Selkirk, Manitoba – home of a giant statue of Chuck the Channel Cat.
Now I don’t know about you, but the longer I go without a day off – and especially some extended time, even a long weekend – the more of a struggle it is to perform a job up to my own standards. A good analogy is physical exercise: When burning off cheesecake on a treadmill, you’re always running faster at the beginning of the workout than at the end. That’s because your workout muscles are rested and ready to roll. You attack the treadmill like Garfield attacks Odie.
Likewise, rested work muscles let a person attack their job harder, faster, and free of bizarre treadmill analogies. It can be argued that the work done fresh off a vacation is more efficient, and performed with a greater attention to detail.
And it doesn’t really matter what line of work you’re in. You could be completely in love with your job and still benefit from some prolonged hammock time. If your job is to lie on a bed of marshmallows and fig leaves and let fashion models rub your feet and fan you with palm fronds, you could still use a vacation – although at that point it might be hard to find a leisure activity that provides an adequate counterpoint. Maybe a trip to South Dakota to see the Mitchell Corn Palace.
Point being, Americans are in desperate need of more vacation time. So lawmakers, take note. Just look at some examples from around the world – Italy, for example, which tops the time-off list with a whopping 42 vacation days. Forty-two days! And have you ever seen an angry Italian?
Wait. Don’t answer that.
Wednesday, October 31, 2012
Tricks, Treats, and Tribulations
I remember dressing as Hulk Hogan for Halloween when I was about seven –
back when you could do that and still be cool. The centerpiece of the
costume was a gigantic rubber mask that smelled like freshly-washed
linoleum, with eyepieces roughly the size of atoms. At the time, I wore
thick, Coke-bottle glasses that fogged up easily, and so my
trick-or-treating escapade that night consisted of stumbling around in a
thick mist and wandering into telephone poles. The real Hulk Hogan
wouldn’t be doing that for another 20 years.
That may not sound like a fun adventure, but I was ecstatic. There’s something about donning ridiculous garb and going on a candy-hunt that transcends discomfort and near-blindness, and it’s why trick-or-treating has endured as a tradition. Without it, the lead-up to Thanksgiving would lack pizazz. If Thanksgiving is a boring sports game, Halloween is the gaudy fireworks display that precedes it.
Problem is, trick-or-treating is a more sanitized event than it used to be. When I was growing up – a timeframe which now seems like it took place in the Mesozoic Era – there was a sweet spot during which my friends and I were still young enough to enjoy going door to door for candy, but old enough to roam the safer suburban areas without our parents. This was an unprecedented level of independence that lasted for two, maybe three years, and it was strangely exhilarating. It’s what I imagine going on a spacewalk would feel like, with the exception that candy pretty much contributes to the opposite of weightlessness.
One Halloween in particular, during that sweet spot, I went trick-or-treating with a couple of friends from school. We were approaching our teen years, and though it remained unspoken, we all realized that time was running short: If we pushed it much further, we’d start coming home with bags full of electric shavers and acne cream instead of Smarties and Milky Ways. So it was with a sense of urgency that we embarked, sans parents, on a mission to collect as much loot from the neighborhood as we could.
If we had been chaperoned, the following would never have happened: While giggling over our growing stash and walking to the next house, a couple of older kids – kids who felt no compunction about asking for treats in crackling, pubescent voices – burst from behind a cover of shrubbery and shrieked, “Give us your candy!” I locked eyes with my friend (well, eye; he was a pirate) and without saying a word, we sped off into the night, ducking under fences and leaping over hedges, our bags of goodies wagging behind us like parachutes. Sometime later, when it was clear we had shaken our pusuers, we sat and rested with our backs to a parked car and caught our breath. After a moment, we looked at each other, smirked, and then burst out laughing.
Had our parents accompanied us, the masked candy thieves would never have attempted their nefarious plot. And if they had, somehow, been that stupid, our parents would have stopped them cold with with their parent-ness. They would have been right to do it; it’s a parent’s job to protect their children, even from hormonal werewolves.
But I’m glad the folks weren’t around. First of all, the shrub-boys didn’t pose much of a threat; my friend and I knew that, even if they caught us, we could simply scream for help. It was a suburban neighborhood filled with trick-or-treaters, not an African jungle.
Mostly, though, I recall the night fondly. Outrunning dim-witted, would-be criminals and then laughing about it is one of those memories that, for me, adds to the luster of the holiday. Nowadays, kids tend to miss out on that kind of adventure: The kind that feels sorta scary and larger than life, but is tempered by the innate knowledge that the palpable concerns of the real world are buffered by what remains of childhood.
In a more dangerous and uncertain world, many kids won’t experience that. To their detriment.
Of course, despite all that, trick-or-treating is still a blast. If it’s become too monitored and strict, at least there’s the candy, and the crisp autumn air, and the unabashed escapism. Just one small parting piece of advice: Don’t go as Hulk Hogan. That’s so 1988.
That may not sound like a fun adventure, but I was ecstatic. There’s something about donning ridiculous garb and going on a candy-hunt that transcends discomfort and near-blindness, and it’s why trick-or-treating has endured as a tradition. Without it, the lead-up to Thanksgiving would lack pizazz. If Thanksgiving is a boring sports game, Halloween is the gaudy fireworks display that precedes it.
Problem is, trick-or-treating is a more sanitized event than it used to be. When I was growing up – a timeframe which now seems like it took place in the Mesozoic Era – there was a sweet spot during which my friends and I were still young enough to enjoy going door to door for candy, but old enough to roam the safer suburban areas without our parents. This was an unprecedented level of independence that lasted for two, maybe three years, and it was strangely exhilarating. It’s what I imagine going on a spacewalk would feel like, with the exception that candy pretty much contributes to the opposite of weightlessness.
One Halloween in particular, during that sweet spot, I went trick-or-treating with a couple of friends from school. We were approaching our teen years, and though it remained unspoken, we all realized that time was running short: If we pushed it much further, we’d start coming home with bags full of electric shavers and acne cream instead of Smarties and Milky Ways. So it was with a sense of urgency that we embarked, sans parents, on a mission to collect as much loot from the neighborhood as we could.
If we had been chaperoned, the following would never have happened: While giggling over our growing stash and walking to the next house, a couple of older kids – kids who felt no compunction about asking for treats in crackling, pubescent voices – burst from behind a cover of shrubbery and shrieked, “Give us your candy!” I locked eyes with my friend (well, eye; he was a pirate) and without saying a word, we sped off into the night, ducking under fences and leaping over hedges, our bags of goodies wagging behind us like parachutes. Sometime later, when it was clear we had shaken our pusuers, we sat and rested with our backs to a parked car and caught our breath. After a moment, we looked at each other, smirked, and then burst out laughing.
Had our parents accompanied us, the masked candy thieves would never have attempted their nefarious plot. And if they had, somehow, been that stupid, our parents would have stopped them cold with with their parent-ness. They would have been right to do it; it’s a parent’s job to protect their children, even from hormonal werewolves.
But I’m glad the folks weren’t around. First of all, the shrub-boys didn’t pose much of a threat; my friend and I knew that, even if they caught us, we could simply scream for help. It was a suburban neighborhood filled with trick-or-treaters, not an African jungle.
Mostly, though, I recall the night fondly. Outrunning dim-witted, would-be criminals and then laughing about it is one of those memories that, for me, adds to the luster of the holiday. Nowadays, kids tend to miss out on that kind of adventure: The kind that feels sorta scary and larger than life, but is tempered by the innate knowledge that the palpable concerns of the real world are buffered by what remains of childhood.
In a more dangerous and uncertain world, many kids won’t experience that. To their detriment.
Of course, despite all that, trick-or-treating is still a blast. If it’s become too monitored and strict, at least there’s the candy, and the crisp autumn air, and the unabashed escapism. Just one small parting piece of advice: Don’t go as Hulk Hogan. That’s so 1988.
Wednesday, October 24, 2012
Profane in the membrane
If you were walking through your neighborhood and heard someone let
loose with a loud profanity, you’d bristle, right? Flinch? Maybe do a
little scowling and muttering?
Sure you would. Nobody likes to hear that. And unless you’ve spent a lot of time in an army barracks – in a bunk between Andrew Dice Clay and Triumph the Insult Comic Dog – you’re probably not used to it. Despite the gradual, generations-long decay of manners, and the relegation of “etiquette” to a vague word belonging to antiquity, we still expect to walk down the street without hearing loudly bellowed cuss words. At least most of us do. If you live in a neighborhood where such things are common, you may want to consider moving, or maybe pitching a tent in the woods and living on squirrel meat.
Like it or not, though, people can say what they want. It’s that whole First Amendment thing. If the shouted profanities were sustained, and majorly disruptive to a large portion of the community, then the foul-mouthed culprit would be subject to arrest under various public disturbance laws. But the content of what people say is protected constitutionally. I’ve never regarded that document as flawless – why would a perfect document have to be amended so many times? – but the Founding Fathers definitely got that one right. Between that and the powdered wigs, they were clearly geniuses.
Which is what makes it all the more curious that a town in Massachusetts has approved an article allowing police to enforce a 1968 bylaw barring “profane” language in public.
In Middleborough, a town of about 23,000, saying the wrong thing in front of the wrong person could result in a $20 fine – quite the price to pay for stepping in dog droppings and calling it what it really is. It was at a town meeting in June, according to the Associated Press, that residents voted 183-50 to start enforcing the 40-plus-year-old bylaw, which was enacted in an era when “profanity” meant calling someone a “total bummer.”
And that’s part of what makes the vote so ridiculous: Who decides what’s profane? Sure, there are certain words we can all agree on. The F-Word, which in its versatility can refer to anything from joking around to fornication, is clearly a profanity. You wouldn’t want someone shouting that incessantly while standing in the town square – hard to pass that off as performance art.
Other words and phrases are harder to categorize, partly because one generation’s profanity is the next one’s casual endearment. Calling someone a son-of-a-bitch in 1950 would be shocking, a true insult to one’s beloved mother. Nowadays, half of its utterances are intended as friendly nicknames for good buddies: “Hey, you son-of-a-bitch, I haven’t seen you since that time you stepped in dog shit!”
Location can also play a big factor in how words are taken. And I’m not just talking about the misunderstandings and mistranslations that can occur between disparate languages and cultures. Look at the United States and Britain. Say the word “bloody” in the U.S., and you can pretty well narrow down the possible contexts: “The crime scene was unnervingly bloody.” “The boxer’s face was bloody after the bout.” “Hand me a tissue, I’ve got a bloody nose!”
Say “bloody” in Britain and you may, depending on the situation, be saying something that would make the Queen’s toes curl. And it’s a two-way street. The British slang word for “cigarette” would not go over well at the Gay Pride Parade.
So the good folks of Middleborough may well find that subjectivity is an issue in enforcing this quaint little profanity law. Fortunately, the office of Attorney General Martha Coakley issued a statement urging repeal of the bylaw, and suggested it not be enforced in the meantime, citing the constitutional right to free speech.
Hopefully, residents come around. But if they don’t, I may have to avoid Middleborough altogether the next time I’m in Massachusetts. With all the fines my mouth would earn me, I’d go broke in 15 minutes.
Sure you would. Nobody likes to hear that. And unless you’ve spent a lot of time in an army barracks – in a bunk between Andrew Dice Clay and Triumph the Insult Comic Dog – you’re probably not used to it. Despite the gradual, generations-long decay of manners, and the relegation of “etiquette” to a vague word belonging to antiquity, we still expect to walk down the street without hearing loudly bellowed cuss words. At least most of us do. If you live in a neighborhood where such things are common, you may want to consider moving, or maybe pitching a tent in the woods and living on squirrel meat.
Like it or not, though, people can say what they want. It’s that whole First Amendment thing. If the shouted profanities were sustained, and majorly disruptive to a large portion of the community, then the foul-mouthed culprit would be subject to arrest under various public disturbance laws. But the content of what people say is protected constitutionally. I’ve never regarded that document as flawless – why would a perfect document have to be amended so many times? – but the Founding Fathers definitely got that one right. Between that and the powdered wigs, they were clearly geniuses.
Which is what makes it all the more curious that a town in Massachusetts has approved an article allowing police to enforce a 1968 bylaw barring “profane” language in public.
In Middleborough, a town of about 23,000, saying the wrong thing in front of the wrong person could result in a $20 fine – quite the price to pay for stepping in dog droppings and calling it what it really is. It was at a town meeting in June, according to the Associated Press, that residents voted 183-50 to start enforcing the 40-plus-year-old bylaw, which was enacted in an era when “profanity” meant calling someone a “total bummer.”
And that’s part of what makes the vote so ridiculous: Who decides what’s profane? Sure, there are certain words we can all agree on. The F-Word, which in its versatility can refer to anything from joking around to fornication, is clearly a profanity. You wouldn’t want someone shouting that incessantly while standing in the town square – hard to pass that off as performance art.
Other words and phrases are harder to categorize, partly because one generation’s profanity is the next one’s casual endearment. Calling someone a son-of-a-bitch in 1950 would be shocking, a true insult to one’s beloved mother. Nowadays, half of its utterances are intended as friendly nicknames for good buddies: “Hey, you son-of-a-bitch, I haven’t seen you since that time you stepped in dog shit!”
Location can also play a big factor in how words are taken. And I’m not just talking about the misunderstandings and mistranslations that can occur between disparate languages and cultures. Look at the United States and Britain. Say the word “bloody” in the U.S., and you can pretty well narrow down the possible contexts: “The crime scene was unnervingly bloody.” “The boxer’s face was bloody after the bout.” “Hand me a tissue, I’ve got a bloody nose!”
Say “bloody” in Britain and you may, depending on the situation, be saying something that would make the Queen’s toes curl. And it’s a two-way street. The British slang word for “cigarette” would not go over well at the Gay Pride Parade.
So the good folks of Middleborough may well find that subjectivity is an issue in enforcing this quaint little profanity law. Fortunately, the office of Attorney General Martha Coakley issued a statement urging repeal of the bylaw, and suggested it not be enforced in the meantime, citing the constitutional right to free speech.
Hopefully, residents come around. But if they don’t, I may have to avoid Middleborough altogether the next time I’m in Massachusetts. With all the fines my mouth would earn me, I’d go broke in 15 minutes.
Monday, October 8, 2012
Master debaters (cue rimshot)
The most disappointing thing about Wednesday night’s debate is that there wasn’t any bloodshed.
You’d think there would have been, given the hype. For weeks, the news media was apoplectic in declaring that the showdown would come in second to the Apocalypse in terms of earth-shattering, universe-destroying events. Obama and Romney were supposed to stare hard into each other’s eyes until there was a violent rip in the fabric of spacetime.
Then the fated hour came, and they just kinda talked about stuff.
On the one hand, we shouldn’t be surprised; we go through this every four years. Like the Olympics, we work our way to a level of frenzied anticipation for something that, without fail, makes fence-painting feel as edgy and extreme as skydiving. Debates are where the unpredictable, gaff-laden moments on the campaign trail die a pathetic, and highly controlled, death.
But there’s always the hope that something might happen to justify the bated breath and white knuckles – like a candidate being interrupted by an alien invasion, or suffering a stress-related nervous breakdown that makes them shed their clothes and dance the Macarena.
That’s about what it would take for the debates to live up to their billing – that, or an abandonment of the traditional formula in favor of a bare-knuckle brawl in the octagon.
“Handlers” have become a big part of this stale predictability. You’ve heard about the handlers. It’s a word that has become a part of the political lexicon, like “patriot,” or “poopyface.” The handlers are the ones who make sure their candidates stick to the talking points, stay on message, stand up straight, don’t snap their gum, etc. They take the wild fruit of the candidates’ positions, extract the essential ingredients, refine them into something digestible, and sell them to the American public in shiny, plastic packages.
And they’ve become so pervasive in American politics that it’s hard to tell whether it’s the candidates talking or their coterie of advisers. These days, it’s all about playing it safe. In a sense, they can’t really be blamed for that. It’s the YouTube era, after all, in which every little comment, every misstatement and mistake, is recorded and dissected to the point where the words themselves lose all meaning. As Mr. Romney can attest, the wrong comment in front of the wrong technology is a hangman’s noose.
All of this results in debates that are as scripted as a high school play. The reason we still hope for fireworks is because of debates past, before the handlers started spritzing their candidates with sanitizer. Maybe the most famous example of such fireworks came in the 1988 vice presidential debate between Lloyd Bentsen and Dan Quayle. Quayle, who compared his level of experience in the U.S. Senate to that of John F. Kennedy when Kennedy took office, was sideswiped by one of the most legendary political smackdowns of all time: “Senator,” said Bentsen, “I served with Jack Kennedy. I knew Jack Kennedy. Jack Kennedy was a friend of mine. Senator, you’re no Jack Kennedy.”
Boom! Eat it, Quayle!
As far as staying in the public consciousness, that moment ranks alongside Mike Tyson biting off part of Evander Holyfield’s ear. We’ve been waiting for another one like it for 24 years.
If the political system remains in its current state, of course, that simply won’t happen. The handlers will pose their mannequins, and it’ll be up to us to analyze the fingerprints. Pundits will take to the airwaves to painstakingly pick apart a sea of generic pandering and dubious claims; a panel of “experts” will try to tell us all about body language and eye contact. And we, the general public, will grow weary and flip the channel over to professional wrestling for some intellectual stimulation.
Which segues nicely into a truly American solution: Since so few of us listen to the content of the debates as it is, replace the format entirely with an American Gladiators-style slugfest. The winner will be the one who looks “more presidential” while beating his opponent senseless with a foam javelin.
It would be sensationalist and contribute nothing to the public discourse. But at least there’d be some bloodshed.
You’d think there would have been, given the hype. For weeks, the news media was apoplectic in declaring that the showdown would come in second to the Apocalypse in terms of earth-shattering, universe-destroying events. Obama and Romney were supposed to stare hard into each other’s eyes until there was a violent rip in the fabric of spacetime.
Then the fated hour came, and they just kinda talked about stuff.
On the one hand, we shouldn’t be surprised; we go through this every four years. Like the Olympics, we work our way to a level of frenzied anticipation for something that, without fail, makes fence-painting feel as edgy and extreme as skydiving. Debates are where the unpredictable, gaff-laden moments on the campaign trail die a pathetic, and highly controlled, death.
But there’s always the hope that something might happen to justify the bated breath and white knuckles – like a candidate being interrupted by an alien invasion, or suffering a stress-related nervous breakdown that makes them shed their clothes and dance the Macarena.
That’s about what it would take for the debates to live up to their billing – that, or an abandonment of the traditional formula in favor of a bare-knuckle brawl in the octagon.
“Handlers” have become a big part of this stale predictability. You’ve heard about the handlers. It’s a word that has become a part of the political lexicon, like “patriot,” or “poopyface.” The handlers are the ones who make sure their candidates stick to the talking points, stay on message, stand up straight, don’t snap their gum, etc. They take the wild fruit of the candidates’ positions, extract the essential ingredients, refine them into something digestible, and sell them to the American public in shiny, plastic packages.
And they’ve become so pervasive in American politics that it’s hard to tell whether it’s the candidates talking or their coterie of advisers. These days, it’s all about playing it safe. In a sense, they can’t really be blamed for that. It’s the YouTube era, after all, in which every little comment, every misstatement and mistake, is recorded and dissected to the point where the words themselves lose all meaning. As Mr. Romney can attest, the wrong comment in front of the wrong technology is a hangman’s noose.
All of this results in debates that are as scripted as a high school play. The reason we still hope for fireworks is because of debates past, before the handlers started spritzing their candidates with sanitizer. Maybe the most famous example of such fireworks came in the 1988 vice presidential debate between Lloyd Bentsen and Dan Quayle. Quayle, who compared his level of experience in the U.S. Senate to that of John F. Kennedy when Kennedy took office, was sideswiped by one of the most legendary political smackdowns of all time: “Senator,” said Bentsen, “I served with Jack Kennedy. I knew Jack Kennedy. Jack Kennedy was a friend of mine. Senator, you’re no Jack Kennedy.”
Boom! Eat it, Quayle!
As far as staying in the public consciousness, that moment ranks alongside Mike Tyson biting off part of Evander Holyfield’s ear. We’ve been waiting for another one like it for 24 years.
If the political system remains in its current state, of course, that simply won’t happen. The handlers will pose their mannequins, and it’ll be up to us to analyze the fingerprints. Pundits will take to the airwaves to painstakingly pick apart a sea of generic pandering and dubious claims; a panel of “experts” will try to tell us all about body language and eye contact. And we, the general public, will grow weary and flip the channel over to professional wrestling for some intellectual stimulation.
Which segues nicely into a truly American solution: Since so few of us listen to the content of the debates as it is, replace the format entirely with an American Gladiators-style slugfest. The winner will be the one who looks “more presidential” while beating his opponent senseless with a foam javelin.
It would be sensationalist and contribute nothing to the public discourse. But at least there’d be some bloodshed.
Friday, October 5, 2012
iRate
My first cell phone looked like the kind of device that movie villains use to detonate bombs.
It had an LCD display slightly more sophisticated than a calculator. It was square. It was bulky. It had a flimsy antenna with a plastic nub at the tip that waggled in the wind, and the only “entertainment” contained within was a game called Snake, which involved manipulating a line of pixels – the snake – around the screen to try to gobble up a dot. It made Pac-Man look like virtual reality.
And those were the good old days.
Perhaps the smartphone-using technophobes among you are scratching your heads. How, you may ask, could those have possibly been the good old days when we live in an age of phones that can stream videos, identify music on the radio, purchase Amazon brick-a-brack, and set the heat levels on your waffle iron?
There’s no doubting that smartphones are capable of doing a heck of a lot. Some of the tasks they can perform occasionally pass as useful. There’s an iPhone app, for example, that uses the gadget’s eletronic eye to read text on real-world signs and posters and translate it into any language of our choosing. Pretty nifty. If you’re an American in Japan, for example, and you feel nature calling, you can use the app to identify the men’s and women’s restrooms, thus saving you the embarrassment of having a roomful of pantless people screaming at you in horror.
The problem with smartphones is easily demonstrated by a depressing and meandering anecdote. Every few weeks, a friend of mine has a game night in his home. The gang shows up, convenes around the kitchen table, and passes the evening with Cheese Doodles and board games – Settlers of Catan, Trivial Pursuit, and a bunch of other games that virtually guarantee none of us will ever experience the LA nightclub scene. This, if you’re as nerdy as we are, is a pleasant way to spend the evening with friends.
A month or two ago, however, the games remained in their boxes. The gang showed up, convened around the kitchen table, and pretty much just sat there, unable to decide on that night’s game. This would have been fine if the evening had been passed in pleasant conversation, but instead, everybody whipped out their phones, and it didn’t take long for a listless hush to fall over the room. Picture it: Seven or eight people seated around a table, silent, faces aglow with the light from their little screens.
Except for me. Because I refuse to buy a smartphone, the only thing I had to stare at was my bowl of Cheese Doodles.
Enter the smartphone era, in which real human contact – and peaceful rest for our overworked gray matter – plays second fiddle to streaming YouTube videos and instant recipes for organic gourmet tofu cupcakes. Such devices are a boon in a theoretical utopia where everyone has self control, and can resist the allure of playing Angry Birds while waiting in line for the Tilt-A-Whirl.
Unfortunately, most of us live in the real world, and in the real world, such distractions can cause problems.
Loren Frank, assistant professor of physiology at the University of California, San Francisco, was recently quoted in the New York Times as saying that downtime allows the brain to process experiences it’s had, helping to facilitate the learning process. And according to Times reporter Matt Richtel, researchers have found that the devices’ exposure to the developing minds of children and teens results in brains that are less able to sustain attention. Which may help explain why it takes repeated urging to get Little Timmy to mop up the Pepsi spill on the kitchen linoleum.
This portends unequivocal suckiness for the future. I envision a world of 15-second attenton spans, glowing faces in front of glowing screens, and streets filled with flaming oil drums. (Or maybe that’s Terminator.)
Kids and teens might have the developing brains excuse. Adults have no such out. When they focus on their phones and nothing else, it’s just plain rude.
It had an LCD display slightly more sophisticated than a calculator. It was square. It was bulky. It had a flimsy antenna with a plastic nub at the tip that waggled in the wind, and the only “entertainment” contained within was a game called Snake, which involved manipulating a line of pixels – the snake – around the screen to try to gobble up a dot. It made Pac-Man look like virtual reality.
And those were the good old days.
Perhaps the smartphone-using technophobes among you are scratching your heads. How, you may ask, could those have possibly been the good old days when we live in an age of phones that can stream videos, identify music on the radio, purchase Amazon brick-a-brack, and set the heat levels on your waffle iron?
There’s no doubting that smartphones are capable of doing a heck of a lot. Some of the tasks they can perform occasionally pass as useful. There’s an iPhone app, for example, that uses the gadget’s eletronic eye to read text on real-world signs and posters and translate it into any language of our choosing. Pretty nifty. If you’re an American in Japan, for example, and you feel nature calling, you can use the app to identify the men’s and women’s restrooms, thus saving you the embarrassment of having a roomful of pantless people screaming at you in horror.
The problem with smartphones is easily demonstrated by a depressing and meandering anecdote. Every few weeks, a friend of mine has a game night in his home. The gang shows up, convenes around the kitchen table, and passes the evening with Cheese Doodles and board games – Settlers of Catan, Trivial Pursuit, and a bunch of other games that virtually guarantee none of us will ever experience the LA nightclub scene. This, if you’re as nerdy as we are, is a pleasant way to spend the evening with friends.
A month or two ago, however, the games remained in their boxes. The gang showed up, convened around the kitchen table, and pretty much just sat there, unable to decide on that night’s game. This would have been fine if the evening had been passed in pleasant conversation, but instead, everybody whipped out their phones, and it didn’t take long for a listless hush to fall over the room. Picture it: Seven or eight people seated around a table, silent, faces aglow with the light from their little screens.
Except for me. Because I refuse to buy a smartphone, the only thing I had to stare at was my bowl of Cheese Doodles.
Enter the smartphone era, in which real human contact – and peaceful rest for our overworked gray matter – plays second fiddle to streaming YouTube videos and instant recipes for organic gourmet tofu cupcakes. Such devices are a boon in a theoretical utopia where everyone has self control, and can resist the allure of playing Angry Birds while waiting in line for the Tilt-A-Whirl.
Unfortunately, most of us live in the real world, and in the real world, such distractions can cause problems.
Loren Frank, assistant professor of physiology at the University of California, San Francisco, was recently quoted in the New York Times as saying that downtime allows the brain to process experiences it’s had, helping to facilitate the learning process. And according to Times reporter Matt Richtel, researchers have found that the devices’ exposure to the developing minds of children and teens results in brains that are less able to sustain attention. Which may help explain why it takes repeated urging to get Little Timmy to mop up the Pepsi spill on the kitchen linoleum.
This portends unequivocal suckiness for the future. I envision a world of 15-second attenton spans, glowing faces in front of glowing screens, and streets filled with flaming oil drums. (Or maybe that’s Terminator.)
Kids and teens might have the developing brains excuse. Adults have no such out. When they focus on their phones and nothing else, it’s just plain rude.
Friday, September 21, 2012
The cars behind the bus go wait, wait, wait
Driving behind a school bus is like trailing a really slow guy on the
sidewalk who doesn’t realize you’re there and won’t let you pass him.
The difference is that, in the latter situation, I have no compunction
about dashing quickly into the street and bounding ahead of said slow
man. Pulling a similar stunt in the bus scenario would not only put
children needlessly at risk, but forever shame me with the label of Jerk
Who Passed School Bus.
But man, is it tempting.
It would be one thing if the bus never stopped, like the bomb-rigged bus from “Speed” that would explode if it went too slow. Buses can gather some good velocity, but only with a head start of roughly, say, New Jersey. Any distance shorter than that, and a kid on a bike could outpace one and still have enough wind at the finish line for a game of hopscotch, or cyber Pokemon Hunger Games football, or whatever they play nowadays.
Of course, a school bus can never inch its way up to an acceptable speed, because bus routes now require that it make a stop about once every block and a half. When I pull onto a street on a school day afternoon and see that big square yellow butt staring at me, I know exactly what to expect: Stop. Let two children out. Inch forward at five miles per hour. Stop two feet down the road. Let three more children out. Repeat. I could do my taxes and choreograph the Ice Capades in the time it takes a bus to go down May Street.
Now there’s nothing that makes a person feel old like starting a sentence with the phrase, “Back in my day.” It implies something curmudgeonly, a cantankerous nostalgia for the way things used to be. But you know what? I’m a curmudgeon, so screw it: Back in my day, the bus never stopped that often. It didn’t have to. Our stops were spaced farther apart, and if one of us was unfortunate enough to live a quarter mile from one, we sucked it up and walked there.
When I was a wee lass taking the bus to middle school, I lucked out: The stop was right at the end of my street. It took about four minutes to walk there, sometimes a little more in icy weather. My friend Kevin, who met us there every morning, had his own nearby stop, but elected to walk to ours instead; a close-knit group of friends, it was hardly a complete morning without the full gang present to trade cards and share stories about girls and boogers. Kevin walked almost a full mile for this daily rite. He trudged up and over a hill so steep and massive, its legend earned it a name: Applesass Hill. And yes, he walked up Applesass Hill in the freezing cold and snow. The clichés are true.
Contrast that with today, when a child has to trek no further than the neighbor’s bronze statue of a peeing angel. Now admittedly, I’m not a parent. Perhaps my feelings would be different if I were sending my own nine-year old out into the freezing cold to wait for a ride on a rickety bus with no seat belts. But it seems that, with the proper guidance on how to be safe, letting a child walk even a ninety-second journey would be character-building. And that’s not to mention the exercise factor: In a country where childhood obesity is considered an epidemic, encouraging a kid to put one foot in front of the other hardly seems like the worst
thing that could be done.
If the trend continues, then changes will have to be made to that age-old ditty about the wheels on the bus. “The wheels on the bus go ‘round and ‘round,” according to that childhood staple, but that’s no longer accurate. The wheels on the bus go ‘round. Then they stop while the rest of us wait.
But man, is it tempting.
It would be one thing if the bus never stopped, like the bomb-rigged bus from “Speed” that would explode if it went too slow. Buses can gather some good velocity, but only with a head start of roughly, say, New Jersey. Any distance shorter than that, and a kid on a bike could outpace one and still have enough wind at the finish line for a game of hopscotch, or cyber Pokemon Hunger Games football, or whatever they play nowadays.
Of course, a school bus can never inch its way up to an acceptable speed, because bus routes now require that it make a stop about once every block and a half. When I pull onto a street on a school day afternoon and see that big square yellow butt staring at me, I know exactly what to expect: Stop. Let two children out. Inch forward at five miles per hour. Stop two feet down the road. Let three more children out. Repeat. I could do my taxes and choreograph the Ice Capades in the time it takes a bus to go down May Street.
Now there’s nothing that makes a person feel old like starting a sentence with the phrase, “Back in my day.” It implies something curmudgeonly, a cantankerous nostalgia for the way things used to be. But you know what? I’m a curmudgeon, so screw it: Back in my day, the bus never stopped that often. It didn’t have to. Our stops were spaced farther apart, and if one of us was unfortunate enough to live a quarter mile from one, we sucked it up and walked there.
When I was a wee lass taking the bus to middle school, I lucked out: The stop was right at the end of my street. It took about four minutes to walk there, sometimes a little more in icy weather. My friend Kevin, who met us there every morning, had his own nearby stop, but elected to walk to ours instead; a close-knit group of friends, it was hardly a complete morning without the full gang present to trade cards and share stories about girls and boogers. Kevin walked almost a full mile for this daily rite. He trudged up and over a hill so steep and massive, its legend earned it a name: Applesass Hill. And yes, he walked up Applesass Hill in the freezing cold and snow. The clichés are true.
Contrast that with today, when a child has to trek no further than the neighbor’s bronze statue of a peeing angel. Now admittedly, I’m not a parent. Perhaps my feelings would be different if I were sending my own nine-year old out into the freezing cold to wait for a ride on a rickety bus with no seat belts. But it seems that, with the proper guidance on how to be safe, letting a child walk even a ninety-second journey would be character-building. And that’s not to mention the exercise factor: In a country where childhood obesity is considered an epidemic, encouraging a kid to put one foot in front of the other hardly seems like the worst
thing that could be done.
If the trend continues, then changes will have to be made to that age-old ditty about the wheels on the bus. “The wheels on the bus go ‘round and ‘round,” according to that childhood staple, but that’s no longer accurate. The wheels on the bus go ‘round. Then they stop while the rest of us wait.
Thursday, September 20, 2012
1-00-1-00-1
During a recent call to York County Superior Court, it took about three
solid minutes to get ahold of a person – an actual, honest-to-goodness
human being. Three minutes doesn’t sound like a long time, but for
someone accustomed to hearing “Hello?” after three or four rings, it’s
an eternity.
The wait was because of what’s called a “phone tree,” which I believe was invented by medieval torturers looking to extract murder confessions from bloodthirsty barbarians. You’ve dealt with phone trees before. If you’ve ever called a courthouse, school, library, or law office, you’ve heard that automated message: “Thank you for calling the Office of Whoever. For a staff directory, press ‘one.’ To spend the rest of your life on the phone and never speak to a breathing homo sapien again, press ‘two.’”
If left unchecked, phone trees will slowly spread and wipe out humanity like the killer machines in “The Matrix.” Or at least they’ll make all of our phone calls profoundly annoying. They’re increasingly unavoidable, and the menu options are getting increasingly long. Any longer and they’ll be voiced by James Earl Jones and sold in bookstores.
A perfect case in point is the phone tree for the Massabesic school system. I called a few months ago trying to get ahold of a staff member at the high school, but dialing their number no longer gets you the actual school. Instead, the number connects you to a central hub, from which you can be transferred to the high school, middle school, elementary school, or the central district office. Convenient if you’re a robot, irritating if you’re flesh-and-bone.
Why can’t I just speak to a receptionist and ask to be connected to someone? That’s usually what ends up happening anyway, because the options on the menu never correspond to the actual help you need. I don’t know the extension of the person I’m trying to reach, I don’t need to call my child in sick, and I don’t need to speak to somebody in food services, although, really, try a little harder on the mashed potatoes. I just want to talk to the chemistry teacher so I can ask him about beakers and stuff.
It’s hard to pinpoint exactly when phone trees started taking over the world. They are to phones what Ryan Gosling is to movies: You never really noticed them coming until they were already there.
They certainly weren’t as prevalent ten or twenty years ago. Back then you would call a place, and a bored or polite-sounding person would gently guide you in the right direction; it was bliss, because when you’re talking to an honest-to-goodness receptionist, who communicates in human language instead of binary code, you can make your intentions understood quickly and succinctly. A person doesn’t have a list of options that you have to sift through. They have minds, and those minds are capable of assessing what you need and helping you to get there.
The only argument I’ve heard in favor of the phone tree system is that it saves receptionists time and effort. That’s all well and good, but if that’s the goal, it seems like we should at least wait until technological advances have made this less of a pain in the rump for callers – maybe when all robots have the cognitive ability of that big black computer that kicked so much butt on Jeopardy.
Until then, my head will remain firmly in the clouds, envisioning a pipe-dream utopia where people answer phones and robots stick to doing robot things, like making coffee or opening cans of dog food. We’re a long way off from Arnold Schwarzenegger’s “Terminator.” In the time we have left before that eerie reality, let’s talk. You may not be James Earl Jones. But you’re better than the alternative.
The wait was because of what’s called a “phone tree,” which I believe was invented by medieval torturers looking to extract murder confessions from bloodthirsty barbarians. You’ve dealt with phone trees before. If you’ve ever called a courthouse, school, library, or law office, you’ve heard that automated message: “Thank you for calling the Office of Whoever. For a staff directory, press ‘one.’ To spend the rest of your life on the phone and never speak to a breathing homo sapien again, press ‘two.’”
If left unchecked, phone trees will slowly spread and wipe out humanity like the killer machines in “The Matrix.” Or at least they’ll make all of our phone calls profoundly annoying. They’re increasingly unavoidable, and the menu options are getting increasingly long. Any longer and they’ll be voiced by James Earl Jones and sold in bookstores.
A perfect case in point is the phone tree for the Massabesic school system. I called a few months ago trying to get ahold of a staff member at the high school, but dialing their number no longer gets you the actual school. Instead, the number connects you to a central hub, from which you can be transferred to the high school, middle school, elementary school, or the central district office. Convenient if you’re a robot, irritating if you’re flesh-and-bone.
Why can’t I just speak to a receptionist and ask to be connected to someone? That’s usually what ends up happening anyway, because the options on the menu never correspond to the actual help you need. I don’t know the extension of the person I’m trying to reach, I don’t need to call my child in sick, and I don’t need to speak to somebody in food services, although, really, try a little harder on the mashed potatoes. I just want to talk to the chemistry teacher so I can ask him about beakers and stuff.
It’s hard to pinpoint exactly when phone trees started taking over the world. They are to phones what Ryan Gosling is to movies: You never really noticed them coming until they were already there.
They certainly weren’t as prevalent ten or twenty years ago. Back then you would call a place, and a bored or polite-sounding person would gently guide you in the right direction; it was bliss, because when you’re talking to an honest-to-goodness receptionist, who communicates in human language instead of binary code, you can make your intentions understood quickly and succinctly. A person doesn’t have a list of options that you have to sift through. They have minds, and those minds are capable of assessing what you need and helping you to get there.
The only argument I’ve heard in favor of the phone tree system is that it saves receptionists time and effort. That’s all well and good, but if that’s the goal, it seems like we should at least wait until technological advances have made this less of a pain in the rump for callers – maybe when all robots have the cognitive ability of that big black computer that kicked so much butt on Jeopardy.
Until then, my head will remain firmly in the clouds, envisioning a pipe-dream utopia where people answer phones and robots stick to doing robot things, like making coffee or opening cans of dog food. We’re a long way off from Arnold Schwarzenegger’s “Terminator.” In the time we have left before that eerie reality, let’s talk. You may not be James Earl Jones. But you’re better than the alternative.
Subscribe to:
Posts (Atom)