I remember dressing as Hulk Hogan for Halloween when I was about seven –
back when you could do that and still be cool. The centerpiece of the
costume was a gigantic rubber mask that smelled like freshly-washed
linoleum, with eyepieces roughly the size of atoms. At the time, I wore
thick, Coke-bottle glasses that fogged up easily, and so my
trick-or-treating escapade that night consisted of stumbling around in a
thick mist and wandering into telephone poles. The real Hulk Hogan
wouldn’t be doing that for another 20 years.
That may not sound like a fun adventure, but I was ecstatic. There’s
something about donning ridiculous garb and going on a candy-hunt that
transcends discomfort and near-blindness, and it’s why trick-or-treating
has endured as a tradition. Without it, the lead-up to Thanksgiving
would lack pizazz. If Thanksgiving is a boring sports game, Halloween is
the gaudy fireworks display that precedes it.
Problem is, trick-or-treating is a more sanitized event than it used to
be. When I was growing up – a timeframe which now seems like it took
place in the Mesozoic Era – there was a sweet spot during which my
friends and I were still young enough to enjoy going door to door for
candy, but old enough to roam the safer suburban areas without our
parents. This was an unprecedented level of independence that lasted for
two, maybe three years, and it was strangely exhilarating. It’s what I
imagine going on a spacewalk would feel like, with the exception that
candy pretty much contributes to the opposite of weightlessness.
One Halloween in particular, during that sweet spot, I went
trick-or-treating with a couple of friends from school. We were
approaching our teen years, and though it remained unspoken, we all
realized that time was running short: If we pushed it much further, we’d
start coming home with bags full of electric shavers and acne cream
instead of Smarties and Milky Ways. So it was with a sense of urgency
that we embarked, sans parents, on a mission to collect as much loot
from the neighborhood as we could.
If we had been chaperoned, the following would never have happened:
While giggling over our growing stash and walking to the next house, a
couple of older kids – kids who felt no compunction about asking for
treats in crackling, pubescent voices – burst from behind a cover of
shrubbery and shrieked, “Give us your candy!” I locked eyes with my
friend (well, eye; he was a
pirate) and without saying a word, we sped off into the night, ducking
under fences and leaping over hedges, our bags of goodies wagging behind
us like parachutes. Sometime later, when it was clear we had shaken our
pusuers, we sat and rested with our backs to a parked car and caught
our breath. After a moment, we looked at each other, smirked, and then
burst out laughing.
Had our parents accompanied us, the masked candy thieves would never
have attempted their nefarious plot. And if they had, somehow, been that stupid, our
parents would have stopped them cold with with their parent-ness. They
would have been right to do it; it’s a parent’s job to protect their
children, even from hormonal werewolves.
But I’m glad the folks weren’t around. First of all, the shrub-boys
didn’t pose much of a threat; my friend and I knew that, even if they
caught us, we could simply scream for help. It was a suburban
neighborhood filled with trick-or-treaters, not an African jungle.
Mostly, though, I recall the night fondly. Outrunning dim-witted,
would-be criminals and then laughing about it is one of those memories
that, for me, adds to the luster of the holiday. Nowadays, kids tend to
miss out on that kind of adventure: The kind that feels sorta scary and
larger than life, but is tempered by the innate knowledge that the
palpable concerns of the real world are buffered by what remains of
childhood.
In a more dangerous and uncertain world, many kids won’t experience that. To their detriment.
Of
course, despite all that, trick-or-treating is still a blast. If it’s
become too monitored and strict, at least there’s the candy, and the
crisp autumn air, and the unabashed escapism. Just one small parting
piece of advice: Don’t go as Hulk Hogan. That’s so 1988.
Wednesday, October 31, 2012
Wednesday, October 24, 2012
Profane in the membrane
If you were walking through your neighborhood and heard someone let
loose with a loud profanity, you’d bristle, right? Flinch? Maybe do a
little scowling and muttering?
Sure you would. Nobody likes to hear that. And unless you’ve spent a lot of time in an army barracks – in a bunk between Andrew Dice Clay and Triumph the Insult Comic Dog – you’re probably not used to it. Despite the gradual, generations-long decay of manners, and the relegation of “etiquette” to a vague word belonging to antiquity, we still expect to walk down the street without hearing loudly bellowed cuss words. At least most of us do. If you live in a neighborhood where such things are common, you may want to consider moving, or maybe pitching a tent in the woods and living on squirrel meat.
Like it or not, though, people can say what they want. It’s that whole First Amendment thing. If the shouted profanities were sustained, and majorly disruptive to a large portion of the community, then the foul-mouthed culprit would be subject to arrest under various public disturbance laws. But the content of what people say is protected constitutionally. I’ve never regarded that document as flawless – why would a perfect document have to be amended so many times? – but the Founding Fathers definitely got that one right. Between that and the powdered wigs, they were clearly geniuses.
Which is what makes it all the more curious that a town in Massachusetts has approved an article allowing police to enforce a 1968 bylaw barring “profane” language in public.
In Middleborough, a town of about 23,000, saying the wrong thing in front of the wrong person could result in a $20 fine – quite the price to pay for stepping in dog droppings and calling it what it really is. It was at a town meeting in June, according to the Associated Press, that residents voted 183-50 to start enforcing the 40-plus-year-old bylaw, which was enacted in an era when “profanity” meant calling someone a “total bummer.”
And that’s part of what makes the vote so ridiculous: Who decides what’s profane? Sure, there are certain words we can all agree on. The F-Word, which in its versatility can refer to anything from joking around to fornication, is clearly a profanity. You wouldn’t want someone shouting that incessantly while standing in the town square – hard to pass that off as performance art.
Other words and phrases are harder to categorize, partly because one generation’s profanity is the next one’s casual endearment. Calling someone a son-of-a-bitch in 1950 would be shocking, a true insult to one’s beloved mother. Nowadays, half of its utterances are intended as friendly nicknames for good buddies: “Hey, you son-of-a-bitch, I haven’t seen you since that time you stepped in dog shit!”
Location can also play a big factor in how words are taken. And I’m not just talking about the misunderstandings and mistranslations that can occur between disparate languages and cultures. Look at the United States and Britain. Say the word “bloody” in the U.S., and you can pretty well narrow down the possible contexts: “The crime scene was unnervingly bloody.” “The boxer’s face was bloody after the bout.” “Hand me a tissue, I’ve got a bloody nose!”
Say “bloody” in Britain and you may, depending on the situation, be saying something that would make the Queen’s toes curl. And it’s a two-way street. The British slang word for “cigarette” would not go over well at the Gay Pride Parade.
So the good folks of Middleborough may well find that subjectivity is an issue in enforcing this quaint little profanity law. Fortunately, the office of Attorney General Martha Coakley issued a statement urging repeal of the bylaw, and suggested it not be enforced in the meantime, citing the constitutional right to free speech.
Hopefully, residents come around. But if they don’t, I may have to avoid Middleborough altogether the next time I’m in Massachusetts. With all the fines my mouth would earn me, I’d go broke in 15 minutes.
Sure you would. Nobody likes to hear that. And unless you’ve spent a lot of time in an army barracks – in a bunk between Andrew Dice Clay and Triumph the Insult Comic Dog – you’re probably not used to it. Despite the gradual, generations-long decay of manners, and the relegation of “etiquette” to a vague word belonging to antiquity, we still expect to walk down the street without hearing loudly bellowed cuss words. At least most of us do. If you live in a neighborhood where such things are common, you may want to consider moving, or maybe pitching a tent in the woods and living on squirrel meat.
Like it or not, though, people can say what they want. It’s that whole First Amendment thing. If the shouted profanities were sustained, and majorly disruptive to a large portion of the community, then the foul-mouthed culprit would be subject to arrest under various public disturbance laws. But the content of what people say is protected constitutionally. I’ve never regarded that document as flawless – why would a perfect document have to be amended so many times? – but the Founding Fathers definitely got that one right. Between that and the powdered wigs, they were clearly geniuses.
Which is what makes it all the more curious that a town in Massachusetts has approved an article allowing police to enforce a 1968 bylaw barring “profane” language in public.
In Middleborough, a town of about 23,000, saying the wrong thing in front of the wrong person could result in a $20 fine – quite the price to pay for stepping in dog droppings and calling it what it really is. It was at a town meeting in June, according to the Associated Press, that residents voted 183-50 to start enforcing the 40-plus-year-old bylaw, which was enacted in an era when “profanity” meant calling someone a “total bummer.”
And that’s part of what makes the vote so ridiculous: Who decides what’s profane? Sure, there are certain words we can all agree on. The F-Word, which in its versatility can refer to anything from joking around to fornication, is clearly a profanity. You wouldn’t want someone shouting that incessantly while standing in the town square – hard to pass that off as performance art.
Other words and phrases are harder to categorize, partly because one generation’s profanity is the next one’s casual endearment. Calling someone a son-of-a-bitch in 1950 would be shocking, a true insult to one’s beloved mother. Nowadays, half of its utterances are intended as friendly nicknames for good buddies: “Hey, you son-of-a-bitch, I haven’t seen you since that time you stepped in dog shit!”
Location can also play a big factor in how words are taken. And I’m not just talking about the misunderstandings and mistranslations that can occur between disparate languages and cultures. Look at the United States and Britain. Say the word “bloody” in the U.S., and you can pretty well narrow down the possible contexts: “The crime scene was unnervingly bloody.” “The boxer’s face was bloody after the bout.” “Hand me a tissue, I’ve got a bloody nose!”
Say “bloody” in Britain and you may, depending on the situation, be saying something that would make the Queen’s toes curl. And it’s a two-way street. The British slang word for “cigarette” would not go over well at the Gay Pride Parade.
So the good folks of Middleborough may well find that subjectivity is an issue in enforcing this quaint little profanity law. Fortunately, the office of Attorney General Martha Coakley issued a statement urging repeal of the bylaw, and suggested it not be enforced in the meantime, citing the constitutional right to free speech.
Hopefully, residents come around. But if they don’t, I may have to avoid Middleborough altogether the next time I’m in Massachusetts. With all the fines my mouth would earn me, I’d go broke in 15 minutes.
Monday, October 8, 2012
Master debaters (cue rimshot)
The most disappointing thing about Wednesday night’s debate is that there wasn’t any bloodshed.
You’d think there would have been, given the hype. For weeks, the news media was apoplectic in declaring that the showdown would come in second to the Apocalypse in terms of earth-shattering, universe-destroying events. Obama and Romney were supposed to stare hard into each other’s eyes until there was a violent rip in the fabric of spacetime.
Then the fated hour came, and they just kinda talked about stuff.
On the one hand, we shouldn’t be surprised; we go through this every four years. Like the Olympics, we work our way to a level of frenzied anticipation for something that, without fail, makes fence-painting feel as edgy and extreme as skydiving. Debates are where the unpredictable, gaff-laden moments on the campaign trail die a pathetic, and highly controlled, death.
But there’s always the hope that something might happen to justify the bated breath and white knuckles – like a candidate being interrupted by an alien invasion, or suffering a stress-related nervous breakdown that makes them shed their clothes and dance the Macarena.
That’s about what it would take for the debates to live up to their billing – that, or an abandonment of the traditional formula in favor of a bare-knuckle brawl in the octagon.
“Handlers” have become a big part of this stale predictability. You’ve heard about the handlers. It’s a word that has become a part of the political lexicon, like “patriot,” or “poopyface.” The handlers are the ones who make sure their candidates stick to the talking points, stay on message, stand up straight, don’t snap their gum, etc. They take the wild fruit of the candidates’ positions, extract the essential ingredients, refine them into something digestible, and sell them to the American public in shiny, plastic packages.
And they’ve become so pervasive in American politics that it’s hard to tell whether it’s the candidates talking or their coterie of advisers. These days, it’s all about playing it safe. In a sense, they can’t really be blamed for that. It’s the YouTube era, after all, in which every little comment, every misstatement and mistake, is recorded and dissected to the point where the words themselves lose all meaning. As Mr. Romney can attest, the wrong comment in front of the wrong technology is a hangman’s noose.
All of this results in debates that are as scripted as a high school play. The reason we still hope for fireworks is because of debates past, before the handlers started spritzing their candidates with sanitizer. Maybe the most famous example of such fireworks came in the 1988 vice presidential debate between Lloyd Bentsen and Dan Quayle. Quayle, who compared his level of experience in the U.S. Senate to that of John F. Kennedy when Kennedy took office, was sideswiped by one of the most legendary political smackdowns of all time: “Senator,” said Bentsen, “I served with Jack Kennedy. I knew Jack Kennedy. Jack Kennedy was a friend of mine. Senator, you’re no Jack Kennedy.”
Boom! Eat it, Quayle!
As far as staying in the public consciousness, that moment ranks alongside Mike Tyson biting off part of Evander Holyfield’s ear. We’ve been waiting for another one like it for 24 years.
If the political system remains in its current state, of course, that simply won’t happen. The handlers will pose their mannequins, and it’ll be up to us to analyze the fingerprints. Pundits will take to the airwaves to painstakingly pick apart a sea of generic pandering and dubious claims; a panel of “experts” will try to tell us all about body language and eye contact. And we, the general public, will grow weary and flip the channel over to professional wrestling for some intellectual stimulation.
Which segues nicely into a truly American solution: Since so few of us listen to the content of the debates as it is, replace the format entirely with an American Gladiators-style slugfest. The winner will be the one who looks “more presidential” while beating his opponent senseless with a foam javelin.
It would be sensationalist and contribute nothing to the public discourse. But at least there’d be some bloodshed.
You’d think there would have been, given the hype. For weeks, the news media was apoplectic in declaring that the showdown would come in second to the Apocalypse in terms of earth-shattering, universe-destroying events. Obama and Romney were supposed to stare hard into each other’s eyes until there was a violent rip in the fabric of spacetime.
Then the fated hour came, and they just kinda talked about stuff.
On the one hand, we shouldn’t be surprised; we go through this every four years. Like the Olympics, we work our way to a level of frenzied anticipation for something that, without fail, makes fence-painting feel as edgy and extreme as skydiving. Debates are where the unpredictable, gaff-laden moments on the campaign trail die a pathetic, and highly controlled, death.
But there’s always the hope that something might happen to justify the bated breath and white knuckles – like a candidate being interrupted by an alien invasion, or suffering a stress-related nervous breakdown that makes them shed their clothes and dance the Macarena.
That’s about what it would take for the debates to live up to their billing – that, or an abandonment of the traditional formula in favor of a bare-knuckle brawl in the octagon.
“Handlers” have become a big part of this stale predictability. You’ve heard about the handlers. It’s a word that has become a part of the political lexicon, like “patriot,” or “poopyface.” The handlers are the ones who make sure their candidates stick to the talking points, stay on message, stand up straight, don’t snap their gum, etc. They take the wild fruit of the candidates’ positions, extract the essential ingredients, refine them into something digestible, and sell them to the American public in shiny, plastic packages.
And they’ve become so pervasive in American politics that it’s hard to tell whether it’s the candidates talking or their coterie of advisers. These days, it’s all about playing it safe. In a sense, they can’t really be blamed for that. It’s the YouTube era, after all, in which every little comment, every misstatement and mistake, is recorded and dissected to the point where the words themselves lose all meaning. As Mr. Romney can attest, the wrong comment in front of the wrong technology is a hangman’s noose.
All of this results in debates that are as scripted as a high school play. The reason we still hope for fireworks is because of debates past, before the handlers started spritzing their candidates with sanitizer. Maybe the most famous example of such fireworks came in the 1988 vice presidential debate between Lloyd Bentsen and Dan Quayle. Quayle, who compared his level of experience in the U.S. Senate to that of John F. Kennedy when Kennedy took office, was sideswiped by one of the most legendary political smackdowns of all time: “Senator,” said Bentsen, “I served with Jack Kennedy. I knew Jack Kennedy. Jack Kennedy was a friend of mine. Senator, you’re no Jack Kennedy.”
Boom! Eat it, Quayle!
As far as staying in the public consciousness, that moment ranks alongside Mike Tyson biting off part of Evander Holyfield’s ear. We’ve been waiting for another one like it for 24 years.
If the political system remains in its current state, of course, that simply won’t happen. The handlers will pose their mannequins, and it’ll be up to us to analyze the fingerprints. Pundits will take to the airwaves to painstakingly pick apart a sea of generic pandering and dubious claims; a panel of “experts” will try to tell us all about body language and eye contact. And we, the general public, will grow weary and flip the channel over to professional wrestling for some intellectual stimulation.
Which segues nicely into a truly American solution: Since so few of us listen to the content of the debates as it is, replace the format entirely with an American Gladiators-style slugfest. The winner will be the one who looks “more presidential” while beating his opponent senseless with a foam javelin.
It would be sensationalist and contribute nothing to the public discourse. But at least there’d be some bloodshed.
Friday, October 5, 2012
iRate
My first cell phone looked like the kind of device that movie villains use to detonate bombs.
It had an LCD display slightly more sophisticated than a calculator. It was square. It was bulky. It had a flimsy antenna with a plastic nub at the tip that waggled in the wind, and the only “entertainment” contained within was a game called Snake, which involved manipulating a line of pixels – the snake – around the screen to try to gobble up a dot. It made Pac-Man look like virtual reality.
And those were the good old days.
Perhaps the smartphone-using technophobes among you are scratching your heads. How, you may ask, could those have possibly been the good old days when we live in an age of phones that can stream videos, identify music on the radio, purchase Amazon brick-a-brack, and set the heat levels on your waffle iron?
There’s no doubting that smartphones are capable of doing a heck of a lot. Some of the tasks they can perform occasionally pass as useful. There’s an iPhone app, for example, that uses the gadget’s eletronic eye to read text on real-world signs and posters and translate it into any language of our choosing. Pretty nifty. If you’re an American in Japan, for example, and you feel nature calling, you can use the app to identify the men’s and women’s restrooms, thus saving you the embarrassment of having a roomful of pantless people screaming at you in horror.
The problem with smartphones is easily demonstrated by a depressing and meandering anecdote. Every few weeks, a friend of mine has a game night in his home. The gang shows up, convenes around the kitchen table, and passes the evening with Cheese Doodles and board games – Settlers of Catan, Trivial Pursuit, and a bunch of other games that virtually guarantee none of us will ever experience the LA nightclub scene. This, if you’re as nerdy as we are, is a pleasant way to spend the evening with friends.
A month or two ago, however, the games remained in their boxes. The gang showed up, convened around the kitchen table, and pretty much just sat there, unable to decide on that night’s game. This would have been fine if the evening had been passed in pleasant conversation, but instead, everybody whipped out their phones, and it didn’t take long for a listless hush to fall over the room. Picture it: Seven or eight people seated around a table, silent, faces aglow with the light from their little screens.
Except for me. Because I refuse to buy a smartphone, the only thing I had to stare at was my bowl of Cheese Doodles.
Enter the smartphone era, in which real human contact – and peaceful rest for our overworked gray matter – plays second fiddle to streaming YouTube videos and instant recipes for organic gourmet tofu cupcakes. Such devices are a boon in a theoretical utopia where everyone has self control, and can resist the allure of playing Angry Birds while waiting in line for the Tilt-A-Whirl.
Unfortunately, most of us live in the real world, and in the real world, such distractions can cause problems.
Loren Frank, assistant professor of physiology at the University of California, San Francisco, was recently quoted in the New York Times as saying that downtime allows the brain to process experiences it’s had, helping to facilitate the learning process. And according to Times reporter Matt Richtel, researchers have found that the devices’ exposure to the developing minds of children and teens results in brains that are less able to sustain attention. Which may help explain why it takes repeated urging to get Little Timmy to mop up the Pepsi spill on the kitchen linoleum.
This portends unequivocal suckiness for the future. I envision a world of 15-second attenton spans, glowing faces in front of glowing screens, and streets filled with flaming oil drums. (Or maybe that’s Terminator.)
Kids and teens might have the developing brains excuse. Adults have no such out. When they focus on their phones and nothing else, it’s just plain rude.
It had an LCD display slightly more sophisticated than a calculator. It was square. It was bulky. It had a flimsy antenna with a plastic nub at the tip that waggled in the wind, and the only “entertainment” contained within was a game called Snake, which involved manipulating a line of pixels – the snake – around the screen to try to gobble up a dot. It made Pac-Man look like virtual reality.
And those were the good old days.
Perhaps the smartphone-using technophobes among you are scratching your heads. How, you may ask, could those have possibly been the good old days when we live in an age of phones that can stream videos, identify music on the radio, purchase Amazon brick-a-brack, and set the heat levels on your waffle iron?
There’s no doubting that smartphones are capable of doing a heck of a lot. Some of the tasks they can perform occasionally pass as useful. There’s an iPhone app, for example, that uses the gadget’s eletronic eye to read text on real-world signs and posters and translate it into any language of our choosing. Pretty nifty. If you’re an American in Japan, for example, and you feel nature calling, you can use the app to identify the men’s and women’s restrooms, thus saving you the embarrassment of having a roomful of pantless people screaming at you in horror.
The problem with smartphones is easily demonstrated by a depressing and meandering anecdote. Every few weeks, a friend of mine has a game night in his home. The gang shows up, convenes around the kitchen table, and passes the evening with Cheese Doodles and board games – Settlers of Catan, Trivial Pursuit, and a bunch of other games that virtually guarantee none of us will ever experience the LA nightclub scene. This, if you’re as nerdy as we are, is a pleasant way to spend the evening with friends.
A month or two ago, however, the games remained in their boxes. The gang showed up, convened around the kitchen table, and pretty much just sat there, unable to decide on that night’s game. This would have been fine if the evening had been passed in pleasant conversation, but instead, everybody whipped out their phones, and it didn’t take long for a listless hush to fall over the room. Picture it: Seven or eight people seated around a table, silent, faces aglow with the light from their little screens.
Except for me. Because I refuse to buy a smartphone, the only thing I had to stare at was my bowl of Cheese Doodles.
Enter the smartphone era, in which real human contact – and peaceful rest for our overworked gray matter – plays second fiddle to streaming YouTube videos and instant recipes for organic gourmet tofu cupcakes. Such devices are a boon in a theoretical utopia where everyone has self control, and can resist the allure of playing Angry Birds while waiting in line for the Tilt-A-Whirl.
Unfortunately, most of us live in the real world, and in the real world, such distractions can cause problems.
Loren Frank, assistant professor of physiology at the University of California, San Francisco, was recently quoted in the New York Times as saying that downtime allows the brain to process experiences it’s had, helping to facilitate the learning process. And according to Times reporter Matt Richtel, researchers have found that the devices’ exposure to the developing minds of children and teens results in brains that are less able to sustain attention. Which may help explain why it takes repeated urging to get Little Timmy to mop up the Pepsi spill on the kitchen linoleum.
This portends unequivocal suckiness for the future. I envision a world of 15-second attenton spans, glowing faces in front of glowing screens, and streets filled with flaming oil drums. (Or maybe that’s Terminator.)
Kids and teens might have the developing brains excuse. Adults have no such out. When they focus on their phones and nothing else, it’s just plain rude.
Subscribe to:
Posts (Atom)