Thursday, December 31, 2009

Men’s Fashion Tip For December

For this month’s fashion tip I’m going to begin what may be an ongoing series on choosing a suit. There are a lot of factors that go into choosing a suit but, judging by the suits I see on men around town, not many people pay attention to them. Obviously that can have negative repercussions; a sharp looking suit will connote success and confidence, but a bad suit can be a significant handicap.

For this first suit-oriented tip I’m going to focus on venting. When buying a suit it’s important to look at the jacket and determine how many slits it has in the back along the bottom. These slits are called vents and there are basically three main types of venting: no vents, single vents, and double vents. (This is also true of most sport coats and other jackets as well).

The most flattering of these styles is double-venting. A double-vented jacket will have two slits, just behind each hip. This style looks more European and typically has the greatest slimming effect on the body. It makes the wearer look more athletic and its lines suggest the ideal masculine shape. This kind of suit also suggests a classier look; because it requires more effort to make (more cutting, sewing, etc.) it looks more expensive (and sometimes, but not always, costs more too). All of these factors combine to give double-vented suits a sleek, flattering, modern aesthetic.

Though I prefer double vents, I more commonly see single-vented suits. These suits will have a single slit in the center of the back and the style is characteristic of looser, more billowy American suits (as opposed to European styles). When I recently asked (the incredibly talented tailor) Lady Danburry why this kind of suit is so common she speculated that it is because they are easier, cheaper, and faster to make, especially by machines. Not surprisingly for suits that have emerged under those conditions, these suits are less flattering. Of course, the single vent can be used to make the suit slimmer and more tailored, but depending on body shape it usually won’t compete double vents. (This style is particularly unflattering when paired with the great fashion travesty of the century: pleated pants).

Finally, suits with no vents are often the least tailored. They’re basically cylinders cut out of fabric and are less flattering. In fairness, some would say they look more formal than single-vented suits (more like tuxedos, for example). They can evoke a particular kind of retro vibe as well; Cary Grant wears one in North by Northwest for example. Still, double and single-vented suits can look equally retro and formal under the right circumstances and have the advantage of actually being attractive too.

Ultimately, then, a double-vented suit is going to be the most stylish choice. Though people in professional and social settings probably won’t consciously note how many vents your suit has, they will be impressed by a man that looks especially sharp and well kept. It should also go without saying that a better looking suit leads to greater confidence, which is in turn a key ingredient for success. So the next time you go decide to revamp your wardrobe remember: venting matters and there is no fashion neutrality.

Monday, December 21, 2009

Kill Christmas!

Ever since the holiday season began I’ve been looking forward to writing a blog about why Christmas is inferior to other holidays. However, I think I basically made my case in my Thanksgiving post (and to a lesser extent my Halloween post), so here I’m going to explain why the best way to celebrate Christmas would be to kill its so-called “spirit.”

The spirit of Christmas, as I understand it, is tied to the life of Jesus Christ. It involves being charitable, loving, and (most importantly) dwelling on the things that He provided for us. As a Christian/Mormon, I obviously think all that stuff is great. However, it’s no surprise that the way we celebrate Christmas so often has little to do with actual Christianity; we buy stuff, stress out about family gatherings, put up lights, decorate sugar cookies, etc. Of course, most of our modern Christmas traditions are actually pagan activities (Christmas trees, for example, which have nothing to do with Jesus despite Christians' attempts at appropriation).

However setting that issue aside, few Christmas activities actually promote reflection on Christian doctrine. In fact the holiday is really wrapped up in the consumerist, pop cultural side of the event. When I think of Christmas I don’t think of the gore-fest that was Jesus’ life, I think of Norman Rockwell paintings and wrapping paper. I think of How the Grinch Stole Christmas or Frosty The Snowman. I think of the pride I got as a teenager from vanquishing our neighbors in an unofficial Christmas light war. In other words, the things that I associate with Christmas are fun, happy, and (most importantly) secular.

For what it’s worth, I prefer it that way. As grateful as I am for them, I don’t really want to think of Jesus’ horrific experiences at Christmastime. I prefer to simply enjoy the pleasantness of spending time with friends and family and save the religious meditation for later. When I’m watching Christmas commercials or looking at lights, I feel a little of the holiday excitement that is supposed to accompany the season. When I think about someone being tortured to death on a cross, I just get kind of sad.

Given the origins of the word “Christmas,” it seems likely that at one point the holiday was actually a somber religious affair. However, for better or worse, that really isn’t what it’s all about anymore. We can shake our fists at the superficiality of the holiday, or we can accept it and enjoy all the secular movies, foods, activities, etc. If we wanted, we could even set aside a different day for contemplating the life of Jesus. We could spend our time thinking about Jesus’ entire life, from how awful it would be to give birth in a filthy stable to the awesome idea of resurrection, and then continue to have a sweet secular celebration on December 25th.

Whether we set aside a different day or not, it’s unlikely that Christmas will become less secular. Instead and as a result of our spending habits, it’ll probably continue to move away from the “Christmas spirit.” Fighting that fact will just constitute slapping a religious veneer on a technically pagan and effectively secular celebration. On the other hand, sweeping away that veneer would allow us to guiltlessly enjoy the simple pleasures of the season (like, say, the deliciousness of eggnog or the smell of evergreen), without extra preoccupation about sins and hell. Ultimately, it’d allow us to embrace the kitschy aesthetic of the season and open the door to a more meaningful religious holiday.

Saturday, December 19, 2009

A More Poetic Style

Tonight I've been writing a list of new Provo businesses of 2009 for Rhombus Magazine. In my opening paragraph I had originally written "...surprisingly managed to bring new life to our struggling downtown." However, I felt like that wasn't really very interesting to read because "bring new life" was boring and clunky. After considering for a minute the image of someone using a defibrillator kept coming to mind. Thus I changed the phrase to "...surprisingly managed to defibrillate our struggling downtown." It's shorter and easier to read, and, I think, more interesting.

If I were writing a poem such realizations would be necessary and one as simple as this wouldn't warrant any particular rejoicing. However, since I was writing an article I think it is an occasion worth taking note of. The first phrase was adequate and would be sufficient. In my own writing I'll actually usually go with whatever is adequate eight or nine times out of ten simply because I have to move on. Though different writers have different skill levels, I'd bet that most professional writers work similarly simply due to time constraints (though "adequate" from great writers is obviously better than "inspiring" from mediocre writers).

This all raises some questions: does it matter? Does an audience really care if a magazine author or journalist uses more interesting diction? Do they notice? (Am I even correct in assuming that my second choice was more interesting?) I'm not sure. Most writing I read isn't remarkably well written, nor is it terrible. It does, however, seem to place a premium on content. Language, most popular writing apparently assumes, isn't something that is supposed to call attention to itself or be anything but the background for the ideas it conveys.

However, a more poetic style (which I hope I was inching closer to when I happened to think of the word "defibrillate") is more exact. I think the verb I decided to use is more violent and vivid than the bland phrase I started out with. If it really is better my conclusion is then that I hope to continue to work toward a more poetic style in my future popular writing. By extension I'd argue that writers generally should do the same.

Of course, probably no writer would argue that he or she shouldn't try to find better words to use. However, I think that my argument is also that, by extension, "adequate" is actually less adequate than it seems. Though the basic idea gets across when writers settle for the first thing that comes to mind, the complexity of a concept is lost. In other words, ideas and language are closely integrated with one another and using inexact words conveys ideas inexactly.

Ultimately I think that popular writing shouldn't strive for transparency. It should strive for poetry. Though poetry itself is all but a dead art (which, as a writer and reader of poetry, I lament), the power of language is significant. Obviously few writers have the luxury to treat everything they write like a poem. Yet I know that in my case when I simply adopt the belief that writing should be poetic my work is profoundly changed. This, coupled with reading more poetic works (such as actual poetry) could substantially improve writing in many venues. Theoretically it could also inspire readers to become more passionate about what they're consuming, which in turn could (in some small way) help to reinvigorate the struggling writing industry.

Friday, December 18, 2009

Going Home Culture

Throughout my time as a college undergrad I was puzzled and slightly intrigued by my friends who had families near by. As a Californian living in Utah, seeing my family was a two or three times a year affair. I’d go home during Thanksgiving and Christmas, and maybe during summer, but for the most part I didn’t see them much.

That, however, contrasted significantly with my friends from Utah who went home at least several times a month. In the case of my roommates I’d always notice them taking their laundry and coming home from Sunday dinner with leftovers. I have to admit that in many ways I was envious of these friends; they obviously had the opportunity to cultivate a stronger relationship with their families, and they also saved money by eating their parent’s food and using their parent’s washing machine.

Though my family lived in California for years, just as I graduated they moved to Utah and I subsequently began to experience the decidedly different culture that exists among family members living within driving distance of each other. Of course, I was recently married by this time and no longer an undergrad, but I still began to do many of the things I’d seen my friends do before. I started taking advantage of my parent’s washing machine. Laura and I started going (and continue to go) to dinner at my family’s house nearly every week. Occasionally we go hang out with them or do things for family night.

As we’ve done this I’ve been surprised at the difference between having family far away and having them nearby. Though that difference was apparent to me before when I had experienced only one of those two options, I don’t think I could fully appreciate it until my family moved to Utah.

For example, as Laura and I prepare to visit her family for Christmas she mentioned how we might not want to bring piles of dirty laundry home (though we have done this in the past and may still in the future). We also don’t think of Laura’s parent’s home as a source of delicious free food.

More seriously however, the kind of relationship that is fosters between family members is largely contingent on how often they communicate and through what medium. The actual conversations that we have with my family, for example, are affected by how much time we spend at their home. It’s also obvious to see how the various things going on in our lives (school, work, stress generally, etc.) changes the kind of relationship we have. On the other hand, a fair amount of time we spend with Laura’s family is going to inevitably be reserved for catching up and/or reasserting the familiar connection we feel.

All of this is really just to say that distance still plays a significant factor in the kind of relationships people can have. Though the Internet and other technologies continue to “shrink” the world, I don’t think they actually allow relationship to transcend or circumvent distance. I’m not arguing that one kind of relationship is better (be it long or short), but that no matter how many phone calls a person makes to their long distance family they can never have the same kind of “going home” culture as a person with family nearby and vice versa.

Wednesday, December 16, 2009

Sandra Bullock: Gradual Feminist or Closeted Misogynist

A few nights ago I had the dubious pleasure of watching Sandra Bullock’s The Proposal, the second of her three films this year. Along with Bullock the film stars Ryan Reynolds and is a fairly formulaic romantic comedy: a man and a woman initially dislike each other, are forced to spend time in one another’s company, finally manage to separate only to discover that they’ve fallen in love. It’s the same story over and over again. However, The Proposal throws a curve ball into the mix: Sandra Bullock plays the role of the older, more powerful professional, while himbo Reynolds is the plucky underling who falls in love as a result of his superiors coercion. In other words, The Proposal reverses the typical romantic comedy gender roles.

Or does it? Though Sandra Bullock, as a vile and reviled publishing boss, is definitely playing against her type she can’t really shake her Sandra Bullock-ishness. The role she’s been given is basically trying to be Meryl Streep in The Devil Wears Prada, but Bullock really just comes off the same way she does in every post-Speed movie. Though this tension between her supposed evilness and her obvious charm actually makes the movie more entertaining, it also begins to undermine the legitimacy of The Proposal’s gender reversal. Much like Miss Congeniality, this latest film casts Bullock in a man’s role but is narratively concerned with removing her feminist veneer to reveal the awkward, tomboyish Bullock archetype. In that sense it’s not unlike several of Barbara Stanwyck’s films that cast the golden era starlet as a spunky working girl who nonetheless ends up in a very traditional relationship.

The process by which Bullock’s feminist veneer is removed further raises questions about the feminist slant of the film. Though the initial gender reversal is laudable (Reynolds, for example, is accused of sleeping his way to professional success much as a woman might be in a less progressive movie) the end of the film basically ends like any romantic comedy with the man proposing to the woman. (This is not a spoiler, as the inevitable end of any romantic comedy is a heterosexual coupling.) It’s trite, but in this case it’s also particularly disappointing because The Proposal had seemingly already done away with that particular convention when it had had Bullock getting down on her knees to proposal to Reynolds earlier in the film.

In any case, The Proposal begins with what could be an interesting premise but slowly unravels everything it has going for it so that by the conclusion it’s just business as usual. Hollywood often takes a lot of flack for supposedly being “liberal” and trying to push a progressive agenda. When some (overly conservative) person watched the film they probably lamented the fact that the film seems to endorse woman’s rights and gender equality. However and unfortunately for those of us who actually believe in those things, the film actually condemns them and advocates the gender disparity status quo that it might have been trying to dispel. In the end, then, The Proposal shows that men are in control and women are just schemers trying to find a husband.

Friday, December 11, 2009

Electron Deception Christmas Song

So when I started this blog I had actually intended to talk a lot about my band, Electron Deception. Over time, however, I've drifted away from that. However, I wanted to do a shameless plug here for the Christmas song we just recorded. It's our fairly liberal interpretation of "Little Drummer Boy" and you can get download it for free, here. While you're at it, feel free to check out the other bands' Christmas songs on the same website, as they are very cool to (if very different from our own).

In the coming days I hope to discuss Christmas music more generally and some of the interesting realizations that we (the band) had as we tried to actually write a Christmas song. Assuming I don't get distracted with other topics I'll post that next week.

Triceratops and Siamese Cats

Tonight I am amazed at the power of children’s movies to shape my adult ideas. More specifically, I am amazed that movies I saw when I was kid as still determining how I see the world, even when I don’t realize it.

Earlier this evening Laura and I were talking about our clothing as a kid and I mentioned that my mom made shirts for my siblings and me that had brontosauruses on them. I mentioned that at the time (and now, actually) brontosauruses seemed like pretty wimpy dinosaurs (I subsequently had my mom sew flames on mine, effectively turning it into a brontosaurus-dragon). Anyway, Laura made the point that the brontosaurus is a fairly gender-neutral dinosaur that would have fit well on either my shirt, or my sister’s.

I think Laura is right, but why? A t-rex is probably a good example of a “boy” dinosaur, but I couldn't think of why brontosaurus would be neutral (this reminds me of my post awhile ago about school mascots). Then, as I was suggesting alternatives that I would have preferred, we figured it out. I said that would have taken a triceratops over a brontosaurus, but that “I always kind of thought of triceratops as girl dinosaurs." That’s when Laura realized that all this went back to The Land Before Time.

Not coincidentally, my mom made our dinosaur shirts right around the time The Land Before Time came out. Though the reason she made them was to have all us kids match, she probably party chose the dinosaur motif because of the movie. However, if my mom was motivated by the movie to make shirts for her children, she probably couldn’t have foreseen that that movie would continue to influence her children’s perceptions about dinosaur gender for the rest of their lives. Though I know there were both boy and girl triceratops, I don’t foresee myself ever associating that dinosaur with boys. If I ever have a son I'll probably give him a doll before a triceratops toy, not because I’m opposed to boys playing with girl toys but just because it would never occur to me to give a girl dinosaur to a little boy (whereas I’d be more likely to think critically and make creative choices about more obviously gendered toys).

Similarly, I know that Siamese cats are evil. Every time I see a Siamese cat on the street I try to steer clear of it, and usually I give it a mean look. Though I spent most of my adolescence and young adult life believing that Siamese cats were just among the more temperamental of cat breeds, I eventually realized I got my ideas about them from Lady and the Tramp.

Remember that part where the two Siamese cats sing “we are Siamese if you please. We are Siamese if you don’t please.” Honestly I can’t even remember anything else about that movie (or what the cats do that makes them so evil). As I’m typing this I’m also realizing that there may be some underlying racism in that part of the story, but that’s a topic for a different day. The point is that because of that movie I will forever imagine every Siamese cat I see singing those two lines. Though I’d feel comfortable describing myself as a cat person, I’d never bring one of those kind into my house.

I suspect that everyone has had similar experiences with media they consumed as a kid. I’m also sure that there are other examples of movies influencing my current attitudes that I'm not even aware of yet. If this story has a moral I suppose it would “be careful what your kids watch.” That’s an important message, I guess, though for now I’m content to marvel at how I know this happens, but I’m relatively powerless (at this point) to do much about it.

Thursday, December 10, 2009

Don’t Buy Tacky Digital Picture Frames

Some things from the past are really tacky. Shag carpet, for example. And TVs with fake wood paneling. Though some of us love that stuff for its nostalgic quality, I don’t think any one would ever call it “classy” or “quality.”

Today’s equivalent of the wood-paneled appliance is the digital, USB picture frame. I’m sure you’ve seen this item. Perhaps you even own one yourself. When they first came out the novelty of the idea nearly made me get one. Thankfully I didn’t make that mistake and if I had I would probably find some person I only half liked so I could give it to them as a Christmas present.

These frames are tacky for a lot of reasons. First off, they’re pretending to be something they’re not. Like it or not, displaying 4x6 photographs on a bedside table is going to get rarer and rarer. After all, digital pictures are not pieces of paper and they don’t need a picture frame to hold them. Hardly anyone takes anything but digital pictures these days, but because all of us were alive when analog photography ruled the earth we still tend to want to play by the old rules. Just because were taking digital photographs, we might think, doesn’t mean that we actually have to display them differently. That, however, is wrong. Like most new technologies, digital photography is not well suited to old fashion frames and it will eventually change how we think about pictures.

In many ways digital pictures frames are a lot like those old radios, record players, or TVs that pretended to be large wooden pieces of furniture. You’ve probably seen these things in the homes of old people (my grandparents seemed to have a few). I think the idea was that all the new-fangled technology needed to be proper and respectable, so it had to look like a table or a cabinet. In retrospect that idea seems quaint and kind of silly. Similarly, the idea that you’d want to put a low quality monitor on a shelf to perpetually play a screen saver is rapidly looking outdated, passé, and downright ridiculous.

As if digital picture frames weren’t bad enough by themselves, many models actually leave the flash drive hanging out the side. Why the frames’ designers wouldn’t have realized that USB drives are longer than half a centimeter baffles me. Still, I’d say at least half the frames I’ve seen suffer from this problem and it always kills what little aesthetic appeal they have; it’s like wearing a polyester suit, and then going out with your fly down.

I predict that in the future we will have ways of viewing pictures that take into account technological advances and don’t try to make new things fit into old boxes. If I could vote, it’d be for something like Facebook’s photo system, where I get to chose which pictures I see. With the proliferation of the internet into new devices (gaming systems, phones, etc.) anything around the house could be used to display pictures. For example, while you’re not watching anything, your TV could continuously display pictures that are stored in one central location and accessible by any device. (Someone is probably reading this think that they like being able to have their digital photographs displayed like old fashioned ones and they don’t want something new. All I can say is too bad. After all, how often to you consume technology or media the way people did in the 80s, 70s, or 60s? The reality is that things change.)

Of course, predicting the future is a fool’s game, but learning from the past is not. It’s possible that digital picture frames will survive and be really cool for a long time. On the other hand, there are very few examples of old technologies (like picture frames) successfully being applied to new ones (like digital photography). So this holiday season avoid digital picture frames and don’t do the 2009 equivalent of installing wood paneling in your home. It’s a choice you appreciate sooner rather than later.

Wednesday, December 9, 2009

Driving and Individual Liberty versus Potential Liberty

When I began writing this post it was going to be about how frustrating elderly drivers can be and how difficult it is to pass legislation regulating them. I had just listened to a program on NPR about making roads safer that had focused specifically on the elderly. As I’m sure is obvious to most people, many elderly drivers are a menace on the road and, according to this show, demonstrate a considerable drop-off in road safety after the age of 75. On the other hand, the problem with improving that situation is that the elderly consistently vote (unlike equally unsafe teenagers) and no legislator wants to upset old folks.

As I’ve considered this topic, however, it has occurred to me that the issue is really about freedom. Elderly people oppose laws that restrict their rights because those laws would allow them to do fewer things.

What’s interesting about this debate is the fact that the elderly, by virtue of their opposition to new laws, seem to be aware that those laws would limit their privileges. This in turn represents an acknowledgement that they may in fact be unsafe. Yet despite the fact that stricter rules would make roads safer, elderly people still oppose them.

Clearly, elderly drivers who oppose new laws value personal freedom over general safety. For what it’s worth I think that’s a fair position to take and one that I share in some instances. (For example, I hate airport security measures and would gladly accept the risk of terrorism if it meant a less invasive screening process.)

However, the issue isn’t as simple as restricting or maintaining elderly drivers’ right to their car keys. If the elderly were merely annoying on the road the issue wouldn’t be that difficult (freedom versus annoyance seems like an easy choice). Instead however, when elderly drivers drive recklessly they often hurt or kill other people. Obviously, if someone is dead or limb-less their freedom has also been significantly reduced. In fact, a dead person has quite a bit less freedom than an old person who can no longer drive.

Thus, the issue of elderly driving seems to revolve around two kinds of freedom that I like to think of as “real freedom” and “potential freedom.” In the first case “real freedom” is the actual, demonstrable ability of a person or people to do something. An individual elderly driver, for example, will absolutely lose some freedom if laws are tightened and it's easy to know exactly who would be affected by new laws. “Potential freedom,” however, affects nameless people who statistics say have been saved. Once a new law is enacted it's usually impossible to identify specific people who have been affected. In other words, potential freedom abstractly affects someone, somewhere, somehow.

Though these labels are just my own invention I think the idea is an important one because it provides an excuse for real-world political and social action. An elder driver might believe that they won’t hit someone, despite the fact that they have a slower reaction time. They won’t miss a stop sign because their eyesight is bad. But they will lose their license if new laws go into place. Therefore, they oppose them.

The problem with this, however, is that it weighs two unequal things. The inability to drive would surely be frustrating, but losing the right to be alive is also a much bigger deal. How many real, flesh-and-blood drivers should lose their licenses in order to preserve one nameless, hypothetical person? Statistically speaking, we know that people will die but without a name or a face they seem less important than grandma and grandpa.

Personally, I believe that the elderly should have stiff regulations imposed on their driving. I don’t think that their inconvenience is worth risking unnecessary lives. However, I recognize that it is difficult to balance freedom with safety, especially when we’ll never know whose lives are saved. Elderly driving is also only one example of this dilemma. Airport security is another, as are security measures in schools and public places, smoking and alcohol laws, weapons laws, and a multitude of other issues. (Come to think of it, the entire ideology behind libertarianism is based on an assertive, and I’d argue painfully simplistic, assessment of the relationship between real and potential freedom.)

In the end, then, this post is less about how frustrating elderly drivers are and more about the difficulty of ensuring that people are safe while they exercise their rights. In a country like the United States, where the word freedom is pretty much synonymous with “good,” these are hard questions indeed.

Friday, December 4, 2009

Max Hall and BYU’s Unsportsmanlike Facebook Group


In the past few days I’ve noticed some of my friends joining a Facebook group called “Max Hall said what everyone was thinking.” (I have quite a few friends in this group right now. If you’re one of them I’m about to condemn it in no uncertain terms. I hope this doesn’t upset or offend you as I value your friendship. Maybe this post can open a dialog. Who knows, perhaps you have some rebuttal for my argument that I haven’t thought of yet.)

 

Max Hall is, of course, BYU’s quarterback and the group refers to these comments (taken from the group’s page) that he made after the game last Saturday:

 

I don't like Utah. In fact, I hate them — I hate everything about them. I hate their program. I hate their fans. I hate everything. So, it feel good to send those guys home... I think the whole university and their fans and organization is classless... I don't respect them, and they deserve to lose

 

This article gives more context, but unfortunately it doesn’t mitigate the inflammatory nature of the remarks. Now, before I comment on anything else I should say that Hall made the comments right after a tough game during which there was undoubtedly a lot of adrenaline pumping through his system. I know that that sort of situation can cause people to say things they would otherwise keep to themselves and though I think Hall is an idiot for what he did, I can understand making a mistake. (On the other hand, Hall would no doubt like to go pro but if the emotion of a game prevents him from controlling himself he’s hardly NFL material.)

 

In any case, Hall’s comments were a mistake. He admitted as much and was also officially rebuked for them. What is much more troubling than Hall messing up is the fact that BYU students have created a group to honor and perpetuate his mistake. Though Hall’s comments reflect poorly on him and his school (which is my own alma mater too) the Facebook group endorses negativity and unsportsmanlike conduct. Hall at least had the weak excuse that he was riled up by the game. What excuse do BYU fans sitting at home on their computers have? That they’re ill mannered jerks?

 

The comments on the Facebook group’s page vary. Some mention that the University of Utah’s football team was playing a dirty game. Others mention that U football players have similarly said insulting things about BYU. However, that type of excuse is so flimsy it’s laughable. Both teams played a dirty game and even if the U had worn brass knuckles onto the field BYU students should have taken the high road once the game was over. Do they really hate U fans? Seriously? Isn’t BYU all about service and showing Christ-like love? Even to people who chose to get an education at different universities? Ultimately, this whole thing makes BYU look like it’s filled with mean-spirited bullies.

 

I think it might be useful to imagine this whole episode as an inspirational sports movie in the vein of Rudy or Remember the Titans. On BYU’s side we have a fifth year senior who didn’t even play very well. Maybe the U deserved to lose, but Hall’s performance hardly justified a win. On the other side, the U had an 18 year old freshman as their quarterback. As the announcers on the Mountain West Sports Network said during the game, he didn’t show that he was a freshman during the first half but it was apparent in the second. Nevertheless the Utes held BYU at bay the entire game until Max Hall finally got lucky and threw a complete pass to win. Then, despite the win, the much older Hall went on to ice the cake with an insult.

 

If this were a movie BYU would, without question, be the bad guy. Most sports movies include some brutish, mean antagonist and BYU almost perfectly fits that bill. The only problem is that Hall will never have to face the U again so there won’t be a rematch during which the older, more experienced bully is crushed by the resilient underdog. In other words, the U was Rocky. The U was Rudy. The U was every sports movie hero and BYU ended up playing the part of a stock bad guy.

 

I think this movie analogy is useful because the Facebook group supporting Hall currently has 2065 members at the time I’m writing this. It makes me wonder: does everyone want to come off as a vindictive bastard? Do people like perpetuating the worst parts of a dirty game? Does this group strike any of its members as being somewhat at odds with the values they claim to believe in? I’m not saying that BYU fans (and many of my friends) are bastards or hypocrites, but the Facebook group certainly makes our school appear to be extremely bad sports. And really, that's not good for anyone.

 

My guess is that most people aren’t thinking very much about this issue. Rivalries are fun and the conflict they allow can be a much-needed outlet for a lot of people. Perhaps BYU supporters who have joined the group simply feel like they’re showing school spirit. However, I hope that BYU students and fans realize that sports rivalries are not worth being a fool over. Going to another school and (passionately) supporting that school’s team doesn’t make someone a bad person. One bad apple shouldn’t be used to generalize the bunch (which BYU must appreciate, in light of Hall’s comments and the unfortunate actions of some BYU fans after the game). This seems obvious, but the existence of a group like this makes it seem like BYU students have lost sight of things that really matter. If the world is actually going to be the campus of BYU students, there isn’t room for hate groups.

Wednesday, December 2, 2009

Thanksgiving: The Best Holiday of the Year

Thanksgiving is unequivocally the best holiday of the year. Though it consistently ranks below Christmas on many people’s personal holiday rankings, it deserves more credit.

 

Of course, the most obvious reason to praise Thanksgiving is the way it is celebrated. Thanksgiving is practically synonymous with food and feasts. Admittedly, if you aren’t a fan of turkey, pies, mashed potatoes, etc. (and I know many people aren’t) Thanksgiving will probably hold significantly less appeal. On the other hand, Thanksgiving food is fairly varied and most people don’t hate all the traditional dishes. More importantly, perhaps, a Thanksgiving feast provides an opportunity for creativity. This year, for example, my family and I had the typical pies but Laura and I also made banana cake. Last year someone brought sautéed mushrooms. Though these kinds of food definitely weren’t on the Pilgrims menu, they are delicious and any holiday that provides an excuse to eat good food has a lot going for it. Ultimately then, even if you don’t love Thanksgiving food, you can still use the day as an excuse to make and eat a lot of other delicious things.

 

While food is certainly a big part of Thanksgiving, I would argue that is isn’t the most important part. Instead, the most significant part of Thanksgiving is being able to spend time with family or friends. Again, I know that not everyone enjoys this part of the holiday. I also know that there are many people who can’t get together with their families. However, the fact that the Wednesday before Thanksgiving is the biggest travel day of the year suggests that celebrating with loved ones is a typical or at least common thing to do. What’s more, whether you get along with your family or not, Thanksgiving is among the least stressful ways to get together. You don’t to worry about things like presents, and the attention is never on one person as it might be during a wedding, birthday celebration, graduation, etc. 

 

There are a number of other reasons to appreciate Thanksgiving. It takes place in the fall, for example, so it’s not as cold as Christmas but it’s not as hot as the Fourth of July. It also means a day off from work for most people and two days for many (and three for students and teachers). The list could go on and on, but the point is that Thanksgiving genuinely has more to offer than most holidays.

 

Of course, Thanksgiving is also good because of the ways that it isn’t celebrated. As I mentioned above, there are no presents as there is with Christmas. Obviously if you love getting (or giving) things that might seem like a negative thing, but realistically presents raise stress. You have to find just the right thing for everyone, worry about whether they’ll like it, and then act surprised/impressed/excited about what they gave you no matter how you actually feel. Sometimes none of those things are a big deal, but other times they cause serious tensions. At very least Christmas requires everyone to put on a show and by comparison Thanksgiving comes off as a much more sincere holiday.

 

Thanksgiving also isn’t commercialized the way that Christmas or Halloween are. To be honest, the commercial aspect of Christmas is really the only part of the holiday that makes me feel the seasonal “spirit”; I can’t, for example, get into Christmas unless I’ve seen tons of Christmas commercials. However, Thanksgiving’s charm is that you don’t have to “get into” the holiday in the first place. There is relatively little buildup and, consequently, no let down when the day actually comes. As an adult I can’t stress this point enough. Though I loved Christmas as a child it is inevitably disappointing as an adult when I know that there is no magical, Santa-related ending and that the best thing that will happen is that I’ll get to sleep in. Those things are great, but hardly worthy of weeks and weeks of anticipation. Thanksgiving, by contrast, includes the best parts about Christmas without the huge anticipation and subsequent let down.

 

I could go on and on about how much better Thanksgiving is than all other holidays (in the U.S.). Instead, however, suffice it to say that in these first days of December I’m reminded of how much more thankful I am for Thanksgiving than all other holidays.

Monday, November 30, 2009

November Fashion Follow-Up

Re-reading my November fashion tip I was struck by the fact that it seemed to suggest that the only good coats for winter were wool pea coats. While that type of coat (as well as coats based on it) are certainly cool and functional, there are also many other kinds of coats that can be worn effectively and stylishly.

 

One type of fabric that I personally have come to appreciate is canvas. For example I have a green, lightweight canvas jacket that I got from H&M. It’s not jean material, but it’s 100% cotton and has kind of a quasi-military look (except with a flattering, double vented-cut). Though I purchased this coat on a study abroad in England two and a half years ago, all the H&Ms I’ve been to have had equally stylish, if different, cotton jackets.

 

Cotton, of course, is not as warm as wool and it doesn’t wick water away like wool does. However, a cotton coat can be cheaper than a wool coat and if you’re just using it for short periods outside it shouldn’t be a big deal. My H&M jacket is definitely more autumn attire, but I still wear it during the winter, just with more layers.

 

The point here is that in addition to wool coats there are number of other things that work. Ideally, everyone would own numerous pieces for every possible occasion. Since that isn’t the case it’s a good idea to get a few coats and jackets that have some flexibility. My H&M jacket, for example, is decidedly less formal looking than my black coat. I can still wear it to work, but I can also wear it out to see concerts or just to hang out with friends. If I need something more professional or formal I’ll go with my coat.

 

Ultimately fabric choice is extremely important when choosing a winter coat. Find a vendor that has some variety and assess how you’ll be using the coat. In keeping with November’s fashion tip, I’d say that a pea coat is a good first choice because of it’s versatility, functionality, and style. However, if you’re branching out from there consider something that addresses a completely different need. 

Wednesday, November 25, 2009

No Fences In Cedar Hills

Laura and I have been spending a lot of time lately at my parents house in Cedar Hills, Utah. In the past, I've used Cedar Hills as a kind of poster city for the problems inherent in suburban living. While I continue to be shocked and disappointed at how damaging that lifestyle is (environmentally, socially, etc.), there is one unique thing that I really like about my parent's neighborhood: no fences.

When I first visited my parents home after they moved there in 2007 I was at once disappointed and pleasantly surprised. First, I was disappointed because the houses in the neighborhood weren't atypically interesting; though it didn't shock me, they're all pretty much like those any newish housing development. However, what amazed me then and impresses me now is that there are relatively few partitions between the homes. Each house technically has a yard, but for the most part they all run together. The result is that the houses all kind of feel like they're in a park. This also has the added benefit of making the fairly small yards feel spacious. 

Besides making people believe their yard is bigger than it actually is, there seem to be a number of benefits to not having fences. For example, it makes getting from point A to B much faster and more convenient; instead of going around the houses (on the frustratingly winding streets) a person could just cut through all the open yards. (Some people have tried to stop others from doing this, which is really unfortunate.) This also has the potential benefit of encouraging people to walk places instead of drive because the distance can be so much shorter on foot. (And I shouldn't have to mention all the reasons why walking is superior to driving.)

More abstractly, a lack of fences seems to be an argument for community interaction. While suburban architecture seems specifically designed to separate (and therefore isolate, and thus alienate) people, breaking down the barriers between properties symbolically breaks down the barriers between people. Inevitably residents will see and interact more often with their neighbors if there are no fences. I don't think it's a far stretch to suggest that that interaction will engender greater empathy and interest among residents. Just last week while at my parents house I noticed how I could actually see into the neighbor's house at night (and presumably they could see into my parents house). Admittedly, that raises questions about voyeurism and the like, but it also served as a reminder that there were people all around me. I wasn't in an isolated little island of personal space. I was sharing an environment with other people. As a non-resident of the area I don't really have a relationship with the people in the neighborhood, but the lack of fences nevertheless seems to be a promising way to combat the social entropy that some claim is inherent in suburbia. 

Unfortunately, fences are slowly going up in Cedar Hills. Some seem to be designed to keep pets in, while others look like they're simply to prevent people from cutting across the yard. Though I hope it doesn't happen, I wouldn't be surprised if in five or ten years all the houses in the area had fences. Still, keeping the neighborhood open is a good idea. Suburban living is perhaps the least efficient and socially conscious lifestyle out there, but if tearing down fences doesn't make anything greener it at least encourages people to see themselves first and foremost as members of a community. Ultimately then, in this case and unlike other suburbs across the nation, Cedar Hills has figured something out that surprisingly begins chip away at the problems of suburban life. 

Friday, November 20, 2009

Writing Negative Reviews

This week I went back and read Electron Deception’s negative show review from April (sorry, I'm not linking to that). I was really trying to see if any of our efforts to drive it down in the Google search results had worked (they hadn’t), but it got me thinking about the purpose of reviewing things, especially local things.

 

Ultimately, as I’ve considered this topic, I believe that writers who decide to cover local events should actually avoid giving negative reviews, at least initially. If communities are going to thrive and produce things (arts, restaurants, etc.) people have to actually go out and support those things; negative reviews, on the other hand, prompt people to avoid local activities and can contribute to increased isolation. Taken to their extreme, they also eventually kill off local endeavors and leave people with few options other than chain restaurants, generic music, and mediocre art.

 

I’m not suggesting here that writers simply lie and say that everything local is good. Instead, I would hope that when covering a community, they focus on the positive. If a restaurant, musician, etc. isn’t great, simply chose to cover something else. If at some point something becomes so big and so ubiquitous that members of the community have begun weighing on it with various opinions, then it might be time to start writing honest, hard-hitting reviews. I know that’s a nebulous distinction to make, but the vast majority of local entrepreneurs and artists aren’t becoming rich, famous, and powerful. Instead, they’re struggling to make ends meet and a single negative review can sometimes force them out of business.

 

Of course, negative reviews may prompt people to improve whatever it is they’re trying to sell. When Electron Deception got our scathing review we genuinely tried to address some of it’s more legitimate points. On the other hand, most of our efforts have never been heard by anyone; our shows often don’t have huge attendance and the review has certainly not done anything to improve that. In fact, despite our efforts, we think the negative review may have caused some people to not come see us, even after we’ve changed our sound considerably. If the people who wrote the review had wanted to improve the music scene they could simply given their suggestions to us privately, let us work on them, and waited to see what had happened. If they had done that we would have been happier of course, but local venues might have made more money when we played there (because more people would have wanted to come out) and parties would have been livelier (for the same reason). However, as it was, the negative review only dampened the music scene across the board.

 

Electron Deception is just one small band and few people will probably remember us, but I think this incident is illustrative of the anti-community effects of negative reviews generally. If a writer, blogger, or journalist decides to condemn a local theater production or restaurant, for example, people will go elsewhere. If you live in New York where there are hundreds (or thousands) of local performances and food establishments all the time this probably isn’t as big a deal. However, if you live in a mid-sized town (like Provo, Utah, for example) people don’t necessarily have other options. A scathing review of a single local theater production can thusly put an end to all or most of local theater generally. A bad review of a local restaurant can drive people to chains (which typically serve terrible food). In both situations innovation is stifled, local economies suffer, and people interact less with each other.

 

Journalists certainly don’t have the responsibility to advertise for local businesses and events. Journalistic integrity is important and should be defended. (I would argue, however, that that doesn’t apply to the blogosphere, where many local reviews take place. A negative review on a blog seems closer to cyberbulling than journalistic honesty to me.) Still, if we want to live in vibrant places where diversity and creativity abound, it’s important to ask ourselves if what we’re writing is going to bolster that community, or stifle it. 

Thursday, November 19, 2009

Eggo Waffles

In my freezer I have a big box of Eggo waffles. I like them. In fact, the reason the box is still in my freezer (several months after it was purchased) is because I keep saving them for a rainy day when I need something delicious to eat. Today however, I learned that my big box of Eggos is about to become something of a scarce and hot commodity. According to this article on MSNBC.com two factories that produce Eggos have recently had to shut down and one, in Tennessee, is still closed. What this means, of course, is that we’re about to enter a waffle famine and we should probably start praying for it to end.

 

Or should we? Though I like Eggos as much as anyone, I was surprised that the MSNBC article was actually able to find people who would be alarmed about the shortage (not to mention who plan to ration the waffles). I would easily consider Eggos breakfast junk food and though the shortage might not make me happy (I feel rather neutral about it), I have to admit that it’s probably only going to make people healthier.

 

Let’s consider: A single waffle has 90 calories, 35 of which are from fat. According to the box, a serving size is two waffles, so we have to double those proportions for a meal (somehow, two waffles are listed as having 190 calories however, not 180.) It’s also worth considering that most people are going to eat their waffles with butter and syrup, both of which have little or no nutritional value. That’s not to mention the various other preservatives and chemicals that go into pretty much all industrially produced foods.

 

On the other hand an apple typically has around 70 calories, very very few of which (if any) come from fat. According to this website an average banana might have 200 calories, but only 6 of those come from fat. Also, fresh fruit of any variety contains numerous vitamins, nutrients, and fiber that make up a healthy diet.

 

Now, Eggo waffles certainly aren’t the worst health offenders in the world, especially when it comes to processed food. However, they’re nowhere near as healthy as fruits, which also make great breakfasts and are among the only things easier to make than Eggos. In that light, it’s not hard to think of the Eggo shortage as less of a famine and more of a fortuitous turn of events. (At least for the country at large, if not for the workers who may be out of work until the plant reopens.) In any case, perhaps this will force people (including me) to return to a healthier lifestyle.

Wednesday, November 18, 2009

Men’s Fashion Tip For November

It’s cold and it’s time to choose a coat. Don’t make the mistake of choosing something that’s warm but ugly.

When choosing a coat it’s important to consider all the factors that have driven you to the store in the first place. If you’re buying a coat it’s probably cold outside, or about to get cold. However, while a coat that is functionally warm is important, it isn’t the only thing that matters. Just like any article of clothing, a coat says a lot about your personality, profession, avocations, etc. Accordingly, it's essential to balance function with form. What cuts flatter your body, for example? What colors are appropriate for the settings in which you’ll be wearing the coat? Which fabrics are going to be functional, affordable, and stylish? As you think about these questions keep in mind that surviving the cold may be the least important benefit of a coat. After all, when was the last you ate a meal solely because you needed the sustenance to survive? Thankfully most of the readers of this blog are not starving to death and they should likewise consider that a winter coat is only ostensibly about avoiding hypothermia.

My preference when it comes to winter coats is to go with something classic. Wool coats tend to be warm and going with something like a pea coat will almost always look sophisticated and urbane (ironically perhaps, given the pea coat’s origins as attire for sailors). There are also numerous coats out there that are based on the idea of a pea coat, but which offer varying degrees of modern flare. These kinds of coats also usually have the added benefit of coming in black or dark colors, which is an absolute must if you live in a place cold enough to actually need a winter coat.

On the other hand, one of the greatest fashion travesties of our day is the development of synthetic fabric coats, or perhaps more precisely, the way that these kinds of coats have developed. Though the 1970s and 80s saw some interesting designs that have moved into the realm of retro-chic, most modern synthetic coats seem to be crafted under the delusion bodies should be round and fashion is irrelevant. Perhaps the epitome of this phenomenon is Gore-Tex, which is theoretically a good idea but also almost always used in coats that are too puffy and unflattering. (Gore-Tex was appropriately parodied on this episode of Seinfeld.) While Gore-Tex is a stand-out offender, modern synthetic coats generally tend not to be well fitted. They also often have a baffling array of colors or shades of the same color. (Again, theoretically sound, but it always just ends up looking passé.) In the end, ask yourself a few basic questions about your coat: does it make overtures toward your body shape? If you plan to wear it anywhere besides your backyard, does it communicate the image you’ve been cultivating? How will it cohere with the rest of your wardrobe? What is it’s level of formality? (This is also the great weakness of non-wool coats: they're generally too casual for anything but leisure activities. A pea coat, on the other hand, looks great with jeans or business attire.)

Ultimately, if you happen to be riding a dogsled to the North Pole this year, go out and find a coat that was made using the latest and greatest thermal technology. However, if you’re a typical guy who will just be wearing your coat to as you walk to school or work (or maybe just out to your car), keep in mind that fashion is at least as important (if not more so) than function. And never forget, there is no fashion neutrality.

Saturday, November 14, 2009

Joaquin Neighborhood, Provo

A few years ago I lived in a house on 5th north in Provo. One of the most marking things that happened during my time in that house was the demolition of the Joaquin School, which was just across the street. It was pretty amazing to see an enormous excavator ripping whole trees out by the roots and knocking down three story walls in a single swipe.

 

As impressive as the destruction was, it was also fairly tragic. Though the Joaquin School itself hadn’t been used for a few years at that time, the very large grassy area around it was used as a park by the community. The property also had numerous large trees surrounding it. Those trees that were too big to simply pull out by the roots were cut down later. Bafflingly, the construction company even cut down the old trees that lined the strip of grass between the sidewalk and the street.

 

As sad as it was to see this parcel of land decimated, the plan was for a construction company to build a new student housing development on it. The housing development was supposed to provide living spaces for a large number of students, and, at least at the time, promised to include some new open grassy areas for people to use.

 

That was in 2006. Now, it’s 2009, and disappointingly the tragedy has been compounded by the fact that in three years the demolition is the only thing that has happened. According to this Daily Herald article, the construction company went into bankruptcy and things got stalled. It’s a familiar story in this recession ravaged world, but it also begs a number  of questions: why, for example, did they go in and destroy everything if they didn’t have the money to begin building? Why did they tear out and cut down trees that weren’t even on the main construction site and wouldn’t have been in the way until serious building began (if ever)? Why has everything that has taken place ended up seemingly like the company has a vendetta against the community and good things in general?

 

In all fairness, the company has done a few things. Since they demonlished a public space they have found enough money to A) erect a chain link fence around the property, B) spray paint “no parking” signs in big red lettering on the (public) sidewalks at certain points around the property, and C) mow down any vegetation that takes root, ensuring that all that remains visible is broken asphalt and dirt. These actions have left the area looking barren and fit for a post-apocalyptic movie shoot. Understandably tall dry grass poses a fire hazard in the summer, and if people were constantly entering the property one of the them could get hurt and sue. Still, the actions of the construction company have left the Joaquin Neighborhood to be characterized more by blight than by beauty.

 

Business and construction are complex things, and I don’t mean to overly simplify the issue (though I know I am), but it is important to consider the fact that while various entities involved have been wrangling the legality of their positions, it’s actual residents of the neighborhood who have paid the real price. It’s also probably safe to say that the parent company, Arrowstar Construction, and its president Wayne Ross, aren’t based in the Joaquin neighborhood. Their interest is profit, and it’s too bad that they behaved like little boys in a school sand box with no foresight. Hopefully they’ll get on the ball, because if I was a property owner in the Joaquin neighborhood each day I’d think more and more about a lawsuit.

Friday, November 6, 2009

Guy Fawkes’ Day


Why do people keep celebrating Guy Fawkes Day?

 

Like most Americans I had forgotten that the fifth of November is a day to celebrate a 17th century Roman Catholic British rebel who tried to blow up parliament. Surprisingly, however, a fair number of my friends on Facebook reminded me of that fact, posting all sorts of different Fawkes-related status updates. Though many of my friends are in college and need only the slightest excuse to throw a mid-autumn bonfire, the periennial resurgence of Fawkes’ name made me wonder why people keep returning to him.

 

In the United States Guy Fawkes went mainstream as a result of the film V for Vendetta, which tells the story of a Fawkes impersonator living in the near future who tries to destroy a dystopian British government. Though a few British history buffs probably knew about him before the movie, the majority of this year’s November fifth celebrations date back only to 2006 when the movie came out. A select few may date back to the 1980s when Alan Moore’s graphic novel on which the movie was based came out, but that’s hardly any closer to the 17th century. (If you’re one of the handful of Americans that knew about Fawkes before V for Vendetta pat yourself on the back for your erudition, but realize that that’s not evidence of Fawkes’ pre-1980s popularity.)

 

V for Vendetta is a great film (I haven’t read the graphic novel yet, but I hear it’s also good), but just because it’s well made doesn’t necessarily mean that watching it gives people any legitimate connection to Guy Fawkes. For all the typical viewer knows, Guy Fawkes could simply have been another fictional plot device. In other words, though Guy Fawkes happens to be real, he might as well have not been because almost no one (in the U.S.) knows anything about him beyond the movie.

 

Of course, the fact that most Americans who celebrate Guy Fawkes Day are doing so because of a movie isn’t a bad thing, but it is fairly rare. How many people do you know that watch a piece of media and suddenly initiate a holiday around that media? The closest phenomenon I can think of is Fesitvus, the made-up holiday popularized by Seinfeld. However, even Festivus celebrations are pretty rare these days and only happen among diehard fans, and then probably only because Seinfeld continues to air in syndication for several hours everyday. Accordingly, it’s surprising that three years after V for Vendetta—which was a hit but not an overwhelming one—it’s all-but-fictional holiday continues to receive recognition.

 

The robustness of Guy Fawkes celebrations suggests that there is something about the commemoration itself that fulfills people’s needs. If most American’s (at least initial) understanding of the day is based on V for Vendetta, then the day has rousing revolutionary origins. It’s about people overthrowing their oppressors and those celebrations that I’ve been to have played this aspect up to varying degrees. However, significantly, no one has ever actually begun a revolution after one of these celebrations. As far as I know, the movie also didn’t prompt much revolutionary action, despite the fact that it was considered a biting political allegory. (After watching the movie I felt ready to go fight a revolution myself, but within a few minutes that feeling had subsided. I had apparently got that feeling out of my system.)

 

My conclusion is that the revolutionary aspect of Guy Fawkes appeals to people because they are in some way dissatisfied with their (political) social structure, and that Guy Fawkes celebrations provide and outlet to vent such frustrations. Admittedly, a lot of people simply want to have a bonfire and burn an effigy, but it would be negligent to deny the connection between the celebration’s enactment and its historical connotation. (Whether historical in the sense of the original Guy Fawkes or V for Vendetta.) More insidiously, however, these celebrations also empower those very elements with which people are dissatisfied; much like watching the movie riled viewers up but also exorcised their need for revolution, so do Guy Fawkes celebrations.

 

It should be obvious where I’m going with this: subversion and containment. There is perhaps no other phenomenon that is a better illustration of subversive behavior being contained than Guy Fawkes Night activities. If Halloween illustrates this theory playing out with regard to people’s moral values, Guy Fawkes events are more socio-political, which is actually closer to Stephen Greenblatt’s original description of subversion and containment (in Elizabethan England).

 

Ultimately, these celebrations in the United States will probably disappear as people forget about V for Vendetta. However, their presence (along with that of the graphic novel and movie versions) reveals that there is indeed a revolutionary thread running through society. However, as long as people “remember, remember the fifth of November” that thread will likely continue to be contained. 

Thursday, November 5, 2009

Halloween: The Worst Holiday of the Year

Halloween was last week and while that probably made candy hungry children happy, the fact is that Halloween is actually the lamest holiday of the year.

 

The biggest reason that Halloween ranks near the bottom of my holiday list (in a distant last place after April fool’s day) is because of the dressing up. Now, many of my Facebook friends have recently posted status updates criticizing people’s tendency to use Halloween to look like sluts. Specifically, these friends have pointed out how many women use the holiday to look like sexy nurses, sexy witches, sexy maids, or other variations of sexiness. To this I would also add guys who choose virtually nothing on Halloween (albeit not always in a “sexy” way).


I concede that sexy/scanty costumes can unfortunately objectify the wearer, but I would also say that I’m not terribly bothered by them. If you want to look like a slut and objectify yourself, that’s your choice. What I do think is silly is the need for a holiday to do so. Why not just dress like that all the time? Or why not do so in conjunction with other holidays wear dressing up isn’t an end in itself. (What better to celebrate the fourth of July than with sexy nurses and guys in loincloths?)

 

Slightly more seriously, the persistence of dressing up seems to suggest that many people have latent or repressed desires that they express on Halloween. The prominence of sexual costumes indicates that many of those repressed desires are sexual, though other Halloween staples like zombies or robots equally represent social insecurities simmering beneath the surface of people’s psyche. It’s possible to look at Halloween in this context through the lens of Stephen Greenblatt’s theory of subversion and containment; people subvert social norms through their Halloween behavior, but such momentary subversion allows them to continue participating in those norms the rest of the time (a rough, inadequate summary of the theory, I admit). Still, it seems like it might be better to try to reconcile desire with belief. For example, instead of condemning sex or allowing technology to pervade our lives (while simultaneously expression trepidation about those issues through costume), we could figure out a lifestyle that would balance our values with our desires.

 

There are other reasons that Halloween doesn’t make sense as well. Take for instance, its complete detachment from its harvest origins. Since few people participate in harvests, or have any connection to agriculture, it makes little sense to celebrate a harvest holiday. In light of the kinds of costumes that people tend to wear, we could at least move it to a warmer month. (August is hot and doesn’t have an abundance of holidays.) In reality, however, the point is that the meaninglessness of Halloween actually detracts from the pleasure it brings. In Guy Debord’s book The Society of the Spectacle he includes a chapter called “Spectacular Time” in which he introduces a concept called pseudo-cyclical time. Basically he makes the point that society has moved to an industrialize, arbitrary, and falsely seasonal concept of time. Though he doesn’t get into Halloween, the celebration of such a holiday clearly contributes to the thesis that society may merely be a parody, and consequently less fulfilling, than it once was.

 

The reasons that Halloween is both outdated and inadequate could go on and on.  Hopefully, however, the notion of “holidays” like Halloween—where we get to take a break from our labor, standard behavior, and the values predicated on these things—will eventually become unnecessary.  

Wednesday, October 28, 2009

Men’s Fashion Tip For October

With its cool temperatures, brisk breezes, and cloudy skies, October is the perfect month for…accessories! If you’re a typical guy, the word “accessories” might conjure images of handbags, jewelry, or any number of items more commonly used by women. However, while those things are indeed accessories, there are also a number of men’s items that can add charm, sophistication, and class to an outfit. This month, given the cooler temperatures, I’ll focus specifically on hats (and implicitly on scarves as well). 

 

The primary use of hats and scarves, at least in cold weather, is probably to retain heat, so the first trick to choosing an accessory is to make sure it actually does what it’s supposed to.  That may seem obvious, but for the hat or scarf to fit the outfit it also has to look like it provides warmth.  For example, I recently saw a guy wearing a Panama hat on a cold and cloudy day. Because Panamas are typically worn in warm weather near coasts, they guy actually looked colder and more out of place than he would have with a bare head. The same could be said about many kinds of loose, thin scarves that are currently out there; they might look cool but before adding one to an ensemble make sure it looks autumnal or winter-esque too. This is a pretty basic rule, but we have to start somewhere. 

 

The next two rules for choosing a hat or a scarf go hand in hand: make sure your selection exhibits cohesion with the rest of the outfit, and make sure to bring a sense of history to your choice.  First, cohesion means ensuring that the accessory you’ve chosen blends well with the colors, cuts, and (most importantly) formality of the rest of the outfit. If matching colors and cuts should be fairly obvious, formality may not be. With the exception of the baseball cap, hats tend to dress an outfit up. That can be great, but keep in mind that wearing a beret or a flat cap with grubby wear can look incongruous. Though it’s not necessary to wear a suit and tie with most hats, it is worth trying to spread formal and informal pieces throughout the outfit so you don’t look like a casual dude with a bit of gentleman perched atop your head. 

 

Similarly, bringing a sense of history to your hat and scarf selections is essential.  Before choosing what kind of accessory you’re going to wear (or buy), ask yourself when that item was commonly worn. Most likely the item’s heyday was not today, due to the waning use of accessories and the increasingly casual nature of men’s fashion in the western world. However, depending on which time period you’re drawing from you can sometimes get away with a slightly anachronistic piece. 

 

For example, for a long time I felt that fedoras were a serious fashion faux pas. Though some famous people manage to look decent in a fedora from time to time, I usually saw them on Indian Jones fans’ head.  Fatally, these fans often pared their hats with t-shirts, or (worse) sneakers.  The problem with this combination isn’t simply that it illogically mixes layers of formality, but also that during the fedora’s golden age t-shirts were thought of as undergarments. Consequently, a fedora would never have been paired with a t-shirt, historically speaking, and if most people haven’t watched enough film noir to consciously know that, they will recognize a person who has no sense of history when it comes to choosing an accessory. 

 

Recently, however, I have seen a few (admittedly older) men wearing fedoras with more somber suits, and I was surprised at how sharp these men looked. I’m not sure that a young person could pull the same thing off (again, unless you’re a celeb on the red carpet), but the incident drove home the point that with the right outfit you can get away with a lot. 

 

Still, history is important and many kinds of hats only work as costumes. For example, unless you’re a magician, a flamboyant celebrity, or the star of your local production of A Christmas Carol, you probably shouldn’t be wearing a top hat. Assuming the wearer makes a careful selection, an unofficial rule of thumb I like to follow is this: the hat style you’re thinking of needs to have been common (a.k.a. not a costume) during the lifetime of someone who is alive right now. That gives you at least 70 or 80 years into the past to draw on, and potentially even more.

 

Picking an accessory is tricky, not least because wearing this type of clothing is increasingly becoming an anomaly.  However, asking yourself about an item’s functionality, formality, and historical origins should at least provide a basis to work from as you go out into the cold winter.  And remember, there is no fashion neutrality.   

Monday, October 19, 2009

MLA

A few days ago Laura and I discovered that we could look up many of Provo’s historic buildings online.  Even more surprising was the fact that one of our favorite old houses, the Reed Smoot House, we could actually look at the paper work that was filed in 1970’s that allowed the house to become an historical landmark. 

 

Among the many interesting things on this paper work were some research citations, which I was astounded to see were done with MLA formatting.  As an English major and then master’s student, I’m very familiar with MLA, having had to use it to cite all of my research.  Initially, then, I was vaguely pleased that I was able to immediately recognize MLA formatting on a document more than 30 years old.  Quickly however, I became even more astonished that MLA had changed so little in all that time.  Though there were slight differences, was basically the same.

 

This experience points out to me the idiotic backwardness of all things MLA today.  In the 30+ years since the Smoot House papers were filled out the world has experienced a digital revolution that arguably has changed the world as much as the printing press centuries before.  Texts themselves are different, people’s access to information is different, and the way we think about information is different.  Unfortunately (or fortunately if you’re a lazy old scholar), MLA has chosen to disregard all of those changes and pretend like typewriters are still an author’s weapon of choice.  This is somewhat akin to monks continuing to make illuminated manuscripts into, say, the 1600s; nice, but not really relevant anymore.

 

As a teacher of writing I often tried to get my students to understand that citation styles served a very real purpose.  Superficially, they obviously give different researchers a codified way of understanding and retracing others’ footsteps.  More importantly, however, they reflect the values of their discipline; each style implicitly makes an argument about what information is important, and where that information should go.  This, in turn, reflects the values of the people working in the particular discipline.

 

Today however, texts themselves are bucking the values that MLA embodies.  When was the last time you saw a website with an author?  Sure, certain websites include authors for particular articles, but who is the custodian of the entire cite?  Where is the website “published?”  How do you cite a Youtube video that has been embedded on a Facebook profile and was originally pirated from a DVD extra features disc?

 

The point is that in the digital age much of the information that MLA values is either nonexistent or irrelevant. That doesn’t mean that a reader doesn’t need any citation information, it just means that the information they need is different. Accordingly, why not just hyperlink everything?  It would be easier, less disruptive to the reading experience, and potentially much more fruitful.  Any old English professors who couldn’t handle this could be forced into early retirement for becoming irrelevant.

 

Obviously I’m fairly critical of MLA, but I say these things as someone who uses it all the time and who feels fairly comfortable with it.  I see its usefulness, but nearly all of the research that I’ve done would be better suited to a more contemporary citation style.  I also believe that the kinds of people who use MLA, people who are supposed to be on the cutting edge of cultural criticism and analysis, are stuck oddly in the past when it comes to documenting their research.

Thursday, October 15, 2009

Fictional Café: BLT

It’s been a while since I’ve posted anything new about Laura and Jim’s Fictional Café, so here are some new pictures.  This is the delicious BLT.  It’s extra delicious because Laura fries the bread in the bacon grease and because she puts some special sauces and dressings on it.  Needless to say, it is in no way healthy or vegetarian-friendly. 

 

However, recently, Laura has tried making it somewhat less likely to clog an artery.  She’s taken to grilling the bread on the George Foreman, and, as you may notice in the second picture, we’re using turkey bacon.  I have to say, these changes don’t necessarily make the sandwich taste better, but they might make it sell better in a café where people want to eat healthily.  The result, then, is that if someone were to order this sandwich that person would be given the choice of bread-grilling styles, bacon types, and sauces.  


Wednesday, October 14, 2009

Picking Apples (Pictures)

I thought I'd end the "Picking Apples" posts with Part 3, but I figured I'd post a few pictures.  All of these pictures are of apples we picked from the second tree (see "Part 1").  Also, there are a bunch that wouldn't fit in bags, so I never really got a good picture of all the apples together.  These pictures probably represent between 1/3 and 1/2 of all the apples we've picked.