Monday, November 30, 2009

November Fashion Follow-Up

Re-reading my November fashion tip I was struck by the fact that it seemed to suggest that the only good coats for winter were wool pea coats. While that type of coat (as well as coats based on it) are certainly cool and functional, there are also many other kinds of coats that can be worn effectively and stylishly.

 

One type of fabric that I personally have come to appreciate is canvas. For example I have a green, lightweight canvas jacket that I got from H&M. It’s not jean material, but it’s 100% cotton and has kind of a quasi-military look (except with a flattering, double vented-cut). Though I purchased this coat on a study abroad in England two and a half years ago, all the H&Ms I’ve been to have had equally stylish, if different, cotton jackets.

 

Cotton, of course, is not as warm as wool and it doesn’t wick water away like wool does. However, a cotton coat can be cheaper than a wool coat and if you’re just using it for short periods outside it shouldn’t be a big deal. My H&M jacket is definitely more autumn attire, but I still wear it during the winter, just with more layers.

 

The point here is that in addition to wool coats there are number of other things that work. Ideally, everyone would own numerous pieces for every possible occasion. Since that isn’t the case it’s a good idea to get a few coats and jackets that have some flexibility. My H&M jacket, for example, is decidedly less formal looking than my black coat. I can still wear it to work, but I can also wear it out to see concerts or just to hang out with friends. If I need something more professional or formal I’ll go with my coat.

 

Ultimately fabric choice is extremely important when choosing a winter coat. Find a vendor that has some variety and assess how you’ll be using the coat. In keeping with November’s fashion tip, I’d say that a pea coat is a good first choice because of it’s versatility, functionality, and style. However, if you’re branching out from there consider something that addresses a completely different need. 

Wednesday, November 25, 2009

No Fences In Cedar Hills

Laura and I have been spending a lot of time lately at my parents house in Cedar Hills, Utah. In the past, I've used Cedar Hills as a kind of poster city for the problems inherent in suburban living. While I continue to be shocked and disappointed at how damaging that lifestyle is (environmentally, socially, etc.), there is one unique thing that I really like about my parent's neighborhood: no fences.

When I first visited my parents home after they moved there in 2007 I was at once disappointed and pleasantly surprised. First, I was disappointed because the houses in the neighborhood weren't atypically interesting; though it didn't shock me, they're all pretty much like those any newish housing development. However, what amazed me then and impresses me now is that there are relatively few partitions between the homes. Each house technically has a yard, but for the most part they all run together. The result is that the houses all kind of feel like they're in a park. This also has the added benefit of making the fairly small yards feel spacious. 

Besides making people believe their yard is bigger than it actually is, there seem to be a number of benefits to not having fences. For example, it makes getting from point A to B much faster and more convenient; instead of going around the houses (on the frustratingly winding streets) a person could just cut through all the open yards. (Some people have tried to stop others from doing this, which is really unfortunate.) This also has the potential benefit of encouraging people to walk places instead of drive because the distance can be so much shorter on foot. (And I shouldn't have to mention all the reasons why walking is superior to driving.)

More abstractly, a lack of fences seems to be an argument for community interaction. While suburban architecture seems specifically designed to separate (and therefore isolate, and thus alienate) people, breaking down the barriers between properties symbolically breaks down the barriers between people. Inevitably residents will see and interact more often with their neighbors if there are no fences. I don't think it's a far stretch to suggest that that interaction will engender greater empathy and interest among residents. Just last week while at my parents house I noticed how I could actually see into the neighbor's house at night (and presumably they could see into my parents house). Admittedly, that raises questions about voyeurism and the like, but it also served as a reminder that there were people all around me. I wasn't in an isolated little island of personal space. I was sharing an environment with other people. As a non-resident of the area I don't really have a relationship with the people in the neighborhood, but the lack of fences nevertheless seems to be a promising way to combat the social entropy that some claim is inherent in suburbia. 

Unfortunately, fences are slowly going up in Cedar Hills. Some seem to be designed to keep pets in, while others look like they're simply to prevent people from cutting across the yard. Though I hope it doesn't happen, I wouldn't be surprised if in five or ten years all the houses in the area had fences. Still, keeping the neighborhood open is a good idea. Suburban living is perhaps the least efficient and socially conscious lifestyle out there, but if tearing down fences doesn't make anything greener it at least encourages people to see themselves first and foremost as members of a community. Ultimately then, in this case and unlike other suburbs across the nation, Cedar Hills has figured something out that surprisingly begins chip away at the problems of suburban life. 

Friday, November 20, 2009

Writing Negative Reviews

This week I went back and read Electron Deception’s negative show review from April (sorry, I'm not linking to that). I was really trying to see if any of our efforts to drive it down in the Google search results had worked (they hadn’t), but it got me thinking about the purpose of reviewing things, especially local things.

 

Ultimately, as I’ve considered this topic, I believe that writers who decide to cover local events should actually avoid giving negative reviews, at least initially. If communities are going to thrive and produce things (arts, restaurants, etc.) people have to actually go out and support those things; negative reviews, on the other hand, prompt people to avoid local activities and can contribute to increased isolation. Taken to their extreme, they also eventually kill off local endeavors and leave people with few options other than chain restaurants, generic music, and mediocre art.

 

I’m not suggesting here that writers simply lie and say that everything local is good. Instead, I would hope that when covering a community, they focus on the positive. If a restaurant, musician, etc. isn’t great, simply chose to cover something else. If at some point something becomes so big and so ubiquitous that members of the community have begun weighing on it with various opinions, then it might be time to start writing honest, hard-hitting reviews. I know that’s a nebulous distinction to make, but the vast majority of local entrepreneurs and artists aren’t becoming rich, famous, and powerful. Instead, they’re struggling to make ends meet and a single negative review can sometimes force them out of business.

 

Of course, negative reviews may prompt people to improve whatever it is they’re trying to sell. When Electron Deception got our scathing review we genuinely tried to address some of it’s more legitimate points. On the other hand, most of our efforts have never been heard by anyone; our shows often don’t have huge attendance and the review has certainly not done anything to improve that. In fact, despite our efforts, we think the negative review may have caused some people to not come see us, even after we’ve changed our sound considerably. If the people who wrote the review had wanted to improve the music scene they could simply given their suggestions to us privately, let us work on them, and waited to see what had happened. If they had done that we would have been happier of course, but local venues might have made more money when we played there (because more people would have wanted to come out) and parties would have been livelier (for the same reason). However, as it was, the negative review only dampened the music scene across the board.

 

Electron Deception is just one small band and few people will probably remember us, but I think this incident is illustrative of the anti-community effects of negative reviews generally. If a writer, blogger, or journalist decides to condemn a local theater production or restaurant, for example, people will go elsewhere. If you live in New York where there are hundreds (or thousands) of local performances and food establishments all the time this probably isn’t as big a deal. However, if you live in a mid-sized town (like Provo, Utah, for example) people don’t necessarily have other options. A scathing review of a single local theater production can thusly put an end to all or most of local theater generally. A bad review of a local restaurant can drive people to chains (which typically serve terrible food). In both situations innovation is stifled, local economies suffer, and people interact less with each other.

 

Journalists certainly don’t have the responsibility to advertise for local businesses and events. Journalistic integrity is important and should be defended. (I would argue, however, that that doesn’t apply to the blogosphere, where many local reviews take place. A negative review on a blog seems closer to cyberbulling than journalistic honesty to me.) Still, if we want to live in vibrant places where diversity and creativity abound, it’s important to ask ourselves if what we’re writing is going to bolster that community, or stifle it. 

Thursday, November 19, 2009

Eggo Waffles

In my freezer I have a big box of Eggo waffles. I like them. In fact, the reason the box is still in my freezer (several months after it was purchased) is because I keep saving them for a rainy day when I need something delicious to eat. Today however, I learned that my big box of Eggos is about to become something of a scarce and hot commodity. According to this article on MSNBC.com two factories that produce Eggos have recently had to shut down and one, in Tennessee, is still closed. What this means, of course, is that we’re about to enter a waffle famine and we should probably start praying for it to end.

 

Or should we? Though I like Eggos as much as anyone, I was surprised that the MSNBC article was actually able to find people who would be alarmed about the shortage (not to mention who plan to ration the waffles). I would easily consider Eggos breakfast junk food and though the shortage might not make me happy (I feel rather neutral about it), I have to admit that it’s probably only going to make people healthier.

 

Let’s consider: A single waffle has 90 calories, 35 of which are from fat. According to the box, a serving size is two waffles, so we have to double those proportions for a meal (somehow, two waffles are listed as having 190 calories however, not 180.) It’s also worth considering that most people are going to eat their waffles with butter and syrup, both of which have little or no nutritional value. That’s not to mention the various other preservatives and chemicals that go into pretty much all industrially produced foods.

 

On the other hand an apple typically has around 70 calories, very very few of which (if any) come from fat. According to this website an average banana might have 200 calories, but only 6 of those come from fat. Also, fresh fruit of any variety contains numerous vitamins, nutrients, and fiber that make up a healthy diet.

 

Now, Eggo waffles certainly aren’t the worst health offenders in the world, especially when it comes to processed food. However, they’re nowhere near as healthy as fruits, which also make great breakfasts and are among the only things easier to make than Eggos. In that light, it’s not hard to think of the Eggo shortage as less of a famine and more of a fortuitous turn of events. (At least for the country at large, if not for the workers who may be out of work until the plant reopens.) In any case, perhaps this will force people (including me) to return to a healthier lifestyle.

Wednesday, November 18, 2009

Men’s Fashion Tip For November

It’s cold and it’s time to choose a coat. Don’t make the mistake of choosing something that’s warm but ugly.

When choosing a coat it’s important to consider all the factors that have driven you to the store in the first place. If you’re buying a coat it’s probably cold outside, or about to get cold. However, while a coat that is functionally warm is important, it isn’t the only thing that matters. Just like any article of clothing, a coat says a lot about your personality, profession, avocations, etc. Accordingly, it's essential to balance function with form. What cuts flatter your body, for example? What colors are appropriate for the settings in which you’ll be wearing the coat? Which fabrics are going to be functional, affordable, and stylish? As you think about these questions keep in mind that surviving the cold may be the least important benefit of a coat. After all, when was the last you ate a meal solely because you needed the sustenance to survive? Thankfully most of the readers of this blog are not starving to death and they should likewise consider that a winter coat is only ostensibly about avoiding hypothermia.

My preference when it comes to winter coats is to go with something classic. Wool coats tend to be warm and going with something like a pea coat will almost always look sophisticated and urbane (ironically perhaps, given the pea coat’s origins as attire for sailors). There are also numerous coats out there that are based on the idea of a pea coat, but which offer varying degrees of modern flare. These kinds of coats also usually have the added benefit of coming in black or dark colors, which is an absolute must if you live in a place cold enough to actually need a winter coat.

On the other hand, one of the greatest fashion travesties of our day is the development of synthetic fabric coats, or perhaps more precisely, the way that these kinds of coats have developed. Though the 1970s and 80s saw some interesting designs that have moved into the realm of retro-chic, most modern synthetic coats seem to be crafted under the delusion bodies should be round and fashion is irrelevant. Perhaps the epitome of this phenomenon is Gore-Tex, which is theoretically a good idea but also almost always used in coats that are too puffy and unflattering. (Gore-Tex was appropriately parodied on this episode of Seinfeld.) While Gore-Tex is a stand-out offender, modern synthetic coats generally tend not to be well fitted. They also often have a baffling array of colors or shades of the same color. (Again, theoretically sound, but it always just ends up looking passé.) In the end, ask yourself a few basic questions about your coat: does it make overtures toward your body shape? If you plan to wear it anywhere besides your backyard, does it communicate the image you’ve been cultivating? How will it cohere with the rest of your wardrobe? What is it’s level of formality? (This is also the great weakness of non-wool coats: they're generally too casual for anything but leisure activities. A pea coat, on the other hand, looks great with jeans or business attire.)

Ultimately, if you happen to be riding a dogsled to the North Pole this year, go out and find a coat that was made using the latest and greatest thermal technology. However, if you’re a typical guy who will just be wearing your coat to as you walk to school or work (or maybe just out to your car), keep in mind that fashion is at least as important (if not more so) than function. And never forget, there is no fashion neutrality.

Saturday, November 14, 2009

Joaquin Neighborhood, Provo

A few years ago I lived in a house on 5th north in Provo. One of the most marking things that happened during my time in that house was the demolition of the Joaquin School, which was just across the street. It was pretty amazing to see an enormous excavator ripping whole trees out by the roots and knocking down three story walls in a single swipe.

 

As impressive as the destruction was, it was also fairly tragic. Though the Joaquin School itself hadn’t been used for a few years at that time, the very large grassy area around it was used as a park by the community. The property also had numerous large trees surrounding it. Those trees that were too big to simply pull out by the roots were cut down later. Bafflingly, the construction company even cut down the old trees that lined the strip of grass between the sidewalk and the street.

 

As sad as it was to see this parcel of land decimated, the plan was for a construction company to build a new student housing development on it. The housing development was supposed to provide living spaces for a large number of students, and, at least at the time, promised to include some new open grassy areas for people to use.

 

That was in 2006. Now, it’s 2009, and disappointingly the tragedy has been compounded by the fact that in three years the demolition is the only thing that has happened. According to this Daily Herald article, the construction company went into bankruptcy and things got stalled. It’s a familiar story in this recession ravaged world, but it also begs a number  of questions: why, for example, did they go in and destroy everything if they didn’t have the money to begin building? Why did they tear out and cut down trees that weren’t even on the main construction site and wouldn’t have been in the way until serious building began (if ever)? Why has everything that has taken place ended up seemingly like the company has a vendetta against the community and good things in general?

 

In all fairness, the company has done a few things. Since they demonlished a public space they have found enough money to A) erect a chain link fence around the property, B) spray paint “no parking” signs in big red lettering on the (public) sidewalks at certain points around the property, and C) mow down any vegetation that takes root, ensuring that all that remains visible is broken asphalt and dirt. These actions have left the area looking barren and fit for a post-apocalyptic movie shoot. Understandably tall dry grass poses a fire hazard in the summer, and if people were constantly entering the property one of the them could get hurt and sue. Still, the actions of the construction company have left the Joaquin Neighborhood to be characterized more by blight than by beauty.

 

Business and construction are complex things, and I don’t mean to overly simplify the issue (though I know I am), but it is important to consider the fact that while various entities involved have been wrangling the legality of their positions, it’s actual residents of the neighborhood who have paid the real price. It’s also probably safe to say that the parent company, Arrowstar Construction, and its president Wayne Ross, aren’t based in the Joaquin neighborhood. Their interest is profit, and it’s too bad that they behaved like little boys in a school sand box with no foresight. Hopefully they’ll get on the ball, because if I was a property owner in the Joaquin neighborhood each day I’d think more and more about a lawsuit.

Friday, November 6, 2009

Guy Fawkes’ Day


Why do people keep celebrating Guy Fawkes Day?

 

Like most Americans I had forgotten that the fifth of November is a day to celebrate a 17th century Roman Catholic British rebel who tried to blow up parliament. Surprisingly, however, a fair number of my friends on Facebook reminded me of that fact, posting all sorts of different Fawkes-related status updates. Though many of my friends are in college and need only the slightest excuse to throw a mid-autumn bonfire, the periennial resurgence of Fawkes’ name made me wonder why people keep returning to him.

 

In the United States Guy Fawkes went mainstream as a result of the film V for Vendetta, which tells the story of a Fawkes impersonator living in the near future who tries to destroy a dystopian British government. Though a few British history buffs probably knew about him before the movie, the majority of this year’s November fifth celebrations date back only to 2006 when the movie came out. A select few may date back to the 1980s when Alan Moore’s graphic novel on which the movie was based came out, but that’s hardly any closer to the 17th century. (If you’re one of the handful of Americans that knew about Fawkes before V for Vendetta pat yourself on the back for your erudition, but realize that that’s not evidence of Fawkes’ pre-1980s popularity.)

 

V for Vendetta is a great film (I haven’t read the graphic novel yet, but I hear it’s also good), but just because it’s well made doesn’t necessarily mean that watching it gives people any legitimate connection to Guy Fawkes. For all the typical viewer knows, Guy Fawkes could simply have been another fictional plot device. In other words, though Guy Fawkes happens to be real, he might as well have not been because almost no one (in the U.S.) knows anything about him beyond the movie.

 

Of course, the fact that most Americans who celebrate Guy Fawkes Day are doing so because of a movie isn’t a bad thing, but it is fairly rare. How many people do you know that watch a piece of media and suddenly initiate a holiday around that media? The closest phenomenon I can think of is Fesitvus, the made-up holiday popularized by Seinfeld. However, even Festivus celebrations are pretty rare these days and only happen among diehard fans, and then probably only because Seinfeld continues to air in syndication for several hours everyday. Accordingly, it’s surprising that three years after V for Vendetta—which was a hit but not an overwhelming one—it’s all-but-fictional holiday continues to receive recognition.

 

The robustness of Guy Fawkes celebrations suggests that there is something about the commemoration itself that fulfills people’s needs. If most American’s (at least initial) understanding of the day is based on V for Vendetta, then the day has rousing revolutionary origins. It’s about people overthrowing their oppressors and those celebrations that I’ve been to have played this aspect up to varying degrees. However, significantly, no one has ever actually begun a revolution after one of these celebrations. As far as I know, the movie also didn’t prompt much revolutionary action, despite the fact that it was considered a biting political allegory. (After watching the movie I felt ready to go fight a revolution myself, but within a few minutes that feeling had subsided. I had apparently got that feeling out of my system.)

 

My conclusion is that the revolutionary aspect of Guy Fawkes appeals to people because they are in some way dissatisfied with their (political) social structure, and that Guy Fawkes celebrations provide and outlet to vent such frustrations. Admittedly, a lot of people simply want to have a bonfire and burn an effigy, but it would be negligent to deny the connection between the celebration’s enactment and its historical connotation. (Whether historical in the sense of the original Guy Fawkes or V for Vendetta.) More insidiously, however, these celebrations also empower those very elements with which people are dissatisfied; much like watching the movie riled viewers up but also exorcised their need for revolution, so do Guy Fawkes celebrations.

 

It should be obvious where I’m going with this: subversion and containment. There is perhaps no other phenomenon that is a better illustration of subversive behavior being contained than Guy Fawkes Night activities. If Halloween illustrates this theory playing out with regard to people’s moral values, Guy Fawkes events are more socio-political, which is actually closer to Stephen Greenblatt’s original description of subversion and containment (in Elizabethan England).

 

Ultimately, these celebrations in the United States will probably disappear as people forget about V for Vendetta. However, their presence (along with that of the graphic novel and movie versions) reveals that there is indeed a revolutionary thread running through society. However, as long as people “remember, remember the fifth of November” that thread will likely continue to be contained. 

Thursday, November 5, 2009

Halloween: The Worst Holiday of the Year

Halloween was last week and while that probably made candy hungry children happy, the fact is that Halloween is actually the lamest holiday of the year.

 

The biggest reason that Halloween ranks near the bottom of my holiday list (in a distant last place after April fool’s day) is because of the dressing up. Now, many of my Facebook friends have recently posted status updates criticizing people’s tendency to use Halloween to look like sluts. Specifically, these friends have pointed out how many women use the holiday to look like sexy nurses, sexy witches, sexy maids, or other variations of sexiness. To this I would also add guys who choose virtually nothing on Halloween (albeit not always in a “sexy” way).


I concede that sexy/scanty costumes can unfortunately objectify the wearer, but I would also say that I’m not terribly bothered by them. If you want to look like a slut and objectify yourself, that’s your choice. What I do think is silly is the need for a holiday to do so. Why not just dress like that all the time? Or why not do so in conjunction with other holidays wear dressing up isn’t an end in itself. (What better to celebrate the fourth of July than with sexy nurses and guys in loincloths?)

 

Slightly more seriously, the persistence of dressing up seems to suggest that many people have latent or repressed desires that they express on Halloween. The prominence of sexual costumes indicates that many of those repressed desires are sexual, though other Halloween staples like zombies or robots equally represent social insecurities simmering beneath the surface of people’s psyche. It’s possible to look at Halloween in this context through the lens of Stephen Greenblatt’s theory of subversion and containment; people subvert social norms through their Halloween behavior, but such momentary subversion allows them to continue participating in those norms the rest of the time (a rough, inadequate summary of the theory, I admit). Still, it seems like it might be better to try to reconcile desire with belief. For example, instead of condemning sex or allowing technology to pervade our lives (while simultaneously expression trepidation about those issues through costume), we could figure out a lifestyle that would balance our values with our desires.

 

There are other reasons that Halloween doesn’t make sense as well. Take for instance, its complete detachment from its harvest origins. Since few people participate in harvests, or have any connection to agriculture, it makes little sense to celebrate a harvest holiday. In light of the kinds of costumes that people tend to wear, we could at least move it to a warmer month. (August is hot and doesn’t have an abundance of holidays.) In reality, however, the point is that the meaninglessness of Halloween actually detracts from the pleasure it brings. In Guy Debord’s book The Society of the Spectacle he includes a chapter called “Spectacular Time” in which he introduces a concept called pseudo-cyclical time. Basically he makes the point that society has moved to an industrialize, arbitrary, and falsely seasonal concept of time. Though he doesn’t get into Halloween, the celebration of such a holiday clearly contributes to the thesis that society may merely be a parody, and consequently less fulfilling, than it once was.

 

The reasons that Halloween is both outdated and inadequate could go on and on.  Hopefully, however, the notion of “holidays” like Halloween—where we get to take a break from our labor, standard behavior, and the values predicated on these things—will eventually become unnecessary.