Wednesday, October 28, 2009

Men’s Fashion Tip For October

With its cool temperatures, brisk breezes, and cloudy skies, October is the perfect month for…accessories! If you’re a typical guy, the word “accessories” might conjure images of handbags, jewelry, or any number of items more commonly used by women. However, while those things are indeed accessories, there are also a number of men’s items that can add charm, sophistication, and class to an outfit. This month, given the cooler temperatures, I’ll focus specifically on hats (and implicitly on scarves as well). 


The primary use of hats and scarves, at least in cold weather, is probably to retain heat, so the first trick to choosing an accessory is to make sure it actually does what it’s supposed to.  That may seem obvious, but for the hat or scarf to fit the outfit it also has to look like it provides warmth.  For example, I recently saw a guy wearing a Panama hat on a cold and cloudy day. Because Panamas are typically worn in warm weather near coasts, they guy actually looked colder and more out of place than he would have with a bare head. The same could be said about many kinds of loose, thin scarves that are currently out there; they might look cool but before adding one to an ensemble make sure it looks autumnal or winter-esque too. This is a pretty basic rule, but we have to start somewhere. 


The next two rules for choosing a hat or a scarf go hand in hand: make sure your selection exhibits cohesion with the rest of the outfit, and make sure to bring a sense of history to your choice.  First, cohesion means ensuring that the accessory you’ve chosen blends well with the colors, cuts, and (most importantly) formality of the rest of the outfit. If matching colors and cuts should be fairly obvious, formality may not be. With the exception of the baseball cap, hats tend to dress an outfit up. That can be great, but keep in mind that wearing a beret or a flat cap with grubby wear can look incongruous. Though it’s not necessary to wear a suit and tie with most hats, it is worth trying to spread formal and informal pieces throughout the outfit so you don’t look like a casual dude with a bit of gentleman perched atop your head. 


Similarly, bringing a sense of history to your hat and scarf selections is essential.  Before choosing what kind of accessory you’re going to wear (or buy), ask yourself when that item was commonly worn. Most likely the item’s heyday was not today, due to the waning use of accessories and the increasingly casual nature of men’s fashion in the western world. However, depending on which time period you’re drawing from you can sometimes get away with a slightly anachronistic piece. 


For example, for a long time I felt that fedoras were a serious fashion faux pas. Though some famous people manage to look decent in a fedora from time to time, I usually saw them on Indian Jones fans’ head.  Fatally, these fans often pared their hats with t-shirts, or (worse) sneakers.  The problem with this combination isn’t simply that it illogically mixes layers of formality, but also that during the fedora’s golden age t-shirts were thought of as undergarments. Consequently, a fedora would never have been paired with a t-shirt, historically speaking, and if most people haven’t watched enough film noir to consciously know that, they will recognize a person who has no sense of history when it comes to choosing an accessory. 


Recently, however, I have seen a few (admittedly older) men wearing fedoras with more somber suits, and I was surprised at how sharp these men looked. I’m not sure that a young person could pull the same thing off (again, unless you’re a celeb on the red carpet), but the incident drove home the point that with the right outfit you can get away with a lot. 


Still, history is important and many kinds of hats only work as costumes. For example, unless you’re a magician, a flamboyant celebrity, or the star of your local production of A Christmas Carol, you probably shouldn’t be wearing a top hat. Assuming the wearer makes a careful selection, an unofficial rule of thumb I like to follow is this: the hat style you’re thinking of needs to have been common (a.k.a. not a costume) during the lifetime of someone who is alive right now. That gives you at least 70 or 80 years into the past to draw on, and potentially even more.


Picking an accessory is tricky, not least because wearing this type of clothing is increasingly becoming an anomaly.  However, asking yourself about an item’s functionality, formality, and historical origins should at least provide a basis to work from as you go out into the cold winter.  And remember, there is no fashion neutrality.   

Monday, October 19, 2009


A few days ago Laura and I discovered that we could look up many of Provo’s historic buildings online.  Even more surprising was the fact that one of our favorite old houses, the Reed Smoot House, we could actually look at the paper work that was filed in 1970’s that allowed the house to become an historical landmark. 


Among the many interesting things on this paper work were some research citations, which I was astounded to see were done with MLA formatting.  As an English major and then master’s student, I’m very familiar with MLA, having had to use it to cite all of my research.  Initially, then, I was vaguely pleased that I was able to immediately recognize MLA formatting on a document more than 30 years old.  Quickly however, I became even more astonished that MLA had changed so little in all that time.  Though there were slight differences, was basically the same.


This experience points out to me the idiotic backwardness of all things MLA today.  In the 30+ years since the Smoot House papers were filled out the world has experienced a digital revolution that arguably has changed the world as much as the printing press centuries before.  Texts themselves are different, people’s access to information is different, and the way we think about information is different.  Unfortunately (or fortunately if you’re a lazy old scholar), MLA has chosen to disregard all of those changes and pretend like typewriters are still an author’s weapon of choice.  This is somewhat akin to monks continuing to make illuminated manuscripts into, say, the 1600s; nice, but not really relevant anymore.


As a teacher of writing I often tried to get my students to understand that citation styles served a very real purpose.  Superficially, they obviously give different researchers a codified way of understanding and retracing others’ footsteps.  More importantly, however, they reflect the values of their discipline; each style implicitly makes an argument about what information is important, and where that information should go.  This, in turn, reflects the values of the people working in the particular discipline.


Today however, texts themselves are bucking the values that MLA embodies.  When was the last time you saw a website with an author?  Sure, certain websites include authors for particular articles, but who is the custodian of the entire cite?  Where is the website “published?”  How do you cite a Youtube video that has been embedded on a Facebook profile and was originally pirated from a DVD extra features disc?


The point is that in the digital age much of the information that MLA values is either nonexistent or irrelevant. That doesn’t mean that a reader doesn’t need any citation information, it just means that the information they need is different. Accordingly, why not just hyperlink everything?  It would be easier, less disruptive to the reading experience, and potentially much more fruitful.  Any old English professors who couldn’t handle this could be forced into early retirement for becoming irrelevant.


Obviously I’m fairly critical of MLA, but I say these things as someone who uses it all the time and who feels fairly comfortable with it.  I see its usefulness, but nearly all of the research that I’ve done would be better suited to a more contemporary citation style.  I also believe that the kinds of people who use MLA, people who are supposed to be on the cutting edge of cultural criticism and analysis, are stuck oddly in the past when it comes to documenting their research.

Thursday, October 15, 2009

Fictional Café: BLT

It’s been a while since I’ve posted anything new about Laura and Jim’s Fictional Café, so here are some new pictures.  This is the delicious BLT.  It’s extra delicious because Laura fries the bread in the bacon grease and because she puts some special sauces and dressings on it.  Needless to say, it is in no way healthy or vegetarian-friendly. 


However, recently, Laura has tried making it somewhat less likely to clog an artery.  She’s taken to grilling the bread on the George Foreman, and, as you may notice in the second picture, we’re using turkey bacon.  I have to say, these changes don’t necessarily make the sandwich taste better, but they might make it sell better in a café where people want to eat healthily.  The result, then, is that if someone were to order this sandwich that person would be given the choice of bread-grilling styles, bacon types, and sauces.  

Wednesday, October 14, 2009

Picking Apples (Pictures)

I thought I'd end the "Picking Apples" posts with Part 3, but I figured I'd post a few pictures.  All of these pictures are of apples we picked from the second tree (see "Part 1").  Also, there are a bunch that wouldn't fit in bags, so I never really got a good picture of all the apples together.  These pictures probably represent between 1/3 and 1/2 of all the apples we've picked.  

Monday, October 12, 2009

Picking Apples (Part 3)

If I ate nothing but the apples that Laura and I recently picked I bet I could live for a month without really ever feeling hungry.  (I’m sure I’d feel a lot of other things with that kind of diet, but the point is that I’d survive).  The thing is, I don’t really need all these apples to survive; though I’m not rich I’ve always had enough money to buy food.  That means that the apples are all extra food.  Nice, but not necessary. 


As I’ve thought about the vast amount of extra food that I’ve acquired in the last few days I’ve found my thoughts turning to people who don’t have food at all.  As I understand it, there are many parts of the world plagued by hunger, malnutrition, and famine.  Accordingly, while I picked apples I began to wonder why we couldn’t send all our surplus fruit to those places.


There are, of course, practical problems with sending something like apples to places like Africa.  Though it could probably be done, preserving the fruit for that long would probably be inefficient and costly (and other kinds of food would be much better to ship).  Likewise, planting apple trees in locations where there are problems with hunger would also not work because of the climate.  Thus, my conclusion was that the apples I picked could not realistically contribute to ending world hunger.


Or could they?  While there are many ways to alleviate world hunger without local apples, the fact that they may provide extra food leads me to believe that they might contribute by freeing up funds that would otherwise be spent on food.  So, for example, if I’m able to eat apples for breakfast for the next week, I have theoretically saved some money by not buying my normal food, which could then be donated to those who don’t have food (a familiar idea to my fellow Mormons out there). 


While my own charitable contributions could no doubt help many people, I can’t help but wonder if this idea could make a huge dent in world hunger.  Let’s estimate that the typical city block has 300 residents (that’s way too high if you live in the suburbs but probably low if you live in a high rise).  Now, lets imagine that each of those people ate two apples a day for breakfast for a week.  If each person normally spent one dollar on break fast, that means each person has saved seven dollars by eating apples.  Collectively, the block has saved $2100.  Now imagine if most of the blocks in a given city tried that.  Imagine if most of the cities in the United State tried it.  Very quickly there would be millions (or probably billions) of dollars.


This plan would be incredibly easy to implement.  300 people eating two apples a day for seven days would require 4200 apples.  Lets say each tree that was planted ended up being really scrawny and only produced 300 apples.  That would mean that the block of 300 people would need 14 trees.  The typical residential city block has more than 14 yards, so that would be fewer than one tree per yard.  In reality, however, each block would probably need far fewer trees (though I don’t see why they wouldn’t want even more).  Obviously it would take a few years for the trees to produce, but when they did the results could literally change the world.   


As was the case with my previous apple picking post, I know that this boils a complex problem down to an oversimplified solution.  I’m also not under the delusion that it is about to happen.  Still, it should more or less work.  It would require innovation and experimentation and would only bring about change over a long time, but it points out that the possibility of solving world problems actually exists.  My apple picking experiences have reminded me, then, that what we (myself especially) lack is simply the will to do even the simplest things to make a difference.

Saturday, October 10, 2009

Picking Apples (Part 2)

My recent apple picking experiences have led me to ponder the roles of urban vegetation.  These experiences have also reminded me of a presentation I attended a while ago at the old Gallery 110 in Provo.  The presentation was on urban gardening and while I felt the presenter focused too much on symbolic acts of urban guerrilla environmentalism that ultimately couldn’t effect any change, it was a useful experience for pointing out that most city plants exists only for beautification purposes.


The apple trees that Laura and I recently found and harvested are a good example of this fact.  The trees provide shade and look nice, and for most people that’s apparently all they’re good for.  Similarly, Provo and BYU campus have numerous plum trees all over, but they’re trimmed to minimize how much fruit they produce because the fruit is seen as a nuisance that dirties sidewalks and streets.  And of course, most lawns, trees, and bushes in urban environments bear no fruit at all and simply exist for people to enjoy. 


I appreciate plants that beautify, but picking bag after bag of apples recently has led me to consider more seriously why urban vegetation can’t be visually pleasing and productive.  Would Provo’s plum or apple trees be any less pretty, for example, if they were pruned to maximize their fruit yield instead of being trimmed to eliminate it?  As I run and walk the streets of Provo (and, in the past, other cities I’ve lived in) I always notice new saplings planted in parks and yards.  Occasionally these are fruit trees, but usually they aren’t.  Why is that?  What is to be lost by planting fruit trees more often than simply “attractive” trees?  There may be slightly more research required to maintain a fruit tree (though any nursery worker should be able to answer questions), and of course someone will eventually have to make the effort to pick the fruit, but those seem like small prices to pay for the reward of having delicious homegrown food (that is also essentially free).  


There are opportunities to turn urban settings into productive gardens right now.  Again, Provo’s fruit trees provide a good example.  Right now someone has to go around and trim them, so if they’re putting forth the effort anyway why not spend a little more time and prune them for their fruit?  Of course, this would take some extra training and know-how, but the fruit itself could easily offset the cost.  For example, cities could plant fruit trees along boulevards and pedestrian paths, and then charge local residents a small fee to come and pick the fruit on certain days of the year.  It’d be like going to a farmers market, except that the “venders” (i.e. the trees) would be spread throughout the city and people would get to pick the fruit themselves.  Cities could make a little extra cash, people would get fresh organic food, and there wouldn’t be rotting undersized fruit all over sidewalks.  Everybody would win.


I know that’s a simplified assessment of the situation, and I’m not advocating some kind of public orchard system.  Instead, I think that urban vegetation could serve many communities much better than it currently does.  I don’t think (as some do) that every square inch of dirt in a city needs to be planted with carrots and lettuce (though I’m not opposed to that idea), but it seems that many communities currently put forth a lot of effort to maintain their plants.  Why not focus that effort so it produces both beauty and fruit?

Picking Apples (Part 1)

A couple of weeks ago Laura and I went on a walk and passed an apple tree in front of a small apartment complex.  We probably wouldn’t have noticed it, except that the ground around the tree was covered in decomposing apples.  When we looked up we were surprised to see that the tree was still full of fruit.  Laura picked one and tasted it, and when she said it was delicious we decided to come back with bags and pick more.  After all, we figured, the number of apples on the ground suggested that no one else was interested in them. 


Because the apples weren’t necessarily ours, we felt the need to be sneaky.  The first time we went back there was a guy in front of the apartments barbequing, so we pretended we weren’t there to steal his apples and kept walking.  Then, later that night after it was dark, we stealthily returned and proceeded to pick more apples than we knew what to do with.


Just about the time our many bags began to overflow the barbeque guy came back out and gave us a suspicious look.  I don’t blame him; Laura was ten feet up in the tree and I was down below catching apples in my messenger bag.  However, what he said surprised me.  Instead of asking us to leave he told us that he “wouldn’t eat the apples.  Box elders got into them.  Take them if you want, but eat them at your own risk.” 


Now, it’s possible the guy just wanted us to leave and so he made up an excuse to scare us away.  However, if that was his goal he failed because Laura had already eaten some of the apples and was just fine.  Also, I had never heard of anyone dying or becoming sick due to box elders.  Nevertheless our bags were full (we probably had a bushel or more), so we left.


Since then I’ve tried to look up the dangers of box elders, but to no avail.  More importantly, Laura and I have made numerous loaves of apple bread, apple pies, and eaten many raw apples with no negative repercussions.  If we found a bug-infested apple, we threw it in the compost pile (which is also a secret that I’m hoping will totally decompose before the landlord notices it). 


In the end I’m left to wonder why the barbeque guy was afraid of the apples in front of his apartment (if he was serious about them being bad, which I believe he was).  Since then Laura and I have actually found another apple tree at another apartment complex and picked even more apples.  (In fact, we picked at least twice as many from that tree and had to make two arduous trips to carry them all home.)  Like the first tree this one was surrounded by fallen, decomposing apples.  Clearly, no one wanted them and I felt like it almost as my duty to pick all the remaining good apples and eat them so they wouldn’t go to waste.  While I’m glad I benefited from these trees, there was more than enough for everyone in the vicinity, if anyone had cared to look. 


While I’ll explore some of the larger thoughts I had about this experience in subsequent posts, the most basic conclusion that I’ve come to is to not be afraid to take a risk (in this case that risk was the very slight chance of becoming sick from eating a bad apple, but I suppose there are broader applications to that idea, which I’ve commented on before).  The other basic idea that I think emerges here is that there are free, delicious, and abundant resources all around us all, ripe for the taking.  

Friday, October 9, 2009

Broadcast Journalism is a Joke

If you have a Netflix subscription you may have noticed the recent ad on the envelop for CNN’s HLN.  The ad features headshots of three women—Jane Velez-Mitchell, Nancy Grace, and Joy Behar—with information about their respective shows.  As ads go, it’s an unremarkable bit of information that probably won’t prompt many Netflix users to tune into CNN.


What is remarkable, however, is that the ad it looks like it should be plugging a movie parody of broadcast journalism, rather than the real thing.  Most obviously, all the women are wearing dramatic makeup and have huge hair.  If style is admittedly a matter of personal taste, I’d at least expect news anchors to go with something that doesn’t look like a fashion disaster from 1995.  If that wasn’t enough, the photos of the women have actually been heavily airbrushed and doctored.  If there was a man in the midst of these pictures we could compare the doctoring between genders and perhaps come to a conclusion about the media’s biased portrayal of women.  That isn’t the case, however, and instead the pictures look like glamour shots of washed up 1970’s porn stars.


Compounding the awful headshots is the ad’s general failure to establish a serious aesthetic through its design choices.  For example, the bold, sans serif font along the top is a little too bold.  It’s also too “tall” as a font and gives the impression that the words are begins screamed, rather than simply asserted.  Likewise, the logo for Jane’s show, “Issues with Jane Velez-Mitchell,” looks like its advertising NASCAR, not news.  Nancy Grace and Joy Behar’s respective fonts aren’t any better and convey a “daytime talk show” aesthetic rather than an evening news program.  As I look over the ad I’m finally struck by the fact that even the worst visual assignment I ever received as a teacher of rhetoric was more successful. 


While the ad is annoying, the real problem is that it serves as a kind of diagnostic on the state of TV news.  The programs it advertises begin at 7 PM and end at 10 PM; that’s primetime when CNN should have its highest ratings.  Accordingly, it would make sense that CNN would air its most important, serious material during that time.  Instead, I can only assume from this ad that CNN airs a lightweight combination of opinion editorializing and infotainment. 


Consequently, culpability for this absurdly ineffective ad does not lie solely with the women it depicts nor the inept graphic designers who put it together.  Instead, the entire profession of broadcast journalism should be blamed.  The ad suggests that “news” really isn’t the focus of the news.


While that’s all fine (after all, networks do have to get ratings), it seems odd that TV news has continued its march toward entertainment when real journalism in newspapers, on the radio, or online, is literally collapsing all around us.  Some of the country’s oldest and most venerable newspapers have folded this year, and there are tens of thousands fewer reporters now than before the recession.  That means that there are fewer outlets for people to read legitimate journalism.  If TV news like the kind on my Netflix envelop can be written off as a joke (which I believe it absolutely can), that means that there is simply less journalism out there and people are less informed. 


Obviously the news world is changing right now, but it’s unfortunate that in that environment CNN is either incapable or unwilling to advertise serious reporting.  There are many people in this country that would like to be informed about world events (and who need to be informed), but if this recent Netflix ad proves anything it’s that TV has completely failed as a part of the fourth estate.

Saturday, October 3, 2009

Asking the Right Questions

A few days ago I had an article in Rhombus about gay rights and Mormons.  I'm just going to put a link here in case anyone isn't on facebook (and therefore didn't see my link there) or simply hasn't seen it.  I usually plan on having different content for my blog and the stuff I submit to Rhombus (and elsewhere), but this is a topic that I think warrants discussion.

So, you can read that article here.  

Also, a number of people left comments on the Rhombus website.  Some are intelligent, a number are less so, but it'd take forever if I tried to respond to all of them in one sitting on the Rhombus website.  Therefore, I will occasionally try to respond to them here, on this blog.