Wednesday, March 19, 2014

Support is ending for Windows XP - Microsoft Windows

Support is ending for Windows XP - Microsoft Windows

Boo Hiss Microsoft.

Postscript.

This is apparently a more significant deal than I'd imagined.  Our tech guy at work tells us if we have XP, we better get to something else, one way or another, within the next couple of weeks.

The computer I'm on right now runs XP.   

Tuesday, March 18, 2014

The Rebirth Of Rye Whiskey And Nostalgia For 'The Good Stuff' & Beer and Prohibtion.

Always exploring the history of things, including social and material history, our eye was caught recently by a couple of items which relate 20th Century history, specifically the history of alcohol and Prohibition.  For example, there's this item:

NPR's "Salt" broadcast ran this recent item:  The Rebirth Of Rye Whiskey And Nostalgia For 'The Good Stuff' : The Salt : NPR

I'm not really a whiskey fan, but at least locally whiskey has been in the news a lot recently, and here we have this NPR example.  All in all, I think I've read that whiskey consumption is overall on the decline nationally, but given the news content, you'd not know that.

As noted, I'm not really a big whiskey fan.  Right now, we actually have, however, four bottles of different types of whiskey upstairs in the cupboard, probably a personal all time record.  We don't have a liquor cabinet, and don't need one, so the cupboard suffices, even if that oddly places the whiskey right next to the breakfast cereal.  We have so much because of Christmas, and we're likely to have the present four bottles for a really long time.  We have, respectively, Wyoming Whiskey, a bottle of Pendleton and a bottle of single malt Irish whiskey, so not only do we have a record amount, we actually have a record variety as well.  You can probably fairly easily tell by the novelty of this that we're not exactly living the "Mad Men" life around here.

In spite of not liking it much, I know something about it, and that's probably because of law school.  It isn't like we were living out the Pogues "Streams of Whiskey" there, but there was a single malt Scotch whiskey revival going on at the time, so we became exposed to it a bit then, and being inquisitive, I learned something about the makeup of whiskey at the time.*  It's sort of an interesting topic.

The big American whiskey is bourbon.  The reason for this is found in the history of transportation, oddly enough.  Bourbon is a corn based whiskey and it was distilled on the eastern Frontier early on.  While beer was really a staple during colonial times, hardy Frontiersmen distilled a lot of whiskey. Why?  Because it keeps better than corn on the cob does.  And it's relatively easy to transport in barrels, and there's always a market.  It wasn't, therefore, that frontier farmers were making thousands of gallons of "corn likker" to get sloshed, although there was some sloshing going on, but rather because it's easier to keep it in the barn than it is to keep a pile of corn. It doesn't attract mice either.  

Like with all things which people make, a simple necessity became an art, and bourbon was born.  It's been the American whiskey for probably around three centuries.

Frankly, I can't stand it as a rule.  Even the best bourbons generally taste like something that ought to be fueling a jet to me, but it's been what Americans mean by "whiskey" for a very long time.  And it's been in the news here recently as Wyoming now has its own distillery, which makes "Wyoming Whiskey".

A bottle of Wyoming Whiskey.

Wyoming Whiskey is a new brand of whiskey that's distilled in the tiny Hot Springs County town of Kirby.  It came about, according to what I've read, as the Meads purchased farm ground in the area in order to have a steady supply of corn for their cattle operation, and then hit upon the idea of distilling whiskey in the county.  Hot Springs County is otherwise famous for, well, hots springs, and is of course the location of Thermopolis, which features the same.

When Wyoming Whiskey was released, the first batch (there have been only two to date) was big news. To my huge surprise, my wife actually signed us up for two bottles. She doesn't even drink whiskey except on extraordinarily rare occasions, so it was quite a surprise.  But we ended up with two of the very first bottles.

I like it, to my surprise. But the public reaction has been interesting.  Whiskey Magazine rated it as first rate, which is interesting in part because up until I read that in the Casper paper, I didn't know that there was a Whiskey Magazine.  Who subscribes to that. . . and why?  Anyhow, their reviewer thought it great.  Amongst people I generally run into, however, it seems a lot of people hate it.

Why is that?  I don't know for sure, but I have my theories.  In part, Wyomingites are a hard sell on anything, and that may be a lot of it.  But I have also noticed, in talking to people, that the people who don't like it generally like bourbon, and people who do, like me, don't drink it much.  My suspicion is, therefore, that those people acclimated to bourbon, and who enjoy it, like the jet fuel nature of the taste. As I don't like bourbon, that's probably why I think Wyoming Whiskey is okay.  But if they have to rely on people like me to buy it, they're in big trouble, as the chances of me buying enough of it to be felt economically are nonexistent.  Anyhow, put another way, I think that bourbon drinkers expect bourbon to taste like bourbon, rather than the lower proof, milder, and softly minerally taste that this has.

Canadian Whiskey, I should note, is just blended bourbon.  Whiskeys are blended in order to take the harsh taste out of them, and blending is very common with all types of whiskeys.  Canada grows a lot of corn, and at some point, somebody must have hit upon the idea of borrowing American whiskey as a product. They probably did it, tasted the product and said something like "Ack!!!. . Grgemhph!  Eh?  Where's the water?"  So they blended it.

Unlike almost every bourbon, some Canadian Whiskeys I like.  Namely Royal Crown and Pendleton. That's it. The rest make me gag.  Again, it doesn't matter, as I buy so little that they don't care what I think, but those two aren't bad. And Pendleton, which is named after Pendleton Oregon, has a really neat bottle with a Steamboat like rider on it. Presumably the University of Wyoming, which owns that trademark, is making a few bucks off of that.

Royal Crown, by the way, is owned by the alcohol giant Diageo, which also owns Bushmills (Irish Whiskey), Guinness and a zillion other brands.

Bourbon basically got its start on the western slopes of Appalachia, and that's no surprise as that region was first settled by "Scots Irish", i.e., that demographic that immigrated from Ireland, but which were actually Scottish, placed in Ireland as a buffer in Ulster against the native Irish.  The Scots and the Irish both have a very long history of Whiskey distilling, and it's basically a Celtic concoction in the first place. So, they were simply using a process that they were already familiar with.  The word "whiskey" is itself a corruption of the Gaelic term uisce beatha/uisge beatha"  which means "water of life," sort of an odd description, if you think about it.

Scotch and Irish Whiskeys are very closely related, which is odd as Scotch is, in my view, horrid, while Irish whiskeys can be good, or can be horrid.  I think that this has something to do with the water. Both types are grain whiskeys, and can be made from any of the grass grains or a blend of them, but Scotch is made from bog water, and Irish Whiskey is made form water that flows from limestone sourced springs.  My personal theory is that this makes Scotch taste and smell like diesel fuel, as the water in Scotch peat bogs also has, well, peat in it. And, besides, anyone familiar with bogs knows that cows love bogs, and we all know, or should know, what cows love to do in bogs.  It explains a lot.

One of the grains that can be in Irish Whiskey or Scotch Whiskey is rye.  I did an item here on rye bread awhile back, which I really like, but I've never had Rye Whiskey.  An odd thing about Rye Whiskey, which relates to the theme of this blog, is that Rye Whiskey has a pretty bad reputation, but because of a historical event, that event being Prohibition.

As noted in the item above, Rye was actually a premium whiskey before Prohibition.  During Prohibition, however, bootleggers took up labeling bad whiskey as Rye in order to fraudulently peddle the bad stuff to people who remembered the good stuff. As a result, "Rye" came to be associated with nasty cheap booze, an reputation that came on fairly fast, which stuck up until recently.  Rye was such a shorthand for bad whiskey that Bill Mauldin had his Joe character, in the Up Front cartoon, joke that his "old woman" would be comforted by the fact that he had "give up rye whiskey and .10 cent ceegars", an ironic statement for an infantryman.  Recently, however, Rye has been making a comeback, the quality Rye apparently still being out there.

As I like rye bread I'd be curious if I like Rye Whiskey, but I'm too cheap to buy it, so I"ll have to keep wondering or be fortunate enough to be attending some social event where somebody serves it.  Liking rye bread probably doesn't translate into liking Rye in any event, as I like corn, but hate bourbon.

Related to the Prohibition story and Rye, Prohibition also did in breweries.  And here too there's both an interesting story, and interesting recent developments.

http://farm9.staticflickr.com/8496/8330981413_6441bfbd7f_o.jpg 
 Late 19th Century New York beer I've never heard of.  Apparently the plan in the picture is t drink a bunch of beer and then drive the cart, which is undoubtedly a very bad idea.

Beer has an even older presence in North America than whiskey because beer was a staple in the British Isles from some point in antiquity up until some point in the 20th Century.  And this was true not just of the British Isles, but an entire belt of countries in northern Europe. Basically north of the Rhine, and in the British Isles, up to the Baltic the average drink was Beer.  Below the Rhine it was wine.  Once you got out into Poland and Russia this was no longer true, and if there was a staple drink, I don't know what it was. Certainly a lot of vodka was being consumed out in those regions, but I don't think it would be as if people sat down to dinner and had a big heaping glass of vodka.  At least I hope not.  Beer was brewed in Europe everywhere, but as a staple its basically associated with these regions, and it's best from these regions.  Likewise, probably ever location in Europe ferments some wine, but it's associated with southern Europe for a reason.

A lot of the reason for that, by the way, is climatic.  So perhaps its not too surprising that the beer brewing also saw the development of some other spirits.  Anyhow, the English brought beer to North America.  Indeed, the Mayflower put in when it did not because that location seemed ideal, but because the ship had run out of beer, a genuine problem.

In the 19th Century there were a vast number of local breweries in the US.  I doubt very much that an accurate idea as to how many there were is known.  Prior to refrigeration for rail cars being worked out, which happened in the second half of the 19th Century, beer could not easily be shipped, so breweries needed to be local, or there was no beer.  Refrigeration in rail cars meant that beer could be shipped by rail for the first time, and shortly thereafter pasteurization of beer, a process of course worked out for milk, not beer, began to be employed which meant that beer could be stored for some time without refrigeration.  Light is the enemy of beer, and the dark bottle that's so familiar to everyone also played a role in beer storage, seeking to create a vessel that could store beer, allow the customer to see it, and also keep out the destroying elements of light.

Rail car refrigeration mean that beer could be transported long distances for the first time, and that gave rise to the first big breweries in the US, the Anhauser-Busch brewery in St. Louis being the first such example.  Nonetheless, all the way up to the Volstead Act in 1919, there were a lot of local breweries.  I don't know how many may have existed in Wyoming, or co-existed together at any one time, but at least Casper and Sheridan did have breweries.  Casper's pre Prohibition brewery was the Hilcreast Brewery, named after the Hilcrest spring which still provides cooler water for Casperites today.  None of the Wyoming breweries survived Prohibition.  Hilcreast's brewery building still stands, just as it did in 1919, being a three story brick building, but its an electronics store now.  When I was a kid, it was a potato chip plant, packaging Cook's Potato Chips, the kind we all bought locally.

 Trade card for Wiedemann Beer. This is a company that I've never heard of, but it turns out, they survived Prohibition, and they're still around.

It's widely claimed that Prohibition did in the quality of American beer and that when breweries re-emerged from Prohibition, the beer wasn't what it once was. There were certainly a lot fewer breweries and that any managed to survive is amazing.  Some did, however, and rapidly went back into brewing.  According to at least Europeans, American beer was pretty bad however, and real beer fans maintained that to also be the case, which made for a small market, up until the late 1970s, for import beers, which were regarded as very exotic.**  The trend toward brewing singularity actually increased after Prohibition ended, which is odd, in that the large commercial brewers began to purchase the smaller one, a trend which continues to this day, although they no longer tend to wipe out the distinctive natures of the individual breweries as they once seemed to.

This is because of the rise of the "micro brews."  Defining what a micro brew is; is difficult.  But some time in the late 1970s very small breweries began to develop with very distinctive beers in reaction to the blandness of American beers.  This started slowly, but after it got rolling, it really got rolling.  When I was a kid in the 1960s and 1970s, around here, the beers that you saw in the summer when men went fishing, etc., were Coors (really a regional beer), Olympia, Hamms and maybe Rainier.  Of course, Budweiser, which was and is the American giant (now owned by Belgian company) was around, but it seemed that at least amongst the men I knew, none of them ever drank it.  There were some other brands, of course, but those are the ones you tended to see.  Starting with Anchor Steam, however, small breweries began to make major inroads into the large brewers' markets, brewing beers with strong distinctive flavors and sometimes brewed with old fashioned, methods.  Anchor Steam, New Belgian, Odell, Sam Adams, and any other number of brewers rose up in this fashion, some becoming pretty big in the process, and there seems to be no end in sight to the revival of small breweries and the multiplicity of beer types.***  Recognizing a declining market when they see it, the big breweries have gotten into the act themselves and have come out with "micro brew" type beers, even though they're from big breweries.

Probably with that in mind, and returning to the them of our post here, Coors just recently introduced a beer that they claim is "Pre-Prohibition" style lager.  Being unable to pass up something which claims to be an historic exploration, I bought a six pack and then looked it up.  Indeed, it might at least partially answer the question that I had.  According to the information on the beer, the recipe for it was discovered by Coors' employees in Greeley in a part of their brewery they no longer use. That there is such a quarter in their brewery surprises me, but perhaps it shouldn't, as the Greeley brewery has long ago overlapped the walls of its original facility.  Anyhow, in finding  the old recipe, which dates to the immediate Pre-Prohibition era, they determined to make it.  At first they only offered it on tap, but now they're selling it in bottles.

One beer, of course, can't tell us what all beers were prior to the Volstead Act, but this one is revealing.  Coors has long been a major local beer here, and its not bad.  It's a really light beer, and so Coors was well positioned to move into the "light beer" market when it came about, although I've always wondered if that hurt their regular beer sales, which aren't much different.  But it's never been my favorite.  Their Pre-Prohibtion beer, sold as "Batch 19," on the basis that Prohibition came in that year, 1919, is much different.  It's stronger, in terms of alcohol content, and it has a lot more flavor.  I like it, but I suspect that it won't appeal to die hard Coors fans.  It might appeal, however, to micro brew fans.

If Batch 19 indicates what American beer was like prior to Prohibition, what we could take away from that is that at least some American beers were German style lagers but with a stronger taste. Sort of a collision between German lager and British lager.  For beer fans, therefore, the Volstead Act probably was sort of a small beer burning of the library at Alexandria, temporarily.

https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhreeZmsmi6Hl62n9bzvjEaoT8QcwaqgzuiFfHDP8Unb7-M6Lv4lmJuFWtN-ip3hAwMzIS4No4tfp3kSSS0nRqVt3dFJzuQ8Rd9WpuQHn7w6vJ9NmEcRvQPArv08o5e9vl-V8l0kYIlkVnT/s1600/dry.jpg
__________________________________________________________________________________

*Streams of Whiskey is one of several sodden tunes by the excellent Irish band the Pogues, which sadly no longer exists as a band.  The band celebrated a certain boozy view of things which undoubtedly would have disastrous effects on a person's health if actually followed, for example:
Last night as I slept
I dreamt I met with Behan
I shook him by the hand and we passed the time of day
When questioned on his views
On the crux of life's philosophies
He had but these few clear and simple words to say

I am going, I am going
Any which way the wind may be blowing
I am going, I am going
Where streams of whiskey are flowing
Not content to limit the commentary to whiskey, the song also provides:
Oh the words that he spoke
Seemed the wisest of philosophies
There's nothing ever gained
By a wet thing called a tear
When the world is too dark
And I need the light inside of me
I'll walk into a bar
And drink fifteen pints of beer

I am going, I am going
Any which way the wind may be blowing
I am going, I am going
Where streams of whiskey are flowing
More than one Pogues song was a modern, hard core, hard edge, Irish drinking song and the primary force behind the music, Sean MacGowan acquired a reputation as a hard drinker as a result.  It's interesting to note, therefore, that at least one interview of a close associate of MacGowan's has related that he did not, in this period, actually drink all that much, but that as a result of the music people insisted in buying the band drinks wherever they were.

While the Pogues no longer exist as a band, all the band members are still with us, suggesting that they didn't drink as much as the songs might suggest, and they have independent music careers.

**Having said that, complaints against American beer go all the way back to the colonial period, when British soldiers complained about he bad quality of American beer compared to English beer.

***I wonder if the micro brew explosion is beginning to run its course, however.  When it started, in the 1970s, the goal was "good beer."  Micro breweries still claim that as their goal, but in recent years a weird, and probably bad, trend has been going on where the exploration they're engaged in really is towards making stronger and stronger beers, alcohol content wise, which isn't the same as good beer.

For some reason its often missed that a lot of really excellent beers, particularly those of the British Isles, are very low alcohol content.  This makes sense to me, as the beer was brewed to be consumed in a pub, at a "session."  Beers of that type are called "session beers."  Session beers are very common British Isles beers, and are low alcohol as a rule.  Guinness Stout, for example, which defines "stout," is only a little over 3% alcohol.  It almost qualifies as a "light beer" by American standards.  Even the post Prohibition Coors, widely regarded as a classic American beer in some quarters, was pretty low in alcohol content in the classic "Banquet" variety.

German beers, on the other hand, have always been higher in alcohol content, for reasons that are completely lost to me. Even so, they probably rounded out somewhere in the 5% neighborhood.  Now, however, American microbreweries are rushing to brew what they call "IPAs,", or "Indian Pale Ales."  IPAs were a type of beer originally brewed by British breweries solely for consumption in India, and were shipped incomplete, with high alcohol contents and lots of hops, on the thesis that this would keep it from spoiling on the long, and extremely hot, trip to India.  At one time, however, it had some slight popularity in the UK when some accident required an unfinished batch to be sold on the docks when it couldn't be shipped.  It's become popular with microbreweries however, and so now they're all rushing to brew very bitter, very high alcohol content, and very icky beers.  This has expanded into other offerings, such as stouts, where high alcohol stouts are now offered as well, when historically, stouts are actually low alcohol.  This trend is taking micro-brews out of the "good beer" category into some weird high alcohol arms race, which may mean that they've about run their exploratory course, which was, perhaps, inevitable. That may mean, however, simply a return to the era of the local brewery.

----------------------------------------------------------------------------------------------------------------------------------
Epilogue:

This past week the extent to which local brewing has returned to Wyoming became apparent to me when I became aware of a couple of breweries or brewpubs I was previously unaware of.  The first couple were in Gillette, when I drove by one restaurant that advertised it was the home of a brewpub and then later that same day I walked pass a storefront on Gillette Avenue that advertised that it would soon be home to the Gillette Brewing Company.

Today, in the paper, the Wonder Bar, which has been around for decades, is advertising it's bar brewed beer, indicating that it is indeed brewing on the premises.  I knew, as indicated above, that it could, but I wasn't sure that it was.  It is.

Anyhow, quite a change.  Soon, it would appear, every substantially sized town in Wyoming is likely to have a brewpub.

Epilogue II

If this story references another which includes "nostalgia for the good stuff" perhaps some recollection of the bad stuff is also warranted, which is provided this week by a story in the Casper Star Tribune.  The Tribune reports:

Wyoming men who are alcohol-dependent earn about 5 percent less than co-workers who don’t have a problem with alcohol.
They also are somewhat less likely to be in the workplace at all.
These are two of the findings from a report compiled by the University of Wyoming Survey and Analysis Center for the Wyoming Department of Health.
The UW report concluded that alcohol is more of an economic burden on society in Wyoming than tobacco or drug abuse.
The study estimated that elimination of alcohol abuse would save $843 million a year, based on 2010 costs. Costs were for health care, lost productivity, crime and accidents.
Elimination of tobacco would save $689 million per year, and the elimination of illegal drugs $391 million per year. “Illness studies are routinely used by government agencies to justify and prioritize prevention, intervention, and research programs,” the report said.
Nanette Nelson, associated research scientist at the UW center, said she and her colleagues were surprised that alcohol was the most costly. “We thought we would see tobacco to be the front-runner,” she said.
As alcohol is a legal drug, it's easy to forget how much of a burden on society it really is.  It's also easy to forget that those advancing Prohibition, prior to 1919, were not wacky really. They had a valid point.  At that time, in a lot of places, the "saloon trade" was completely unregulated.  To open a bar, you just opened one.  We've never gone back to that.  Indeed, the impact of alcohol has been smaller post Prohibition than it was pre Prohibition, as Prohibition did have a lasting social impact. Still, the burden imposed by alcohol today remains real.

Epilogue III

Examples of local breweries from the regional past:

June 12

1890  The brewery in Laramie sold its first beer.  Up until Prohibition, small local breweries were extremely common in the United States.  Attribution:  Wyoming State Historical Society.
 From Today In Wyoming's History.

Epilogue IV

A NPR article on the explosion of small breweries across the U.S.   This demonstrates the increase in small breweries, but it's considerably below the number I'd expect.  I read awhile back that Denver now has something like 200 brew pubs, which would suggest the number of small breweries is higher than reported here.

Old Picture of the Day: Milk Delivery Truck

Old Picture of the Day: Milk Delivery Truck

Not your granddaddy’s river

Not your granddaddy’s river

Indian Territory, fur traders and forts along the Platte

Indian Territory, fur traders and forts along the Platte

Letter from Ireland

Letter from Ireland

Wednesday, March 18, 1914. "Among the things that Wyoming may be thankful is that it is not on the borderland of barbarous Mexico". Enduring jingoism.

British Wilson, border news?

Wilson was in fact an anglophile, but his government certainly wasn't dominated by the British.

And Mexico barbarous?

Some old headlines are oddly contemporary, as are some jingoistic views, we have to say.  This almost sounds like a Trump rally, as over the weekend he declared that some migrants aren't human.


Barbarous?

Is the cigarette ad a football helmet, or a pilot's helmet?


And the brown bottle thing is correct:


The Boomerang was less dramatic, but it did have an interesting item on pipe smoking at a St. Patrick's Day party.



Sunday, March 16, 2014

Student Loans, maybe we're looking at that the wrong way too.

Having just expounded on declining law school enrollment, and why maybe the legal community ought not to worry too much about it, and perhaps if it is going to worry it ought to reconsider its approach to the problem, if there is one, I'll separately note this topic.

Recently, there's been a lot of commentary about how students amassing debt to go to law school can't pay it off with the depressed wages they're now receiving as lawyers. 

The question this raises is this. Why give student loans to people who want to go to law school?

That may sound harsh, but in a flooded market, why fund failure?  Indeed, why do we give loans to go to art school, or just about any degree program we can think of?

Student loans, as a species, stem from the GI Bill, which allowed, very successfully, a lot of demographics to go to college for the first time in our nation's history.  This, it is often noted, resulted in a huge economic boon to the country, repaying the country in economic gain again and again.  The thought, correctly, was that student loads would do the same thing.

They did, but that also resulted in a vast expansion of fields of study, and over time, we graduated so many into the general population that it's really no longer true.  Or at least its not true for all fields.  Perhaps the time has arrived to give loans where our society needs them.

So, perhaps it's time to fund people to go into engineering or the sciences.  For certain targeted minorities, law still makes sense.  But it doesn't make sense to give loans out to everyone who, at age 18, decides they want to go to college in any field.  It can end up hurting them, and it doesn't seem to be benefiting society at large.

This may seem harsh, but perhaps its not as harsh as funding somebody all the way through a Masters in Art when there's no job to be had, and then asking them to pay it back on wages they won't be making.

Declining Law School Enrollment. Maybe we're looking at it the wrong way.

Yesterday I, and most likely every other member of the Wyoming State Bar, received a letter from the current interim dean of the College of Law, whom I hope becomes the permanent dean.  Having met her, she's an impressive individual.  Part of her impressive nature is that she's very honest, and will directly answer a question, and honestly.  She noted that her young attorney assistant probably cringes over what her answers are to certain questions.

The letter was soliciting donations for scholarships at the UW's College of Law and it mentioned some interesting facts and figures.  One of the most impressive, which I was already somewhat aware of, is that law school applications nationwide have fallen 50% over the last few years.  Her figures noted a recent high of 100,000 applications to the nation's law schools and that it's now under 50,000. The New York Times indicated back in January that it was more like 30,000, although that's three months ago so perhaps its changed.

This is causing a lot of consternation at law schools, including our states, but perhaps we ought to take a step back and consider a couple of things about this.  In other words, this might not be a bad thing for anyone, actually.

There were plenty of warnings that this was going to occur and prescient observers of the law noted that we were reaching this point some time ago. Truth be known, the era that law schools and the law have been living in was a freakish anomaly to start with.  For most of our modern history law hasn't been dominated by big multi state law firms with hundreds of lawyers, and for most of it it hasn't been a path to riches either.  The recent history, say 1970 to the current era, was a bit of a bizarre period in our greater economic history which saw the rise and fall of a lot of "entrepreneurial" activities and the attempt to convert the practice of law sort of into one.  That recruited a lot of people to the practice of law, but that this would fall off should have been inevitable.

Not so inevitable, however, that most state bars have failed to catch up with it, and failed to appreciate it. They worry, along with the law schools, but their focus is basically without important aim.  They've also failed to appreciate that while the giant multi state firms are still with us, in the Internet age they're significantly endangered. 

Also with the Internet has come an era when, in spite of what people may want to believe, the average citizen of the globe is much more educated on everything than he once was, and that includes legal matters.  It isn't the case that everyone now has the training that lawyers have, but a lot more people know a lot more stuff than they used to. This makes lawyers less of a needed commodity.

So, what happened is that the ranks of the law were swelled by an economic anomaly that occurred in the 1970 to about 2000 time frame which is now over, and likely over for good.  At the same time, developments in technology have made the need for lawyers smaller.  In other words, there's been an oversupply of lawyers graduating into practice every year, and now there's an overpopulation of us.

Law schools have, unfortunately, worked to make this worse as they've generally backed (although as I know the UW law school had no direct role in it) the UBE which makes a license, and hence a degree, "portable".  Our new interim dean was frank that she would have supported it, even though she understands why many practitioners do not support this change, as it makes the degree more attractive.  That's the same view that almost all law academics take, as they believe that aids them in recruiting students, as a student who is recruited to, say, Wyoming can come here knowing that his degree will let him practice, say, in his native Colorado while still keeping his options open.

Of course, what that also means, for younger lawyers, and indeed even for older ones, is that the population of practicing lawyers in Wyoming, North Dakota and Montana, all UBE states, has effectively swelled to include not only the combined population of lawyers from those states, but of the much more densely populated Colorado as well.  I'd guess there's more lawyers in  Denver alone than in all of Wyoming, but now we're more or less in competition with them. As a result, our incomes will go down, and so will theirs, making everyone's ability to keep on keeping on somewhat impaired. 

A disaster right?

Well. . . . somewhat, but perhaps there's something else we should consider.

Law schools don't exist for their own sake, but to serve the career aims of their students and the needs of the general public. The public is telling the legal community that they have enough lawyers.  That's not a tragedy, that's a good thing, really.

And perhaps it's not that glum for students.  According to the ABA, when its not crying in its double latte over the sad tragic fate of gigantic East Coast white shoe firms, the practice of law has been suffering from lawyers abandoning the practice at record numbers, and has been since before this started. Also, according to the ABA and various state bars, lawyers are suffering from an internal existential crisis like never before, and one that sets them apart from nearly every other career.  There's been a lot of hand wringing about it, but pretty much nobody has been able to come up with any better ideas about this than to suggest that lawyers stop and smell the roses, or perhaps add to their burdens by taking on cases for free (pro bono). 

Maybe part of that overall set of problems is partially explained by the practice recruiting a lot of entrants who actually had not actual interest in the work to start with.  A 50% decline in applications would suggest that an awful lot of potential recruits were fairly easily dissuaded and went on to something else (it'd be interesting to know what).  If that's right, it would probably mean the remaining 50% are pretty darned dedicated and know what they want to do, and that's good for everyone.

And if that's the case, maybe the time has come simply to cut back in law school classes.

The dean's letter indicates that UW actually hasn't suffered from a decline in applications, and that theirs have gone up. That should be a bit of a relief, but their worried anyone.  I'm not too surprised, however. With declining enrollment students can be choosier in where they apply, and its a good school.  So far, people who graduate form it are, I think, largely finding employment within a year or so (although I haven't studied that, so don't take my word for it).  The dean also noticed the role of scholarships in keeping that number up, however.

But what if law schools nationwide simply cut back on the number of students they were taking in, and hence graduates they were putting out. Why not? 

Historically, UW graduates about 60 to 80 students per year.  It's done this for a very, very long time, even though the population of the state has increased over time.  A population increase, of course, would seemingly translate into an increased local need for lawyers, but it's also been the case since about 1991 that a very large number of graduates are going elsewhere, some years well over half.  That pretty strongly suggests that the school is graduating way in excess of the state's requirements, and knows it.  Indeed, the emphasis on Wyoming's law has declined over the past two decades. 

So what if UW, for example, was to cut the graduation target number down to 40 students, or even 30?  Sounds extreme, to be sure, but if a person looks at the graduating classes of the 40s and 50s, they're smaller than that.  Critics would note, and correctly, that perhaps a modern law school can't economically graduate classes that small as it would mean the infrastructure and academic body would be too small to support the education. That may well be correct, but at the same time, perhaps its time to consider that the Internet (that thing again) may mean that the old brick and mortar, wood pulp and cardboard, requirements that seem to exist might be a bit obsolete.  Perhaps this is completely in error, but it might be worth looking at.

In the greater sense, nationally, some law schools just need to go, and that's all there is to it.  The bottom couple of tiers could simply disappear with no harm done to the nation at large and frankly no injury done to the real lives of those who would have applied there. Some of those schools are going to die anyhow, and frankly, they deserve to.

I suppose it might also be time for law schools to make good on a canard that they've shoveled out for decades that a law degree can be "used for a lot of things".  No, it cannot, for the most part.  But law schools still claim this.  They can make it true, however, by doing what Harvard did years and years ago and creating a law school degree that's tacked onto a second course of study.  Indeed, as its generally the case that in most Common Law countries the study of law is not a graduate course of study, but a baccalaureate course, perhaps its time to consider making this mandatory.

That could be done in one of two ways.  One way would be to simply take the approach of other Common Law nations and not require an undergraduate degree to study law.  A four year course of study would work just as well, for most students, if they were required to study more than law, maybe.  Or maybe not.  But if done in that fashion, the prospective lawyer could be required to take a real minor (not "pre law" or some such thing).  Something like business, or the sciences, or engineering.  These degrees might actually be transferable to something else, at least at first, if the student couldn't find employment or decided to abandon law.

My own undergraduate degree is in Geology, and I know quite a few other lawyers, oddly enough, who have the same undergraduate degree.  And I've known a few who had engineering degrees, one who is a doctor of chemistry, and so on. Some worked in their fields, or others such as banking or journalism, so I don't want to be overly critical here.  There is more diversity that a person might suppose, but in some cases there flat out isn't, and those people are then in a bad spot when they start out, if they need to take a fork in the road.  In spite of what their profs may have said, in most spots they're not very interested in hiring a JD with a background in pre law to work at a non law job.

Another way to approach this, however, and one which interests me a bit more is to provide the opportunity for the students to add to their degrees in the Harvard like fashion, and apparently UW is working on doing that.  In that way, a graduate could come out of school like Mitt Romeny with a JD and a MBA, for example.  In this part of the country, the same ought to be looked at in terms of something combined with the energy industries.

But let's not stop there.  The Dean's letter urges action, in the form of donations for scholarships, to the law school.  But maybe the solution isn't to encourage students to keep applying to law school.  Frankly, enticing those into such a dicey situation as we presently have seems like a poor idea, for the most part, and perhaps we're just better off letting the free market dictate who goes and who doesn't, although I can think of a single exception were I would like to see more done (that being in regard to enrolled Tribal members from the Wind River Reservation, which I think is under served by the law, and where an increased population of native lawyers, I think, would be a very good thing).

But, in not stopping there, perhaps its time for State Bars, rather than hand wringing, to acknowledge their part in this and to address it.

One way they could address it would be to dump the UBE and even the Multi State Bar exams.  Have real, state, exams, and don't grant reciprocity to out of state lawyers.  While I have nothing against them personally, anyone practicing law in Wyoming knows that  in litigation a person is nearly as likely to run into a Colorado lawyer as a  Wyoming one.  Some are Wyoming expatriates, and others just opportunistic, but if we didn't grant other states reciprocity, and there's no good reason at all that we should, the same work would go to people who make their homes here. The long term impact of that would be to boost the practice in rural regions and smaller state, and to somewhat hurt it in big urban areas, which are hurting anyhow.  Yes, that's very provincial, but it's not unreasonably so in that arguing that a lawyer should actually be a member of the bar where he is routinely practicing seems like a rather good idea.

Another concept, even though the impact would be relatively minor, would be to reverse the recent trend of pretending that jurist can serve in their extremely advanced old age.  There have been efforts in Wyoming to eliminate the retirement age for state judges, for example, and there is no retirement age for Federal judges as it is.

Recently a Circuit Court judge retired here and, in his remarks, noted that he felt it was important to retire when a person still had all their faculties.  I fully agree.  To this end, perhaps we should take a page from the history of the U.S. military.

At one time, the military had no retirement age, and as a result the services kept a lot of men who reached a state of infirmity.  Sometime after the Spanish American War, the service began to address this and put in a retirement requirement at age 65.  Later it was lowered to 60.  It's 60 now, with some exceptions.  The service also, as noted in our earlier thread about retirementt, has gone from a 30 years of service requirement to a 20 years of service requirement, for early retirement.  This serves to make the thing green line, younger and physically fit, than it would if a long period of retirement and no age cap was present.  Indeed, I'm sure there'd be Army officers in their 70s and 80s if there was no such requirement.  An 80 year old Army officer would have learned his trade 60 years ago, or more or less at the time of the Korean War, when World War Two weapons were still the norm.

Well, and 80 year old judge learned his trade at the same time, before much of the modern law came into being and before all of the modern research tools existed.  Why not acknowledge that.  We could create a Federal retirement cap at age 65, and do the same for state judges.  That would serve the interest of the public, and frankly it'd open up a few positions, and open them up more frequently, as well.

We might wish to consider the same for lawyers who work for the government, although I'd generally note that they do take advantage of retirement, and recently they've tended to have the last laugh about that in regards to private lawyers who often seem to be unable to do so.

Not that all of this, or any of it, will stem the tide of declining enrollment.  But maybe that's just the tide going back out, and we shouldn't worry much about it.  Some schools will go out with that tide, but perhaps that's an effect of the cause we can do little about.

Saturday, March 15, 2014

Standards of Dress: Clerical dress

Recently, I did a thread on changes in standards of dress for average people, or more particularly, those living in cities and towns.  We looked at how those standards have changed greatly over the past century, and even how the dress of the early 20th  Century, or at least male dress, still looks familiar, it was much more formal, day to day, than it is now.

Here we look at a more specific topic, clerical dress.

Clerical dress, i.e., the clothing of priest, pastors, rabbis, etc., has seemingly changed less, which is not to say that it hasn't changed at all, in comparison to other vocations.  This probably makes sense, given their roles.

In looking at this topic, this is one area where we really have to start with the present standards, which are the only ones most people are really familiar with, and work backwards.  This reveals some interesting trends, but it also tends to show how stable this particular area of dress is.  And to start off here, with really have to look at the Roman Catholic Priest.

For the most part, in North America, and indeed in most of Europe, the dress of Christian clerics falls into two camps, one of which takes its inspiration from the standards of the Catholic Church, and the other of which takes its standards from business wear.   Almost never, but not quite never, do Christian religious take a standard from elsewhere, although there are few notable exceptions, such as The Salvation Army.  There are solid reasons based in tradition and even theology for this, but we won't really get into that, as that's a topic for some other forum

The clothing of Catholic Priests is governed by regulations within the Church pertaining to that.  Generally, Catholic Priests must wear black, and they must wear a shirt that accommodates a Roman Collar.

Catholic Priest in Europe, courtesy of Wikipedia.  This Priest is wearing a cassock, which is a type of dress which is unusual in the United States.  The Priests clothing features the Roman Collar.

The actual origin of the Roman Collar is disputed, and even the name "Roman Collar" isn't universal.  Some claim a Reformation origin for the collar, but the better evidence takes it back to ancient times with there being some attribution to it serving a purpose associated with medical emergencies in the Medieval Black Plague.  No matter, in the modern world black dress with Roman Collar is the regulated norm for Catholic Priests.  Roman collars are also the norm for Orthodox Priests in North America.  And they are the norm for Protestant denominations that have an origin associated with the Catholic Church, such as the Lutheran and Episcopal Churches.

Lutheran Priest with Roman Collar, but with checked sports coat.  In the sports coat, he departs from what would be the Catholic standard.

Roman Collars today are also frequently worn by ministers in denominations that have no close association in origin with the Catholic Church, however. In these instances, ministers of those denominations are in denominations that have adopted the wide practice of other Christian denominations or, sometimes, the individual ministers have. 

Given this, it's probably surprising to learn that Roman Collars, while an ancient style of clerical dress, haven't always been the rule in North America, to the extent that they currently are.  Indeed, while at one time Roman Collars were the rule in Europe, in North American Catholic Priest's clothing regulations  caused them to be dressed in apparel that was of the type worn by secular businessmen, this being the norm until the mid 19th Century. The reason for this is that prejudice against Catholics was so strong, that the Church did not wish for clerics to stick out too much, lest they be harmed by anti Catholics. We have to keep in mind here that, prior to the American Civil War, bias against Catholics was so strong in the United States that it defined some political parties.

This type of prejudice began to wane after the Mexican War and Civil War, in which Catholic Irish Americans played such a significant role, and even though decades would pass before being strong anti Irish would not be regarded as acceptable, it did mean that the Roman Collar returned to Catholic clerics by the second half of the 19th Century, in North America.

This didn't mean, however, that clerical dress became identical to what we commonly see today. At that time cassocks, a long outer garment somewhat resembling a frock coat, were the clerical norm for most denominations using the Roman Collar.  This remained the case well into the 20th Century, but during the 20th Century, a coat based on the single breasted man's business suit coat became increasingly common.


Catholic Priest, mid 20th Century, wearing cassock.

Fairly typical wear for Priests, mid 20th Century.

This trend has continued into the the present era, where cassocks are now rare, but where the Roman Collar with simple black suit jacket is common.  For Catholic priests, the reaming clothing is always black, unless they occupy an higher ecclesiastical rank.  For other denominations, however, this is not necessarily so, and you will sometimes see colored shirts of various colors, with blue seemingly being the most common.

Roman Collars have become so common in North America that they have spread to Orthodox and Eastern Rite denominations in North America, which was not always true.  The Roman Collar does not have as long of history in these denominations as in the ones discussed above, with those denominations having had very traditional clothing of their own, which is still worn where these denominations exist in large numbers.  Those watching the recent dramatic events in Ukraine have seen Priests wearing this clothing out in the streets, in support of Ukraine. Typically news reports indicate that they are "Orthodox Priest", but chances are just as high that they may be Ukrainian Greek Catholic Priests, there being no ready way for an average person here to be able to tell the difference by simple observation.

 Greek Orthodox Priest, mid 20th Century, in Jerusalem.  Well into the 20th Century similar dress would have been the norm in North America for Eastern Rite and Orthodox clergy.

Perhaps before going on from here it would be good to note that in at least the Orthodox and Catholic Faiths, the clothing Priests wear is governed by regulation, and so it various but little. Chances are high, but I don't know for certain, that this is also the case with at least the Episcopal church as well.

Amongst the regulated clothing, for many years, was a requirement that headgear be worn.  Some of the photographs set out above demonstrate that.  At one time Catholic Priests wore distinctive headgear on a daily basis,  and in some localities on some occasions they still do.  But for average parish priests this passed away in the 1960s.  At that time, for those areas still requiring it, the requirement in North America was for a hat of a formal type, such as a fedora, so the former requirement of a distinctive hat had passed away, for the most part.  Orthodox Priests have a much more distinctive headgear that survived well into the 20th Century and may still be a requirement for some Orthodox denominations, but I'm not familiar enough with their situation to be certain.

None of this has addressed vestments, which Priests and other religious wear during services, and which would make up a lengthy separate topic.  Suffice it to say, the denominations mentioned above all wear vestments, and while these remain clearly identifiable over time, you can tell the era in which they were made by stylistic differences that occur over time.

Catholic Priest offering Mass, World War Two.  Vestments are being worn, Priest on far right is wearing a cassock.  The distinctive headgear shown would indicate, I think, that three of these men are Bishops.

 Episcopal Priest with recently married couple, mid 20th Century.

For those denominations where Roman Collars are not worn, and shirt and tie is, basically they have tended to follow the more conservative end of business dress over the years.  This continues to the present time, making them one of the few groups that routinely wears formal wear in their official capacity.
Protestant minister discussing problems with his congregation after services, in what appear to be a cold setting in Maine, 1940s.

Presbyterian minister, mid 20th Century.

With all this emphasis on clothing and how it was worn, and what it generally means (I've skipped pretty much information pertaining to higher Church ranks) one surprising thing is to learn that in the United States, distinctive religious clothing has been nearly wholly omitted on occasion for some specific roles, such as military chaplains.  American chaplains wear the standard military uniform of their branch of service.

U.S. Army Chaplain, Civil War.

Confederate officer, holding position as officer and Chaplain, Civil War.

U.S. Army Chaplain, World War One. This photo shows that at the time at least some Army chaplains wore an open collar coat, which was not the service norm, with Roman Collar.

More typical World War One appearance for a U.S. Army chaplain with stand up collar service coat.

Col. William R. Arnold, Chief of Chaplains during World War Two, and a Roman Catholic Priest.

Their uniforms have always featured distinctive insignia,and in field conditions you will still see some specific items being worn while they are performing their official roles. But by and large, they look a lot like other servicemen.  This does not tend to be the case for other nations.

 British Chaplain, wearing Roman Collar, in World War One.

 Catholic, Protestant and Jewish Chaplains, U.S. Army, World War Two.

U.S. Army chaplain, in dress uniform, World War Two.

So far, of course, I've written only about Christian clerics.  In the time frame covered by this blog, it would seem that some discussion of at least Jewish clerics would also be in order.  My problem here, however, is that to the extent I'm familiar with their dress, I'd only be a danger in discussing it.

The Jewish faith is, of course, presently divided into various branches, and it would seem that dress in general in the branches various.  I've seen photographs of rabbis in the mid 20th Century, for example, that are simply indistinguishable from men in typical business attire of the day.  Others have very distinctive dress. So, given that, I can only assume custom and practice varies by branch. As is well known, Hassidic Jews today wear very distinctive dress in general, so perhaps rather than make any more errors than I already have, I should leave that topic alone.

So far I've also omitted any discussion of the dress of female religious.  Generally, up until perhaps the 1970s or so, most female religious were nuns, and perhaps globally that may still be true.  The Catholic Church, Orthodox Churches, Episcopal Church and Lutheran Churches all have religious orders for women, which most people simply refer to as nuns.

 Nuns on Long Island sea shore, 1940s

Nuns traditionally wore distinctive dress which is referred to as "habits".  While these vary, all nuns of all denominations wore some variety of distinctive dress, with most habits resembling one another very generally.  It's interesting to note that orders dedicated to hospitals were once so common in Europe that for a long time European nurses wore clothing that strongly resembled habits, and a common term for a nurse in Europe is "sister."  The German word for a nurse is Krankenshwester, or "sick sister".  

This is an area that has changed enormously post 1960.  While there are still orders of nuns in all faiths that have nuns that wear habits, the largest population of nuns in North American was by far in the Catholic Church, which generally greatly diminished the requirements for habits after the early 1960s, at which point many orders simply did away with them.  Not all did, and interestingly those which have retained them tend to be amongst those which remain the strongest today.

Thursday, March 13, 2014

Retirement



If you are in business, or read business news, or listen to any type of commentary at all, you're going to hear a lot about retirement.  For that matter, if you live to be 50 years of age, and I certainly hope you do, if you are not already, you're going to at least think about retirement with the American Association of Retired Persons sends you mail, implicitly suggesting that by age 50 you've amassed so much wealth that you are going to retire. And the topic appear sin professional journals all the time. The most recent issue of the ABA journal, for example, has its cover story on the topic of retirement.

For a lot of Americans, indeed most Americans, that's a bit of a cruel joke.  Most folks can't retire at 50, and most never have been able to do so.  Beyond that, however, at about that age you'll start to notice an interesting dichotomy of stuff on retirement, some of which is really scary, and some of which is somewhat delusional.

In terms of delusional, I'm always slightly amazed by the series of materials that seek to make you feel guilty about retiring, of which there is a fair amount (and, no, I'm 50 and not anywhere near retiring).  This stuff suggests that when you are of retirement age, say your 60s, you probably ought to do one of two things:

1.  You ought to be starting a new job/retirement/business that reflects your long hidden dreams and talents, or which expresses that series of dreams, talents and values you've developed in your years of work; or

2.  You ought to be able to use your retirement to live a wild life of traveling abandon and adventure.

I know you've seen this stuff.  You are retired, according to the television advertisement, and now you somehow own Monument Valley. Wow.

Or you are retired and open a vineyard in Tuscany.  Jeepers. . . your work really worked out for you big time.

Or you now can open a company that competes with Microsoft. . . or manufactures jackets for kittens, or whatever.  These portrayals are so common that one brokerage company actually made fun of them, in a clever way, with a befuddled individual who needs advice stating something like "A vineyard?  Come on!,", which brings me to the second type of retirement portrayal, which is that if you are in the Middle Class, forget it, it won't be happening.  You're doomed.  Not to sound to glum, but that portrayal is probably much closer to the mark.

All of which makes looking at retirement in a historical context both worthwhile and interesting.  Maybe even productive.  I.e., how did we get here.  Something has been occurring in recent years, to be sure, and this topic is in the news a lot one way or the other, whether it simply be due to a local well known person retiring, or it be warning news about most people in the near future never being able to retire.

One thing we might note here, however, right off the bat is that the common canard about "people living longer" simply isn't true.  People do not live any longer presently than they ever have.  As we addressed in the post about life spans, the very widespread notion that "people live longer today" is based upon a misunderstanding of statistics.  People don't live longer, they simply do not die from some untimely event, whether that be disease, violence, or injury, as frequently as they once did.  Indeed, they do not die by some of these causes (violence, death at birth, etc) nearly as frequently as they once did by a huge margin.  That means more live out their allotted years, so to speak, than was once the case.  Put another way, not too many people would regard being falling off of a hay rake and getting dragged to death a natural way to go, but more than a few teenagers experienced that sort of death up until relatively recently.

But this fact does inspire the two reactions noted above.  On one hand, the combination of better medicine, much less physically arduous labor, increased surplus income and the exceptionalist expectations of the Baby Boom generations has lead to a sort of expectation that the old won't ever grow old, and that we should expect to be touring Naples on bicycles up until our 90s.  And for a few, that is darned near true.  My mother didn't tour Naples on a bike, but she did ride one around town up until just a few years ago, when old age finally really caught up with her.  On the other hand, the same increase in the number of people who grow old, combined with massive societal changes in the past century, inspire legitimate fears in many that their declining years will be impoverished and difficult.

Most people now are used to the idea of there at least being something called retirement.  And while that concept goes back surprisingly far, retirement as an actual practice for most people does not.  Indeed, for most people, and I mean for most people on Earth, it didn't become a possibility until the late 19th Century.

Prior to the late 19th Century retirement for average people just didn't exist.  Part of the reason why, particularly in North American, is that in the much more rural economies of years past, there wasn't an economic ability for it, and there was certainly no state sponsored retirement of any kind.  Farmers basically worked on the land until they passed away, with it being the rule that, if they owned their land (and most North American farmers did) they passed the farm on to one of their children.  If they grew too infirm to work it, that passing on feature effected their retirement, basically.  They'd still be there, even if they could no longer work as much, or indeed at all.


This practice, by the way, is still pretty common with agricultural families.

In other lines of work, the same could also be true, however.  In any sort of family operation, the older male would generally keep working at it as long as he could and if there was somebody to pass it on to, he did.


 Blacksmith, and not a young one.

Where this opportunity didn't present itself, men and women with families, and that was most men and women, might eventually move in with one of their children for their retired years.  So, an old lawyer, like John Adams (also a farmer), or Clarence Darrow, might work up until his death, and many did.  But some might also pass beyond the ability to practice and retire, moving in with a family member and closing their practice.

Of course, some people became wealthy, but in the pre late 19th Century era, that didn't equate with retiring as a rule.  For some it did, of course, but that tended to mean that they had lives of varying degrees of abundance or leisure, depending upon the amount of wealth.  That doesn't vary much from now, expect that a much, much, smaller percentage of the population achieved wealth prior to World War Two.  There are, of course, exceptions.

So, with that being the case, how did modern retirement come about? Well, two ways.  War and Social Revolution.

That's a slight exaggeration, but only slight.

The first real retirements we can find, in the modern sense, start off with various armies.  How armies were raised and manned varied over the world in the 18th and 19th Century, but it's about that time that retirement systems for soldiers started to come into play.  Originally, there was none. Indeed, as shocking as it may now seem, in many European armies of the 18th Century soldiers were conscripted for life or near life terms, if they were conscripted.  Short term conscription for most European armies (the Russians excepted, Russians solders were conscripted for a term of 25 years) came in during the 19th Century, and for solid military reasons.  In the 18th Century, however, even British soldiers, who were volunteers, joined for life.

American soldiers, few in number until World War Two, never joined for life and always joined for a short term, but in both instances, there was no such thing as retirement.  If a soldier was retired, it was because he became too infirm or injured to keep on soldiering.  Every country recognized a a system for retiring soldiers in that situation, but only that one. So, showing that things can reverse direction, the lot of an 18th Century soldier was worse in this fashion than it was for a Roman soldier.  Roman soldiers actually could retire, with a grant of land.

The impact of this, however, was to put a lot of old enlisted men into service.  You can find plenty of pre Civil War American photographs of U.S. soldiers, for example, who are ancient.  They didn't get paid well enough to retire on savings, they didn't always have families, so they had to keep on working. There was no age cap on service, and they ultimately mustered out by infirmity or death.

That was a bad thing not only for the soldiers, but the armies as well.  To take the American example, getting 20 year olds (and the Army generally would not enlist teenagers up until the 20th Century) to spend the month of November in the snow, in Wyoming, eating moldy bacon is one thing.  Getting 60 year olds to do that, and to keep functioning, is quite another.  Now, a lot of 19th Century 60 year olds were perfectly capable of doing that, and even more are now, but in a profession in which, if you were a career man you were at it for decades, you had been badly injured and seriously ill at some point by that time, making it all the tougher.  Indeed, according to one statistical analysis I've seen, the majority of American men over 40 years of age lived with some chronic condition by age 40.  Probably the majority now do as well, but at that time, you just endured it. And enduring it wears you down.

The Army, indeed all armies, recognized this and they all began to introduce retirement systems.  In the U.S. the Army first allowed officers to retire after 40 years of service after the Civil War.  Soon thereafter, this policy was expanded to include enlisted men. Other countries adopted similar policies.

The Last Muster.  Pen and ink depiction of British Army pensioners, in uniform.

This served two purposes.  One is it simply recognized decades of service.  But it also recognized that younger men made better soldiers for a variety of reasons.  One was, of course, physical.  The original retirement system, which effectively retired U.S. soldiers at about 60 years of age, recognized that by that time they probably were physically pushing the limits of their service abilities.

World War One poster noting the physical abilities of generations of soldiers.

Indeed, this was so much the case that Theodore Roosevelt encourage the early retirement of officers who were no longer physically fit, during his presidency, by requiring officers to go on long rides (ninety miles)on their mounts.  All officers were expected to know how to ride in that era, and he tested them on it.  By that time, many older ones couldn't endure it, and accordingly they were retired.  Even officers in the Navy were given the choice of going on a very horseback ride, a very long bicycle ride, or a very long hike.

The other purpose was an intellectual one, however.  By the late 19th Century the military sciences were advancing rapidly, and the Army began to recognize that keeping old officers in place impaired the ability to adapt.  The American army was legendary for keeping men in their same ranks for eons, and by the early 20th Century, this was recognized to be a bad idea.  Sixty year old captains who had the same command for fifteen years were much less likely to appreciate newly introduced weapons than, for example, a captain in his twenties might be.  The Army accordingly dropped the retirement age to 30 years, effectively encouraging, but not requiring, men to retire early, at 3/4s pay, in their 50s.  

Even this proved to be problematic at the start of World War Two, and the Army, recognizing a need to adapt to a change in the nature of war, dropped the retirement age to twenty years.  This, it must be noted, was an early retirement age. To obtain full retirement a soldier had to stay in for 40 years, as they still do. But they could take half pay and retire at 20 years.  Less attractive at first to enlisted men as opposed to officers, this provided a means to encourage retirement for officers whom the service wanted out of the way, which they soon found other ways to additionally encourage.

 While the U.S. armed forces did indeed encourage a lot of older soldiers to "move on" at the start of World War Two, combat attrition in World War One had been so high in the Commonwealth nations that this poster actually was aimed at drawing back in World War One soldiers, noting that many in their 40s and 50s still had plenty of vigor for later service.  Unlike the U.S. Army, the British used a fair number of older officers during the war, as did the Germans.

This created the modern service retirement system we still have in place. The system spread out of the Armed forces and into nearly every type of uniformed service we have today.  Policemen, for example, generally can retire early at 20 years of service.  It's even spread out of uniformed service in some instances, however, and some other sorts of government workers have retirement systems of this type.

Retirement in other fields is a somewhat more recent phenomenon.  It's a product of the industrial revolution really.  Industrial employment, like military service, chewed men up.  It also organized them.  And this organization both created opportunities, and threats, depending upon how they were handled.  And it also removed men from the rural support system in which they'd previously lived.  If a blacksmith was injured in his small town occupation, chances are that his sons, or brothers, could take over for him, and if he had to stay home, no doubt in poverty, at least there was a home to go to.  When he grew old and could no longer work, the same was true, and chances were high that there was a fireside to stay near, as the younger men went out to work.

Once industrial labor arose, this was no longer true.  Early industrial laborers were displaced from farms and small towns to a very large extent.  As a result, they were disoriented, rootless, and in some ways at the mercy of their environment.  Ultimately, they came to agitate for protection, cognizant of the dangers of their work and what that meant for them personally.

This created a wide variety of responses, but one of them ultimately came to be socially sponsored retirement.  Men could not work in heavy industry indefinitely, but they could not leave those occupations and be able to depend on anything to fall back on. Something had to replace the family supported home to retire to, and that came to be retirement, either government sponsored or employer funded, both of which served to keep the social wolf from the door.

Early moves towards wider retirement started in the early 20th Century, with the first proposal for a Social Security being advanced in the Progressive Party campaign of Theodore Roosevelt.  Roosevelt's Bull Moose campaign failed, but the Great Depression gave new force to the argument and Social Security, a fairly radical proposal by historical standards, came to be reality under Franklin Roosevelt.  By that time, heavy industry had privately incorporated it in many instances. World War Two, which boosted the advantages to private industry to supply benefits during a period in which wages were frozen, boosted it further.
 
Female industrial laborer, World War Two. Labor had been agitating for benefits beyond increased wages since the late 19th Century, but it was World War Two that really changed the nature of health care and retirement in the United States.

This gave us the situation we had in the middle of the 20th Century, and which lasted until at least the 1970s. By and large, in private employment in the US, most occupations offered pensions of some sort.  This promised workers the ability to retire at age 65.  In addition to that, Social Security became available at age 63, with full benefits payable at age 65.  For those with service occupations, retirement came to be available after 30 or 20 years.  Frequently, those who had service employments went on to a second career, in light of the fact that they were retiring fairly young.

So far so good, but staring in the late 1980s, something began to break down in this system. Now, while that system hasn't completely broken, concern over the system is widespread.  What happened?

Well, for one thing work stability declined in the private sector, while seemingly becoming solidifying in the government sector, at least up until very recently.  For long time government employees, I'd note, many are having their last laugh now after years of deridment by those in the private sector.  I've heard this more than once, for example, from government lawyers who are nearing retirement and now see that their private sector fellows, who often chided them for sticking it out in "low paying" (which were really lower paying, not low paying) positions for decades.  Now the government sector lawyers are able to retire, while  many of those in higher paying private practices cannot.  Indeed, one comment of that type just appeared on an ABA website about retirement.

In the private sector manufacturing jobs became highly unstable, if they didn't disappear completely, and this spilled over into white collar occupations as well.  Much has been made of the fact that employees can't enter an occupation  and plan on sticking it out for their career for one reason or another.  One of the things little noted about that is that with that instability, has come the evaporation of retirement plans.  Retirement plans only make sense for long term employees, not short term ones. 

So, is this system broken?  Put another way, is it unrealistic?

That's hard to answer, but retirement is rapidly becoming something that is not nearly as certain for many people as it once was.  Social Security wasn't designed to provide a fancy retirement, just to keep people from falling into retired poverty, and it wasn't meant to cover 100% of the people who paid into it either.  Indeed, it still doesn't cover 100%, but in an era when medicine has made early deaths less common, more people now live into their old age and advanced old age.

But another aspect of this may simply be that expectations about retirements became unrealistic.  Truth be known, much of our concept of retirement is retirement as envisioned by the World War Two and Boomer generations, which was never the historical norm.  The abnormal economies of the 1940s through 1960s lead people, gradually, to an expectation of sort of a luxurious retirement, replete with a new home far away from where they'd worked.  Historically, however, in the 60 or so years prior to that, retirement just meant retirement in place and in scale.  People tended to live decades in one house which they'd paid off well before they retired.   When they retired, they stayed home, not traveled the globe and not dreaming of planting vineyards in Tuscany.

But in order to do that, a person has to have paid their debt down to next to nothing, or nothing, but the time they're in the 60s at least, if not their mid 50s.  Otherwise, they're going to have to have a pretty significant income in retirement.  That is unrealistic.

So, what does all of this mean?  Hopefully it doesn't mean that retirement has returned to its absolute historical norm, i.e., non existent.  But it does mean that the golden age of retirement is most likely over, at least for the foreseeable future, and in the type of economy we have now.  Social Security is already being readjusted to creep it back to its more historical demographic status, and ages of entitlement have started to go up.  I strongly suspect that will start occurring in government retirements as well, which are now strained.  Twenty year plans, where they exist, will disappear in favor of thirty year plans that only allow a draw once the recipient hits age sixty, much like Army Reserve retirements now work.  That'll probably continue as well.  For those retiring in the future, a paid off home with a garden in the backyard is probably a lot more likely than trips to France and vineyards in Tuscany.



Postscript

The New York Times has an article on retirees today noting that those who want to keep on working often have a hard time finding a job that suits them, and that those who have retired find they often like it better than they suppose.

I'm glad to read that really.  While its contrarian in nature, in our society, I find the general view that its great if people past retirement age can keep working, and that they really should. Should they, if they can retire?  I'm not so sure.  It's discouraging to think that the value of a person is measured only in their ability to work, and that for everyone it must be the case that all their adult years must be employed.  That says something about us, I think, as a society.

While it's also contrarian of me to mention it, this sort of taps into the theme of one of the Super Bowl advertisements from this year, in which an actor Neal McDonough discusses how in the US we get two weeks off per year for vacation (which is inaccurate for most Americans, most don't take all their vacation so they take less than that) while the French take August (which is also inaccurate, as the French also tend to extent their vacations with a general strike from time to time).  We're informed we're a can do sort of people, and at the end its suggested that our reward for that is a Cadillac.

Well, I have nothing against Cadillacs, but that advertisement sort of makes you wonder if you should go for a Peugeot instead and take August off.