Tuesday, March 18, 2014

Old Picture of the Day: Milk Delivery Truck

Old Picture of the Day: Milk Delivery Truck

Not your granddaddy’s river

Not your granddaddy’s river

Indian Territory, fur traders and forts along the Platte

Indian Territory, fur traders and forts along the Platte

Letter from Ireland

Letter from Ireland

Wednesday, March 18, 1914. "Among the things that Wyoming may be thankful is that it is not on the borderland of barbarous Mexico". Enduring jingoism.

British Wilson, border news?

Wilson was in fact an anglophile, but his government certainly wasn't dominated by the British.

And Mexico barbarous?

Some old headlines are oddly contemporary, as are some jingoistic views, we have to say.  This almost sounds like a Trump rally, as over the weekend he declared that some migrants aren't human.


Barbarous?

Is the cigarette ad a football helmet, or a pilot's helmet?


And the brown bottle thing is correct:


The Boomerang was less dramatic, but it did have an interesting item on pipe smoking at a St. Patrick's Day party.



Sunday, March 16, 2014

Student Loans, maybe we're looking at that the wrong way too.

Having just expounded on declining law school enrollment, and why maybe the legal community ought not to worry too much about it, and perhaps if it is going to worry it ought to reconsider its approach to the problem, if there is one, I'll separately note this topic.

Recently, there's been a lot of commentary about how students amassing debt to go to law school can't pay it off with the depressed wages they're now receiving as lawyers. 

The question this raises is this. Why give student loans to people who want to go to law school?

That may sound harsh, but in a flooded market, why fund failure?  Indeed, why do we give loans to go to art school, or just about any degree program we can think of?

Student loans, as a species, stem from the GI Bill, which allowed, very successfully, a lot of demographics to go to college for the first time in our nation's history.  This, it is often noted, resulted in a huge economic boon to the country, repaying the country in economic gain again and again.  The thought, correctly, was that student loads would do the same thing.

They did, but that also resulted in a vast expansion of fields of study, and over time, we graduated so many into the general population that it's really no longer true.  Or at least its not true for all fields.  Perhaps the time has arrived to give loans where our society needs them.

So, perhaps it's time to fund people to go into engineering or the sciences.  For certain targeted minorities, law still makes sense.  But it doesn't make sense to give loans out to everyone who, at age 18, decides they want to go to college in any field.  It can end up hurting them, and it doesn't seem to be benefiting society at large.

This may seem harsh, but perhaps its not as harsh as funding somebody all the way through a Masters in Art when there's no job to be had, and then asking them to pay it back on wages they won't be making.

Declining Law School Enrollment. Maybe we're looking at it the wrong way.

Yesterday I, and most likely every other member of the Wyoming State Bar, received a letter from the current interim dean of the College of Law, whom I hope becomes the permanent dean.  Having met her, she's an impressive individual.  Part of her impressive nature is that she's very honest, and will directly answer a question, and honestly.  She noted that her young attorney assistant probably cringes over what her answers are to certain questions.

The letter was soliciting donations for scholarships at the UW's College of Law and it mentioned some interesting facts and figures.  One of the most impressive, which I was already somewhat aware of, is that law school applications nationwide have fallen 50% over the last few years.  Her figures noted a recent high of 100,000 applications to the nation's law schools and that it's now under 50,000. The New York Times indicated back in January that it was more like 30,000, although that's three months ago so perhaps its changed.

This is causing a lot of consternation at law schools, including our states, but perhaps we ought to take a step back and consider a couple of things about this.  In other words, this might not be a bad thing for anyone, actually.

There were plenty of warnings that this was going to occur and prescient observers of the law noted that we were reaching this point some time ago. Truth be known, the era that law schools and the law have been living in was a freakish anomaly to start with.  For most of our modern history law hasn't been dominated by big multi state law firms with hundreds of lawyers, and for most of it it hasn't been a path to riches either.  The recent history, say 1970 to the current era, was a bit of a bizarre period in our greater economic history which saw the rise and fall of a lot of "entrepreneurial" activities and the attempt to convert the practice of law sort of into one.  That recruited a lot of people to the practice of law, but that this would fall off should have been inevitable.

Not so inevitable, however, that most state bars have failed to catch up with it, and failed to appreciate it. They worry, along with the law schools, but their focus is basically without important aim.  They've also failed to appreciate that while the giant multi state firms are still with us, in the Internet age they're significantly endangered. 

Also with the Internet has come an era when, in spite of what people may want to believe, the average citizen of the globe is much more educated on everything than he once was, and that includes legal matters.  It isn't the case that everyone now has the training that lawyers have, but a lot more people know a lot more stuff than they used to. This makes lawyers less of a needed commodity.

So, what happened is that the ranks of the law were swelled by an economic anomaly that occurred in the 1970 to about 2000 time frame which is now over, and likely over for good.  At the same time, developments in technology have made the need for lawyers smaller.  In other words, there's been an oversupply of lawyers graduating into practice every year, and now there's an overpopulation of us.

Law schools have, unfortunately, worked to make this worse as they've generally backed (although as I know the UW law school had no direct role in it) the UBE which makes a license, and hence a degree, "portable".  Our new interim dean was frank that she would have supported it, even though she understands why many practitioners do not support this change, as it makes the degree more attractive.  That's the same view that almost all law academics take, as they believe that aids them in recruiting students, as a student who is recruited to, say, Wyoming can come here knowing that his degree will let him practice, say, in his native Colorado while still keeping his options open.

Of course, what that also means, for younger lawyers, and indeed even for older ones, is that the population of practicing lawyers in Wyoming, North Dakota and Montana, all UBE states, has effectively swelled to include not only the combined population of lawyers from those states, but of the much more densely populated Colorado as well.  I'd guess there's more lawyers in  Denver alone than in all of Wyoming, but now we're more or less in competition with them. As a result, our incomes will go down, and so will theirs, making everyone's ability to keep on keeping on somewhat impaired. 

A disaster right?

Well. . . . somewhat, but perhaps there's something else we should consider.

Law schools don't exist for their own sake, but to serve the career aims of their students and the needs of the general public. The public is telling the legal community that they have enough lawyers.  That's not a tragedy, that's a good thing, really.

And perhaps it's not that glum for students.  According to the ABA, when its not crying in its double latte over the sad tragic fate of gigantic East Coast white shoe firms, the practice of law has been suffering from lawyers abandoning the practice at record numbers, and has been since before this started. Also, according to the ABA and various state bars, lawyers are suffering from an internal existential crisis like never before, and one that sets them apart from nearly every other career.  There's been a lot of hand wringing about it, but pretty much nobody has been able to come up with any better ideas about this than to suggest that lawyers stop and smell the roses, or perhaps add to their burdens by taking on cases for free (pro bono). 

Maybe part of that overall set of problems is partially explained by the practice recruiting a lot of entrants who actually had not actual interest in the work to start with.  A 50% decline in applications would suggest that an awful lot of potential recruits were fairly easily dissuaded and went on to something else (it'd be interesting to know what).  If that's right, it would probably mean the remaining 50% are pretty darned dedicated and know what they want to do, and that's good for everyone.

And if that's the case, maybe the time has come simply to cut back in law school classes.

The dean's letter indicates that UW actually hasn't suffered from a decline in applications, and that theirs have gone up. That should be a bit of a relief, but their worried anyone.  I'm not too surprised, however. With declining enrollment students can be choosier in where they apply, and its a good school.  So far, people who graduate form it are, I think, largely finding employment within a year or so (although I haven't studied that, so don't take my word for it).  The dean also noticed the role of scholarships in keeping that number up, however.

But what if law schools nationwide simply cut back on the number of students they were taking in, and hence graduates they were putting out. Why not? 

Historically, UW graduates about 60 to 80 students per year.  It's done this for a very, very long time, even though the population of the state has increased over time.  A population increase, of course, would seemingly translate into an increased local need for lawyers, but it's also been the case since about 1991 that a very large number of graduates are going elsewhere, some years well over half.  That pretty strongly suggests that the school is graduating way in excess of the state's requirements, and knows it.  Indeed, the emphasis on Wyoming's law has declined over the past two decades. 

So what if UW, for example, was to cut the graduation target number down to 40 students, or even 30?  Sounds extreme, to be sure, but if a person looks at the graduating classes of the 40s and 50s, they're smaller than that.  Critics would note, and correctly, that perhaps a modern law school can't economically graduate classes that small as it would mean the infrastructure and academic body would be too small to support the education. That may well be correct, but at the same time, perhaps its time to consider that the Internet (that thing again) may mean that the old brick and mortar, wood pulp and cardboard, requirements that seem to exist might be a bit obsolete.  Perhaps this is completely in error, but it might be worth looking at.

In the greater sense, nationally, some law schools just need to go, and that's all there is to it.  The bottom couple of tiers could simply disappear with no harm done to the nation at large and frankly no injury done to the real lives of those who would have applied there. Some of those schools are going to die anyhow, and frankly, they deserve to.

I suppose it might also be time for law schools to make good on a canard that they've shoveled out for decades that a law degree can be "used for a lot of things".  No, it cannot, for the most part.  But law schools still claim this.  They can make it true, however, by doing what Harvard did years and years ago and creating a law school degree that's tacked onto a second course of study.  Indeed, as its generally the case that in most Common Law countries the study of law is not a graduate course of study, but a baccalaureate course, perhaps its time to consider making this mandatory.

That could be done in one of two ways.  One way would be to simply take the approach of other Common Law nations and not require an undergraduate degree to study law.  A four year course of study would work just as well, for most students, if they were required to study more than law, maybe.  Or maybe not.  But if done in that fashion, the prospective lawyer could be required to take a real minor (not "pre law" or some such thing).  Something like business, or the sciences, or engineering.  These degrees might actually be transferable to something else, at least at first, if the student couldn't find employment or decided to abandon law.

My own undergraduate degree is in Geology, and I know quite a few other lawyers, oddly enough, who have the same undergraduate degree.  And I've known a few who had engineering degrees, one who is a doctor of chemistry, and so on. Some worked in their fields, or others such as banking or journalism, so I don't want to be overly critical here.  There is more diversity that a person might suppose, but in some cases there flat out isn't, and those people are then in a bad spot when they start out, if they need to take a fork in the road.  In spite of what their profs may have said, in most spots they're not very interested in hiring a JD with a background in pre law to work at a non law job.

Another way to approach this, however, and one which interests me a bit more is to provide the opportunity for the students to add to their degrees in the Harvard like fashion, and apparently UW is working on doing that.  In that way, a graduate could come out of school like Mitt Romeny with a JD and a MBA, for example.  In this part of the country, the same ought to be looked at in terms of something combined with the energy industries.

But let's not stop there.  The Dean's letter urges action, in the form of donations for scholarships, to the law school.  But maybe the solution isn't to encourage students to keep applying to law school.  Frankly, enticing those into such a dicey situation as we presently have seems like a poor idea, for the most part, and perhaps we're just better off letting the free market dictate who goes and who doesn't, although I can think of a single exception were I would like to see more done (that being in regard to enrolled Tribal members from the Wind River Reservation, which I think is under served by the law, and where an increased population of native lawyers, I think, would be a very good thing).

But, in not stopping there, perhaps its time for State Bars, rather than hand wringing, to acknowledge their part in this and to address it.

One way they could address it would be to dump the UBE and even the Multi State Bar exams.  Have real, state, exams, and don't grant reciprocity to out of state lawyers.  While I have nothing against them personally, anyone practicing law in Wyoming knows that  in litigation a person is nearly as likely to run into a Colorado lawyer as a  Wyoming one.  Some are Wyoming expatriates, and others just opportunistic, but if we didn't grant other states reciprocity, and there's no good reason at all that we should, the same work would go to people who make their homes here. The long term impact of that would be to boost the practice in rural regions and smaller state, and to somewhat hurt it in big urban areas, which are hurting anyhow.  Yes, that's very provincial, but it's not unreasonably so in that arguing that a lawyer should actually be a member of the bar where he is routinely practicing seems like a rather good idea.

Another concept, even though the impact would be relatively minor, would be to reverse the recent trend of pretending that jurist can serve in their extremely advanced old age.  There have been efforts in Wyoming to eliminate the retirement age for state judges, for example, and there is no retirement age for Federal judges as it is.

Recently a Circuit Court judge retired here and, in his remarks, noted that he felt it was important to retire when a person still had all their faculties.  I fully agree.  To this end, perhaps we should take a page from the history of the U.S. military.

At one time, the military had no retirement age, and as a result the services kept a lot of men who reached a state of infirmity.  Sometime after the Spanish American War, the service began to address this and put in a retirement requirement at age 65.  Later it was lowered to 60.  It's 60 now, with some exceptions.  The service also, as noted in our earlier thread about retirementt, has gone from a 30 years of service requirement to a 20 years of service requirement, for early retirement.  This serves to make the thing green line, younger and physically fit, than it would if a long period of retirement and no age cap was present.  Indeed, I'm sure there'd be Army officers in their 70s and 80s if there was no such requirement.  An 80 year old Army officer would have learned his trade 60 years ago, or more or less at the time of the Korean War, when World War Two weapons were still the norm.

Well, and 80 year old judge learned his trade at the same time, before much of the modern law came into being and before all of the modern research tools existed.  Why not acknowledge that.  We could create a Federal retirement cap at age 65, and do the same for state judges.  That would serve the interest of the public, and frankly it'd open up a few positions, and open them up more frequently, as well.

We might wish to consider the same for lawyers who work for the government, although I'd generally note that they do take advantage of retirement, and recently they've tended to have the last laugh about that in regards to private lawyers who often seem to be unable to do so.

Not that all of this, or any of it, will stem the tide of declining enrollment.  But maybe that's just the tide going back out, and we shouldn't worry much about it.  Some schools will go out with that tide, but perhaps that's an effect of the cause we can do little about.

Saturday, March 15, 2014

Standards of Dress: Clerical dress

Recently, I did a thread on changes in standards of dress for average people, or more particularly, those living in cities and towns.  We looked at how those standards have changed greatly over the past century, and even how the dress of the early 20th  Century, or at least male dress, still looks familiar, it was much more formal, day to day, than it is now.

Here we look at a more specific topic, clerical dress.

Clerical dress, i.e., the clothing of priest, pastors, rabbis, etc., has seemingly changed less, which is not to say that it hasn't changed at all, in comparison to other vocations.  This probably makes sense, given their roles.

In looking at this topic, this is one area where we really have to start with the present standards, which are the only ones most people are really familiar with, and work backwards.  This reveals some interesting trends, but it also tends to show how stable this particular area of dress is.  And to start off here, with really have to look at the Roman Catholic Priest.

For the most part, in North America, and indeed in most of Europe, the dress of Christian clerics falls into two camps, one of which takes its inspiration from the standards of the Catholic Church, and the other of which takes its standards from business wear.   Almost never, but not quite never, do Christian religious take a standard from elsewhere, although there are few notable exceptions, such as The Salvation Army.  There are solid reasons based in tradition and even theology for this, but we won't really get into that, as that's a topic for some other forum

The clothing of Catholic Priests is governed by regulations within the Church pertaining to that.  Generally, Catholic Priests must wear black, and they must wear a shirt that accommodates a Roman Collar.

Catholic Priest in Europe, courtesy of Wikipedia.  This Priest is wearing a cassock, which is a type of dress which is unusual in the United States.  The Priests clothing features the Roman Collar.

The actual origin of the Roman Collar is disputed, and even the name "Roman Collar" isn't universal.  Some claim a Reformation origin for the collar, but the better evidence takes it back to ancient times with there being some attribution to it serving a purpose associated with medical emergencies in the Medieval Black Plague.  No matter, in the modern world black dress with Roman Collar is the regulated norm for Catholic Priests.  Roman collars are also the norm for Orthodox Priests in North America.  And they are the norm for Protestant denominations that have an origin associated with the Catholic Church, such as the Lutheran and Episcopal Churches.

Lutheran Priest with Roman Collar, but with checked sports coat.  In the sports coat, he departs from what would be the Catholic standard.

Roman Collars today are also frequently worn by ministers in denominations that have no close association in origin with the Catholic Church, however. In these instances, ministers of those denominations are in denominations that have adopted the wide practice of other Christian denominations or, sometimes, the individual ministers have. 

Given this, it's probably surprising to learn that Roman Collars, while an ancient style of clerical dress, haven't always been the rule in North America, to the extent that they currently are.  Indeed, while at one time Roman Collars were the rule in Europe, in North American Catholic Priest's clothing regulations  caused them to be dressed in apparel that was of the type worn by secular businessmen, this being the norm until the mid 19th Century. The reason for this is that prejudice against Catholics was so strong, that the Church did not wish for clerics to stick out too much, lest they be harmed by anti Catholics. We have to keep in mind here that, prior to the American Civil War, bias against Catholics was so strong in the United States that it defined some political parties.

This type of prejudice began to wane after the Mexican War and Civil War, in which Catholic Irish Americans played such a significant role, and even though decades would pass before being strong anti Irish would not be regarded as acceptable, it did mean that the Roman Collar returned to Catholic clerics by the second half of the 19th Century, in North America.

This didn't mean, however, that clerical dress became identical to what we commonly see today. At that time cassocks, a long outer garment somewhat resembling a frock coat, were the clerical norm for most denominations using the Roman Collar.  This remained the case well into the 20th Century, but during the 20th Century, a coat based on the single breasted man's business suit coat became increasingly common.


Catholic Priest, mid 20th Century, wearing cassock.

Fairly typical wear for Priests, mid 20th Century.

This trend has continued into the the present era, where cassocks are now rare, but where the Roman Collar with simple black suit jacket is common.  For Catholic priests, the reaming clothing is always black, unless they occupy an higher ecclesiastical rank.  For other denominations, however, this is not necessarily so, and you will sometimes see colored shirts of various colors, with blue seemingly being the most common.

Roman Collars have become so common in North America that they have spread to Orthodox and Eastern Rite denominations in North America, which was not always true.  The Roman Collar does not have as long of history in these denominations as in the ones discussed above, with those denominations having had very traditional clothing of their own, which is still worn where these denominations exist in large numbers.  Those watching the recent dramatic events in Ukraine have seen Priests wearing this clothing out in the streets, in support of Ukraine. Typically news reports indicate that they are "Orthodox Priest", but chances are just as high that they may be Ukrainian Greek Catholic Priests, there being no ready way for an average person here to be able to tell the difference by simple observation.

 Greek Orthodox Priest, mid 20th Century, in Jerusalem.  Well into the 20th Century similar dress would have been the norm in North America for Eastern Rite and Orthodox clergy.

Perhaps before going on from here it would be good to note that in at least the Orthodox and Catholic Faiths, the clothing Priests wear is governed by regulation, and so it various but little. Chances are high, but I don't know for certain, that this is also the case with at least the Episcopal church as well.

Amongst the regulated clothing, for many years, was a requirement that headgear be worn.  Some of the photographs set out above demonstrate that.  At one time Catholic Priests wore distinctive headgear on a daily basis,  and in some localities on some occasions they still do.  But for average parish priests this passed away in the 1960s.  At that time, for those areas still requiring it, the requirement in North America was for a hat of a formal type, such as a fedora, so the former requirement of a distinctive hat had passed away, for the most part.  Orthodox Priests have a much more distinctive headgear that survived well into the 20th Century and may still be a requirement for some Orthodox denominations, but I'm not familiar enough with their situation to be certain.

None of this has addressed vestments, which Priests and other religious wear during services, and which would make up a lengthy separate topic.  Suffice it to say, the denominations mentioned above all wear vestments, and while these remain clearly identifiable over time, you can tell the era in which they were made by stylistic differences that occur over time.

Catholic Priest offering Mass, World War Two.  Vestments are being worn, Priest on far right is wearing a cassock.  The distinctive headgear shown would indicate, I think, that three of these men are Bishops.

 Episcopal Priest with recently married couple, mid 20th Century.

For those denominations where Roman Collars are not worn, and shirt and tie is, basically they have tended to follow the more conservative end of business dress over the years.  This continues to the present time, making them one of the few groups that routinely wears formal wear in their official capacity.
Protestant minister discussing problems with his congregation after services, in what appear to be a cold setting in Maine, 1940s.

Presbyterian minister, mid 20th Century.

With all this emphasis on clothing and how it was worn, and what it generally means (I've skipped pretty much information pertaining to higher Church ranks) one surprising thing is to learn that in the United States, distinctive religious clothing has been nearly wholly omitted on occasion for some specific roles, such as military chaplains.  American chaplains wear the standard military uniform of their branch of service.

U.S. Army Chaplain, Civil War.

Confederate officer, holding position as officer and Chaplain, Civil War.

U.S. Army Chaplain, World War One. This photo shows that at the time at least some Army chaplains wore an open collar coat, which was not the service norm, with Roman Collar.

More typical World War One appearance for a U.S. Army chaplain with stand up collar service coat.

Col. William R. Arnold, Chief of Chaplains during World War Two, and a Roman Catholic Priest.

Their uniforms have always featured distinctive insignia,and in field conditions you will still see some specific items being worn while they are performing their official roles. But by and large, they look a lot like other servicemen.  This does not tend to be the case for other nations.

 British Chaplain, wearing Roman Collar, in World War One.

 Catholic, Protestant and Jewish Chaplains, U.S. Army, World War Two.

U.S. Army chaplain, in dress uniform, World War Two.

So far, of course, I've written only about Christian clerics.  In the time frame covered by this blog, it would seem that some discussion of at least Jewish clerics would also be in order.  My problem here, however, is that to the extent I'm familiar with their dress, I'd only be a danger in discussing it.

The Jewish faith is, of course, presently divided into various branches, and it would seem that dress in general in the branches various.  I've seen photographs of rabbis in the mid 20th Century, for example, that are simply indistinguishable from men in typical business attire of the day.  Others have very distinctive dress. So, given that, I can only assume custom and practice varies by branch. As is well known, Hassidic Jews today wear very distinctive dress in general, so perhaps rather than make any more errors than I already have, I should leave that topic alone.

So far I've also omitted any discussion of the dress of female religious.  Generally, up until perhaps the 1970s or so, most female religious were nuns, and perhaps globally that may still be true.  The Catholic Church, Orthodox Churches, Episcopal Church and Lutheran Churches all have religious orders for women, which most people simply refer to as nuns.

 Nuns on Long Island sea shore, 1940s

Nuns traditionally wore distinctive dress which is referred to as "habits".  While these vary, all nuns of all denominations wore some variety of distinctive dress, with most habits resembling one another very generally.  It's interesting to note that orders dedicated to hospitals were once so common in Europe that for a long time European nurses wore clothing that strongly resembled habits, and a common term for a nurse in Europe is "sister."  The German word for a nurse is Krankenshwester, or "sick sister".  

This is an area that has changed enormously post 1960.  While there are still orders of nuns in all faiths that have nuns that wear habits, the largest population of nuns in North American was by far in the Catholic Church, which generally greatly diminished the requirements for habits after the early 1960s, at which point many orders simply did away with them.  Not all did, and interestingly those which have retained them tend to be amongst those which remain the strongest today.

Thursday, March 13, 2014

Retirement



If you are in business, or read business news, or listen to any type of commentary at all, you're going to hear a lot about retirement.  For that matter, if you live to be 50 years of age, and I certainly hope you do, if you are not already, you're going to at least think about retirement with the American Association of Retired Persons sends you mail, implicitly suggesting that by age 50 you've amassed so much wealth that you are going to retire. And the topic appear sin professional journals all the time. The most recent issue of the ABA journal, for example, has its cover story on the topic of retirement.

For a lot of Americans, indeed most Americans, that's a bit of a cruel joke.  Most folks can't retire at 50, and most never have been able to do so.  Beyond that, however, at about that age you'll start to notice an interesting dichotomy of stuff on retirement, some of which is really scary, and some of which is somewhat delusional.

In terms of delusional, I'm always slightly amazed by the series of materials that seek to make you feel guilty about retiring, of which there is a fair amount (and, no, I'm 50 and not anywhere near retiring).  This stuff suggests that when you are of retirement age, say your 60s, you probably ought to do one of two things:

1.  You ought to be starting a new job/retirement/business that reflects your long hidden dreams and talents, or which expresses that series of dreams, talents and values you've developed in your years of work; or

2.  You ought to be able to use your retirement to live a wild life of traveling abandon and adventure.

I know you've seen this stuff.  You are retired, according to the television advertisement, and now you somehow own Monument Valley. Wow.

Or you are retired and open a vineyard in Tuscany.  Jeepers. . . your work really worked out for you big time.

Or you now can open a company that competes with Microsoft. . . or manufactures jackets for kittens, or whatever.  These portrayals are so common that one brokerage company actually made fun of them, in a clever way, with a befuddled individual who needs advice stating something like "A vineyard?  Come on!,", which brings me to the second type of retirement portrayal, which is that if you are in the Middle Class, forget it, it won't be happening.  You're doomed.  Not to sound to glum, but that portrayal is probably much closer to the mark.

All of which makes looking at retirement in a historical context both worthwhile and interesting.  Maybe even productive.  I.e., how did we get here.  Something has been occurring in recent years, to be sure, and this topic is in the news a lot one way or the other, whether it simply be due to a local well known person retiring, or it be warning news about most people in the near future never being able to retire.

One thing we might note here, however, right off the bat is that the common canard about "people living longer" simply isn't true.  People do not live any longer presently than they ever have.  As we addressed in the post about life spans, the very widespread notion that "people live longer today" is based upon a misunderstanding of statistics.  People don't live longer, they simply do not die from some untimely event, whether that be disease, violence, or injury, as frequently as they once did.  Indeed, they do not die by some of these causes (violence, death at birth, etc) nearly as frequently as they once did by a huge margin.  That means more live out their allotted years, so to speak, than was once the case.  Put another way, not too many people would regard being falling off of a hay rake and getting dragged to death a natural way to go, but more than a few teenagers experienced that sort of death up until relatively recently.

But this fact does inspire the two reactions noted above.  On one hand, the combination of better medicine, much less physically arduous labor, increased surplus income and the exceptionalist expectations of the Baby Boom generations has lead to a sort of expectation that the old won't ever grow old, and that we should expect to be touring Naples on bicycles up until our 90s.  And for a few, that is darned near true.  My mother didn't tour Naples on a bike, but she did ride one around town up until just a few years ago, when old age finally really caught up with her.  On the other hand, the same increase in the number of people who grow old, combined with massive societal changes in the past century, inspire legitimate fears in many that their declining years will be impoverished and difficult.

Most people now are used to the idea of there at least being something called retirement.  And while that concept goes back surprisingly far, retirement as an actual practice for most people does not.  Indeed, for most people, and I mean for most people on Earth, it didn't become a possibility until the late 19th Century.

Prior to the late 19th Century retirement for average people just didn't exist.  Part of the reason why, particularly in North American, is that in the much more rural economies of years past, there wasn't an economic ability for it, and there was certainly no state sponsored retirement of any kind.  Farmers basically worked on the land until they passed away, with it being the rule that, if they owned their land (and most North American farmers did) they passed the farm on to one of their children.  If they grew too infirm to work it, that passing on feature effected their retirement, basically.  They'd still be there, even if they could no longer work as much, or indeed at all.


This practice, by the way, is still pretty common with agricultural families.

In other lines of work, the same could also be true, however.  In any sort of family operation, the older male would generally keep working at it as long as he could and if there was somebody to pass it on to, he did.


 Blacksmith, and not a young one.

Where this opportunity didn't present itself, men and women with families, and that was most men and women, might eventually move in with one of their children for their retired years.  So, an old lawyer, like John Adams (also a farmer), or Clarence Darrow, might work up until his death, and many did.  But some might also pass beyond the ability to practice and retire, moving in with a family member and closing their practice.

Of course, some people became wealthy, but in the pre late 19th Century era, that didn't equate with retiring as a rule.  For some it did, of course, but that tended to mean that they had lives of varying degrees of abundance or leisure, depending upon the amount of wealth.  That doesn't vary much from now, expect that a much, much, smaller percentage of the population achieved wealth prior to World War Two.  There are, of course, exceptions.

So, with that being the case, how did modern retirement come about? Well, two ways.  War and Social Revolution.

That's a slight exaggeration, but only slight.

The first real retirements we can find, in the modern sense, start off with various armies.  How armies were raised and manned varied over the world in the 18th and 19th Century, but it's about that time that retirement systems for soldiers started to come into play.  Originally, there was none. Indeed, as shocking as it may now seem, in many European armies of the 18th Century soldiers were conscripted for life or near life terms, if they were conscripted.  Short term conscription for most European armies (the Russians excepted, Russians solders were conscripted for a term of 25 years) came in during the 19th Century, and for solid military reasons.  In the 18th Century, however, even British soldiers, who were volunteers, joined for life.

American soldiers, few in number until World War Two, never joined for life and always joined for a short term, but in both instances, there was no such thing as retirement.  If a soldier was retired, it was because he became too infirm or injured to keep on soldiering.  Every country recognized a a system for retiring soldiers in that situation, but only that one. So, showing that things can reverse direction, the lot of an 18th Century soldier was worse in this fashion than it was for a Roman soldier.  Roman soldiers actually could retire, with a grant of land.

The impact of this, however, was to put a lot of old enlisted men into service.  You can find plenty of pre Civil War American photographs of U.S. soldiers, for example, who are ancient.  They didn't get paid well enough to retire on savings, they didn't always have families, so they had to keep on working. There was no age cap on service, and they ultimately mustered out by infirmity or death.

That was a bad thing not only for the soldiers, but the armies as well.  To take the American example, getting 20 year olds (and the Army generally would not enlist teenagers up until the 20th Century) to spend the month of November in the snow, in Wyoming, eating moldy bacon is one thing.  Getting 60 year olds to do that, and to keep functioning, is quite another.  Now, a lot of 19th Century 60 year olds were perfectly capable of doing that, and even more are now, but in a profession in which, if you were a career man you were at it for decades, you had been badly injured and seriously ill at some point by that time, making it all the tougher.  Indeed, according to one statistical analysis I've seen, the majority of American men over 40 years of age lived with some chronic condition by age 40.  Probably the majority now do as well, but at that time, you just endured it. And enduring it wears you down.

The Army, indeed all armies, recognized this and they all began to introduce retirement systems.  In the U.S. the Army first allowed officers to retire after 40 years of service after the Civil War.  Soon thereafter, this policy was expanded to include enlisted men. Other countries adopted similar policies.

The Last Muster.  Pen and ink depiction of British Army pensioners, in uniform.

This served two purposes.  One is it simply recognized decades of service.  But it also recognized that younger men made better soldiers for a variety of reasons.  One was, of course, physical.  The original retirement system, which effectively retired U.S. soldiers at about 60 years of age, recognized that by that time they probably were physically pushing the limits of their service abilities.

World War One poster noting the physical abilities of generations of soldiers.

Indeed, this was so much the case that Theodore Roosevelt encourage the early retirement of officers who were no longer physically fit, during his presidency, by requiring officers to go on long rides (ninety miles)on their mounts.  All officers were expected to know how to ride in that era, and he tested them on it.  By that time, many older ones couldn't endure it, and accordingly they were retired.  Even officers in the Navy were given the choice of going on a very horseback ride, a very long bicycle ride, or a very long hike.

The other purpose was an intellectual one, however.  By the late 19th Century the military sciences were advancing rapidly, and the Army began to recognize that keeping old officers in place impaired the ability to adapt.  The American army was legendary for keeping men in their same ranks for eons, and by the early 20th Century, this was recognized to be a bad idea.  Sixty year old captains who had the same command for fifteen years were much less likely to appreciate newly introduced weapons than, for example, a captain in his twenties might be.  The Army accordingly dropped the retirement age to 30 years, effectively encouraging, but not requiring, men to retire early, at 3/4s pay, in their 50s.  

Even this proved to be problematic at the start of World War Two, and the Army, recognizing a need to adapt to a change in the nature of war, dropped the retirement age to twenty years.  This, it must be noted, was an early retirement age. To obtain full retirement a soldier had to stay in for 40 years, as they still do. But they could take half pay and retire at 20 years.  Less attractive at first to enlisted men as opposed to officers, this provided a means to encourage retirement for officers whom the service wanted out of the way, which they soon found other ways to additionally encourage.

 While the U.S. armed forces did indeed encourage a lot of older soldiers to "move on" at the start of World War Two, combat attrition in World War One had been so high in the Commonwealth nations that this poster actually was aimed at drawing back in World War One soldiers, noting that many in their 40s and 50s still had plenty of vigor for later service.  Unlike the U.S. Army, the British used a fair number of older officers during the war, as did the Germans.

This created the modern service retirement system we still have in place. The system spread out of the Armed forces and into nearly every type of uniformed service we have today.  Policemen, for example, generally can retire early at 20 years of service.  It's even spread out of uniformed service in some instances, however, and some other sorts of government workers have retirement systems of this type.

Retirement in other fields is a somewhat more recent phenomenon.  It's a product of the industrial revolution really.  Industrial employment, like military service, chewed men up.  It also organized them.  And this organization both created opportunities, and threats, depending upon how they were handled.  And it also removed men from the rural support system in which they'd previously lived.  If a blacksmith was injured in his small town occupation, chances are that his sons, or brothers, could take over for him, and if he had to stay home, no doubt in poverty, at least there was a home to go to.  When he grew old and could no longer work, the same was true, and chances were high that there was a fireside to stay near, as the younger men went out to work.

Once industrial labor arose, this was no longer true.  Early industrial laborers were displaced from farms and small towns to a very large extent.  As a result, they were disoriented, rootless, and in some ways at the mercy of their environment.  Ultimately, they came to agitate for protection, cognizant of the dangers of their work and what that meant for them personally.

This created a wide variety of responses, but one of them ultimately came to be socially sponsored retirement.  Men could not work in heavy industry indefinitely, but they could not leave those occupations and be able to depend on anything to fall back on. Something had to replace the family supported home to retire to, and that came to be retirement, either government sponsored or employer funded, both of which served to keep the social wolf from the door.

Early moves towards wider retirement started in the early 20th Century, with the first proposal for a Social Security being advanced in the Progressive Party campaign of Theodore Roosevelt.  Roosevelt's Bull Moose campaign failed, but the Great Depression gave new force to the argument and Social Security, a fairly radical proposal by historical standards, came to be reality under Franklin Roosevelt.  By that time, heavy industry had privately incorporated it in many instances. World War Two, which boosted the advantages to private industry to supply benefits during a period in which wages were frozen, boosted it further.
 
Female industrial laborer, World War Two. Labor had been agitating for benefits beyond increased wages since the late 19th Century, but it was World War Two that really changed the nature of health care and retirement in the United States.

This gave us the situation we had in the middle of the 20th Century, and which lasted until at least the 1970s. By and large, in private employment in the US, most occupations offered pensions of some sort.  This promised workers the ability to retire at age 65.  In addition to that, Social Security became available at age 63, with full benefits payable at age 65.  For those with service occupations, retirement came to be available after 30 or 20 years.  Frequently, those who had service employments went on to a second career, in light of the fact that they were retiring fairly young.

So far so good, but staring in the late 1980s, something began to break down in this system. Now, while that system hasn't completely broken, concern over the system is widespread.  What happened?

Well, for one thing work stability declined in the private sector, while seemingly becoming solidifying in the government sector, at least up until very recently.  For long time government employees, I'd note, many are having their last laugh now after years of deridment by those in the private sector.  I've heard this more than once, for example, from government lawyers who are nearing retirement and now see that their private sector fellows, who often chided them for sticking it out in "low paying" (which were really lower paying, not low paying) positions for decades.  Now the government sector lawyers are able to retire, while  many of those in higher paying private practices cannot.  Indeed, one comment of that type just appeared on an ABA website about retirement.

In the private sector manufacturing jobs became highly unstable, if they didn't disappear completely, and this spilled over into white collar occupations as well.  Much has been made of the fact that employees can't enter an occupation  and plan on sticking it out for their career for one reason or another.  One of the things little noted about that is that with that instability, has come the evaporation of retirement plans.  Retirement plans only make sense for long term employees, not short term ones. 

So, is this system broken?  Put another way, is it unrealistic?

That's hard to answer, but retirement is rapidly becoming something that is not nearly as certain for many people as it once was.  Social Security wasn't designed to provide a fancy retirement, just to keep people from falling into retired poverty, and it wasn't meant to cover 100% of the people who paid into it either.  Indeed, it still doesn't cover 100%, but in an era when medicine has made early deaths less common, more people now live into their old age and advanced old age.

But another aspect of this may simply be that expectations about retirements became unrealistic.  Truth be known, much of our concept of retirement is retirement as envisioned by the World War Two and Boomer generations, which was never the historical norm.  The abnormal economies of the 1940s through 1960s lead people, gradually, to an expectation of sort of a luxurious retirement, replete with a new home far away from where they'd worked.  Historically, however, in the 60 or so years prior to that, retirement just meant retirement in place and in scale.  People tended to live decades in one house which they'd paid off well before they retired.   When they retired, they stayed home, not traveled the globe and not dreaming of planting vineyards in Tuscany.

But in order to do that, a person has to have paid their debt down to next to nothing, or nothing, but the time they're in the 60s at least, if not their mid 50s.  Otherwise, they're going to have to have a pretty significant income in retirement.  That is unrealistic.

So, what does all of this mean?  Hopefully it doesn't mean that retirement has returned to its absolute historical norm, i.e., non existent.  But it does mean that the golden age of retirement is most likely over, at least for the foreseeable future, and in the type of economy we have now.  Social Security is already being readjusted to creep it back to its more historical demographic status, and ages of entitlement have started to go up.  I strongly suspect that will start occurring in government retirements as well, which are now strained.  Twenty year plans, where they exist, will disappear in favor of thirty year plans that only allow a draw once the recipient hits age sixty, much like Army Reserve retirements now work.  That'll probably continue as well.  For those retiring in the future, a paid off home with a garden in the backyard is probably a lot more likely than trips to France and vineyards in Tuscany.



Postscript

The New York Times has an article on retirees today noting that those who want to keep on working often have a hard time finding a job that suits them, and that those who have retired find they often like it better than they suppose.

I'm glad to read that really.  While its contrarian in nature, in our society, I find the general view that its great if people past retirement age can keep working, and that they really should. Should they, if they can retire?  I'm not so sure.  It's discouraging to think that the value of a person is measured only in their ability to work, and that for everyone it must be the case that all their adult years must be employed.  That says something about us, I think, as a society.

While it's also contrarian of me to mention it, this sort of taps into the theme of one of the Super Bowl advertisements from this year, in which an actor Neal McDonough discusses how in the US we get two weeks off per year for vacation (which is inaccurate for most Americans, most don't take all their vacation so they take less than that) while the French take August (which is also inaccurate, as the French also tend to extent their vacations with a general strike from time to time).  We're informed we're a can do sort of people, and at the end its suggested that our reward for that is a Cadillac.

Well, I have nothing against Cadillacs, but that advertisement sort of makes you wonder if you should go for a Peugeot instead and take August off.

Wednesday, March 12, 2014

The Big Speech: The Dream of the Rood

The Dream of the Rood: Translation by Richard Hamer

Translation of the dramatic Old English poem

Wyoming Jambalaya




Antelope summer sausage, frozen seafood package (shrimp, imitation crab, calamari) and red bean and rice mix.  Not bad.

Law Office Space: How Much Is Enough? - Attorney at Work - Attorney at Work

Law Office Space: How Much Is Enough? - Attorney at Work - Attorney at Work

Mid Week at Work: The Civil Air Patrol.

Photographs of the Civil Air Patrol during World War Two. The CAP was made up of civilian volunteers organized into an axillary of the Army Air Corps for the purposes of patrolling the coasts.  They detected over 100 submarines during the war.  The organization exists today as an axillary of the USAF and performs search and rescue operations.