Showing posts with label Yeoman's First Law of History. Show all posts
Showing posts with label Yeoman's First Law of History. Show all posts

Wednesday, September 13, 2023

Hard soled sandels.

New fossil footprint evidence suggests that humans wore hard soled sandals over 80,000 years ago.

No surprise.

Friday, June 30, 2023

Can't win for losing. Supreme Court Strikes Down Affirmative Action.

For the reasons provided above, the Harvard and UNC admissions programs cannot be reconciled with the guarantees of the Equal Protection Clause. Both programs lack sufficiently focused and measurable objectives warranting the use of race, unavoidably employ race in a negative manner, involve racial stereotyping, and lack meaningful end points. We have never permitted admissions programs to work in that way, and we will not do so today.

At the same time, as all parties agree, nothing in this opinion should be construed as prohibiting universities from considering an applicant’s discussion of how race affected his or her life, be it through discrimination, inspiration, or otherwise. See, e.g., 4 App. in No. 21–707, at 1725– 1726, 1741; Tr. of Oral Arg. in No. 20–1199, at 10. But, despite the dissent’s assertion to the contrary, universities may not simply establish through application essays or other means the regime we hold unlawful today. (A dissenting opinion is generally not the best source of legal advice on how to comply with the majority opinion.) “[W]hat cannot be done directly cannot be done indirectly. The Constitution deals with substance, not shadows,” and the prohibition against racial discrimination is “levelled at the thing,not the name.” Cummings v. Missouri, 4 Wall. 277, 325 (1867). A benefit to a student who overcame racial discrimination, for example, must be tied to that student’s courage and determination. Or a benefit to a student whose heritage or culture motivated him or her to assume a leadership role or attain a particular goal must be tied to that student’s unique ability to contribute to the university. In other words, the student must be treated based on his or her experiences as an individual—not on the basis of race.

Many universities have for too long done just the opposite. And in doing so, they have concluded, wrongly, that the touchstone of an individual’s identity is not challenges bested, skills built, or lessons learned but the color of their skin. Our constitutional history does not tolerate that choice.

The judgments of the Court of Appeals for the First Circuit and of the District Court for the Middle District of North Carolina are reversed.

It is so ordered.

After a series of decisions on cases which liberal pundits were in self afflicted angst about in which the Court didn't realize their fears, the Court finally did realize one and struck down affirmative action admission into universities, something it warned it would do 25 years ago.

The reason is simple. Race based admission is clearly violative of US law and the equal protection clause. That was always known, with the Court allowing this exception in order to attempt to redress prior racism.  As noted, it had already stated there was a day when this would end.  The Court had been signalling that it would do this for years.

Indeed, while not the main point in this entry, it can't help be noted that when the Court preserves a policy like this one, which it did last week with the also race based Indian Child Welfare Act, liberals are pretty much mute on it.  There are no howls of protest from anyone, but no accolades either.  Political liberals received two (expected, in reality) victories from the Court in two weeks that they'd been all in a lather regarding. They seemed almost disappointed to have nothing to complain about, until this case, which gave them one.

Predictably, the left/Democrats reacted as if this is a disaster.  It isn't.  Joe Biden instantly reacted.  Michele Obama, who has a much better basis to react, also made a statement, pointing out that she was a beneficiary of the policy, which she was.  That's fine, but that doesn't mean that the policy needed to be preserved in perpetuity.

At some point, it's worth noting, these policies become unfair in and of themselves.  Not instantly, but over time, when they've redressed what they were designed to.  The question is when, and where.  A good argument could be made, for example, that as for the nation's traditionally largest minority, African Americans, this policy had run its course.  In regard to Native Americans?  Not so much.

Critics will point out that poverty and all the ills that accompany it still afflict African Americans at disproportionate levels, and that's true. The question then becomes why these policies, which have helped, don't seem to be able to bridge the final gap.  A whole series of uncomfortable issues are then raised, which the right and the left will turn a blind eye to. For one thing, immigration disproportionately hurts African Americans, which they are well aware of.  Social programs that accidentally encouraged the break-up of families and single parenthood hit blacks first, and then spread to whites, helping to accidentally severely damage American family structures and cause poverty.  Due to the Civil Rights movement, African Americans became a Democratic base, which was in turn abandoned by the Democrats much like Hard Hat Democrats were, leaving them politically disenfranchised.  Black membership in the GOP has only recently increased (although it notably has), as the black middle class and traditionally socially conservative black community has migrated towards it, but that migration was severely hindered by the legacy of Reagan's Southern Strategy, which brought Southern (and Rust Belt) Democrats into the party and with it populism and closeted racism.

While the left will howl in agony on this decision, it won't really do anything that isn't solidly grounded in the 1960s, and 70s, and for that matter probably moribund, about the ongoing systemic problems.  Pundits who are in favor of institutionalizing every child during the day will come out mad, but they won't dare suggest that immigrants take African American entry level jobs.  Nobody is going to suggest taking a second look at social programs that encourage women of all races to marry the government and fathers to abandon their offspring, something that Tip O'Neill, a Democrat, noted in regard to the African American family before it spread to the white family.  The usual suspects will have the usual solutions and the usual complaints, all of which aren't working to push a determinative solution to this set of problems.

Hardly noted, yet, we should note here, is that this decision, just like Obergefell and Heller, will have a longer reach than people now seem to note.  If college affirmative action is illegal, then similar race based programs (save for ones involving Native Americans, who are subject to the Indian Commerce Clause) are as well. And maybe so are gender based ones, including ones that take into account the ever expanding phony categories of genders that progressive add to every day.  In other words, if programs that favor minority admission into university are invalid, probably Federal Government policies that favor women owned companies over others are as well.

Indeed, they should be.

Societies have an obligation to work towards equality before the law, and before society, for all.  But the essence of working on a problem is solving it.  The subject policy was successful for a long time, but this institutionalized favoritism was no longer working to a large degree, and for that matter, in some instances, impacting others simply because of their race.  It's not 1963, 1973, or 1983 any longer.  New thoughts on old problems should be applied.

Some of those new thoughts, frankly, should be to what extent must we continue to have a 1883 view of the country as if it has vast unpopulated domains to settle that it needs to import to fill.  Another might be, however, that American society really has fundamentally changed on race even within the last 20 years.  While racism remains, and the Obama and Trump eras seem to have boiled it back up, for different reasons, a lot of street level racism really is gone.  For one thing, seeing multiracial couples with multiracial children no longer causes anyone to bat an eye anymore, and that wasn't true as recently as 20 years ago.  We may be a lot further down this road than anyone suspects.

Sunday, February 12, 2023

Neanderthal Crab Bakes.

Neanderthals living 90,000 years ago in a seafront cave, in what’s now Portugal, regularly caught crabs, roasted them on coals and ate the cooked flesh, according to a new study.

From CNN.

No surprise. Why wouldn't they have roast crabs?

Neanderthals eating crabs 90,000 years ago.  Okay, actually, these folks are in Raceland, Louisiana in 1938, but its the same thing, probably right down to the beer.  The messiness of eating crab is shown by the newspapers, and that explains also why those looking in the subject cave can tell Neanderthals ate crabs.

This provides, by the way, one more reason that being a vegetarian is nuts.  You don't toss out diets that we've been acclimated to for eons.


Friday, January 6, 2023

Neaderthals and their advanced brains.

A science headline on a paper just out yesterday:

Homo sapiens and Neanderthals share high cerebral cortex integration into adulthood

From a synopsis by the authors of the study:

A surprising result

The results of our analyses surprised us. Tracking change over deep time across dozens of primate species, we found humans had particularly high levels of brain integration, especially between the parietal and frontal lobes.

But we also found we're not unique. Integration between these lobes was similarly high in Neanderthals too.

I know it sounds flippant, but I'm not surprised.  I would have expected our brains, and Neanderthal brains, to be just about the same.  And that's because I also believed this:

There's another important implication. It's increasingly clear that Neanderthals, long characterized as brutish dullards, were adaptable, capable and sophisticated people.

I, of course, maintain that Neanderthals weren't a different species at all, but simply a subspecies of our species.

Sunday, January 1, 2023

Sustainable fashion.

A new study reveals the following:

Humans have been using bear skins for at least 300,000 years, suggests study

This is not surprising, of course.


Tuesday, November 22, 2022

Lex Anteinternet: Evidence for the cooking of fish 780,000 years ago...A few observations.



A few odds and ends on this story:
Lex Anteinternet: Evidence for the cooking of fish 780,000 years ago...:   Evidence for the cooking of fish 780,000 years ago at Gesher Benot Ya’aqov, Israel Yup.  And. . .  The early Middle Pleistocene site of Ge...

By most reckonings, the humans, and they were humans, who were grilling up the carp were not members of our species, Homo sapiens.

They likely would have been Homo Heidelbergensis or Homo Erectus, the former having at one time been regarded as a subspecies of the latter.

No matter, these people were a lot closer to you than you might imagine.  Their brain capacity, for one thing, is just about the same as modern humans at 1200 cc.  FWIW, the brain capacity of archaic Homo Sapiens was actually larger than that of current people, members of the species Homo Sapien Sapien. Our current brain sizes are pretty big, in relative terms, at about 1400 cc, although Neanderthals' were bigger, at 1500cc.  

About the "archaic" members of our species, it's been said that they're not regarded their own species as they have been "admitted to membership in our species because of their almost modern-sized brains, but set off as ‘archaic' because of their primitive looking cranial morphology".1  Having said that, some people say, no, those are Homo Heidlebergensis.  It can be pretty difficult to tell, actually, and as been noted:

One of the greatest challenges facing students of human evolution comes at the tail end of the Homo erectus span. After Homo erectus, there is little consensus about what taxonomic name to give the hominins that have been found. As a result, they are assigned the kitchen-sink label of “archaic Homo sapiens.”

Tattersall (2007) notes that the Kabwe skull bears more than a passing resemblance to one of the most prominent finds in Europe, the Petralona skull from Greece. In turn, as I mentioned above, the Petralona skull is very similar to one of the most complete skulls from Atapuerca, SH 5, and at least somewhat similar to the Arago skull.

Further, it is noted that the Bodo cranium from Africa shares striking similarities to the material from Gran Dolina (such as it is). This suggests that, as was the case with Homo erectus, there is widespread genetic homogeneity in these populations. Given the time depth involved, it is likely that there was considerable and persistent gene flow between them. Tattersall (2007), argues that, since the first example of this hominin form is represented by the Mauer mandible, the taxonomic designation Homo heidelbergensis should be used to designate these forms. This would stretch the limits of this taxon, however, since it would include the later forms from Africa as well. If there was considerable migration and hybridization between these populations, it could be argued that a single taxon makes sense. However, at present, there is no definitive material evidence for such migration, or widespread agreement on calling all these hominins anything other than “archaic Homo sapiens.”2

 Regarding our first ancestors, of our species, appearance:

When comparing Homo erectus, archaic Homo sapiens, and anatomically modern Homo sapiens across several anatomical features, one can see quite clearly that archaic Homo sapiens are intermediate in their physical form. This follows the trends first seen in Homo erectus for some features and in other features having early, less developed forms of traits more clearly seen in modern Homo sapiens. For example, archaic Homo sapiens trended toward less angular and higher skulls than Homo erectus but had skulls notably not as short and globular in shape and with a less developed forehead than anatomically modern Homo sapiens. archaic Homo sapiens had smaller brow ridges and a less-projecting face than Homo erectus and slightly smaller teeth, although incisors and canines were often about as large as that of Homo erectus. Archaic Homo sapiens also had a wider nasal aperture, or opening for the nose, as well as a forward-projecting midfacial region, known as midfacial prognathism. The occipital bone often projected and the cranial bone was of intermediate thickness, somewhat reduced from Homo erectus but not nearly as thin as that of anatomically modern Homo sapiens. The postcrania remained fairly robust, as well. To identify a set of features that is unique to the group archaic Homo sapiens is a challenging task, due to both individual variation—these developments were not all present to the same degree in all individuals—and the transitional nature of their features. Neanderthals will be the exception, as they have several clearly unique traits that make them notably different from modern Homo sapiens as well as their closely related archaic cousins.3

Well, what that tells us overall is that we were undergoing some changes during this period of the Pleistocene, that geologic period lasting from about 2,580,000 to 11,700 years ago.

And that, dear reader, points out that we're a Pleistocene mammal.

It also points out that we don't have yet a really good grasp as to when our species really fully came about.  We think we know what the preceding species was, but we're not super sure when we emerged from it.  And of course, we didn't really emerge, but just kind of rolled along mother and father to children.

Which tells us that Heidlebergensis may have been pretty much like us, really.

Just not as photogenic.

On that, it's also been recently noted that the best explanation for the disappearance of the Neanderthals, which are now widely regarded as a separate species that emerged also from Heidelbergensis disappeared as they just cross bread themselves out of existence.  Apparently they thought our species was hotter than their own.

Assuming they are a separate species, which I frankly doubt.

Here were definitely morphology differences between Heidelbergensis and us, but as we addressed the other day in a different context, everybody has a great, great, great . . . grandmother/grandfather who was one of them.

And another thing.

They ate a lot of meat.

A lot.

I note that as it was in vogue for a while for those adopting an unnatural diet, i.e. vegetarianism, to claim that this is what we were evolved to eat. 

Not hardly.  With huge brains, and cold weather burning up calories, we were, and remain, meat eaters.

Foonotes:

1.  Archaic Homo sapiens  Christopher J. Bae (Associate Professor, Dept. of Anthropology, University of Hawaii-Manoa) © 2013 Nature Education  Citation: Bae, C. J. (2013) . Nature Education Knowledge 4(8):4

2. By  James Kidder, The Rise of Archaic Homo sapiens

3.  11.3: Defining Characteristics of Archaic Homo Sapiens

Sunday, November 20, 2022

Evidence for the cooking of fish 780,000 years ago at Gesher Benot Ya’aqov, Israel

 


Evidence for the cooking of fish 780,000 years ago at Gesher Benot Ya’aqov, Israel

Yup.  And. . . 

The early Middle Pleistocene site of Gesher Benot Ya’aqov, Israel (marine isotope stages 18–20; ~0.78 million years ago), has preserved evidence of hearth-related hominin activities and large numbers of freshwater fish remains (>40,000). 

People like to eat fish, and save for the oddballs who like to eat sushi, for which there is no explanation, they like their fish cooked.

Most places, people like to eat carp too.  For some odd reason, there's a prejudice against carp in at least the Western United States, but elsewhere, not so much.

So, our human ancestors 780,000 years ago. . . put another carp on the barbi. . . 

Monday, October 31, 2022

Why on earth would this be surprising in any fashion?

 Regarding a set of Neanderthal remains found in Siberia:

When Skov started comparing the genomes from Chagyrskaya, he got the surprise of his career. Two individuals, an adult male and a teenage female, shared half of their DNA, a situation that could occur only if they were siblings or a parent and child. To determine the relationship, the researchers examined mitochondrial DNA — which is maternally inherited and would therefore be identical between siblings and between a mother and child, but not between a father and child. This differed between the male and female, suggesting that they were father and daughter.

This is a huge whopping surprise?

Friday, September 9, 2022

Donkeys


Donkeys transformed human history as essential beasts of burden for long-distance movement, especially across semi-arid and upland environments. They remain insufficiently studied despite globally expanding and providing key support to low- to middle-income communities. To elucidate their domestication history, we constructed a comprehensive genome panel of 207 modern and 31 ancient donkeys, as well as 15 wild equids. We found a strong phylogeographic structure in modern donkeys that supports a single domestication in Africa ~5000 BCE, followed by further expansions in this continent and Eurasia and ultimately returning to Africa. We uncover a previously unknown genetic lineage in the Levant ~200 BCE, which contributed increasing ancestry toward Asia. Donkey management involved inbreeding and the production of giant bloodlines at a time when mules were essential to the Roman economy and military.

Abstract, The genomic history and global expansion of domestic donkeys.

Thursday, August 4, 2022

Footprints dating back 12,000 years have been found in salt flats at Hill Air Force Base in Utah.

More evidence showing that human beings had spread well into the continent much earlier than had only recently been supposed.

The area was, at the time, a wetland. The footprints appear to be those of women and children.

Tuesday, January 18, 2022

The antiquity of the species.

Scientists say a Homo sapiens fossil found in Ethiopia in the 1960s is at least 233,000 years old, which would make it 36,000 years older than the previous estimate.

No surprise whatsoever.

Which means that the species is at least 300,000 years old, even if nobody is going to admit that.

Probably older.

No big surprise.

As its probably more like 500,000.


Saturday, December 18, 2021

A couple of interesting items. . .

 to ponder.

View from the S H Knight (geology) Building in 1986.

Recent research has indicated that humans reached the Faeroe Islands at least 300 years prior to the Vikings doing so.

This doesn't surprise me a bit, and apparently it's been more or less known for some time, and its what I would have expected, but new studies, involving obtaining DNA from the bottom of a lake, has proven it conclusively.

Evidence of really old sheep defecation was found down there.  Maybe sort of gross sounding in a way, but really cool nonetheless.  So not only was early colonization much earlier than guessed at, but it was true colonization.  I.e, we know about this place and we're bringing our sheep.

Really cool, in my opinion, is that part of the groundbreaking research was done by Dr. Lorelei Curtin of the University of Wyoming. She is a post-doctoral researcher at the university's Department of Geology and Geophysics, of which I'm a graduate.

She specializes, I'd note, in climate research and another study just out notes that global cooling seems to be brought about by global warming. Something I was taught when a student in that department some 35 or so years ago.

Graduates of the other department that I'm a graduate of, the College of Law, have not pegged me out on the pride meter much as time has gone on, but the Department of Geology and Geophysics is different.

Well, go Pokes.

I'll note this as well. The Vikings first settled Iceland starting in 874 and Greenland around 980.  I'm guessing that the last date is correct, but I'll bet that somebody was on Iceland by 874. Rather obviously, the Vikings weren't great at recording who exactly was where they went, when they got there, as the Faeroe Island discovery more or less proves.

Saturday, December 4, 2021

Milk, Big Lunches, and Coffee

Milk, it does a body good. . . or not.

The ability of some, but not all, human populations to drink milk is supposedly only about 6,000 years old.


I am not a huge milk fan, frankly.  Some people really are.  There are adults who like milk so much that they'll buy it and regularly consume it, although people seem to mostly do that at home, and with adults it seems to really drop off.

It's not that I detest it either.  I just quit drinking it pretty much as a teenager and I don't like it enough to resume drinking it. . . if I could.  I don't recall the last time I simply drank a glass of milk, but it would be a long time ago.  The last time I regularly did it, I"m pretty sure, was in basic training, as they served it in little pint cartons, just like schools used to do.

Milk has to be digested.


Our species is supposedly about 300,000 or so years old, which probably means its closure to 500,000 years old if not 600,000 or more.  We only got around, they say, and as noted above, to drinking cows milk about 6,000 years ago, supposedly, which probably means its a little longer than that.

Now, if you are a mammal, you are evolved to drink milk. . . as a baby.  Milk, all milk, contains lactose, a sort of sugar, and babies produce lactase in order to be able to digest it.  But human adults, in our default state of nature, don't.

In fact, no adult mammal does.  Not even cats, which will drink milk for the fat in it. Cats can't taste sweet, by the way, so they're not experiencing milk like you do when they drink it.  It's more like gulping down a liquid bratwurst for them.

About 6,000 or so ago a mutation started showing up in European genetics for lactase persistence.  In reality, lactose intolerance isn't so much a genetic deficiency as lactase persistence is a genetic advantage.  What was pretty clearly going on is people were keeping cattle, so they didn't have to go out and hunt them (wild cattle in Europe still existed. . .indeed they existed on much of the globe) and at some point, either out of desperation or something, they started drinking their milk.  That was probably a bold move, but we'd note that the only pastoral people on earth who remain lactose intolerant are African pastoralist, who will if things get desperate bleed cattle for protein, which is sort of similiar in a way.

According to the BBC, the first humans to take up drinking milk probably were rather flatulent, but if my own experience means anything, they probably felt a little sick to their stomach.

I.e., as an adult I've become somewhat lactose intolerant.

It's a bummer.

It just happened over the last couple of years, which surprises me.  My lactase production as an adult must have always been somewhat weak, but only very recently has this become a problem.  But it has now.

It's breakfast where the problem really shows up.

As best as  I can determine, in my amateur but scientific fashion, I still produce some lactase.  I can and do eat cheese, for example. And usually things cook with milk in them, which I frankly don't eat a lot of, don't bother me.  I'll note that I'm also mildly allergic to eggs, and this is true of them as well.

But putting milk on cereal has become a sufficient enough problem that I have to use it very sparingly, and even then sometimes that's a problem.  And there's one egg/milk casserole dish that my wife occasionally makes that is practically a no-go for me.  Just too darned much.

Pass the cheese slathered leftovers please. .  F=m(a).

I'm not much of a breakfast eater anyway now that I'm in my older years.  I tend to eat it, as I don't eat lunch hardly at all, and that way I don't get too hungry during the day if I’m doing something, although truthfully if I don't eat at all, it usually doesn't matter.

That's probably because I have an office job most days, and sitting around on your butt isn't expending much in the way of calories.

That's self-evident, I think, but to a lot of people it doesn't seem to be, or it is in sort of a chasing the tail fashion.  

I note that as it seems that about 100% of the European American population in the United States is on some sort of a "diet".  I just commented on this.  This affliction doesn't seem to wander into other ethnicities, in so far as I’m aware, but for European Americans, at least middle class and upper class Americans, this is true.

Americans have long had a problem with magical thinking about diet and medicine as it's earlier than actually accepting the science of things.  I.e, rather than think "yikes, I'm getting sick and might need to see the doctor" it's easier to buy "essential oils" or some other crap off the Internet.  If you don't die, you can proclaim it cured you, and if you die, you won't be around to make that point.

Diet works the same way.

The basic biology of diet is pretty simple.  You expend so many calories just existing, and if you work beyond that, you'll expend more.  Sitting around in an office doesn't expend many calories.  So if you don't want to gain weight, the first principal would be not to eat too much.

The second one would be not to eat an unnatural diet.  If it comes prepackaged in cardboard, it's probably unnatural.

Anyhow, that's simple enough, but that means you'd have to eat less, and for a lot of people, that's a real bummer.  Most people like food, and most people like some food that is way high calorie.

The big problem is, however, that most people don't work for their food in the physical sense, the most class definition (but not the only one) being that W=m(a), that is work is equivalent to mass times acceleration.  No, most people don't do that.

Take even the period just prior to World War One, which wasn't that long ago in real terms.  There was some prepackaged food in the form of canned goods, and there was food that people canned themselves. And there were salted and brined meats as well. But by and large, what most married people experienced involved quite a bit of work. 

If you lived where I do, for example, there's a strong chance that you walked to work, if you were a man or one of the minority of women who were employed outside the home.  Some were driving by 1921, but a lot were still walking, and it was 1911, most were walking. That's work under the physics definition.

Married women, or women in a married household who were adults, typically went to the grocers and the meat market every day during the day. So that was more work. 

In contrast, now most people simply drive to the grocery store and get what they need, which involves work for the car, but not for the eater.

Added to that, in addition, quite a few people worked to some degree, at least by having a garden, for their food.

Some of this still goes on, but by and large people are highly acclimated to doing very little physical work for their food.

This isn't really new. Since the mid 20th Century this has been an increasing trend, and by the late 20th Century, when it might be noted people really started putting on the weight, it was much like it is now.  The odd thing is, however, that people have never really gotten away from large-scale food consumption.

Eating three full meals a day makes sense if you are a farm hand in 1910, but not much if you are an office worker in 2021.  For that matter, even aboriginal people rarely eat that much, and that's what our bodies mostly think we are, with some regional evolutionary adaptations for agriculture.  If you don't have those adaptations at all, what your body thinks is that you might not eat today. . . or tomorrow, but you'll be okay when you kill that deer the day after.  But pass on that milk . . .

Or if your ancestors, let's say, lived in the Mediterranean, your body probably thinks you'll get three squares with lots of grains and cheese, but you're also going to be spending almost all day hiking around with your goats.

Your body never thinks that you are going to eat a hearty breakfast, drive to work, and eat a lunch as big as most people's dinners in prior eras, then drive home and eat an even bigger dinner.

That's what a lot of people actually do, but very few people are ready to admit it.

I'm 5'6" tall and I weight 165 lbs (normally).  

I did a post on this quite a while back, from an historical prospective.  That post is here:

Am I overweight? Well, that might depend on the century.

I realize now that I actually messed up on that post, as the linked in chart involved only women.  In that my weight, 165, would have been overweight for a woman, barely, but my  guess is that it wouldn't have been for a man.  According to current figures, however, I'm overweight to the tune of 10 lbs.

Now, a lot goes into that, and I'll admit that I should lose some weight, even though I don't think I'm really all that much overweight.  Be that as it may, if I ate breakfast every day, and then followed up with a full meal at noon, and went home to eat dinner, I'd be very much overweight.  I'd guess at least another 20 lbs higher.

It's a matter of physics and metabolism.

On this, being overweight is not a sign of some moral failing.  I'm continually surprised when people assume it is.  Indeed, when Chesterton had a pending cause for canonization, there were some who noted that he was overweight.

Seriously?  That's why he shouldn't be canonized?

Coffee, it does a mind good

Some recent reports hold that drinking coffee significantly reduces the risks of dementia later in life, by which they presumably mean drinking a caffeinated beverage. The headlines were on coffee, however.  That's good news for me, as I drink a full post of coffee before I go to work.

I've witnessed dementia up close and personal as my mother acquired it.  I'll be frank, it worries me, but then you have to play the cards you are dealt.  It doesn't occur on my father's side of the family at all.  Having said that, for the most part, most of the men in my family haven't tended to live much past. . . my current age. Once again, you play the cards you are dealt.

Having said that, it's probably the case that not too much can be drawn from the latter.  My father acquired a persistent internal infection that we don't really know the origin of, but which was probably related to his gal bladder and some ineffective early medical treatment (he knew what he had, but his physician didn't seem to really grasp what was going on completely).  He inherited late in life gall bladder problems, by all appearances, from his mother, who also had them, and died from them.  However, they both had a bit of a fondness for certain foods that didn't help that, and I don't really have the same sort of sweet tooth they did.  My father's father died in his late 40s, but he had high blood pressure, which I don't.  My father's brother is in his late 80s and doing great, so hopefully. . . 

Anyhow, I drink a lot of coffee and I'm glad for the news.

Wednesday, October 6, 2021

Blog Mirror: Southern Rockies Nature Blog: Who Will Make Me These Old Skis?

Southern Rockies Nature Blog: Who Will Make Me These Old Skis?: Skis from 1300 years ago ( Secrets of the Ice. ) I have always enjoyed messing around with old cross-country ski gear. In high school, I pic...

Way cool.

And probably the user would recognize my Fischer 99s more readily than the skis out on the cross country track these days. 

Saturday, September 25, 2021

23,000 year old human footprints found in New Mexico.

And that's a big deal.

It's a big deal, as it pushes the human presence in North America way back beyond what had been previously suspected in a spectacular example of Holscher's First Law of History.

And if humans were in New Mexico 23,000 years ago, so far back that the Ice Age land bridge isn't a very good explanation for how humans got here from Asia, it means they arrived at least some time prior to that. After all, you can't walk from Point Barrow to New Mexico in a day.

Monday, February 15, 2021

The ruins of a "high production" brewery have been unearthed in Egypt. . .

 or rather rediscovered. British archeologist discovered it earlier, but apparently didn't appreciate what it was.

And what it was, was a brewery that was capable of producing 22.400 liters of beer at a time.

22,400 liters at a time.

That's quite a lot.

The significant thing, moreover, is that the brewery dates to 3,000 BC, some 5,000 years ago. So this means that Egypt was mass producing beer, at least from this facility, 5,000 years ago.

What would it have been like? Well, that's a big guess.  There's been some efforts to brew it, but that involves a lot of guessing in and of itself. All we can say for sure is that they brewed it, and apparently quite a lot of it.

Friday, December 11, 2020

Neanderthals buried their dead. Duh.

 

A new discovery has released the shocking news that Neanderthals buried their dead, released as this thought hadn't occurred to us before, which is odd as we've uncovered at least one Neanderthal grave with the deceased covered in flowers before.

Here's more shocking news.  Neanderthals were people.

We're unfortunately in a "splitter" era in terms of linnean classifications.  There have always been two such groups for such things, one being lumpers who hold that species are big groups with a fair amount of diversity, and splitters who hold the opposite.

I'm a lumper as they are correct.

The classic definition of a species is when two members of the same genus can breed and reproduce.  We're well aware that's the case with "our species" and Neanderthals, as most Europeans and now, we've learned, some Africans are packing around Neaderthal DNA.

But we should have known that all along and indeed some biologist and archeologist long held that.  And in fact at one time our species was referred to as Homo Sapien Sapien; Cro-Magnon man as Homo Sapien Archiac (or something like that) and Neanderthals as Homo Sapien Neaderthalensis.  Looked at that way, there was one species that had at least three subspecies, maybe more, but those are the ones we knew about.

And that definition is correct. 

So now we've confirmed, not discovered that Neanderthals buried their dead.  Of course they did. We already know that due to some of the injuries they had, and recovered from, they cared for their injured as well.  

They were simply people after all.  Every bit as smart or dull witted as we are.  

More significantly burying the dead implies that they knew of an afterlife. Their art may imply that as well. Which likewise shouldn't surprise us.

Our ancestors. . . pretty much like us.

Saturday, October 17, 2020

Pandemics go way back. D'uh.

Somehow, science didn't appreciate that the Great Plague was't the first plague, as the Washington Post is now reporting.

Ancient teeth show history of epidemics is much older than we thought

How could we not have realized that?

Sunday, July 26, 2020

Misunderstanding demographics

The professor returns for a discussion on demographics.

One of the most common features of social science is completely misunderstanding the topic of demographics. Indeed, we ought to note that nearly all demographic predictions are wrong in retrospect.

One thing we might start off with is something we noted long ago in our first law of history, which holds:


This was presented, of course, in terms of the historical past, but it's also true of the present.  We might, in fact, need to present another rule, which would be:
By the time media picks up on a story, it's already well advanced.
Such is certainly the case here.

Pop social scientists have been worried about the "population bomb" since at least the 1960s, or even much earlier if you go back to Malthusian angst. But the truth of the matter is that professional demographers have known for decades that the predicted population curve will start to decline this century.  Usually they run populations through the end of the century and no further, for good reason. Demographic predictions, as noted, are notoriously inaccurate. But the ultimate decline in the human population is an established scientific probability that's so well established its not worth debating.

Indeed, from a scientific perspective, the predictions that the population is going to keep growing and growing all over the globe which has been popular in apocalyptic books has been known to be flat out wrong for at least two decades if not longer.  Demographers began to revise their downward population trend predictions well over a decade ago to take into account the much more rapid decline, that's right decline, in population that was already beginning to become a feature across the globe. Apparently nobody really took note, however, until sometime a couple of weeks ago when they did it again as the well established trend is accelerating and therefore global population decline will set in much more quickly than we had originally thought.

Indeed, in spite of the "what we're doing now to ourselves" concerns, some of which are indeed very valid, this is something that's been occurring since the turn of the prior century and was a matter of angst then.  Observers in the United States, for example, worried as far ago as the early 20th Century that the white, or rather the White Anglo Saxon Protestant, demographic birth rate was dropping off so fast that it meant demographic death, while they also worried about the black birth rate (and the Catholic one) which was not, at that time.  Such concerns ultimately gave rise to the likes of people like Margaret Sanger who called blacks "weeds" and promoted abortion as as a well of arresting their population increase.  Only this past week did Planned Parenthood, the organization that she founded, change the name of one of their installations once it became too politically imprudent to continue an honor a woman who was a racist promoting abortion to keep black numbers down.

In Germany there was such a concern about the drop off in the birth rate that the Nazi Party went to extreme rates, icky propaganda, and icky programs to try to reverse the decline.  Ultimately, they even took to kidnapping children who they felt could pass for German, handing them out to German families.  Most never made it back to their original parents.  More than one country in that time frame boosted programs to try to increase the number of children that women were having, all to no effect.

The reason that we note all of this is that the decline is a demographic fact that pre dates pharmaceuticals to prevent or abort birth.

It's closely tied to economics, something that's well known in one way, and the source of misplaced concern in another.  Simply put, and dating back prior to the real incorporation of women into the workplace, as advanced economies managed to raise the bulk of their populations into the middle class, birth rates fell.  As this is happening all over the world, and quickly, birth rates are accordingly falling quickly.

Indeed, poor societies tend to have high birth rates, although this isn't always true. The classic reason for this is that the poor have tended to depend on their children for support in old age, although in the United States, where most of the poor actually are at an economic level that would have been regarded as lower middle class a century ago, this isn't the reason.  In that case the law of unintended consequences has operated to incentive child birth economically while simultaneously operating to destroy marital bonds in the same demographic.

Which takes us to our next point, which perhaps should be a law as well, that being
Old understandings of conditions continue to be believed well after they are no longer correct.
Here that old understanding is the same one that operates classically to produce high birth rates in poor demographics, but with a societal application.

That is, it's widely believed that a declining birth rate is a societal disaster as large numbers, indeed an ever increasing number, of new workers is necessary to support an old retiring populations.

That's complete nonsense.

It's nonsense as it doesn't contemplate the fact of ever increasing societal wealth, which has been a decade by decade feature of economies ever since the beginning of the Renaissance.  It also doesn't take into account the advance of technology.

In fact, it's interesting to note that the very same societies and journals that are now worried about the human population now decreasing are also proposing a Universal Basic Income because technology is, they assert (probably correctly) going to put so many people out of work.  So we're simultaneously worrying that a decrease in the population means that there are fewer workers to support an aging population while we are also worrying that entering generations of workers are going to be put out of work by technology.

If you consider that both things may be operating at one time, what it should lead you to believe is that; 1) a declining population of workers is a good thing as work is also declining; and 2) advances in technology are making society wealthier and that's a good thing as it supports everyone, including the old.

Of course, that ought to also lead you to question the American policy of massive immigration rates, which are designed to offset our population decline, which otherwise set in during the 1970s.  We frankly aren't going to have a place for the workers we're bringing in, if technologist are correct, so what we're doing is importing future unemployment.  Indeed, as those same entrants come from less technologically advanced nations, the argument can be made that their future labor, which they depend on for a livelihood, will shortly be needed more where they are, than where they are going to.

All of which means:

1.  This isn't a future economic problem of any sort.
2.  It doesn't mean that future generations of elderly will be without economic support.
3.  It's actually good for the environment and the standard of living in every sense.

That doesn't mean, however that there aren't some things worth considering and perhaps being a bit worried about.

In that context, while take a look at bits of an essay by Fr. David Longnecker.  Fr. Longnecker is more classically concerned about this, and I've discounted those concerns above, but that doesn't mean that there aren't societal things that shouldn't be considered, with some rejected, nonetheless.

Well start with his essay, which was titled:

THE ABOLITION OF MAN…LITERALLY
First of all, that's dramatic, but that' isn't occurring.  People aren't going away.  There are a lot of us, and a long term decline in population will have to go on for at least two centuries before there's a real problem.  The BBC article that Fr. Longnecker linked in noted that Japan and Spain could see a 50% reduction in their populations by 2100, but if we consider that Japan's population is 126,500,000, nearly as high as Russia's, that would hardly amount to a disaster.  That would place Japan's population at around 60,000,000 which is still far too high to make Japan a really nice place to live.  Added to that, on the human understanding of population, Japan's pre World War Two population was 90,000,000 which the Japanese regarded as so dense that it required, in their view, starting colonies in Asia in order to export its population.

Spain's population is about 47,000,000, about the same as pre World War Two Germany's.  Spain isn't big.  If it had 23,000,000 it's still be pretty darned crowded.

You get the point.

The US, I'd note, hasn't. Our population is predicted to be higher than it current is in 2100, although not enormously so, as we keep importing a population.  The US is already so densely populated in some areas that regions that were once regarded as really idyllic are horrifying crapholes, Los Angeles being a prime example, followed by the rest of California.  We keep doing that for the reasons noted above, and also because we find it convenient to engage in a version of slavery light, in which we import the poor so that we don't have to pay our own poor a living wage.

Okay, so that only reinforces what I've already said.  So nothing to worry about, right?

Well, societally there is, which is mostly not what we're doing.  And here's where Longnecker, first among the really erudite critics, points out, succinctly, things that are problematic.

We first note:

What else is driving the lower birth rate? Young people are not choosing to marry and have children. Not only that, an increasing number are choosing not to make love. They can’t be bothered.
Is that really correct?

Well, part of it is, and part of it isn't. 

We've noted already that the widely held perception that people are marrying later and later isn't really true.  The young marriage ages that people generally are considering are actually usually demographic flukes that apply to unique economic conditions. 

Indeed, we've analyzed that all here in depth, and shown that marriage ages have remained remarkably stable since the Middle Ages:

Shockingly young! Surprisingly old! Too young, too old! Well, nothing much actually changing at all. . . Marriage ages then. . . and now. . and what does it all mean?

Added to this is the statistical problem of how couples that "cohabitate" or "live together" are regarded and counted.  Throughout the Christian era, until very recently, this was strongly frowned upon culturally, but at the same time occured much more often than might be supposed.  If examples given, for instance, in such well researched works as Kristin Lavransdatter are considered, they were fairly common among the gentry, if again strongly disapproved of, although they usually didn't have an illegitimate status endlessly. They broke up or resulted in marriage.  They were extremely common among British minor nobility in the 18th Century and early 19th.  They'd become so common among the British industrial working class that the concept of Common Law Marriage was introduced in order to deal with the situation in a formal way.

When no fault divorce spread through the English world and then around the world, the common law marriage died off. But the behavior remains.  This causes a statistical problem here as if these same couples were regarded as they once were, marriage in fact would be much more common.  The interesting thing is that with the death of the common law marriage a certain elevation in concept of formal marriages occurred, although divorce, which was once fairly uncommon, remains very very common.

Fr. Longnecker wouldn't want to be seen to be endorsing common law marriage and is really dealing with a social issue in advance of dealing with it from the Catholic and Christian prospective, but its important to note there that our concept of what's occurring may be inaccurate.  Having said that, the second part and third parts of his observation is correct.

Young people are not choosing to marry and have children. Not only that, an increasing number are choosing not to make love. They can’t be bothered.
People remaining childless has become quite common, although in recent years, at least by informal observation, that trend is reversing. Even where it is reversing, however, it isn't as if couples are normally choosing to have large numbers of children.  And while Catholics would generally regard it as a good development, the epidemic of sex outside of marriage that dropped down into the teens and twenties in the 70s, 80s, and 90s has really reversed.  Not only are married couples not having as many children, unmarried people aren't engaging in sex as often as they were in spite of the constant entertainment industry and societal pressure that urges them too.

Indeed, on the last item things are really thick with irony.

There's been a lot of questions about all of this in recent years, with theories ranging from the sociological to the biological.  

Sociologically, the impact of work seems to be a definite factor in some societies, particularly the United States and Japan.  The US has become as obsessed with careerism as Japan has and the emphasis in the US has really shifted since World War Two from finding a good job or career, to support a family, to a career being the end all and be all of everything.  This has been something that's impacted both men and women.  

Fr. Longnecker notes:
The reasons are complicated but among them are the aggressiveness of the modern feminist. High powered career focussed women are not interested in marriage and babies and many young men are not interested in this type of woman so the guys just opt out.
And he may in fact be right, but only in part.  Its not only because the feminist ethos has become hostile to men, which in some instances it is, and always has been, but the emphasis on careers and work in Western society have spread to women when originally it only pertained to men, and not in the same way.

This can sound like we're saying something we're not.  We're not saying women shouldn't work.  Rather, what we're saying is that there came a shift over time in which both men and women were sold a line of propaganda that held real worldly fulfillment came through careers.  That was always baloney.  

Indeed, people have always normally taken up whatever work they take up in order to get by in life, which for most people meant providing for their family  The thought of a spouse interfering with a career was really foreign to most people. Rather, the opposite, often applying only to men in earlier times, was that a career became a necessary burden and even a sacrifice to support a family The family came first in the equation.  Now a lot of people simply forgo families as its hindrance to a career, the irony being that careers are just jobs and it turns out that the majority of people don't like their jobs.

Indeed, before we move on, one thing that all of this raises is the topic of "temporary marriages", something that now exist throughout Western society but which is basically only acknowledged, oddly enough, in Iran.

Islam has a religious institution of temporary marriages, although it's really rare in almost every Islamic society save for Iran.  In Iran, which retains a fairly advanced Western economy, it's not and its even somewhat encouraged.

The Islamic institution of the temporary marriage acknowledged that humans have a sex drive while, at the same time, young people often have goals which are contrary to contracting a marriage.  In Christian cultures divorce was traditionally disallowed, which obviously isn't the case in Western cultures now, but in both Islamic and Judaic societies it isn't.  Sex outside of marriage is frowned upon in nearly every culture and religion and very much so in the Abrahamic religions.  Temporary marriages made temporary couples' actions licit.  They can have sex and not stray from morality, even though they're likely to split up later.

Following the sexual revolution the cultural leaders in the Western world encouraged and nearly demanded that everyone engage in premarital sex and only those with strong religious feelings will openly regard it as wrong now in spite of the social devastation that the change has brought about.  What's been missed is that as this has occurred, a lot of Westerners basically engage in something equivalent to temporary marriages.  No formal marriage is (usually) contracted, but people cohabitate in conditions in which its nearly acknowledged that it's all just temporary and has at least a partial goal of satisfying urges.  Obviously, a child is permanent, and therefore they're creation is diligently avoided in those arrangements.  

But is is more than that.  The heavy emphasis on work has been shown, in Japan, to cause such a level of fatigue that people are just not interested.

The example of Japan gives us another factor as well, although our own culture also does.

Pornography is a huge Japanese industry. There's something really odd here in that Japan produces a massive amount of pornography of all types, right down to the cartoon level.  That stuff is being used for something.

When the word was first coined over a hundred years ago the world "homosexual" pertained to men whose sexual impulses were self directed.  The word's meaning has changed, rather obviously, over the years, but at that time it was believed that the other meaning, the one it now has, flowed naturally from the first meaning to the practice of the second. That may sound odd, but it's not completely illogical as it is in fact the case that some people become so focused on the first that all conventional impulses are overridden, and for others it leads them into really odd acts.  There would be few (although there are some) who would maintain that this is the case today, but the mental pathway for those assumptions weren't completely illogical.

Part of the reason that they weren't completely illogical is that by and large people's impulses had to be conventionally directed by nature.  Pornography was in printed form up until introduced into film, and in both instances obtaining it had to be done at least somewhat publicly, and often illegally in earlier eras.  We've already dealt with how that changed after the introduction of Playboy magazine, which introduced a really skewed version of femininity,and we've traced that history and its interactions with pharmaceuticals and the sexual revolution already.  Indeed, we have multiple posts on it.  What the Internet has done is to make pornography free. 

And what that has done has turned a lot of men, more or less, into the original definition of homosexuals. Rather than have to deal with woman who is a person, will have moods, problems, get sick, get mad, have expectations, and the like, they just opt for a photographed (or cartoon) harem that doesn't do any of those things and is only interested in sex whenever men are.

And men will be more interested as a rule than women.  And like the old knowledge that we're no longer supposed to acknowledge holds, women are much more moodier, and perhaps simply have much keener and sharper feelings, than men do.  Nearly any married man has had to learn, or at least to learn, how to deal with feelings and reactions that he can't really fathom.  Simply electing to turn on the desktop and opt for photos of a series of massive boobed prostitutes (as that's really what they are) is easier.  At some point, it becomes not only easier, but a habit, and then some cross over into what the original definition of homosexuality was.  Most will not, but all will suffer some decay because of it and indeed nearly our entire society has.  Frustrated men who would have made decent husbands and fathers in earlier eras become loners who really have only the companionship of their workmates in fairly large numbers.  Fr. Longnekcer briefly addresses this in his comment here:
Another contributing factor to the falling birth rate is the twisted approach to sex caused by pornography. An article in London’s Daily Mail explains the research done on the effect porn and masturbation have on male libido. The short version is, guys find porn more stimulating than the real thing and self abuse easier than building a real relationship.
Finally, it can't help be noted that part of the explanation, but not all of it, for the decline in population is pharmacological and that the law of unintended consequences applies to that.  We've addressed that before a couple of times here:

The Chemical News: "New Study Links Birth Control Pill to Brain Differences, but Don't Panic", "Breast Cancer Warning Tied To Hair Dye", "Hair Dyes and Straighteners May Raise Breast Cancer Risk for Black Women". Go ahead and panic.


And here:

We like everything to be all natural. . . . except for us.


Truth be known, if the "pill" were introduced today, it's unlikely that it would be around long.  Lawyers would drive it out of existence through lawsuits if government regulators didn't.  But now we're fully used to it and it's not going away.  But its impacts aren't.

It's been repeatedly more or less maintained that this set of drugs is "safe" and that there's no overall impact on the human biome in any significant way. But we know that some of this isn't true, as the first item noted above notes.  Birth control pills are known to cause disruption in female thinking, alter brains, and cause cancer.  What the do to men isn't known, but the routine assertion that they do nothing is at least questionable.

Something is causing an increase in male sterility.  We don't know what it is. And while it may simply be a reflection of the Strauss-Howe generational theory at work (men become what women want them to be), it might not be.  Scientifically its been shown that younger generations of men are weaker than currently older ones.  And a lot, but certainly not all, of younger men now are much more lighter and, dare we say it, effeminate than their predecessors.  It's currently popular to speak of "Toxic Masculinity", but much of that was simply masculinity and, in an era not all that long ago, what women sought out.

That latter fact may be, as noted, an example of the Strauss Howe factors at work.  But it might also be the influences of chemicals in our environment as well.  And those chemicals may be having long term effects on both men and women in unnatural ways.

Well, does it matter?

It does, therefore, but not for the reasons that people are worried about.  It matters because families are the root of any decent society where as the individual and the individuals whims aren't.  People aren't made to live the lives of rootless economic samurai and they aren't happy doing so.  If we're altering ourselves chemically that's definitely a bad thing.

So once again, we should at least pause and think.

Sunday, December 8, 2019

The Frozen Puppy

Lex Anteinternet: The Eastern Shoshone consider cannabis: In one of the many posts that I start and never finish, I had in my draft posts a item that was from the Irish Times on Irish physicians lam...

The overall problem, however, is that distinguishing between hemp and marijuana isn't really completely possible overall, as the difference between the two is somewhat like the difference between wolves and wolfy dogs.  Is that a dog, or a wolf?  It's hard to tell

Which leads me to a science item, having nothing to do with hemp or marijuana, but oddly illustrating the point in a way.

Scientists, last year, but only revealed within the last week or so, discovered an 18,000 year old puppy in a lump of frozen mud in Siberia.  It's very well preserved.  It's a male.

They've sequenced its genes and can't tell if its a wolf, or a dog.

That's not really that surprising, and this conundrum has happened before with really old canine remains. Early dogs were nearly wolves.  The first canines that hung out in human camps were wolves.  Shoot, for all we know the very first canine to be incorporated into a human society as a pet may have been a wolf puppy.

Now, that doesn't argue, as some folks will do, that humans should keep wolves as pets.  Wolves are a wild animal and even if acclimated to humans it doesn't make them a pet.  They're still wolves.  But the distance between the first dog and wolves isn't a very far distance.  At some point, that distance must have been nearly non existent.