Tag Archives: Anthropology

An excellent study of human psychology, evolution, modes of thinking. Read this book.

It is possible to view the human experience, and the evolution of Homo sapiens, and the development over time of human society and culture, from a number of different perspectives, all of which are, of course, wrong. That is what scholars of Homo sapiens do. They produce misleading, biased, or otherwise poor descriptions or explanations pertaining to humans and their history, one after the other, and try to make others believe them. That is really just human story telling (and story telling is clearly an important part of the human experience). This endeavor becomes scholarly when the various story tellers test their stories against each other, and against facts or observations made outside the context of the creation of the story, and thus, over time, produce an increasingly refined, still wrong, but less wrong, version.

The first chapter of The Importance of Small Decisions (Simplicity: Design, Technology, Business, Life) by by Michael J. O’Brien, R. Alexander Bentley, and William A. Brock, which discusses the evolution of scholarly thought about the origin of agriculture, provides an example of this process of evolution of understanding in the context of the growth of knowledge.

This book is an analysis of the relationship between human choices, human culture, human society, and the context in which those forces generate outcomes that may or may not have been expected. The analysis starts with one of the most important questions asked, and usually ignored, about human history. How is it that humans came up with agriculture so many times, over a short period (of a few thousand years?), more or less all at once, in regions that has zero chance of any kind of interaction? The most significant transformation in human history happened independently at that time, but not before, without any apparent single or simple cause. But there were causes. They had to do with the environment, demographics, and circumstance. They happened to humans much like similar species-species (plant-animal or animal-animal) relationships evolved in hundreds of thousands of cases across life on this life-rich planet. Individual human decisions were involved, culture was causative and transformed, and society changed and constrained, potentiated and proscribed. It was all very complicated. But when it came down to individual human decisions, they mattered in ways that you would never expect or predict because such things are utterly unpredictable.

Continue reading An excellent study of human psychology, evolution, modes of thinking. Read this book.

The Story of Ollie and his Flashlights

Ollie Andersen and his wife lived much of the summer in a cabin in northern Minnesota,where Ollie fished, watched birds, and spent considerable effort keeping his boat in repair, while his wife made canned goods and embroidery to bring to the market a few times a year down in Walker, not to make money, but to sell for the Leech Lake Area Benefit Association, her favorite local charity.

One day Ollie came up to the cabin after a couple of weeks down in the cities, and his mail box, out on the county road, was full of junk mail and a few good pieces of mail. Ollie had noticed over recent months that more and more mail was coming to the cabin address, and on more than one occasion he found several days worth either soaked because a bad rain had blown into the box, or found the mail knocked out by the wind and strewn around in the ditch by his drive. So, he decided, about mid August, on a plan to do something good and healthy for himself and deal with the mail box problem at the same time. Every evening, after dinner, Ollie would walk up the drive, out to the county road, and check on the mail.

Now, you have to understand a few pertinent facts. Continue reading The Story of Ollie and his Flashlights

A Thanksgiving Day Story: Fear, Loathing, Feasting, Family

What is Thanksgiving?

Thanksgiving is a feast. But what is a feast? Anthropology is all about examining ourselves through the lens of other cultures. Or, at least, that’s what we used to do back in the good old days. Let’s have a look at this great American holiday from this perspective and see what we see. Continue reading A Thanksgiving Day Story: Fear, Loathing, Feasting, Family

ALERT: Two very good deals on two very good books

Every single regular reader of this blog has read or intends to read Stephen Jay Gould’s The Panda’s Thumb: More Reflections in Natural History. I just noticed that the Kindle version of it is available for $1.99, and I assume this is temporary. I already had the book on dead-tree matter, but I picked this up because ebooks are searchable! You will want one two.

Every single regular reader of this blog SHOULD want to read, or should have already read, Mary Doria Russell’s excellent binary set including The Sparrow: A Novel and Children of God. (The Sparrow is first, COG second.)

Right now, and I assume very temporarily, The Sparrow is also avaialble for $1.99.

A quick word about the Sparrow series. It has been classified as science fiction. Others have said, no, it is not science fiction, it is philosophy and spirituality. A lot of church groups read it because of its religious meaning and implications.

That is really funny because there isn’t a drop of religiosity in this series. There is a priest, but it is a priest mainly operating in a post-religion world. This series is primarily anthropology fiction, which happens to be set in a science fiction theme, and if anything, it deconstructs the central role of religious institutions and makes them look as potentially lame and potentially nefarious and as potentially impotent as the other institutions. Or, really, as products of human behavior as anthropologists understand it, the outcome of a mix of self interested behavior, bonding or revulsion, racism and in-group vs. out-group thinking, the power of institutions, ritual, tradition, class, and exploitation. Set, of course, in the background of co-evolution of morphology of predator and prey. There is also a linguistic theme addressing meaning creation (or lack there of: ouch), development of mind and behavior, language learning, and so on.

You have to read them, and now you can get one of them for two bucks! (Unfortunately COG seems regular price.)

Let me add this too, just noticed it, could be of interest for two bucks: The Science of Star Wars: The Scientific Facts Behind the Force, Space Travel, and More!.

Odd Ancient South African Human “Ancestor” Is Young

You’ve heard of Homo naledi, the strange “human ancestor” (really, a cousin) found a while back in South Africa. There were many skeletal remains in a cave, in the kind of shape you’d expect if they had crawled into the cave and died there, not much disturbed. They look enough like other members of our genus, Homo, to be called Homo, but if we assume that increase in brain size is the hallmark of our species, they seem to be an early grade.

Over the last ten years, we have come to appreciate the fact that our genus may have differentiated into multiple species that did not have a large brain after all, and Homo naledi is one of the reasons we think that. And, just as the “Hobbit” of Indonesia (flores) has recently been re-dated to be a bit older than people thought, Homo naledi is now dated to be a bit later than people may have thought.

Schematic of the Rising Star cave system. Picture: Marina Elliott/Wits University
Schematic of the Rising Star cave system. Picture: Marina Elliott/Wits University

For me, this is an “I told you so” moment. First, I understand, as do most of my colleagues (but not all), that a regular change over time in a trait in one lineage does not magically cause a parallel change in another lineage (though the co-evolution of a single trait in a similar direction along parallel lineages is certainly possible.) So, there was no reason to require that all later period hominins be like all other later period hominins in those later-emerging traits. Also, since no one has ever adequately explained what the heck our big brains are for, I don’t subscribe to the presumption that all evolution will always evolve the big brain just because our own big brains insist that they are really cool. So, a late small brained hominin in our genus but existing long after the split with us is actually somewhat expected.

Then, there is my sense of age based on the things I’ve seen in the area’s caves.

Geologist Dr Hannah Hilbert-Wolf studying difficult to reach flowstones in a small side passage in the Dinaledi Chamber. Picture: Wits University
Geologist Dr Hannah Hilbert-Wolf studying difficult to reach flowstones in a small side passage in the Dinaledi Chamber. Picture: Wits University
Some time ago, Lee Berger took me around some of the cave he had poking around in (long before this hominin was discovered) and showed me several animals that had crawled into the caves, probably looking for water during an arid period (this is already a fairly dry area). They had died in place and become mummified. In other caves, I’ve seen similar things, like a troop of baboons that somehow got into a cave with no known entrance and died, as well as bats that died in situ and mummified against the rock they died on.

On another occasion, Ron Clarke, another anthropologist working in the area, showed me the famous “Little Foot” which is a fossil that represents that mummy-to-stone transition, while mostly sitting on the surface of the floor(ish) of a very deep and inaccessible cave. Meanwhile, I’d been working with my friend and colleague Francis Thackeray, and he demonstrated to me how many of the diverse bits and pieces we find of australopithecines are actually probably part of individual skeletons, but discovered and excavated at very different times. These are creatures that got in the cave somehow, and were only somewhat disarticulated after death.

The whole “crawled into the cave” mode of entering the fossil record, and its presumed variant, “fell to one’s death in the cave” is different from the previously presumed process of “leopard kills you, drags you onto a tree branch hanging over a cave entrance and your bones fall into the cave” means of becoming a fossil. It is of course possible, even likely, that both of these processes occurred at various times and places.

Homo naledi, according to Lee Berger, may represent a third way of getting into one of these famous caves. He suggests that the hominins themselves dragged the dead bodies of each other into the caves, as a form of treatment of the dead. That is a spectacularly controversial claim, of course, since with a small brain how can you have a god, and without a god, how can you have ritual or burial? Of course, elephants treat their dead specially sometimes, and their brain is right where it is supposed to be on the famous mouse-to-elephant curve of brain size. And, I’d bet a dozen donuts that even though Homo naledi has a small brain compared to, say, yours or mine, it is probably a good measure above that comparative curve. It was a primate, after all.

left to right: Marina Elliott, Maropeng Ramalepa and Mpume Hlophe. Picture: Wits University/Wayne Crichton
left to right: Marina Elliott, Maropeng Ramalepa and Mpume Hlophe. Picture: Wits University/Wayne Crichton
But I digress in several directions, lets get to the point. The site of Rising Star Cave, South Africa, where Homo naledi was discovered, is now dated. These things are always subject to revision and updating, but for now, it seems like we have a pretty good estimate of the age of this incredible site.

The site dates to some time between about 414,000 years ago and 236,000 years ago. That means that the site overlaps with the approximate age of the earliest, probably, modern humans. Here are the details from the abstract of the paper, published this morning:

New ages for flowstone, sediments and fossil bones from the Dinaledi Chamber are presented. We combined optically stimulated luminescence dating of sediments with U-Th and palaeomagnetic analyses of flowstones to establish that all sediments containing Homo naledi fossils can be allocated to a single stratigraphic entity (sub-unit 3b), interpreted to be deposited between 236 ka and 414 ka. This result has been confirmed independently by dating three H. naledi teeth with combined U-series and electron spin resonance (US-ESR) dating. Two dating scenarios for the fossils were tested by varying the assumed levels of 222Rn loss in the encasing sediments: a maximum age scenario provides an average age for the two least altered fossil teeth of 253 +82/–70 ka, whilst a minimum age scenario yields an average age of 200 +70/–61 ka. We consider the maximum age scenario to more closely reflect conditions in the cave, and therefore, the true age of the fossils. By combining the US-ESR maximum age estimate obtained from the teeth, with the U-Th age for the oldest flowstone overlying Homo naledi fossils, we have constrained the depositional age of Homo naledi to a period between 236 ka and 335 ka. These age results demonstrate that a morphologically primitive hominin, Homo naledi, survived into the later parts of the Pleistocene in Africa, and indicate a much younger age for the Homo naledi fossils than have previously been hypothesized based on their morphology.

"Neo" skull of Homo naledi from the Lesedi Chamber. Photo credit: Wits University/John Hawks
“Neo” skull of Homo naledi from the Lesedi Chamber. Photo credit: Wits University/John Hawks
In addition to this date, it is reported that there are more fossil remains, from another cave called Lesedi Chamber. Here is the paper for that, which reports “… Further exploration led to the discovery of hominin material, now comprising 131 hominin specimens, within a second chamber, the Lesedi Chamber. The Lesedi Chamber is far separated from the Dinaledi Chamber within the Rising Star cave system, and represents a second depositional context for hominin remains. In each of three collection areas within the Lesedi Chamber, diagnostic skeletal material allows a clear attribution to H. naledi. Both adult and immature material is present. The hominin remains represent at least three individuals based upon duplication of elements, but more individuals are likely present based upon the spatial context. The most significant specimen is the near-complete cranium of a large individual, designated LES1, with an endocranial volume of approximately 610 ml and associated postcranial remains. The Lesedi Chamber skeletal sample extends our knowledge of the morphology and variation of H. naledi, and evidence of H. naledi from both recovery localities shows a consistent pattern of differentiation from other hominin species.”

Since both articles are OpenAccess, you can see them for yourself. Kudos to the authors for publishing in an OpenAccess journal.

And now, back to my original digression. One gets a sense of how landscapes and land forms develop, and while this can be misleading, it is not entirely absurd to postulate rough comparative ages for things you can see based on other things you’ve seen. I had assumed from the way they were described originally that the Rising Star hominins would not be millions of years old. Even though Bigfoot (found by Clarke) was millions of years old and essentially on the surface (of a deeply buried unfilled chamber) I guessed that over a million-year time scale, the Rising Star material would either become diagenetically inviable as fossils or buried in sediment, or both. But over hundreds of thousands of years? That was plausible to me. In fact, I figured the remains to possibly have been even younger, and if a date half the age as suggested was calculated, I would not have been surprised.

The evolution of our thinking about human evolution went through a period when we threw out all of our old conceptions about a gradual ape to human process, replacing that with a linear evolutionary pattern with things happening in what was then a surprising order, with many human traits emerging one at a time long before brains got big. There was some diversity observed then, but the next phase of our thinking involved understanding a dramatic diverstiy of pre Homo (the genus) life forms followed by the essential erasure of variation with the rise of Homo erectus and the like. Over the last decade and a half, we are now realizing that while the later members of our genus probably did cause, or at least, were associated with, a general decrease in that early diversity, later diversity arose anyway, and there were more different kinds of hominids, very different in some cases, late into our history. Word on the street is that we can expect to learn about even more diversity in coming years.


Paul HGM Dirks, Eric M Roberts, Hannah Hilbert-Wolf, Jan D Kramers, John Hawks, Anthony Dosseto, Mathieu Duval, Marina Elliott, Mary Evans, Rainer Grün, John Hellstrom, Andy IR Herries, Renaud Joannes-Boyau, Tebogo V Makhubela, Christa J Placzek, Jessie Robbins, Carl Spandler, Jelle Wiersma, Jon Woodhead, Lee R Berger. 2017. The age of Homo naledi and associated sediments in the Rising Star Cave, South Africa. May 2017. eLife.

Related books:

Almost Human: The Astonishing Tale of Homo naledi and the Discovery That Changed Our Human Story

Field Guide to the Cradle of Humankind: Sterkfontein, Swartkrans, Kromdraai & Environs World Heritage Site

From Apes to Angels: Essays in Anthropology in Honor of Phillip V. Tobias

Why I ate a Pangolin

The Lese people practice swidden horticulture in the Ituri Forest, Congo (formerly Zaire). Living in the same area are the Efe people, sometimes known as Pygmies (but that may be an inappropriate term). The Efe and Lese share a culture, in a sense, but are distinct entities within that culture, as distinct as any people living integrated by side by side ever are. The Efe are hunter-gatherers, but the gathering of wild food part of that is largely supplanted by a traditional system of tacit exchange between Efe women and Lese farmers, whereby the Efe provide labor and the farmers provide food. The Efe men also work on the farms sometimes, but their contribution to the family’s diet is more typically from foraged goods, including plants but mostly animals, and during a particular season of the year, the products of honey bee nests.

For several years, in the 1980s and early 90s, I lived in Zaire (now Congo) for several months out of each year (generally between May and January, roughly), and for much of that time I was in the Ituri with the Lese and Efe. During that time, I spent much of the time in the forest with the Efe (very few of the researches on that long term multidisciplinary project did that — most spent their time with the Lese for various reasons).

To go from our study site to the grocery store (which was not really a grocery store because they did not exist in that part of Zaire, but a city with markets) was about a week’s trip or more. Only a few days of that was driving, the rest fixing the broken truck, doing the shopping, etc. So, one did this infrequently. There was no local market during my time there, though one opened up 10 clicks away for a while, at which one might or might not be able to buy a chicken or a yam, if you showed up early.

I (and this pertains to most of my colleagues as well, only a few of us would be at the site at a time) would buy sacks of rice and beans and other long term food items in the city, and carefully curate them at the base camp, a small village constructed of wattle and daub leaf-roofed huts and outhouses. When I went to the forest just to live with or observe the Efe, I would bring the exact amount of food I would need to survive if all I did was feed myself. This way my presence would not affect the Efe’s food budget. But, this is a sharing culture and it would have been very bad for me to just eat that food. I feely shared my food with my fellow camp members, and they shared their food, and my food was almost exactly the same as their local food (rice was grown there) except I would have beans and they are not local. Otherwise, the same.

This meant that I ate what they ate.

Other times, I would hire Efe and maybe one Lese to go with me to the forest to carry out research. I’d be careful to hire them for limited amounts of time to not disrupt their lives too much, but there was very little difference between them working for me and, say, getting honey during honey season. I would only ask them to work with me for a few hours a day and they would otherwise forage. On these trips, I brought more food, for them, because our geographic location and the work we were doing interfered with their normal food getting activities, so I made up for that. But still, during these times we ate plenty of forest foods.

So, what do the Efe (and their Lese compatriot) eat?

Locally, the plant diet is insufficient nutritionally, and often, children are undernourished. There is a hunger season during which the plants from the forest and gardens are rare or absent at the same time, and this is often the death season. No one dies form starving, really (though that apparently can happen) but they have another dangerous disease, and the lack of food may put an ill individual over the top. During one bad hungers season, a small family attempted mass suicide, and mostly succeeded.

Locally, there is no beef, or as is the case a couple of hundred clicks away in most directions, commercially harvested fish. They have goats but the are ceremonial and seem to be never eaten. The Lese have chickens, a few, and they are eaten now and then. The wild animal foods they eat are incredibly important. Without that, they would be in very bad shape.

The most common animals they eat, as in day to day and mundane, are a form of antelope called the Blue Duiker, and monkeys, usually Mangabeys. During a certain season they eat a fair umber of another animal, like but not exactly a duiker, called a water Cheverotain. But since food supply is so unpredictable, they are always on the lookout, and they eat everything. A song bird or bat that flies too close may be batted down with a machete, a Honey Badger that stumbles up on a group of resting Efe may be chased own, an Elephant Shrew that happens on a camp will be dispatched by an archer and cooked up. The only time I ever saw the Efe not go after an animal that happened to show up is when a small herd of elephants came along, and the Efe made a lot of noise to chase them off, while at the same time making plans to hide in the nearby hide-from-the-elephant trees (yes, they have them.) And snakes. Something odd going on there with snakes (see below).

One of the focal points of my research was to look at how animals reacted to the Efe’s presence, and it is striking. Since the Efe will kill and eat almost anything they encounter, most of the animals are very careful to avoid the Efe, and even the Efe’s habitually used trails.

There is a certain amount of elephant hunting. Pygmies, generally, are the African elephant hunters, and apparently, have been so for a very long time. The importance of elephant is very under-appreciated by most experts. The data show that most of the food the Efe eat is plant food, and animal food makes up a percentage of their diet typical for tropical or subtropical African hunter gatherers. But those data never include elephant. I’ve estimated that the total amount of elephant meat they eat over medium periods of time, left to their own, is about the same as all the other meat combined. This happens because when someone does kill an elephant (a rare event compared to the daily killing of a duiker or other more common mammal), everyone from everywhere shows up and gorges on that meat for a few weeks.

So, even though most researchers would classify elephant as uncommon in their diet and therefor not a major contributor to the diet, they’ve simply got that wrong. It is a big deal.

Beyond that, the range of animals is huge, because the number of species native to the area is huge. Oddly, the Efe I was with (and these were more than one distinct group) didn’t seem to eat snakes, tough I know that others do. These Efe also often have a particular species of snake as their totem animal, and you don’t eat your totem animal. So, maybe that is the reason.

Because Efe live the life they live, one without the privilege of access to unlimited supplies of cattle flesh, swine meat, domestic birds, and commercially caught or raised fish, they have a wide dietary niche. Because they live in a remote part of the African rain forest, this list includes a lot of animals many may have never even heard of, or that most regard as exotic, though they are very common there. They live a life where the plant foods often fail them, and collectively do not provide a sufficiently nutritious diet, so they do not have the privilege of eschewing meat, and in fact, perhaps with the knowledge that meat is the real hunger-killer in their environment, they prefer to spend as much time as they can chewing meat.

And I spent a lot of time sharing their culture and ecology with them, and in so doing, had the privilege of getting much closer to truly experiencing another culture than most ever get. Close enough, in fact, to know that I wasn’t even close, and knowing that is a privilege the dilettante missionary or subscriber to National Geo can not have.

The problem with the White Power symbol

Update (6/21/2011):
The OK symbol is now a white power symbol or, when it is not, the person making it should know better, especially if the other fingers are flapping around in any manner whatsoever. -gtl

Added:

white power symbol, Illuminati symbol
Feinstein and Bash dueling symbols.

You all know about this: It is being said that the OK sign is used to indicated “White Power” and this use has been spotted among politicians and celebrities everywhere. Is this real? I don’t know. Is it a valid symbol for “White Power”? Certainly not.

The problem with the white power symbol is that it is not a symbol. Or, if it is a symbol, it is a baby symbol that doesn’t know how to be a symbol yet, so don’t expect much from it.

Try this.

Move your hands in front of you as though you were grasping a steering wheel, and pump your right foot while you say, somewhat loudly and using a touch of Vocal Fry if you can manage it, the words “Vroom Vrooom.”

Maybe snap your head back on the second “Vroom.”

You have signified rapid acceleration, but you did not really do it using full blown language. Well, you did, because you have full blown language, and so do the other people in the room wondering what the heck you are doing (I’m hoping you are reading this in a busy coffee shop). But the fact that they get that you are talking about rapid acceleration is because you made sounds like a car and play-tended that you are sitting in a car and reacting to forward rapid acceleration. That’s not really language. From a semiotic point of view, you signified the sound of an accelerating engine by imitating it, and you signified other aspects of rapid acceleration by imitating it. This is not symbolic. You were not doing a symbolic representation of rapid acceleration. You may be thinking, “yes, I was, or what the heck was that that if I wasn’t?” Just trust me, you weren’t.

(Except that since your intentional communication is essentially linguistic even when not and everyone around you is a human, you were, but that’s another matter for another time. Functionally, you were not, pragmatically you were.)

Now, do the following. Wipe that puzzled or snarky expression off your face and speak the following words, enunciating clearly.

nopea kiihtyvyys

Unless you are in a Finnish coffee shop, when you said those words out loud you were uttering a symbol, but unfortunately, a symbol with no meaning, because no one in the room, including yourself, speaks that language (if you are a Fin or among Fins, substitute some other language, please.)

Now, say, with no body movements or other fanfare:

rapid acceleration

In an English-speaking coffee shop, that was a symbolic act. There is no onomatopoeia. There is no imitation. There is no clue to the meaning of those words built into their utterance or the framework in which they are uttered (like an accompanying gesture or facial expression). However, you have made and conveyed meaning, and done so symbolically.

The very fact that these words mean what they mean in an utterly arbitrary way, a way unembellished with direct reflection of reality, is what makes them symbolic, and the fact that language works this way is what makes language very powerful.

There are many reasons for this. For example, if your words were strictly tied to imitation or direct representation, it would be harder to extend or shift meanings. It would be harder for there to be a rapid acceleration of a political policy, or a state of war, or a child’s understanding of subtraction and addition, as well as a vehicle with a steering wheel. Also, you made this meaning using two words, each of which can be used as countless meaning making tools. There is an infinity of meanings that can be generated with the word “rapid” and a few other words, in various combinations uttered in a variety of contexts, and there is an infinity of meanings that can be generated with the word “acceleration” and a few other words, in various combinations uttered in a variety of contexts, and the two infinities are potentially non overlapping.

A google image search for “triangle sign” shows that the triangle, on a sign, could mean a lot of things but almost always refers to something ahead that you need to be cautious of. Some of these signs are icons (a little train for a train), some are verging in indexes (maybe the explanation point?) but they are not very symbolic. If I take a triangle out of the road sign panoply and put it on another road sign, it might be indexical to something. The widespread use of the triangle for this context may render the triangle as un-symbolizable, because it will always be iconic of the indexical reference to danger, until civilization ends, everyone forgets this, and different signs, indices, and icons emerge.

The warning sign above is like a lot of other signs (using the term “sign” like one might say “placard”). It has a triangle which, in this case, signifies semiotics. Why does a triangle signify semiotics? Because in one of the dominant theories of semiotics, which is the study of meaning making, symbolism, and sign making (the other kind of sign), meaning making has three parts (the meaning maker, the meaning receiver, and the other thing). But the triangle is not really a semiotic triangle because there are no labels. This could be a triangle of some other kind, linked to some other meaning. Indeed, the triangular shape is linked to warning signs generally, while the rhombus is for “stuff ahead” so this could be a sign signifying, by looking like something else (a danger sign), danger ahead, or pedestrian crossing ahead, or some other thing.

Cleverly, the warning sign above is both an index to semiotics and a reference to danger, placed on a sign shape usually used to warn of danger ahead (like a deer crossing).

Briefly, a thing that looks like a thing is an icon. Like the thing on your computer screen that looks like a floppy disk, indicating that this is where you click to put the document on the floppy disk. A thing that has a physical feature linked to a thing or meaning, but not exactly looking like it, is an index. We can arbitrarily link a representation to an index (like an index card in a library to a book, linked by the call number which appears on each item) or a representation can evolve from icon to index because of change. For example, the thing on your computer screen that looks like a floppy disk, indicating that this is where you click to put the document in the cloud, in a world with no floppy disks where most computer users don’t have a clue what a floppy disk is or was, but they do know that that particular representation will save their document.

(See: Peirce on Signs: Writings on Semiotic by Charles Sanders Peirce)

A symbol can evolve from the index when the physicality of the link is utterly broken. The vast majority of words do not look, sound, in any way resemble, what they mean. Words are understood because the speakers and hearers already know what they mean. New meaning is not generated in the speaker and then decoded in the listener. Rather, new meaning is generated in the listener when the speaker makes sounds that cause the listener’s brain to interact with that third thing I mentioned above, which is shared by both.

And, of course, meaning can be generated in someone’s mind when all that happens inside your head. It is advised that, when doing so, try to not move your lips.

The point of all this: having a representation of something linked by the way it looks to some kind of meaning is asking for trouble. A totally arbitrary association between intended meaning and how something looks (or sounds, like a word) is impossible to understand for anyone not in on the symbolic system. But, such an arbitrary association allows, if the meaning making is done thoughtfully and there is no deficit in the process, for an unambiguous meaning making event. At the same time, the arbitrary nature of the symbol allows for subsequent “linguistic” (as in “symbolizing) manipulation of the arbitrary thing itself. And, the fact that the symbolizing requires that third thing, the common understanding of meaning, is what allows us to avoid meaning making that is spurious, as happens when a sign is not a pure symbol, but instead, iconic or indexical of something. And this is where the White Power symbol everyone is talking about, made up of the common “OK” sign, falls into the abyss.

Do this and show it to all the people in the coffee shop:

If you are in the US you may have just told everyone that all is “OK” (or is it “Okay”?).

Among SCUBA divers it specifically means “no problem” which is subtly different than just “OK” because the problems being discussed are on a specific list of important issues to SCUBA divers, like “my air is good” etc.

In the above cases, the gesture means what it means because it is making an “O” for the beginning of OK/Okay. The gesture is an icon of the term “OK.” It is not a full blown proper symbol.

If you are in Argentina or several other South American areas, and possibly parts of Europe, you may have just called everyone in the room an asshole. In this case, the gesture refers to that anatomy, and the anatomy is metaphorical for a state of mind or behavioral syndrome. The symbol itself is an icon or index to the sphincter region.

In other contexts (mainly in Europe), the symbol is also an insult in a different way, in that the “0” part of the gesture implies “you are nothing, a zero.”

In Arabic speaking cultures, the symbols sometimes refers to the evil eye, because it looks like an eye. So it is used, along with a mix of phrases, as a curse.

If you put the ring formed by the gesture over the nose, you are telling someone they are drunk, in Europe. Or, you may place the “O” near your mouth to indicate drinking.

In Japan, if the hand is facing down, that “o” shape is a coin, so it can mean money or something related.

In parts of china, while the symbol can mean “three” the zero part tends not to. To say “zero” one simply makes a closed fist.

In basketball, the “o” part of the gesture is just there to get the index finger out of the way. The key part of it is the three fingers sticking up, which means that the player who just threw the ball into the hoop got three points.

Maybe this is the Illuminati sign. Maybe it is not.

Meanwhile, among some Buddhists, the three fingers part is not the point. The circle part is where the meaning is, but not as the letter “o” but rather the number “0”. Moving across the religious spectrum a ways, in another South Asian religion, it is the three fingers symbolize the three “gunas” which you want to have in harmony, while the “o” part represents union of consciousness. But again, all of these meanings have to do with the actual physical configuration of the fingers.

Rarely, the symbol means “666” and, increasingly, is linked to the Illuminati. To the extent that the Illuminati exists, and I’m not going to confirm or deny. The symbol is also found in western Christian allegoric art. I don’t know what it means there.

There are places in this world where there are both negative and positive meanings implied by the iconic nature of the symbol, which can lead to both confusion and intended ambiguity. I worked on a crew with people who were either Argentinian or who lived in Argentina for a long time, and others who had never been to Argentina. It was always great fun to watch the boss give kudos to a worker at the same time as calling him an asshole. We need more gestures like that.

The Anti-defamation league identifies a version of the White Power symbol, where you use one hand to make a W (start with a “live long and prosper” then move the two middle fingers together) and an upside down OK to make the P. It is not clear that the ADL is convinced this is real; they may just suspect it. But generally, the symbol is found in a small cluster of mainly twiterati, who have produced a few pictures of possible or certain white supremacists or racists using the symbol. But in all cases, they may just be saying “OK” in the usual benign sense. The best case I’ve seen for the one handed WP=White Power OK symbol is its apparent use on a sign being held at a white supremacist group march, but that could be a singular case, or fake.

Since I originally wrote this post, in 2017 (this is a 2021 edit you are reading right here) I’ve noticed that actual white supremacists who want to make it clear they are using the OK White Power symbol do so vigorously or obviously in some way to reduce ambiguity. That does not make it more of a symbol, but it does make it easy to spot the assholes. Which is not what the OK sign is being used to represent, except in an ironic way it really is. But I digress….

Of course, now that the cat is out of the bag, the OK symbol IS a sign for “White Power” or could be, or at least is an ambiguous one, so anything can happen from here on out. I’m just not sure this use was there before a few days ago when Twitter invented it.

Tommie Smith aned John Carolos.
But that is not the point I wish to make here. The point is that the OK gesture sucks as a symbol in the modern globalized world because it has so many existing meanings, yet is not an arbitrary symbol. It isn’t fully linguistic. It has a hard time doing the job a symbol should do, which is to be both fully agreed on, with respect to meaning, and adaptable into novel meaning contexts without easily losing its primary symbolic, historically determined, references.

And, the reason for this is that the OK hand gesture looks like something, or more importantly, looks like a lot of things. A bottle coming to the mouth, a bottle on the nose because you are so drunk, an eye (evil or otherwise), a zero, a three, an “O” or a “P”. A coin or an asshole. Probably more.

So, yes, a “black power” gesture looks to someone in Hong Kong like a declaration of “Zero!” That sign isn’t in as much trouble as “OK” because the meaning “black power” is regional, and the use of the fist is regional. But it is another example of something indexical (a fist meaning power is very indexical, maybe even partly iconic) and thus, not truly symbolic, and thus, limited as a fully powered linguistic thing.

Don’t get me started on this one:

America is part Mexican

The Dogs Still Bark in Dutch

I grew up in the old Dutch colony of New Amsterdam, now known to you as the State of New York. There, I carried out extensive archaeological and historic research, and along the way, came across that phrase, “the dogs still bark in Dutch.”

It is an idea that might occur to a denizen of Harlem, the kids off to Kindergarten, sitting on his stoop eating a cruller, or perhaps some cole slaw with a gherkin, and pondering the Dutch revival architecture down on Wall Street.

There was a war between the Dutch and the English in the 17th century, and as a result of that war, colonial lands were passed back and forth multiple times. In the case of the colony of New Amsterdam, the passing to the English involved the arrival of a warship on the Hudson, but they used their cannon only to wake up the governor so he could receive a letter telling him about the change. Actually, the colony went back and forth a couple of times.

But when the English took over that Dutch colony, they did not remove the Dutch, or really, do much at all. There were some old Dutch customs, such as the Pinksterfest, a bit of a happy go lucky free for all dance party with vague religious overtones, that were illegalized, because the English versions of Christians at the time didn’t like dancing. But mostly nothing happened to affect day to day life for most people. The Dutch parts of the collection of English Colonies and the early United States retained its Dutchness long enough for someone to remark of the time that even with all the political change, the new form of money, the change in monarch, all of that, the dogs still barked in Dutch.

(I oversimplify two centuries of history slightly.)

We know we are eating burritos, yet we call them tacos

I think this is a Minnesota custom but it could be more widespread. This is what you do. You get a large flour tortilla, some kind of meat or beans, tomatoes, lettuce, salsa or hot sauce, grated cheese and sour cream, and you put all that stuff inside the tortilla, roll the tortilla up, eat it, and then say, “That was a good taco, you betcha.”

The part about the tortilla, lettuce, cheese, etc. is not Minnesotan. That is widespread. But calling a burrito a taco may be more local. And, we know it is a burrito. Nobody in Minnesota ever gets confused about what they are ordering at a Mexican restaurant. In fact we’re pretty good at that. Indeed, of all the upper mid west cities, I’ll bet you that Minneapolis has one of the oldest Mexican restaurants, and there has always been a Mexican community here, though it has grown in recent decades.

But never mind the taco-burrito distinction. We Minnesotans also mix up “yet” and “still” and do things “on” accident instead of “by” accident. Don’t get me started on soda vs pop vs sodapop.

What I really want to talk about here is “Mexican food.”

Go find some hipsters and tell them, “Imma go get Mexican food, wanna come?” and you’ll find out that there is no such thing as “Mexican food,” that what you really mean is “Tex-Mex” and that if you want some authentic “Mexican food” there’s this great taco truck down the street that has authentic tacos.

So you got to get the authentic tacos. I did that the other day. Hipsters everywhere. All the tacos, though, were various meat or bean substances, some kid of lettuce, tomato, etc. with some sort of sauce, on a flour tortilla. The only difference between our home made “tacos” and these legitimate “tacos” was that our burritos are chimichanga size, and those burritos were hand size.

Don’t get me started on chimichangas.

Anyway, here’s what I want to say about Mexican food. It is Mexican, and it is not Tex-Mex. Why is it not Tex-Mex? because Tex-Mex is a made up word, a made up category of food. It was made up because people thought this stuff we call “Mexican food” was fake, an American, non-Mexican version of what they eat in Real Mexico. It was not understood that America did not invite Mexico over as long as they bring the Tacos, that things Mexican in America are not immigrated, but rather, indigenous, often. Even though many Mexicans actually do go back and forth across the US-Mexico border, the truth is, the geographical and cultural entity that gave rise to the Country of Mexico also gave rise to the Country of the United States, in part. In part for both. The Yucatan is no more Hispanic Mexican than El Passo is Anglo-American.

Both modern countries have histories that involve big areas of land, country size areas of land by European standards, that had this or that national, ethnic, or cultural thing going on, and all of that stuff contributes to the present. Native American zones were everywhere, of course, and for the region of which we speak here, that included hundreds of languages, many language groups, and numerous entirely different but often overlapping or intermingled lifeways (such as foraging, bigly civilization, and all the arrangements to be found on the small-group-forager to pyramid-building-nation spectrum).

America did not become a first-Native then Anglo-European country that then had Mexicans show up to fix our roofs and run Tex-Mex style taco trucks. Mexican culture, or more broadly speaking New World Hispanic culture (or some other word, you pick) was in place, across a huge area, long before the United States took its current form, and a whopping big chunk of the eventual United States was part of that. And no, I’m not talking about Texas, or even New Mexico, or the Southwest, or the land ceded to the US in the Treaty of Guadalupe Hidalgo. I’m talking about a big blobby thing that includes regions from Atlantic Florida to California, from the Rio Grande to the Great Lakes, overlapping with other big and small blobby things that were French, English, Dutch, Creole, Acadian, Russian, and so on.

So don’t call what we eat “Tex-Mex” because that implies that we are America sans Mexico. We are Mexico. Even up here in Minnesota, the Cowboys sometimes spoke Spanish. A cowboy IS a Spanish-American thing. And out east, the dogs still barked in Dutch. And our northern beginnings are as French as anything else.

America is part Mexican, but not because they came to us. Rather, we come from them.

Should I eat my placenta?

Well, not my placenta exactly, but … well, someone’s?

Did you now that the placenta that is born out of a female primate’s body is an organ of the infant also being born? It is the first body part you lose. I use the term “primate” here because, even though all the “placental mammals” as we are called share some basic reproductive gestational anatomy, there are major categories across the mammals in this area, and primates are distinct from, for example, carnivores. These differences are of course very important when one is considering placentophagy. I mean, you wouldn’t confuse a walnut with an orange when picking a snack, why would you confuse a dog placenta with a monkey placenta?

In humans and mice, and presumably therefore in all mammals, the placenta and the rest of the embryo/fetus have growth patterns that are controlled at some basic level by two distinct developmental genes, each of which has the property of methylation. This is an epigenetic phenomenon for those who like to see that word in use. Here’s what happens. The gene that engenders growth of the placenta is turned on by dad’s allele, turned off by Mom’s. The gene that engenders growth of the rest of the embryo is turned on by mom, off by dad.

The idea here is that mom and dad have difference interests in the outcome. Mom wants to have an optimal (not maximal) number of offspring, so she parses out energy appropriately. Dad wants to have more offspring than mom, using a number of different moms if possible. Thus, he wants the growing embryo and fetus to suck as much energy out of each mom as it can.

The Placenta is the energy-sucking organ. It insinuates itself greedily into the blood supply of the mother, like an alien internal parasite. The mother’s body resists the introduction of placental tissues into her blood supply, the placenta fights back, and the result is a compromise which usually works out. Part of that compromising system, over long term evolutionary time, has been them other’s systematic turning off of the gene that she provides instantiating the growth of the placenta. Dad counters by turning off the fetus/embryo gene. And so on.

Anyway, should I eat my placenta or not?

Across cultures, there are many different practices associated with child birth that have to do with the placenta. Among one group I worked with in the Congo, the Placenta is buried under the threshold of the hut in which the birth happens. This is done by the father. That, and having a sharpened arrow handy to cut the cord, are his only jobs during child birth. But nobody eats the placenta.

I normally don’t pay a lot of attention to the “complementary and alternative medicine” literature, thought I am sent regular notices of various publications. Today, though, something came across my desk that I thought you’d be interested in. I’ll give some of the basic results, you can draw your own conclusions. Feel free to comment below. The topic is, of course, placentophagy.

The Paper:

Schuette Stephanie A., Brown Kara M., Cuthbert Danielle A., Coyle Cynthia W., Wisner Katherine L., Hoffman M. Camille, Yang Amy, Ciolino Jody D., Newmark Rebecca L., and Clark Crystal T.. The Journal of Alternative and Complementary Medicine. January 2017, 23(1): 60-67. doi:10.1089/acm.2016.0147.

Methods:

Two cross-sectional surveys with questions regarding placentophagy practice were distributed to healthcare providers and patients. The provider survey was distributed via email listservers to international perinatal professional organizations and to obstetrics and gynecology, nurse midwifery, family medicine, and psychiatry departments at three urban hospitals. Patient surveys were administered in person at an urban hospital in Chicago, Illinois.

Key results that jumped out at me:

Higher income, higher education, and whiteness seem to be associated with a higher likelihood of engaging in placentophagy, with various degrees of effect.

The most likely kid of provider to suggest considering this practice are midwives, with all the other kinds of providers (physicians and nurses, mainly) being in the main unlikely to suggest it. Sample sizes are small, but 100% of the 66 OB/GYN’s asked said no, they would not suggest this. For nurses, with only 16 in the sample, two thirds said no, they would not, and one third were neutral. Non said they would suggest it. Among Midwives, only 17.6% said they were unlikely, and 29.4% said likely, the rest being neutral.

The survey looked at multiple locations but with enough in Denver and Chicago to identify a vague pattern: A provider in Denver is slightly more likely to thing this a good idea.

The study looked at history of mental health diagnosis. 7.4% of those with no such history said they would consider placentophagy. 24.3% of those with such a history said yes. Across the board, asking about what form they would consider eating the placenta in, or if they thought there was this or that benefit, those with a history of mental health diagnosis generally thought it was good, low risk, and they would try a variety of methods.

There is no evidence that placentophagy has a benefit.

The Norms of Society and Presidential Executive Orders UPDATE

A brief update: This morning, Senate Republicans set aside the rules that say that both parties must be present, with at least one member, for a committee vote to advance a Presidential nominee for a cabinet appointment.

In other words, as outlined below, our system is based not only on enforceable laws but also on rules that only work if everyone involves agrees to not be the bully on the playground who ignores the rules. The Republicans are the bully on the playground.

The system requires honest actor playing by agreed on rules. So, without the honest actor, you get this. This fits perfectly with Trump’s overall approach.

Democracy is not threatened by this sort of thing. Democracy was tossed out the window a while back when this sort of thing became possible, and normal. Whatever we see now that looks like democracy is vestigial.

Original Post:

The title of this post is based closely on the title of a statement posted by my friend Stephan Lewandowsky, representing the Psychonomic Society.

The post is the official statement by this scientific society responding to President Trump’s recent activities, and it begins,

Last Friday was Holocaust Memorial Day, which falls on the day of the liberation of the Auschwitz Death Camp by Soviet troops in 1945. U.S. President Trump marked the occasion with a statement, although it omitted any specific mention of the 6 million Jews who perished in the Holocaust.

On the same day, Trump also signed an executive order that banned citizens of 7 mainly Islamic countries from entering the United States.

This order—at least initially—also applied to legal permanent residents of the U.S. (“Green card” holders), thus barring them from re-entry to their country of residence after a visit abroad, as well as to dual nationals if one of their citizenships is from one of those 7 countries.

I’m going to use this as a starting point to discuss the most important thing you need to know about the situation in the United States right now.

You know most resources are limited. We can cook along ignoring this for long periods of time, ignoring a particular resource’s limitations, until one day something goes awry and that particular resource suddenly matters more and of it, we have less. So a competitive framework develops and then things happen.

It is the business of the rich and powerful to manipulate the world around them in such a way that when such a limitation occurs, they profit. Candidate Trump mentioned this a while back. A housing crisis is a good thing for a real estate developer. This is not because it is inherently good; a housing crisis can put a real estate developer out of business. But the developer who is positioned to exploit such a crisis, or any kind of economic or resource crisis, is in a good position when thing go badly for everyone else.

One of the long term goals of many powerful entities is to maintain working classes, or other lower classes of servitude, in order to have cheap labor and a market. This has been done in many ways, in many places, at many times. Much of our social history is about this. Many wars have been fought over this, and many social, cultural, and economic revolutions have occurred because of this.

And every now and then, a holocaust happens because of this. This is, in part, because of what I’ll term as Mischa’s Law. Mischa Penn is a friend and colleague who has studied race and racism across all its manifestations as represented in literature, but focusing on the Nazi Holocaust and the holocaust of Native Americans. Mishca’s Law is hard to understand, difficult to believe, enrages many when they hear it, and is often set aside as lunatic raving. Unless, of course, you take Mischa’s class on race and racism, get a few weeks into it, know enough about it. Then, he gives you the thing, the thing I call “Mischa’s Law” (he doesn’t call it that) and you go, “Oh, wait, of course, that’s totally true.” And then you get really depressed for a while, hate Mischa for a while, hate his class. Then, later, ten years later, a life time after you’ve taken the class, and you’ve graduated and moved on to other things, Misha’s Law is the only thing you remember from all the classes you took at the U, and you still know it is true.

The fundamentals are always in place for Mischa’s Law to take effect. Competition, limited resources, different social classes or groups, a limited number of individuals in power, etc. But we, in America, have lived in a society where checks and balances kept one ideology (including, sadly, my own!) from taking over for very long, and there is a certain amount of redistribution of wealth and power.

But over recent years, the rich and powerful have convinced the working class that the main way we distribute wealth, through taxes, is a bad thing, so that’s mostly over. Social welfare has become a dirty word. The rich are richer, the powerful more powerful, and those with little power now have almost no power at all. But we still had a governmental system of checks and balances, so that was good.

But then the system of checks and balances got broken. In fact, the entire system of government got broken. Did you notice this? What happened is, about half the elected officials in government stopped doing the number one thing they were supposed to do, and this ruined everything.

What was that one thing? This: play by the rules.

Playing by the rules requires both knowing the rules and then making an honest attempt to respect them. Not knowing the rules is widespread in our society. I’m sure the elected officials know the rules they are breaking, but increasingly, I think, the average person who votes for them has no clue what the rules are or how important it is that they be observed.

Imagine the following situation. You go to baseball games regularly, to see your team play. Let’s make this slightly more realistic and assume this is a Little League team.

One day a big scary kid who is a bully gets up to bat. The pitcher winds up, throws the ball. Strike one. It happens again. Strike two. One more time. Strike three.

But instead of leaving the batter’s box, the big bully kid says, “I’m not out, pitch it again.” The following several moments involve a bit of embarrassment, the coaches come out, some kids are yelling at the bully, one parent hits another parent, and finally, it settles down, but the game is ruined and everyone goes home.

Next game, same thing happens, but this time nobody wants a scene, so they let the pitcher pitch the ball until the bully hits a single. Then the game continues. But the next game, there are a few bullies, not just one, demanding that the rules be ignored for them, and some other players decide to ignore other rules as well, and pretty soon, there is nothing like baseball happening.

You see what happened here? I’m going to guess that you don’t quite see the key point yet. The reason you leave the plate and go back to the dugout when you get three strikes is NOT because of the properties of matter, gravity, magnetic attraction, the unstoppable flow of water or a strong wind. You are not blown, washed, pulled, pushed, or dropped by any force back into the dugout when you get three strikes. You go back into the dugout because you got three strikes, the rules say you are out, right?

No. Still not right. You go back into the dugout because you got three strikes, the rules say you are out, AND THEN YOU FOLLOW THE RULES.

The Republican party, about half the elected officials, have unilaterally decided, in state houses across the country and in the Federal government, to stop following the rules.

A few years ago, in the Minnesota State House, a Republican representative made the clear and bold statement that he represented only the voters in his district who voted for him, and not the other citizens. He was resoundingly condemned for doing this, and he backed off and stopped talking like that. But over time, in state houses across the country, and in congressional districts, this increasingly became the norm, for Republicans. The rule is, of course, that once elected you represent all the people of your district. But more and more Republicans decided that this rule did not apply to them. They only represent those who voted for them. Now, this is normal in the Republican Party, and the first Republican President to be elected after this change said during his first news conference after his election, prior to his inaugural, that blue states would suffer and red states would benefit from his presidency.

I’ll give you another quick example. In one of Minnesota’s legislative chambers, the chair, who is from the leading party, has the right to silence any legislature who gets up to speak if the topic being discussed is not related to the matter at hand on the floor. So, the legislature is debating a proposed law about bicycles. The Democrats are in charge. A Republican gets up and insists on talking about his horoscope. The Democratic chair of the chamber says something like, “Your remarks are not relevant to the matter at hand, sit down and be quiet.” Good rule.

Last time the Republicans were in charge in that Minnesota chamber, they did this to every single Democrat who stood to say anything about anything, including and especially the matter at hand. The Republicans disregarded the actual rule (that the chair can silence a member who is off topic) and misused the power (that the chair can silence any member) to their benefit.

Tump is not following the rules, the Republicans in Congress are not acting like a “check” on Trump, and we have seen government officials in the Executive branch, apparently, ignoring court orders.

Trump’s executive orders over the last few days have been an overreach of power. For example, in its initial and badly executed form, his “extreme vetting” plan removed the rights of green card holders. Two different court orders neutered at least parts of this executive order temporarily, but it is reported that some officials, working for the Executive branches, ignored the court order. Since these are basically cops ignoring an order from a judge, and judges don’t have a police force, there isn’t much that can be done about that. Cops are supposed to follow the orders of judges. That’s the rule. The only way the rule works is if the rule is followed. There is no other force that makes the rule work.

Trump’s apparent abrogation of previous decisions on major pipeline projects was done without reference of any kind to the regulatory process that had already been completed. Regulations are acted on by the Executive branch, but they come from laws passed by Congress, and the whole judiciary is involved whenever someone has a case that there is something amiss. Trump’s executive orders and memoranda related to the pipeline ignore all the different branches of government, departments, process, and rules of governing.

It would appear that Trump had brought together the two major changes in rule observation that have developed over the last 20 years in this country. First, like the average citizen (of all political stripes) he is ignorant of how anything works. Second, like the bully that stands by the batter’s box, he shall not observe any rule that he does happen to find out about.

You see, for a United States President to become a dictator, he has to do only one thing: Stop following the rules. The US Court System, the Congress, and the Executive exist in a system of checks and balances, and that is supposed to keep everybody, well, in check. And balanced. But the Executive is the branch of government with multiple police and security forces, an Army, a Navy, an Air Force, Marines, and a Coast Guard. There is a rule that only the Coast Guard can carry out military-esque activities on US soil. But there is a mechanism for putting that rule aside. The President puts the rule aside. That’s it.

We live in a world of limited resources, and a pre-existing system of inequity, class, and ethnic categorization that allows the powerful to exploit and control most everyone else. We live in a country in which a single individual can take over the government by getting elected president then ignoring the rules, whether or not he formally declares himself in charge of everything. There is no mechanism to stop this from happening. There are all sorts of rules in place to stop it, such as the political parties putting up qualified candidates, the electors making sure they elect a qualified candidate, the Congress certifying the election of qualified candidates. But those things did not happen, and we now have a man who by all indications intends to dictate, not lead, dictate not rule, dictate not represent. There is no indication of any kind whatsoever that we do NOT have an incipient dictatorship as our form of government right now, and there are strong indications that this is where Trump is going.

And this is where Mischa’s Law becomes a thing.

“Racism, left unchecked, will eventually lead to holocaust.”

The checks, they have been neutralized.

Palpable History: Dictator’s Voice, Dictator’s Words

It is a good idea to occassionally experience history. This helps us understand ourselves, and our possible futures, better. Much of this is done through reading excellent texts. For example, I’m currently reading Team of Rivals: The Political Genius of Abraham Lincoln by Doris Kearns Goodwin. Goodwin’s objective is to contextualize Lincoln by looking at him in the broader context of the individuals that ran against him for the Republican nomination, and whom he later added to his cabinet. Goodwin succeeds, at several points, in placing the reader in a time or place of great import. Watching the very young Abraham Lincoln lower himself onto a log (he was out cutting firewood), his face buried in his hands and tears streaming from between his fingers, and not leaving that spot or position for hours after learning of the death of his mother. Or the layout and use patterns of Lincoln’s office in the White House, where he occupied a corner desk, and various members of his cabinet and military came and went with urgent messages, and made vitally important decisions, until the end of the day when Lincoln would sit down for a long read. That sort of thing.

So here, I’m going to invite you to do something a little strange. I’ve got here an audio recording of Adolf Hitler having a normal conversation (about extraordinary things) with a fancy dude by the name of Mannerheim, during a visit to Mannerheim at the time of his birthday. Wikipedia has the story on the audio recording. Here, it is presented as a YouTube video so you can follow who is speaking, and what is being said.

The reason to listen to this for a few minutes (no need to listen to the whole thing, though if you know anything about WW II, it may become captivating after a while) is because Hitler almost always screamed at his audience, and this is him speaking in a normal voice. I want to pair this audio experience with a linguistic but read experience. After listening to the audio recording with Mannerheim, read through the transcript of Hitler’s only other known “conversational” bit of significance.

There is a recording of that as well. It is a speech but one in which he speaks normally for much of the time. The point here, though, is not to listen to it to get the voice experience (but that is interesting) but to read his words. To hear how he formulates his statements, how he describes his situation. How he aggrandizes himself in the face of failure, how he belittles his enemy. How he schizophrenically moves between the gigantic and the modest, how he moves around his own goal posts as needed to make himself look big league smart.

Below you’ll find the two videos and the text. If either video vanishes (they do sometimes) you can easily relocate one on YouTube

The Mannerheim Recording:

The text of Hitler’s Stalingrad Speech:

If we follow our enemies’ propaganda, then I must say that is to be compared with “Rejoicing towards Heaven, depressed until Death”‘ The slightest success anywhere, and they literally turn somersaults in joy. They have already destroyed us, and then the page turns and again they are cast down and depressed. I did not want to attack in the center, not only because Stalin knew I would. I provide one such example. If you read the Russian telegrams every day since June 22nd, they say the following each day: “Fighting of unimportant character”. Or maybe of important character. “We have shot down three times as many German planes. The amount of sunken tonnage is already greater than the entire naval tonnage, of all the German tonnage from before.” They have so many of us missing that this amounts to more divisions than we can ever muster. But, above all, they are always fighting in the same place. “Here and there”, they say modestly, “after fourteen days we have evacuated the city.” But, in general, since June 22nd they have been fighting in the same place. Always successful, we are constantly being beaten back. And in this continued retreat we have slowly come to the Caucasus.

I should say that for our enemies, and not for your soldiers, that the speed at which our soldiers have now traversed territory is gigantic. And what has transcribed this past year is vast and historically unique. Now, I do not always do things just as others want them done. I consider what the other probably believe and then do the opposite on principle. So, if I did not want to attack in the center, not only because Mr. Stalin probably believed I would, but because I didn’t care about it at all. But I wanted to come to the Volga, to a specific place and a specific city. it happened to have Stalin’s name, but that’s not why I went there. It could have had another name.

But, now this is a very important point. Because from here comes 30 millions tons of traffic, including about nine millions tons of oil shipments. From there the wheat pours in from these enormous territories of the Ukraine and from the Kuban region then to be transported north. From here comes magnesium ore. A gigantic terminal is there and I wanted to take it. But, as you know, we are modest. That is to say that we have it now. Only a few small pockets of resistance are left. Some would say “Why not fight onwards?” Because I don’t want a second Verdun! I would rather hold this with small combat patrols! Time does not matter, no ships are coming up the Volga! That is the important point.

Hitler’s Speech, 8 November, 1942:

Everybody Always Gets This Wrong, Even Smart People

This is a great cartoon by Randall Munroe that makes a very important point very effectively. Spread it around, love it, learn from it.

Here is an excellent video walkthrough of the cartoon, discussing its value as a communication tool.

But do ignore the details of the prehistory because the cartoonist has fallen into the same trap so many others have, well meaning in intention but simply a) not an expert on key things and b) unaware of the real consequences of getting certain things wrong.

When we represent prehistory, we represent humanity both past and present. It is not difficult to do so in a way that leads to serious and meaningful, even impactful, misconceptions.

So, here, I’m going to complain not just about this cartoon, but about the general phenomenon of people who are not paleontologists or archaeologists (or some other appropriate expert) using human prehistory to make a point, but at the same time, throwing accuracy about that prehistory under the bus.

Right way, I want to point out the consequences: Westerners, for sure, but this is more widespread than that, tend to have a view of humans that involve concepts of civilized and primitive, and hierarchical concepts mixed with evolutionary ones. And there are other problems in the conceptualization of prehistory and the diversity of humanity. These problems make it very easy to maintain a racist perspective despite overwhelming evidence against the validity of biological race. These problems make it very easy to lessen the pain and suffering of certain people, which, often, we are busy causing in our own self interest. These problems in conceptualizing the nature of humanity across time and space lead to all sorts of misunderstandings with all sort of consequences including, but not limited to, simply getting it all wrong.

XKCD is a comic written in and fully appreciated by the context of modern skepticism and science cheerleading. Let us please not throw the important social and natural sciences of archaeology, prehistory, paleoanthropology, etc. under the bus in service of making a point in some other area of study. A smart man whom I respect quipped, “but this comic is not about archaeology.” My answer to that: This comic makes one point about climate change and dozens of points about archaeology. It is about archaeology.

Why this is a great cartoon.

Look at the cartoon. Go from top to bottom.

It tells us that over a very long period of time, as humans did all sorts of different things, and conditions on the earth changed dramatically, the global surface temperature a) remained within a fairly narrow range and b) didn’t vary that quickly even when it did vary.

Then, all of the sudden, temperatures shoot way up and are expected to shoot way up even more. Holy crap. Point well made.

Missed opportunities

The climate change science is not bad but a bit off. The baseline of temperatures (pre industrial) vs. now should be somewhat different in relation to the current temperature. If you take the last few thousand years as basline, which is the proper thing to do, we are closer to 1.5, and not 1.0, degrees C above it. But that may be a nitpick since the time scale of this cartoon is larger. But, once you get past that level of time scale, the question of baseline becomes untethered from pragmatics and you can justify anything.

Also, there are probably times in the past, within the time range of this cartoon, where more abrupt and dramatic climate change did indeed occur. And, at those times, major effects happens with humans.

This is where not getting the archaeology right causes the cartoon to both miss some key points and become inadvertantly somewhat less than straight forward.

Here’s the thing. Climate change can have a very negative effect on humans. How do we know? Because it apparently has happened over and over again. For example (and there are many examples), within the time range of this graphic, climate changed caused a significant increase in aridity in a huge area of southern Africa. The place was pretty well populated by hunter-gatherers before that, and after that, and for thousands of years, no one could live there. Climate change had made the region uninhabitable for humans.

Similarly, climate change probably caused depopulations, evacuations, and migrations in many other parts of the world at several points in time represented here.

Critics from the denier side of things would point out that climate change has always caused problems, so this new change is no big deal, and XKCD ignore this. But the cartoon, had it mentioned more of these earlier changes, would instead represent a different fact: Natural variation in climate can be catastrophic to humans. The level of change happening now, and expected in the near future, still caused by humans, is much larger than what happened during this time period, or faster. So look out!!!

But that’s not the point I want to make.

Simple facts and big concepts

NOTE: Since I wrote this post, at least one change was made in the original carton, pertaining to the flooding of the scablands in Washington State. Perhaps other changes will be made over time!

There are a number of simple facts that the time line either gets wrong or represents in a way that we would not like for a basic intro class in archaeology or paleontology. Some of these facts were pointed out to me by John McKay or Helga Ingeborg Vierich. This is not comprehensive, but gets the point across:

  • Impressive prehsitoric art appears on the cartoon at 15K. Art and adornmnent appear well before the time line begins, and jot just in Europe. The super impressive cave wall art dates well before the time line, and somewhat less impressive works occur very much earlier. This is a decades old conception overturned many years ago.
  • The Clovis First model of the peopling of the Americas is on its last legs and should not be used as assumed knowledge.
  • The Missoula mega-flods affected eastern Washington, not Oregon.
  • The glaciers weren’t just in New York and Boston, they covered many other places. If the idea is to connect glacial geography to people’s lives, references to other areas might be helpful.
  • Wrangle Island is not tiny. He may have confused Wrangle Island with Ts. Paul in the Privilovs.
  • Abu Hureyra is one of several sites with early year round settlement. More important may be the more southerly Natufian, where foraging peoples, for a very long time, took up permanent settlement, and the first commensal organisms (which would become very important to humans, like plague carrying rats and domestic dogs, etc.) came on the scene.
  • Agriculture has multiple origins, but a single origin is implied here.
  • The origin of copper metal working happens in multiple places (two with smelting in the old world, plus it was worked in the new world).
  • Similarly, other metal working has multiple origins.
  • People will fight about the date for “proto-indo-european” languages or even is fuch a proto-thing existed or could be dates. The majority of historical linguists don’t accept this at all. But if that is right or not, again, Indo-European languages are not particularly important in overall human history. The cartoon centralizes a relatively rare language group and ignores thousands of other language groups, as though the mostly post hoc Western lineage of human civilization is assumed to be the most important.
  • “Permanent settlement in the fertile crescent” is out of the blue, and contradicted earlier on the time line. Permanent settlements in the region predate this by 6,000 years.
  • All of the early steps in civilization rising are focused on a very limited area, represent only a small (and very Western oriented) portion of civilization, ignoring most of human prehistory, and privilege “civilization” over what the fast majority of people were doing at the time.
  • Same problem with writing. Writing was invented many times over many areas, but it looks here like it may have a single origin, the origin that is part of the Western Civilization story.
  • Missed opportunity: “Invasion of the sea peoples” may very well have been an example of climate change messing up a population and causing a mass migration.
  • For later civilizations, I appreciate the reference to the New World. But again, it is only part of the story, mentioning a small part of the record. Isn’t it a much more interesting story to note that between 10K and 2K (or so) dozens of independent highly organized hierarchical societies, often referred to as “this or that civilization” arose all over the world, while at the same time, the vast majority of people lived off the land as foragers?
  • The Industrial Revolution starts in the 18th, not 19th,century, in Europe.

Larger scale things you might learn from this graphic that are wrong.

That agriculture was invented once, as part of Western Civilization, and the same for metal working, marginalizes the new world, many regions of Asia, many regions of Africa. These are misconceptions that those of us who teach intro to world prehistory or similar courses have to spend a large amount of our time refuting.

The idea seems to be represented that humans made the transition from hunter-gatherers at one point in time and thereafter were mostly agriculturalists. The opposite is true. Most humans were living in small groups as hunter gatherers for the entire time represented by this cartoon, except at the end. Half the humans or more at the time of Christ, for example. It is likely that in many regions, at various points in time, an early stab at horticulture was abandoned, and people returned to agriculture.

Is this important?

Well, getting facts right is important. In the case of prehistory, this mainly means not overstating the facts. One might argue that in a simplified version of reality (like in a cartoon) it is ok to overstate things as facts where we really don’t know. No, it isn’t. There are ways to speak briefly, and in an interesting way, of a past that we understand more vaguely than some DK book for five year olds. So let’s do that.

The oversimplification of prehistory contributes to the co-opting of intelligence, innovation, rights over various things like landscapes and cultural phenomena, by the dominant cultures who have condensed the relevant prehistories to centralize and privilege themselves. The prehistory presented here mostly privilages what we sometimes refer to as “Western Civilization” with its middle eastern roots and its simple, linear, one way, always improving, progressive history. A very inaccurate history.

As Helga Vierlich wrote on my Facebook timeline, “In short, this reflects a preoccupation with “progress” whereas what it really shows is a progressive ecosystem and social clusterfuck that brings us to the present situation – characterized by continuing destruction of the last ecologically sustainable (“indigenous”) economies… and also characterized by deforestation, massive climate change, pollution, ecosystem distraction, soil erosion, and species extinction.”

So, in making a point about self destruction by the human species, due to anthropogenic climate change, the oversimplification misses key points in that actual process.

But it is still a good cartoon.

I would like people who pass this cartoon around to make a brief statement, like, “I hear the prehistory is oversimplified a bit, but this makes a great point about climate change” or words to that effect. Many will argue that this statement is not enough. But I’m not a big fan of sacrificing the really cool for the sake of the perfectly pedantical. Usually.

Jihad Engineers

A disproportionate percentage of Islamist radical actors, including suicide bombers, come from an engineering background. Why?

Right wing and Islamist extremism seem to share this and other traits, while left wing extremism is more commonly associated with individuals from the humanities and social sciences.

This is what we learn from “Engineers of Jihad: The Curious Connection between Violent Extremism and Education“, by Diego Gambetta and Steffen Hertog.

An obvious reason that engineers may be more often associated with groups that carry out bombings is that such groups recruit engineers because they would be the idea bomb makers. This, however, is not the case. Indeed, many of the famous goofed up bombing attempts of recent years were carried out by those with engineering backgrounds, while many of the more competent bombers were did not come from an engineering background.

Also note: We keep seeing the term “engineering background” because many of these individuals are not engineers. Many are students who studied, or even got degrees in, engineering, though they may not have ever worked as such. And many are civil engineers, or other kinds of engineers, or studied these professions, rather than some sort of bomb-oriented engineering (though civil engineering might be helpful in designing a bomb-based attack on something).

The basic explanation works something like this. In the Arab/Islamic/Middle Eastern world, there are two professions that men often aspire to for status. Medicine and engineering. Getting a BA or BS is a status symbol, but if one gets a BA or a BS in engineering, that is a better status symbol. Men get this degree, disproportionately, even if they are from a background, and embedded in a family or subculture, where they are not likely to ever work in that profession.

Meanwhile, there seems to be an association with something we might broadly describe as failure to meet one’s own expectations, and getting all cranky and jihadi. You think you are cool. You are cool. And smart. And going up in status. You get you degree. You try for an engineering degree, and maybe you barely get past the hurdles and achieve it. But, you are entering a world where more than just an engineering degree, or your own massive coolness, is enough to succeed. The global Bush-Cheney Recession is upon us, and everyone is suffering.

But the thing is, you are not supposed to be suffering. You are cool. You are from a good background. You have a degree. In engineering!

So, you experience what psychologists of yore called “Relative Deprivation.” It is kind of a first world problem. You should be father along, higher up, better situated, than you are. But the system, the economy, the government, the godless infidels of the west, have kept you down.

So you get all cranky and jihady and blow them up.

I don’t mean to make light of this idea or its consequences. Rather, my snark leads to another point. The people who set bombs and kill innocent bystanders in airports and such are not “cowards” as is often said by Secretaries of States and Presidents and such. Why are spoiled brats. Not that this matters a lot, but one needs to get these things right.

Anyway, Engineers of Jihad: The Curious Connection between Violent Extremism and Education is a very interesting academic treatment of the question of the link between engineering and jihad. Since it is rooted firmly in data, the book serves as well as an interesting historical account of much of the terrorism of the last several decades. More importantly, it is one of the rare full treatments of the nature and psychology of this sort of behavior.

I noted that “relative deprivation” is a concept of yore, and it is. The authors have, dangerously perhaps (because this sort of thing is dangerous in Academia) pulled out and dusted off an old concept that was found wanting in its earlier incarnations. But they have modified it and applied it well, so it is more of an homage to earlier workers to call it this. But the name is appropriate. Relative to your life long expectations, you are screwed. So you react, at out, victimize someone else. And you happen to be male and muslim (both traits of the patriarchal fundamentalist islamic world) and maybe you know somebody who knows somebody, and next thing you know you are in a training camp in war torn Syria.

The set of jihadists examined in this study is not everybody, but rather, a subset with common defining characteristics. So, for example, this study does not pertain to ISIL.

And, other extremists may have a similar pattern of association with certain areas of study and their radical decisions, but come from different backgrounds. There is a vague association between being a Nazi in the early days and being in law, history, or economics. Indeed, the pattern of extremist behavior, historical context, and educational or work background is very complicated, not very well understood, and there is no way I can give it justice here. Must read Chapter 5.

Education (of one type or another) does not cause extremism. This is not nearly so simple of a situation. But the link between academic orientation, educational effort, a few other things, and extremist views and action is not random, and does make sense, in the context of the revised and updated theory of relative deprivation. Have a look, I think you’ll be convinced.

Turns Out Dick Is Really Interesting.

Have you ever wondered how “Dick” became short for “Rick”?

Probably not. But it turns out that the reason, if the following video is accurate, is interesting.

I have two questions for the historical linguists in the room. First, is there a name for this rhymification effect? Is is common? Is it confined to certain regions or cultures? Is it linked to Cockney in some way?

OK, that was a lot of questions, but really, all the same question. My second one is simpler: Where does the phrase “Swinging dick” come in? It is a Britishism for, I think, Square Mile money managers and investors. According to something I saw on TV once.

16 common grammatical mistakes or problems

Certain things that come across one’s desktop, on the internet, are hard to turn away from. Train wrecks, for example. For me, this list includes commentary about grammatical errors and proper language use.

I find this sort of discussion interesting because I’m an anthropologist, and probably also because I’ve spend a lot of time 100% immersed in a language or two other than my native English. This training and this experience each make me think about how we make meaning linguistically. Also, as a parent, I have observed how a child goes through the process of first, and quickly, learning how to use language properly, then spends the next several years learning how to use it wrong by following our arcane rules. And, as a writer – well, you can imagine.

Today I was inspired to write my own version of one of those posts on grammatical errors and quirks. I came across Bill Murphey Jr’s post “17 Grammar Mistakes You Really Need to Stop Correcting, Like Now” via Stumble Upon. Bill’s main point is to cool off the conversation a bit and tell people to lighten up on the grammar correcting.

I’m not too concerned about that. Excessive grammar correcting certainly is annoying, but my main interest in this topic is not the nature of language policing so much as it is the nature of language, as well as simply knowing what is considered righter vs. wronger. As it were.

So, I took Bill’s list of grammar issues, deleted a few, and created my own commentary on them. And resorted them. And here goes:

Further versus farther

Futher is a word’s word. It works with concepts, or as a marker for where the thing you are saying is going. Farther is about physical distance. This is easy to remember. “Farther” has “far” in it. “Those who go farther have indeed gone far.” Not, “Those who have gone further have indeed gone fur.” Meanwhile, we use the word “furthermore,” derived from “further” but there is no such thing as “farthermore.” Not yet, anyway.

(Actually, “farthermore” was a word at one time, but our language has moved further along and it no longer is.)

dot dot dot vs em-dash

Don’t use “…” to break up sentences. Use a long dash (an em-dash). An ellipsis is a part of quoted text that is left out. The same word, ellipsis, is also used to refer to the three dots that we put in the ellipsis. So, if you type dot-dot-dot make sure that something is truly missing there.

Double negatives

It is not uncommon for people to use double negatives when they are trying to look like they are not uneducated. Outside of certain contexts, this is always bad. If a logic algorithm has to be applied to your sentence to understand what it means, you messed up. Don’t do that.

That is the “proper” double negative I’m recommending against. The hauty tauty classist double negative. The other kind is the kind that just makes things wrong, but in a way, it is more linguistically acceptable even if grammatically the equivalent of crushing baby kittens.

I ain’t never going to do that. Or, even, a term like “irregardless,” where afixes or words conflict with each other in a way that seems to cancel out. In language, we often add bits to a word or phrase to add emphasis or, perhaps absurdly, underscore something by negating it. Irregard, if it was a word, would be without regard. Regardless is without regard. So, if we really want to make the point that there is very little regard, we say it both ways at the same time: irregardless of grammatical proscription! This would be a sort of double negative you should avoid in proper and clear writing, and keep in your toolkit for dialog or ironic phrasing.

i.e. versus e.g.

i.e. stands for the latin id est.

e.g. stands for the latin exempl? gr?ti?

Id est means “that is.” Use i.e. to prefix an example of something that elaborates a term or phrase. The Doctor’s time travel machine, i.e., the Tardis.

Exempl? gr?ti? means “for example.” Just like it sounds.

Time machines, e.g., The Doctor’s Tardis, or Dr. Emmett Brown’s DeLorean.

See the difference? Not much of a difference. But there is a difference.

E.g. is usually followed by a comma, just as you might say, “I would like dessert, for example, ice cream” = “I would like dessert, e.g., ice cream.”

I like to think of e.g. as plural, in a sense. Examples.

I.e. can be thought of as “in other words.” So, I might say, “I don’t like desserts like flan, i.e. slimy icky stuff.”

In writing, if you find yourself saying “in other words” a lot, you should revise and perhaps use the “other words” that were your afterthought as your actual words. So, perhaps, if you find yourself using “i.e.” you should revise as well. Either way, if someone complains to you about your use of i.e. vs e.g. you could probably make a case that your word choice was correct no matter what you did.

Incomplete comparisons

Incomplete comparisons are less annoying.

Than what??? Less annoying than what????

A sentence that is an incomplete comparison may not be incomplete at all if the larger context keys the reader in to what is being compared. The Prius and the Smart Car get great gas mileage. The Chevy Volt gets better gas mileage. This is less of a grammatical problem than a marketing problem. Out of context incomplete comparisons reflect incomplete thinking.

(By the way, we’re not talking about semicolons here, but that would have been a great place to use one: “The Prius and the Smart Car get great gas mileage; the Chevy Volt gets better gas mileage.”)

Into versus “in to”

This one can be tricky. “Into” is a preposition. Note that the word “position” is in “preposition.” “Into” pretty much only means that something is moving from and to particular positions. The words “to” and “in” do a lot more work than the prepositional. Generally, if “in to” and “into” both seem right, you want “into.”

There are some odd exceptions. “He walked into the room” is correct. But if he is a burglar and he gets there by force, he broke in. So, you would not say “He broke into the room,” but rather, “he broke in to the room.” He did, however, burgle his way into the room.

Also, the “to” can be possessed by a verb following the term, demanding “in to” instead of “into.” He went into the room where he left his wallet. He opened the door of the room and went in to get his wallet.

Prepositions are not always about space, in the usual sense, so of course, “into” is also used for other kinds of transitions. If life gives you lemons, make them into lemonade.

Irregardless

Regardless of what people tell you, irregardless is a word. But, it is a word that even the dictionary says should be avoided. Instead of sneaking quietly into speech and becoming a normal word that means the same thing as “regardless” it annoyed grammar experts early on (as far back as the 1920s) and was stigmatized. So, now, “irregardless” is a signal that you don’t care about the quality of your spoken or written word. In good writing, “irregardless” should be confined to dialog spoken by characters that you want to look a little careless or poorly educated.

Leaving off the “ly” ending for adverbs

If you want to use an adverb, a word that modifies a verb, you generally need the “ly”. But if you are using a lot of adverbs in your writing, you probably want to delete some of them. A well chosen verb hardly needs such help in eloquently written verbiage. After you’ve written something, go on a ly-hunt. Search for the string “ly_” (note the space) and revise as appropriately. I mean, appropriate.

In the old days you could leave off the -ly to make more impactful text. Bill gives the example of an Apple marketing campaign that used “Think different” instead of “Think differently.”

This method of catching our attention was overused and that ship has sailed.

Me versus I

This is one of those important distinctions that is very easy in certain circumstances and very hard in other circumstances. So, the way to get it right is to restate a sentence in such a way as to make the distinction unambiguous, then revise as if necessary.

For example, you can see that “I wrote a blog post” is correct and “Me wrote a blog post” is Tarzan-talk.

The confusion comes when the simple “I/me” part of the sentence is joined with others.

Jose and I/me went to the movies.

Jose took Jasper and I/me to the movies.

Simply picking the “I” over the “me” in these sentences might sound to some to be “better” because culturally we have come to expect to be corrected more often when misusing “me.” In other words, always opting for “I” is a way to sound like you are not uneducated.

In most cases, the way to figure this out is to remove the second person, the one that is a name and not a pronoun, and see how it sounds.

“Jose and me went to the movies” does not sound a lot different than “Jose and I went to the movies” but the difference becomes clear when we ask Jose to leave the sentence. Compare “Me went to the movies” with “I went to the movies.” I am the subject of the sentence, so I get to be I, not me.

“Jose took Jasper and I to the movies” and “Jose took Jasper and me to the movies” also don’t sound all that different, but compare “Jose took I to the movies” with “Jose took me to the movies.” I am the object of the sentence, and so “me” is correct, and when we parse it out this way, “me” sounds correct.

Me can forgive Tarzan for getting this wrong.

One or two spaces after a period

In the old days, you put two pieces of lead after the period in order to make sentences look normal. This practice continued with non-proportional typefaces on typewriters and other machines.

People will tell you that modern fonts don’t require this, so you should not do it. However, there is a missing part of the story often conveniently ignored.

In the less old days, people who used computing technology to manipulate text could use a .__ (a period and two spaces) as distinct from ._ (period and one space) to tell the difference between the end of a sentence (with a full stop period) and an abbreviation.

Had we continued, as a society, to use period-space-space, this convenience could have been preserved. But we din’t. So that was ruined.

Now, of course, when you are fingering your smart device and hit the space twice, the app automatically puts in a period.

Checkmate!

You can tell me again and again to use only one space after a period. But my thumb will ignore you.

Split infinitives

An infinitive is a form of a verb that has the “to” attached. In some languages the “to” is so attached to the word that you can’t fit any other words in there. E.g., in upcountry Swahili, “ku” is “to” and “do” is “fanya” so “to do” is kufanya. One word. I imagine that the fact that many languages have infinitives that are pre-stuck together had led to the convention that one does not split them by adding extra words between the “to” and the “verb.”

(There is actually quite a bit of ink spilt over the history of this rule.)

In my view, the ability to split infinitives is really cool feature of English and there should be no rule against it. However, since we often split our infinitives with adverbs, and adverbs are overly used, hunt for split infinitives not so much to unsplit them but to identify adverb overuse.

That versus which

After you’ve written your text, go on a which hunt and change the whiches to thats. But, you can leave the whiches that start independant clauses. In other words, if the part of the sentence that stats with which could more or less be a separate sentence, and/or if you can remove it from the sentence and still have a sentence, it is probably OK.

I think that for a time the word “that” sounded more pedestrian than the word “which,” which is a guess on my part, I’m not sure, so people who wanted to write good seeded their sentences with random whiches. Never trust a random which.

The Oxford comma

Also known as the Harvard comma or, perhaps most correctly, the serial comma. In fact, I’m rather shocked that which of these terms to use is not itself a major battle among language mavens.

The Oxford comma is the last comma in a list, before the last item and before the “and” that separates out the last item. Always use this comma. Often, it is not necessary, but when it is necessary, it is sometimes really necessary. So just use it all the time and avoid certain embarrassing, though often hilarious, mistakes.

From here:

I love my parents, Lady Gaga, and Humpty Dumpty.

vs

I love my parents, Lady Gaga and Humpty Dumpty.

They or Their as a gender neutral term, instead of the singular Him, her, his, hers.

English lacks a gender-neutral singular possessive term. Also, English lacks (in common use) a term that is not so strictly gender binary.

Using the plural as a gender neutral is natural, since there is a kind of plurality (his’s, hers’s, or neithers’s).

New terms and new uses tend to grate, but a new term is less likely to be accepted and more likely to bother people than a re-use of an existing term. What needs to happen here, probably, is that the purveyors of proper language (elementary school teachers and the like) need to not correct students who use the plural form as a gender non-specific one.

Who versus that

This is simple. “Who” is about people, “That” is about things. More obviously incorrect and underscoring the point that who is people is the substitution of “The people who do that” with “The people what do that.”

So when it comes to referring to people as that or what, who would do that?

Less versus fewer

Less and more refer to changing amounts of something you don’t count in whole numbers. More or less rain, love, or apple cider. Fewer and more refer to things counted in whole numbers.

The fact that “more” is in both of these sets may be the cause of confusion between “Fewer” and “Less.”

Fewer trains pass by my house these days, so we have less noise around here. Not, less trains pass by my house these days, so we have fewer noise around here. But, we do have less train traffic these days, so we have fewer instances of annoying noise events.