The Complete Dinosaur, an edited volume.* Editors: Thomas H oltz, James Farlo, Bob Walters and Michael Brett, is currently on sale in kindle form, and it looks like a great value. I don’t know the book, but I looked through the sample and bought it.
Over the last few years, I’ve read a lot of 18th and 19th century North American history. In the very old days, I was a career historic archaeologist, so I have some professional background in history, but an archaeologist is not an historian by training or experience. As I went about reading this American history, I learned something that most non-historian Americans find unbelievable. So unbelievable that I won’t tell you now, other than that it has to do with Donald Trump and his followers. Maybe we can discuss it another time.
I’ve always liked historical fiction as well as history, and I’m starting to work on a project that puts the two together: a list of accessible histories (books written by historians who are good writers) and parallel (maybe even matched-up) novels that may be reasonable representations of the past. The novels are a challenge in this project. A book can be a good novel but a lousy history. Also, what do we do with historical science fiction or fantasy, that might involve a good description of some bygone era or culture, but that includes aliens or ghosts? (Time machines probably don’t present this problem, in and of themselves.)
By and large, I expect that most novels are not good representations of our past. I believe culture can vary dramatically across time and space. A 20th century account of the 17th century (anywhere) or a contemporary account of a very different region of the world (or neighborhood) is likely to be written to be understandable and relatable. That may require significant shifts in nuance and context, expectations and norms. By sticking with work covering time periods that are not too far in the past, and on the North American Continent, this problem is somewhat reduced. Or, made worse, because our own history, as quasi-scholarly work or as fiction, is bound to be biased in ways that get around our own BS filters. One way to pretend to avoid that is to include more work by women, non-white people, and stories about someone other than white men. That does not really remove all biases, but it makes us feel better, and that is what is important, right?
The following is a first draft of a list (with links*) of some of the fiction items in this project.
Caleb’s Crossing: A Novel by Geraldine Brooks (author of one of my favorite novels, People of the Book). Bethia Mayfield is a restless and curious young woman growing up in Martha’s vineyard in the 1660s amid a small band of pioneering English Puritans. At age twelve, she meets Caleb, the young son of a chieftain, and the two forge a secret bond that draws each into the alien world of the other. Bethia’s father is a Calvinist minister who seeks to convert the native Wampanoag, and Caleb becomes a prize in the contest between old ways and new, eventually becoming the first Native American graduate of Harvard College. Inspired by a true story and narrated by the irresistible Bethia, Caleb’s Crossing brilliantly captures the triumphs and turmoil of two brave, openhearted spirits who risk everything in a search for knowledge at a time of superstition and ignorance.
Colonial Era and beyond
These novels start in the Colonial area then continue, epic fashion:
Someone Knows My Name: A Novel, originally published as The Book of Negroes by Lawrence Hill is the story of an African woman who is abducted as a girl in her native village and sold in to American slavery. Her subsequent story is complex and fascinating. I think this book is underappreciated in the United States because Americans can’t handle the name. The author, who is Black and Canadian, explains the title: “”I used The Book of Negroes as the title for my novel, in Canada, because it derives from a historical document of the same name kept by British naval officers at the tail end of the American Revolutionary War. It documents the 3,000 blacks who had served the King in the war and were fleeing Manhattan for Canada in 1783. Unless you were in The Book of Negroes, you couldn’t escape to Canada. My character, an African woman named Aminata Diallo whose story is based on this history, has to get into the book before she gets out.”
I am putting these two novels I’ve not read (but plan to) here because they belong here and maybe you will tell ME about them.
I, Eliza Hamilton by Susan Holloway Scott “In this beautifully written novel of historical fiction, bestselling author Susan Holloway Scott tells the story of Alexander Hamilton’s wife, Eliza—a fascinating, strong-willed heroine in her own right and a key figure in one of the most gripping periods in American history.”
My Dear Hamilton: A Novel of Eliza Schuyler Hamilton by Stephanie Dray and Laura Kamole. From the New York Times bestselling authors of America’s First Daughter comes the epic story of Eliza Schuyler Hamilton–a revolutionary woman who, like her new nation, struggled to define herself in the wake of war, betrayal, and tragedy. In this haunting, moving, and beautifully written novel, Dray and Kamoie used thousands of letters and original sources to tell Eliza’s story as it’s never been told before–not just as the wronged wife at the center of a political sex scandal–but also as a founding mother who shaped an American legacy in her own right.
Civil War, Mid-19th Century
There is approximately one gazillion novels set in the US that have something to do with the Civil War, so this is a very much narrowed down list. I won’t make it bigger until some of the other time periods are better covered. Ultimately, there are probably two or three dozen excellent novels in this era, which perhaps can be divided into categories like “the Civil War is actually in the novel” vs. “The Civil War just ended but the smoke still rises from the ashes,” and also, along gender or ethnic lines.
The March: A Novel by E.L. Dostorow. In 1864, Union general William Tecumseh Sherman marched his sixty thousand troops through Georgia to the sea, and then up into the Carolinas. The army fought off Confederate forces, demolished cities, and accumulated a borne-along population of freed blacks and white refugees until all that remained was the dangerous transient life of the dispossessed and the triumphant. In E. L. Doctorow’s hands the great march becomes a floating world, a nomadic consciousness, and an unforgettable reading experience with awesome relevance to our own times.
Late 19th Century, Turn of the Century
Little Big Man: A Novel by Thomas Berger is said by some to be one of the most underappreciated American novels. One reason may be that the literati saw no need to appreciate a Western. Another may be that Berger eschewed the establishment in the publishing world. It is, of course, the story that is told by a very old man who may or may not be an unreliable narrator of his life wafting back and forth between being a white settler/cowboy/gambler/gun slinger/guide vs. a Native warrior, husband, and student of a great shaman. This book was made into what may be one of the great movies of the 20th century. It is also, sadly, the only contribution I can find that involves Native Americans that I’d recommend. Still looking.
Beloved by Toni Morrison. Sethe, its protagonist, was born a slave and escaped to Ohio, but eighteen years later she is still not free. She has too many memories of Sweet Home, the beautiful farm where so many hideous things happened. And Sethe’s new home is haunted by the ghost of her baby, who died nameless and whose tombstone is engraved with a single word: Beloved. Filled with bitter poetry and suspense as taut as a rope, Beloved is a towering achievement.
Ultimately I want this list to go up to and include World War II. I am not short of entries for that period, but I’ll get to that later.
How well we communicate determines success or failure in every aspect of life. The ability to effectively get a message across is learned, even if the person learning is unaware of that learning. We are not born as linguistic beings, but acquire that ability after birth, during early childhood. We hone that ability subconsciously as we engage in our social interactions, our inner dialogue typically running ahead of our overt patter by about a mile. Every now and then the message that the message is important gets out. Lately that has been in the form of memish** aphorisms, like “don’t repeat the falsehood” or “stop using their talking points” or “get a better frame!”
These bits of advice often do more damage then good. They are potentially sharp knives, or meaty mallets, or highly useful duct tape, in the tool kit of novices, but just as likely to cut or pound a finger or gum something up as to help. These bits of advice are like the tricks surgeons used to close off a bleeder or work around a key nerve without harming it. They are nice to know if you are a trained surgeon, but really not that useful if you are not. They serve mainly to make people think they are suddenly good communicators.
My advice is to either let other people do it, or to ramp it up. By ramp it up I mean don’t attend one seminar on how to communicate, but ten. Not three or four, but ten. Don’t read the first four paragraphs of a commentary on communication in The Atlantic, but read five books. Not one or two books, but five books. Or seven,even.
You need to do enough study of the matter to go through the phase when you realize you know way less than you thought.
Pursuant to this effort, I hereby recommend a few items. These are not new, but they are current. Newness is not the key to success. One of the best references in how we communicate with words is well over 2,000 years old.
Made to Stick: Why Some Ideas Survive and Others Die by Chip Heath*. Mark Twain once observed, “A lie can get halfway around the world before the truth can even get its boots on.” His observation rings true: Urban legends, conspiracy theories, and bogus news stories circulate effortlessly. Meanwhile, people with important ideas—entrepreneurs, teachers, politicians, and journalists—struggle to make them “stick.”
In Made to Stick, Chip and Dan Heath reveal the anatomy of ideas that stick and explain ways to make ideas stickier, such as applying the human scale principle, using the Velcro Theory of Memory, and creating curiosity gaps. Along the way, we discover that sticky messages of all kinds—from the infamous “kidney theft ring” hoax to a coach’s lessons on sportsmanship to a vision for a new product at Sony—draw their power from the same six traits.
Called the “father of framing” by The New York Times, Lakoff explains how framing is about ideas?ideas that come before policy, ideas that make sense of facts, ideas that are proactive not reactive, positive not negative, ideas that need to be communicated out loud every day in public.
The ALL NEW Don’t Think of an Elephant! picks up where the original book left off?delving deeper into how framing works, how framing has evolved in the past decade, how to speak to people who harbor elements of both progressive and conservative worldviews, how to counter propaganda and slogans, and more.
In this updated and expanded edition, Lakoff, urges progressives to go beyond the typical laundry list of facts, policies, and programs and present a clear moral vision to the country?one that is traditionally American and can become a guidepost for developing compassionate, effective policy that upholds citizens’ well-being and freedom. (NB: “All New” here does not mean all new now. It was all new a few years ago.)
Mouse is a small rodent with a cigarette shaped, elongated nose that actually kind of looks like a kitchen match. Mouse is either very clever, and knows how to gaslight a predacious bird, or is the most clueless rodent in the forest. Either way, this dark tale in a picture book is ideal to help 3-6 year olds understand some of the key realities of life … and near death.
My Best Friend,* a new, fresh, amusingly and skillfully illustrated book by Rob Hodgson, author of The Cave, could be your toddler’s first relationship book, or first nature book, depending on what the child takes from it. Let me know how it goes. The three kids I tried it out on loved it.
This list does not include children’s books (usually STEM) that I reviewed, books Huxley and I have read (which is mainly Lord of the Rings this year, frankly… taking forever), computer or technology books, or video game references or books. It also does not include manuscripts or books that pre-date 1900 and you probably can’t find anywhere, unpublished material, and racist volumes that I read because I hate them but wont’ link to them (and similar). I list one John Sanford book to represent several that I read (I decided to read them all over, in order … yes, I had a reason for that, and it was great fun. But that was like 20 books.)
I’ve read 100% of 80% of these books, substantial parts of others. I have not included books I only read a few chapters of, such as a biography of FDR and one of Washington, and some books on Minnesota history.
They are not in any particular order. I probably missed a few. One I finished during the current year but I started it in 2019.
This is a great resource for understanding the diverse strategies available to decarbonize. There is a flaw, and I think it is a fairly significant one. Drawdown ranks the different strategies, so you can see what (seemingly) should be done first. But the ranking is highly susceptible to how the data are organized. For example, on shore vs. off shore wind, if combined, would probably rise to the top of the heap, but separately, are merely in the top several. Also, these things change quickly over time in part because we do some of these things, inevitably moving them lower in ranking. So don’t take the ranking too seriously.
I mention this book because I hope it can help the free market doe what it never actually does. The energy business is not, never was, and can’t really be a free market, so expecting market forces to do much useful is roughly the same as expecting the actual second coming of the messiah. Won’t happen. This book is not an ode to those market forces, though, but rather, a third stab (I think), and a thoughtful one, at a complex problem.
Related, of interest: Windfall: The Booming Business of Global Warming by McKenzie Funk. “Funk visits the front lines of the melt, the drought, and the deluge to make a human accounting of the booming business of global warming. By letting climate change continue unchecked, we are choosing to adapt to a warming world. Containing the resulting surge will be big business; some will benefit, but much of the planet will suffer. McKenzie Funk has investigated both sides, and what he has found will shock us all. ”
Designing Climate Solutions: A Policy Guide for Low-Carbon Energy by Hal Harvey, Rovbbie Orvis and Jeffrey Rissman. ” A small set of energy policies, designed and implemented well, can put us on the path to a low carbon future. Energy systems are large and complex, so energy policy must be focused and cost-effective. One-size-fits-all approaches simply won’t get the job done. Policymakers need a clear, comprehensive resource that outlines the energy policies that will have the biggest impact on our climate future, and describes how to design these policies well.”
I have been slowly and steadily working on a project that involves an old topic of interest: the dynamic changes in society, economy, and settlement pattern as Euro-Americans ensnared the middle and western parts of the continent in their material and political net of civilization, sometimes known as the Westward Expansion. And for this reason, I came across a book, a NYT Book Review “Best Ten” for 2017, of interest, that happens also to be on sale cheap in Kindle format.
As pointed out in a review by Patricia Nelson Limerick, the exploitation and eastward shipping, for profit, of bison hides and precious metals (and everything in between) was not the only gig in the west. The story itself, the stories of pioneering, gun fighting, Indian, er, relations, and everything else, collected in situ and refined through the myth-mills of the publishing industry, amounted to a significant and valuable commodity. One of the most productive ore lodes of daring narrative in the plains and midwest was the one tapped by Laura Ingalls Wilder via the Little House series, and other tales. Also, her daughter was in on it.
Prairie Fires pulls back the switch-grass curtain. To quote from PNL’s review:
Rendering this biography as effective at racking nerves as it is at provoking thought, the story of Wilder’s emergence as a major sculptor of American identity pushes far past the usual boundaries of probability and plausibility. For anyone who has drifted into thinking of Wilder’s “Little House” books as relics of a distant and irrelevant past, reading “Prairie Fires” will provide a lasting cure. Just as effectively, for readers with a pre-existing condition of enthusiasm for western American history and literature, this book will refresh and revitalize interpretations that may be ready for some rattling. Meanwhile, “Little House” devotees will appreciate the extraordinary care and energy Fraser brings to uncovering the details of a life that has been expertly veiled by myth. Perhaps most valuable, “Prairie Fires” demonstrates a style of exploration and deliberation that offers a welcome point of orientation for all Americans dismayed by the embattled state of truth in these days of polarization.
-Patricia Nelson Limerick, review, The New York Times
OpenSource science means, among other things, using OpenSource software to do the science. For some aspects of software this is not important. It does not matter too much if a science lab uses Microsoft Word or if they use LibreOffice Write.
However, since it does matter if you use LibreOffice Calc as your spreadsheet, as long as you are eschewing proprietary spreadsheets, you might as well use the OpenSource office package LibreOffice or equivalent, and then use the OpenSource presentation software, word processor, and spreadsheet.
OpenSource programs like Calc, R (a stats package), and OpenSource friendly software development tools like Python and the GPL C Compilers, etc. do matter. Why? Because your science involves calculating things, and software is a magic calculating box. You might be doing actual calculations, or production of graphics, or management of data, or whatever. All of the software that does this stuff is on the surface a black box, and just using it does not give you access to what is happening under the hood.
But, if you use OpenSoucre software, you have both direct and indirect access to the actual technologies that are key to your science project. You can see exactly how the numbers are calculated or the graphic created, if you want to. It might not be easy, but at least you don’t have to worry about the first hurdle in looking under the hood that happens with commercial software: they won’t let you do it.
Direct access to the inner workings of the software you use comes in the form of actually getting involved in the software development and maintenance. For most people, this is not something you are going to do in your scientific endeavor, but you could get involved with some help from a friend or colleague. For example, if you are at a University, there is a good chance that somewhere in your university system there is a computer department that has an involvement in OpenSource software development. See what they are up to, find out what they know about the software you are using. Who knows, maybe you can get a special feature included in your favorite graphics package by helping your new found computer friends cop an internal University grant! You might be surprised as to what is out there, as well as what is in there.
In any event, it is explicitly easy to get involved in OpenSource software projects because they are designed that way. Or, usually are and always should be.
The indirect benefit comes from the simple fact that these projects are OpenSource. Let me give you an example form the non scientific world. (it is a made up example, but it could reflect reality and is highly instructive.)
Say there is an operating system or major piece of software competing in a field of other similar products. Say there is a widely used benchmark standard that compares the applications and ranks them. Some of the different products load up faster than others, and use less RAM. That leaves both time (for you) and RAM (for other applications) that you might value a great deal. All else being equal, pick the software that loads faster in less space, right?
Now imagine a group of trollish deviants meeting in a smoky back room of the evile corporation that makes one of these products. They have discovered that if they leave a dozen key features that all the competitors use out of the loading process, so they load later, they can get a better benchmark. Without those standard components running, the software will load fast and be relatively small. It happens to be the case, however, that once all the features are loaded, this particular product is the slowest of them all, and takes up the most RAM. Also, the process of holding back functionality until it is needed is annoying to the user and sometimes causes memory conflicts, causing crashes.
In one version of this scenario, the concept of selling more of the product by using this performance tilting trick is considered a good idea, and someone might even get a promotion for thinking of it. That would be something that could potentially happen in the world of proprietary software.
In a different version of this scenario the idea gets about as far as the water cooler before it is taken down by a heavy tape dispenser to the head and kicked to death. That would be what would certainly happen in the OpenSource world.
You collect and manage data. You write code to process or analyze data. You use statistical tools to turn data into analytically meaningful numbers. You make graphs and charts. You write stuff and integrate the writing with the pretty pictures, and produce a final product.
The first thing you need to understand if you are developing or enhancing the computer side of your scientific endevour is that you need the basic GNU tools and command line access that comes automatically if you use Linux. You can get the same stuff with a few extra steps if you use Windows. The Apple Mac system is in between with the command line tools already built in, but not quite as in your face available.
You may need to have an understanding of Regular Expressions, and how to use them on the command line (using sed or awk, perhaps) and in programming, perhaps in python.
You will likely want to master the R environment because a) it is cool and powerful and b) a lot of your colleagues use R so you will want to have enough under your belt to share code and data now and then. You will likely want to master Python, which is becoming the default scientific programming language. It is probably true that anything you can do in R you can do in Python using the available tools, but it is also true that the most basic statistical stuff you might be doing is easier in R than Python since R is set up for it. The two systems are relatively easy to use and very powerful, so there is no reason to not have both in your toolbox. If you don’t chose the Python route, you may want to supplement R with gnu plotting tools.
You will need some sort of relational database setup in your lab, some kind of OpenSource SQL lanaguge based system.
You will have to decide on your own if you are into LaTex. If you have no idea what I’m talking about, don’t worry, you don’t need to know. If you do know what I’m talking about, you probably have the need to typeset math inside your publications.
Finally, and of utmost importance, you should be willing to spend the upfront effort making your scientific work flow into scripts. Say you have a machine (or a place on the internet or an email stream if you are working collaboratively) where some raw data spits out. These data need some preliminary messing around with to discard what you don’t want, convert numbers to a proper form, etc. etc. Then, this fixed-up data goes through a series of analyses, possibly several parallel streams of analysis, to produce a set of statistical outputs, tables, graphics, or a new highly transformed data set you send on to someone else.
If this is something you do on a regular basis, and it likely is because your lab or field project is set up to get certain data certain ways, then do certain things to it, then ideally you would set up a script, likely in bash but calling gnu tools like sed or awk, or running Python programs or R programs, and making various intermediate files and final products and stuff. You will want to bother with making the first run of these operations take three times longer to set up, so that all the subsequent runs take one one hundredth of the time to carry out, or can be run unattended.
Nothing, of course, is so simple as I just suggested … you will be changing the scripts and Python programs (and LaTeX specs) frequently, perhaps. Or you might have one big giant complex operation that you only need to run once, but you KNOW it is going to screw up somehow … a value that is entered incorrectly or whatever … so the entire thing you need to do once is actually something you have to do 18 times. So make the whole process a script.
Aside form convenience and efficiency, a script does something else that is vitally important. It documents the process, both for you and others. This alone is probably more important than the convenience part of scripting your science, in many cases.
Being small in a world of largeness
Here is a piece of advice you wont get from anyone else. As you develop your computer working environment, the set of software tools and stuff that you use to run R or Python and all that, you will run into opportunities to install some pretty fancy and sophisticated developments systems that have many cool bells and whistles, but that are really designed for team development of large software projects, and continual maintenance over time of versions of that software as it evolves as a distributed project.
Don’t do that unless you need to. Scientific computing often not that complex or team oriented. Sure, you are working with a team, but probably not a team of a dozen people working on the same set of Python programs. Chances are, much of the code you write is going to be tweaked to be what you need it to be then never change. There are no marketing gurus coming along and asking you to make a different menu system to attract millennials. You are not competing with other products in a market of any sort. You will change your software when your machine breaks and you get a new one, and the new one produces output in a more convenient style than the old one. Or whatever.
In other words, if you are running an enterprise level operation, look into systems like Anaconda. If you are a handful of scientists making and controlling your own workflow, stick with the simple scripts and avoid the snake. The setup and maintenance of an enterprise level system for using R and Python is probably more work before you get your first t-test or histogram than it is worth. This is especially true if you are more or less working on your own.
Another piece of advice. Some software decisions are based on deeply rooted cultural norms or fetishes that make no sense. I’m an emacs user. This is the most annoying, but also, most powerful, of all text editors. Here is an example of what is annoying about emac. In the late 70s, computer keyboards had a “meta” key (it was actually called that) which is now the alt key. Emacs made use of the metakey. No person has seen or used a metakey since about 1979, but emacs refuses to change its documentation to use the word “alt” for this key. Rather, the documentation says somethin like “here, use the meta key, which on some keyboards is the alt key.” That is a cultural fetish.
Using LaTeX might be a fetish as well. Obliviously. It is possible that for some people, using R is a fetish and they should rethink and switch to using Python for what they are doing. The most dangerous fetish, of course, is using proprietary scientific software because you think only if you pay hundreds of dollars a year to use SPSS or BMD for stats, as opposed to zero dollars a year for R, will your numbers be acceptable. In fact, the reverse is true. Only with an OpenSource stats package can you really be sure how the stats or other values are calculated.
This book focuses on Python and not R, and covers Latex which, frankly, will not be useful for many. This also means that the regular expression work in the book is not as useful for all applications, as might be the case with a volume like Mastering Regular Expressions. But overall, this volume does a great job of mapping out the landscape of scripting-oriented scientific computing, using excellent examples from biology.
Mastering Regular Expressions can and should be used as a textbook for an advanced high school level course to prep young and upcoming investigators for when they go off and apprentice in labs at the start of their career. It can be used as a textbook in a short seminar in any advanced program to get everyone in a lab on the same page. I suppose it would be treat if Princeton came out with a version for math and physical sciences, or geosciences, but really, this volume can be generalized beyond biology.
Stefano Allesina is a professor in the Department of Ecology and Evolution at the University of Chicago and a deputy editor of PLoS Computational Biology. Madlen Wilmes is a data scientist and web developer.
Many of the key revolutions, or at least, overhauls, in biological thinking have come as a result of the broad realization that a thentofore identified variable is not simply background, but central and causative.
I’m sure everyone always thought, since first recognized, that if genes are important than good genes would be good. Great, even. But it took a while for Amotz Zahavi and some others to insert good genes into Darwin’s sexual selection as the cause of sometimes wild elaboration of traits, not a female aesthetic or mere runaway selection. Continue reading Time itself as a resource that drives evolution→
There is probably a rule in the chambers of the United States Congress that you can’t punch a guy. Living rules are clues to past behavior. For instance, where I live now, there is a rule: You can’t leave your hockey goals or giant plastic basketball nets out overnight. There are no appropriate age children for that rule to affect. All the old people who live on my street have to drag those things into the garage at the end of every day, after their long sessions of pickup ball. More likely, years ago, there were kids everywhere and the “Get off my lawn” contingent took over the local board and made all these rules.
So, today, in Congress, you can’t hit a guy, but in the old days, that wasn’t so uncommon.
You have heard about the caning of Charles Sumner. Southern slavery supporter Preston Brooks beat the piss out of Charles Sumner, an anti-slave Senator from Massachusetts. They weren’t even in the same chamber. Brooks was in the House, Sumner was in the Senate. Sumner almost didn’t survive the ruthless and violent beating, which came after a long period of bullying and ridicule by a bunch of southern bullies. Witnesses describe a scene in which Brooks was clearly trying to murder Sumner, and seems to have failed only because the cane he was using broke into too many pieces, depriving the assailant of the necessary leverage. Parts of that cane, by the way, were fashioned into pendants worn by Brook’s allies to celebrate his attempted murder of a Yankee anti-slavery member of Congress.
Here’s the thing. You’ve probably heard that story, or some version of it, because it was a major example of violence in the US Congress. But in truth, there were many other acts of verbal and physical violence carried out among our elected representatives, some even worse, often in the chambers, during the decades leading up to the Civil War. Even a cursory examination of this series of events reveals how fisticuffs, sometimes quite serious, can be a prelude to a bloody fight in which perhaps as many as a million people all told were killed. Indeed, the number of violent events, almost always southerner against northerner, may have been large enough to never allow the two sides, conservative, southern, right wing on one hand vs. progressive, liberal not as southern, on the other, to equalize in their total level of violence against each other. Perhaps there are good people on both sides, but the preponderance of thugs reside on one side only.
I strongly urge you to have a look at Freeman’s book, in which she brings to light a vast amount of information about utter asshatitude among our elected representatives, based on previously unexplored documents. I also strongly urge you to listen to the podcast. The most recent edition as of this writing is on video games and American History. The previous issue is covers the hosts’ book picks for the year.
Might as well admit it. America has been ruined. Oh, it is fixable, not “totaled” like your car after you roll it down a hill during an ice storm. More like you failed to set the parking break and it got loose and crashed into a brick wall, then some hoodlum broke through the window and ripped out your radio, then there was a hail storm…
Anyway, here is a carefully selected list of books related to Trump and the Trump fake Presidency, integrated with a list of books that are NOT about that, but rather, leadership in history. The former are to get you steamed up, the latter, they are the control rods. A few are just about attacks on democracy from the elite and powerful.