I don’t know but you better watch this: Continue reading Which of your browser extensions are selling your data?
My father told me exactly three things about his time in the war (aka World War II).
One. He had made a date with a nice English lady, they were to meet under Big Ben at noon on some day, but the Victory in Europe happened and he was hastily sent back the US where he was put on a train to San Francisco to help invade Japan, but then they dropped the bomb. As a result, there is to this day a nice lady in England waiting under Big Ben, and the Japanese Army waiting in Japan, and my dad ditched both of them.
Two. On one, two, or three occasions (I don’t remember) he was at a location in London (like a store or something) and then left, or was just about to arrive at some location London (a store or something) when a German missile blew the place up. Close call.
Three. His contribution to D-Day. He was in the Army Air Corps, though he may have spent more time on a horse (which he presumably knew how to ride before enlisting) than in a plane. He volunteered for the glider corps, willing to be a pilot or navigator, or anything. He cheated on the eye test (he was nearsighted even at that age). He had memorized the eye chart, so when asked to read the letters, he read them all off perfectly.
Unfortunately, he had memorized an older eye chart, and the new eye chart had a different order of letters except the big E on top. The guy giving the test, another Staff Sargent, was his friend, so he did not get in trouble for cheating, but he was not allowed into the glider corps.
Meanwhile, he was assigned to one of those numerous typically secret air bases where they were preparing for the big invasion. His job was to supervise the arrival of airplanes, which were unassembled, and to oversee the storage and transfer of the plane parts to buildings where technicians would assemble them and get them ready to invade Europe. Lots of planes were simply flown to England from the US, but these were built in the us, but sent as non-completed planes to England via large transport planes such as the C-47 Skytrain.
But here’s the thing. The process of delivering these airplanes was rough and rugged. The various partly assembled parts of the planes often came damaged. I believe he said that they were often literally dropped off, pushed out of a transport plane as it landed and taxied, only to take off seconds later. This meant that if five or six planes were delivered over a short period of time, the technicians would have to borrow one part from this plane, and another part from that plane, in order to make perhaps four whole planes, with some spare bits left over.
My father changed the way they managed this, sending a suggestion up the line back to the US, where the planes originated. “Just pack the plane parts in the transport the best way they fit, don’t worry about sending a whole but disassembled plane all together.” So they did that. A transport plane would come in with mostly tails, another with mostly engines, another with mostly whatever. My father set up a method of inventorying and keeping track of the parts, and of supplying to the technicians working, undamaged, sections as they needed to assemble working aircraft. The process of building planes at this airstrip sped up, and when it came time to teach Hitler what for, more planes were ready than otherwise possible.
In other words, my father, Staff Sargent Joseph F. Laden, invaded Normandy with his mind.
He got a medal each from the US Government and from the UK Government for this.
You may already know that a large percentage of the glider-borne soldiers who took part in the Normandy invasion were killed or wounded during the “landing” of the aircraft, or soon after being under heavy fire. The glider pilots suffered much higher casualty rates than the others. So, I’m thinking that my father contributed a more important thing to the war effort with his reorganization of the aircraft building process than he would have as a glider pilot or crew member, and he got to live.
But he never did get to meet that girl under Big Ben.
First, what is “pixel art?”
Is that just art that is rendered in raster? Not exactly. Pixel art is the sort of art you draw for digital cartoons or similar things. The skills and tools of making pixel art would apply to designing icons or logos used in electronic products as well.
To demonstrate what pixel art is, I’m including a few examples from the newly published Make Your Own Pixel Art: Create Graphics for Games, Animations, and More! by Jennifer Dawe and Matthew Humphries.
This book will give you an introduction to the tricks of the trade of making technologically simply but artistically potent drawings, including ways to animate them.
The non-OpenSource (boo) software that is used throughout the book is not expensive and is easy to use, and yes, OpenSource alternatives are suggested and briefly discussed. The book relies on Aseprite and Pro Motion, with GraphcsGale (Windows only, boo) being a free alternative.
Techniques covered include shading, texture, proper use of color, motion and animation, and making things look sentient. Apparently, you can make money doing this sort of thing! This book is probably a good investment, at the very least to see if you have the talent and interest.
Author Jennifer Dawe is an animator and character designer who has been a professional pixel artist for the past 15 years. Author Matthew Humphries is Senior Editor at PCMag.com and a professional game designer.
All keyboards are “mechanical” in some sense, or at least most of them, in that something moves. But what we call a “mechanical keyboard” is one that has individual switches under each key cap, instead of some sort of silly squishy membrane. This gives the keys a different tactile sense, and often, a sound.
This post — Mechanical Keyboards What Are They And Which One Do You Want — is a little, but not too much, out of date. The basic information is correct. There are one or two more kinds of keys than described, and there are emerging manufacturers that may or may not be making good switches, and there are many more offerings of el-cheapo keyboards. And, still, the DasKeyboard is still one of the better (and more expensive) options.
My Avant Stellar keyboard finally broke in enough places (I’m tough on keyboards) to require major repair or replacement. I looked briefly at really old Northgates (20-30 years old?) on ebay, bid on a few, but was outbid and decided not to spend an exorbitant amount of money on a decades old untested machine. I also realized that I have two computers sitting next to each other, and when I change between them, it feels wrong, because they have two entirely different kinds of keyboards. But, I realized, if I get a new DasKeyboard for my Linux machine, since I have a Mac DasKeyboard on the Mac, then I would quickly become accustomed to switching back and forth and all would be well. So, I got the Das Keyboard Model S Professional Cherry MX Blue for the Linux to match the Das Keyboard Model S Pro for Mac, and now everything is good.
Except possibly one thing. You may recal that I had earlier complained about the font used on the key caps on the DasKeyboards. At the time, I used stick on labels to upgrade the DasKeyboard to how I like it. For some reason, as I sit here typing on the new DasKeyboard with the small typeface that I don’t like much, I’m not bothered by it, so I may not make that change. We’ll see.
So now all is well in keyboard land, and my pile of no longer in use keyboards available for spare parts grows.
Boeing 737 Max:
-First commercial flight May 2017
-First fatal crash October 2018
-Second fatal crash March 2019
Both fatal crashes were attended by a nose-down response by auto-pilot right after takeoff.
The same nose-down response by auto-pilot was reported many times (dozens?) by pilots for months.
Boeing has a software fix that changes the way autopilot handles incorrect nose-down responses with autopilot. The software fix (delayed by the Trump-McConnell shutdown) is still not ready yet.
This is the highest rate of crashes or fatalities of any single model of commercial airline. Maybe. I haven’t checked, but it’s gotta be or close, anyway.
Response by Hyper-skeptical skeptics: “Air travel is safe, so this is not actually a problem, don’t ground the flights.”
Response by the idiots who run the Trump administration: “Everything is fine, don’t worry about it.”
Response by every other country in the world, practically, “Out of an abundance of caution, ground the planes until this is figured out”
Response by the air flight industry and regulators when the b787 dreamliners had unexpected unexplained problems: Ground all the planes and fix the problem.
Response when SW planes discovered to have inadequate inspections: Ground the plains and fix the problem
Response when BQ400s have landing gear problem a couple of times (2007): Ground the plane until it is addressed.
In other words, the NORMAL response to this sort of situation is to ground the model. This is THE REASON AIR TRAVEL IS SAFE. Air travel is not inherently and automatically safe. It is safe because when a model aircraft if considered POTENTIALLY FAULTY in some serious way it is grounded until that problem is addressed.
Finally, after everybody else grounded the plane, the Trump administration FOLLOWS.
I’m sure the hyper-skeptics are still saying it is irrational to ground the 737Max planes.
If you have a PDF file and need to extract a subset of pages, creating a new PDF file with those pages in it, you can do that.
I like PDF Lab‘s PDFtk aka PDF toolkit. This is not OpenSource and there is both a non free pro and free version of it. I’ve tried the free version (example below) and was impressed. Next time I need to do a lot of PDF work I’ll probably fork out the 399 for the pro version. (That’s 399 pennies, quite cheap. It is developed by Sid Steward, the author of PDF Hacks: 100 Industrial-Strength Tips & Tools.
So, for example, I can get pages 11-20 of a larger file called big.pdf extracted into a smaller file called extracted.pdf like this:
pdftk A=bigpdf cat A11-20 output extracted.pdf
That line of code makes almost no sense to me, but it works.
I learned about this tip at a Linux Journal Tech Tip page on extracting pages from a PDF, where you will find several other approaches.
I don’t know about you, but I’m getting an increasing number of robocalls. Most of the calls I get are robocalls. I have stopped answering my phone unless it is from my wife, daughter, or other relative or person who’s phone is IDed.
I was speaking with someone last week who works in a business where phone calls are critically important, and the office has several phone lines. Each phone line gets a continuous stream of robocalls. There are times, frequently, when this business that relies on phone contact with clients turns off its entire phone system for an hour or two. That seems to temporarily reduce the number of robocalls, allowing for a brief period when customers can get through.
This madness must end!
I have a proposal to end robocall.
The US Congress passes a law that eliminates robocalls entirely. You simply can’t ever do them.
The robocall lobby will object, and fight, and make it impossible for such a bill to be passed. So, I have an additional set of provisions to help to get a bill like his through.
1) The ban on robocalls can not be lifted in any way for five years. That should give time for all the equipment to get old and all the people in the business to drift off.
2) If an exception is allowed, say for emergency calling systems, it can only be allowed on a state by state basis and only for a maximum of six months, but extendable. This way, any lifting of the ban will require re-evaluation and thus, it is possible that it will be less abused than similar laws have in the past.
In addition to banning all robocalling, it will be necessary, likely, to ban phone communication to or from countries that send out illegal robocalling.
I have two excellent things on my desk, a Linux Journal article by Andy Wills, and a newly published book by Stefano Allesina and Madlen Wilmes.
Computing Skills for Biologists: A Toolbox by Stefano Allesina and Madlen Wilmes, Princeton University Press.
Open Science, Open Source, and R, by Andy Wills, Linux Journal
OpenSource science means, among other things, using OpenSource software to do the science. For some aspects of software this is not important. It does not matter too much if a science lab uses Microsoft Word or if they use LibreOffice Write.
However, since it does matter if you use LibreOffice Calc as your spreadsheet, as long as you are eschewing proprietary spreadsheets, you might as well use the OpenSource office package LibreOffice or equivalent, and then use the OpenSource presentation software, word processor, and spreadsheet.
OpenSource programs like Calc, R (a stats package), and OpenSource friendly software development tools like Python and the GPL C Compilers, etc. do matter. Why? Because your science involves calculating things, and software is a magic calculating box. You might be doing actual calculations, or production of graphics, or management of data, or whatever. All of the software that does this stuff is on the surface a black box, and just using it does not give you access to what is happening under the hood.
But, if you use OpenSoucre software, you have both direct and indirect access to the actual technologies that are key to your science project. You can see exactly how the numbers are calculated or the graphic created, if you want to. It might not be easy, but at least you don’t have to worry about the first hurdle in looking under the hood that happens with commercial software: they won’t let you do it.
Direct access to the inner workings of the software you use comes in the form of actually getting involved in the software development and maintenance. For most people, this is not something you are going to do in your scientific endeavor, but you could get involved with some help from a friend or colleague. For example, if you are at a University, there is a good chance that somewhere in your university system there is a computer department that has an involvement in OpenSource software development. See what they are up to, find out what they know about the software you are using. Who knows, maybe you can get a special feature included in your favorite graphics package by helping your new found computer friends cop an internal University grant! You might be surprised as to what is out there, as well as what is in there.
In any event, it is explicitly easy to get involved in OpenSource software projects because they are designed that way. Or, usually are and always should be.
The indirect benefit comes from the simple fact that these projects are OpenSource. Let me give you an example form the non scientific world. (it is a made up example, but it could reflect reality and is highly instructive.)
Say there is an operating system or major piece of software competing in a field of other similar products. Say there is a widely used benchmark standard that compares the applications and ranks them. Some of the different products load up faster than others, and use less RAM. That leaves both time (for you) and RAM (for other applications) that you might value a great deal. All else being equal, pick the software that loads faster in less space, right?
Now imagine a group of trollish deviants meeting in a smoky back room of the evile corporation that makes one of these products. They have discovered that if they leave a dozen key features that all the competitors use out of the loading process, so they load later, they can get a better benchmark. Without those standard components running, the software will load fast and be relatively small. It happens to be the case, however, that once all the features are loaded, this particular product is the slowest of them all, and takes up the most RAM. Also, the process of holding back functionality until it is needed is annoying to the user and sometimes causes memory conflicts, causing crashes.
In one version of this scenario, the concept of selling more of the product by using this performance tilting trick is considered a good idea, and someone might even get a promotion for thinking of it. That would be something that could potentially happen in the world of proprietary software.
In a different version of this scenario the idea gets about as far as the water cooler before it is taken down by a heavy tape dispenser to the head and kicked to death. That would be what would certainly happen in the OpenSource world.
So, go OpenSource! And, read the paper from Linux Journal, which by the way has been producing some great articles lately, on this topic.
The Scientists Workflow and Software
You collect and manage data. You write code to process or analyze data. You use statistical tools to turn data into analytically meaningful numbers. You make graphs and charts. You write stuff and integrate the writing with the pretty pictures, and produce a final product.
The first thing you need to understand if you are developing or enhancing the computer side of your scientific endevour is that you need the basic GNU tools and command line access that comes automatically if you use Linux. You can get the same stuff with a few extra steps if you use Windows. The Apple Mac system is in between with the command line tools already built in, but not quite as in your face available.
You may need to have an understanding of Regular Expressions, and how to use them on the command line (using sed or awk, perhaps) and in programming, perhaps in python.
You will likely want to master the R environment because a) it is cool and powerful and b) a lot of your colleagues use R so you will want to have enough under your belt to share code and data now and then. You will likely want to master Python, which is becoming the default scientific programming language. It is probably true that anything you can do in R you can do in Python using the available tools, but it is also true that the most basic statistical stuff you might be doing is easier in R than Python since R is set up for it. The two systems are relatively easy to use and very powerful, so there is no reason to not have both in your toolbox. If you don’t chose the Python route, you may want to supplement R with gnu plotting tools.
You will need some sort of relational database setup in your lab, some kind of OpenSource SQL lanaguge based system.
You will have to decide on your own if you are into LaTex. If you have no idea what I’m talking about, don’t worry, you don’t need to know. If you do know what I’m talking about, you probably have the need to typeset math inside your publications.
Finally, and of utmost importance, you should be willing to spend the upfront effort making your scientific work flow into scripts. Say you have a machine (or a place on the internet or an email stream if you are working collaboratively) where some raw data spits out. These data need some preliminary messing around with to discard what you don’t want, convert numbers to a proper form, etc. etc. Then, this fixed-up data goes through a series of analyses, possibly several parallel streams of analysis, to produce a set of statistical outputs, tables, graphics, or a new highly transformed data set you send on to someone else.
If this is something you do on a regular basis, and it likely is because your lab or field project is set up to get certain data certain ways, then do certain things to it, then ideally you would set up a script, likely in bash but calling gnu tools like sed or awk, or running Python programs or R programs, and making various intermediate files and final products and stuff. You will want to bother with making the first run of these operations take three times longer to set up, so that all the subsequent runs take one one hundredth of the time to carry out, or can be run unattended.
Nothing, of course, is so simple as I just suggested … you will be changing the scripts and Python programs (and LaTeX specs) frequently, perhaps. Or you might have one big giant complex operation that you only need to run once, but you KNOW it is going to screw up somehow … a value that is entered incorrectly or whatever … so the entire thing you need to do once is actually something you have to do 18 times. So make the whole process a script.
Aside form convenience and efficiency, a script does something else that is vitally important. It documents the process, both for you and others. This alone is probably more important than the convenience part of scripting your science, in many cases.
Being small in a world of largeness
Here is a piece of advice you wont get from anyone else. As you develop your computer working environment, the set of software tools and stuff that you use to run R or Python and all that, you will run into opportunities to install some pretty fancy and sophisticated developments systems that have many cool bells and whistles, but that are really designed for team development of large software projects, and continual maintenance over time of versions of that software as it evolves as a distributed project.
Don’t do that unless you need to. Scientific computing often not that complex or team oriented. Sure, you are working with a team, but probably not a team of a dozen people working on the same set of Python programs. Chances are, much of the code you write is going to be tweaked to be what you need it to be then never change. There are no marketing gurus coming along and asking you to make a different menu system to attract millennials. You are not competing with other products in a market of any sort. You will change your software when your machine breaks and you get a new one, and the new one produces output in a more convenient style than the old one. Or whatever.
In other words, if you are running an enterprise level operation, look into systems like Anaconda. If you are a handful of scientists making and controlling your own workflow, stick with the simple scripts and avoid the snake. The setup and maintenance of an enterprise level system for using R and Python is probably more work before you get your first t-test or histogram than it is worth. This is especially true if you are more or less working on your own.
Another piece of advice. Some software decisions are based on deeply rooted cultural norms or fetishes that make no sense. I’m an emacs user. This is the most annoying, but also, most powerful, of all text editors. Here is an example of what is annoying about emac. In the late 70s, computer keyboards had a “meta” key (it was actually called that) which is now the alt key. Emacs made use of the metakey. No person has seen or used a metakey since about 1979, but emacs refuses to change its documentation to use the word “alt” for this key. Rather, the documentation says somethin like “here, use the meta key, which on some keyboards is the alt key.” That is a cultural fetish.
Using LaTeX might be a fetish as well. Obliviously. It is possible that for some people, using R is a fetish and they should rethink and switch to using Python for what they are doing. The most dangerous fetish, of course, is using proprietary scientific software because you think only if you pay hundreds of dollars a year to use SPSS or BMD for stats, as opposed to zero dollars a year for R, will your numbers be acceptable. In fact, the reverse is true. Only with an OpenSource stats package can you really be sure how the stats or other values are calculated.
And my final piece of advice is to get and use this book: Computing Skills for Biologists: A Toolbox by Allesina and Wilmes.
This book focuses on Python and not R, and covers Latex which, frankly, will not be useful for many. This also means that the regular expression work in the book is not as useful for all applications, as might be the case with a volume like Mastering Regular Expressions. But overall, this volume does a great job of mapping out the landscape of scripting-oriented scientific computing, using excellent examples from biology.
Mastering Regular Expressions can and should be used as a textbook for an advanced high school level course to prep young and upcoming investigators for when they go off and apprentice in labs at the start of their career. It can be used as a textbook in a short seminar in any advanced program to get everyone in a lab on the same page. I suppose it would be treat if Princeton came out with a version for math and physical sciences, or geosciences, but really, this volume can be generalized beyond biology.
Stefano Allesina is a professor in the Department of Ecology and Evolution at the University of Chicago and a deputy editor of PLoS Computational Biology. Madlen Wilmes is a data scientist and web developer.
Go ahead and chose “chromium” as the default browser in the “settings” application, and hit apply. That setting will likely stick, but Chrome will not be the default browser anyway. A bug in KDE Plasma prevents this, but you can drill down deeper into the configuration information and make it work: Continue reading Chrome as default browser in KDE Plasma: Getting it to stick
There are a lot of books out there to help you learn command line tools, and of course, they mostly cover the same things because there is a fixed number of things you need to learn to get started down this interesting and powerful path.
Small, Sharp, Software Tools: Harness the Combinatoric Power of Command-Line Tools and Utilities by Brian P. Hogan is the latest iteration (not quite in press yet but any second now) of one such book.
I really like Hogan’s book. Here’s what you need to know about it.
First, and this will only matter to some but is important, the book does cover using CLI tools across platforms (Linux, Mac, Windows) in the sense that it helps get you set up to use the bash command line system on all three.
Second, this book is does a much better than average job as a tutorial, rather than just as a reference manual, than most other books I’ve seen. You can work from start to finish, with zero knowledge at the start, follow the examples (using the provided files that you are guided to download using command line tools!) and become proficient very comfortably and reasonably quickly. The topic are organized in such a way that you can probably skip chapters that interest you less (but don’t skip the first few).
Third, the book does give interesting esoteric details here and there, but the author seems not compelled to obsessively fill your brain with entirely useless knowledge such as how many arguments the POSIX standard hypothetically allows on a command line (is it 512 or 640? No one seems to remember) as some other books do.
I found Small, Sharp, Software Tools a very comfortable, straight forward, well organized, accurate read from Pragmatic.
If you write shell scrips, you should check out Dave Taylor’s latest article in Linux Journal.
He gives key examples of what can go wrong if you don’t pay attention to certain things.
For example, if you have a dot in (especially at the start of) your PATH variable, you risk running a Trojan horse that snuck sneakily into your /tmp directory. If you want the dot, put it last.
Anyway, a simple straight forward article with a few pieces of good advice: Writing Secure Shell Scripts
Complex numbers, working with oscillations (trigonometry), using Turtles to draw, some basic algebra, my favorite, Cellular Automata, and more, are covered in Math Adventures with Python: An Illustrated Guide to Exploring Math with Code by Peter Farrell. Farrell is a math and computer science teacher who is interested in math education and using technology in learning. Continue reading Math Adventures with Python
Julien Danjou’s Serious Python: Black-Belt Advice on Deployment, Scalability, Testing, and More is serious.
This book takes Python programming well beyond casual programming, and beyond the use of Python as a glorified scripting language to access statistical or graphics tools, etc. This is level one or even level two material. If you are writing software to distribute to others, handling time zones, want to optimize code, or experiment with different programming paradigms (i.e. functional programming, generating code, etc.) then you will find Serious Python informative and interesting. Multi-threading, optimization, scaling, methods and decorators, and integration with relational databases are also covered. (A decorator is a function that “decorates,” or changes or expands, a function without motifying i.) The material is carefully and richly explored, and the writing is clear and concise. Continue reading Serious Python Programming
Minecraft is probably the most creative video game out there, not in the sense that its creators are creative, but rather, that it is all about creating things, and this is done by constructing novelty out of a relatively simple set of primitives. But to do so, the player needs to know about the building blocks of Minedraft, such as Lava, Fencing, Redstone, Levers, various chest and chest related things, and so on.
Yes, you (or your child) can learn as you go playing the game, watch a few YouTube videos, etc. But if we want to fully enjoy and integrate the Minecraft experience, and help that child (or you?) get in some more reading time, there must be books. For example, the Minecraft: Blockopedia by Alex Wiltshire. Continue reading Minecraft Blockopedia
And, welcome to Uncanny Valley: