Category Archives: Technology

No, it is NOT rational to keep flying the Boeing 737 Max

Spread the love

Boeing 737 Max:

-First commercial flight May 2017

-First fatal crash October 2018

-Second fatal crash March 2019

Both fatal crashes were attended by a nose-down response by auto-pilot right after takeoff.

The same nose-down response by auto-pilot was reported many times (dozens?) by pilots for months.

Boeing has a software fix that changes the way autopilot handles incorrect nose-down responses with autopilot. The software fix (delayed by the Trump-McConnell shutdown) is still not ready yet.

This is the highest rate of crashes or fatalities of any single model of commercial airline. Maybe. I haven’t checked, but it’s gotta be or close, anyway.

Response by Hyper-skeptical skeptics: “Air travel is safe, so this is not actually a problem, don’t ground the flights.”

Response by the idiots who run the Trump administration: “Everything is fine, don’t worry about it.”

Response by every other country in the world, practically, “Out of an abundance of caution, ground the planes until this is figured out”

Response by the air flight industry and regulators when the b787 dreamliners had unexpected unexplained problems: Ground all the planes and fix the problem.

Response when SW planes discovered to have inadequate inspections: Ground the plains and fix the problem

Response when BQ400s have landing gear problem a couple of times (2007): Ground the plane until it is addressed.

In other words, the NORMAL response to this sort of situation is to ground the model. This is THE REASON AIR TRAVEL IS SAFE. Air travel is not inherently and automatically safe. It is safe because when a model aircraft if considered POTENTIALLY FAULTY in some serious way it is grounded until that problem is addressed.

Finally, after everybody else grounded the plane, the Trump administration FOLLOWS.

I’m sure the hyper-skeptics are still saying it is irrational to ground the 737Max planes.


Spread the love

How to extract pages from a PDF file

Spread the love

If you have a PDF file and need to extract a subset of pages, creating a new PDF file with those pages in it, you can do that.

I like PDF Lab‘s PDFtk aka PDF toolkit. This is not OpenSource and there is both a non free pro and free version of it. I’ve tried the free version (example below) and was impressed. Next time I need to do a lot of PDF work I’ll probably fork out the 399 for the pro version. (That’s 399 pennies, quite cheap. It is developed by Sid Steward, the author of PDF Hacks: 100 Industrial-Strength Tips & Tools.

So, for example, I can get pages 11-20 of a larger file called big.pdf extracted into a smaller file called extracted.pdf like this:

pdftk A=bigpdf cat A11-20 output extracted.pdf

That line of code makes almost no sense to me, but it works.

I learned about this tip at a Linux Journal Tech Tip page on extracting pages from a PDF, where you will find several other approaches.


Spread the love

End Robocalls

Spread the love

I don’t know about you, but I’m getting an increasing number of robocalls. Most of the calls I get are robocalls. I have stopped answering my phone unless it is from my wife, daughter, or other relative or person who’s phone is IDed.

I was speaking with someone last week who works in a business where phone calls are critically important, and the office has several phone lines. Each phone line gets a continuous stream of robocalls. There are times, frequently, when this business that relies on phone contact with clients turns off its entire phone system for an hour or two. That seems to temporarily reduce the number of robocalls, allowing for a brief period when customers can get through.

This madness must end!

I have a proposal to end robocall.

The US Congress passes a law that eliminates robocalls entirely. You simply can’t ever do them.

The robocall lobby will object, and fight, and make it impossible for such a bill to be passed. So, I have an additional set of provisions to help to get a bill like his through.

1) The ban on robocalls can not be lifted in any way for five years. That should give time for all the equipment to get old and all the people in the business to drift off.

2) If an exception is allowed, say for emergency calling systems, it can only be allowed on a state by state basis and only for a maximum of six months, but extendable. This way, any lifting of the ban will require re-evaluation and thus, it is possible that it will be less abused than similar laws have in the past.

In addition to banning all robocalling, it will be necessary, likely, to ban phone communication to or from countries that send out illegal robocalling.


Spread the love

How to do science with a computer: workflow tools and OpenSource philosophy

Spread the love

I have two excellent things on my desk, a Linux Journal article by Andy Wills, and a newly published book by Stefano Allesina and Madlen Wilmes.

They are:

Computing Skills for Biologists: A Toolbox by Stefano Allesina and Madlen Wilmes, Princeton University Press.

Open Science, Open Source, and R, by Andy Wills, Linux Journal

Why OpenSource?

OpenSource science means, among other things, using OpenSource software to do the science. For some aspects of software this is not important. It does not matter too much if a science lab uses Microsoft Word or if they use LibreOffice Write.

However, since it does matter if you use LibreOffice Calc as your spreadsheet, as long as you are eschewing proprietary spreadsheets, you might as well use the OpenSource office package LibreOffice or equivalent, and then use the OpenSource presentation software, word processor, and spreadsheet.

OpenSource programs like Calc, R (a stats package), and OpenSource friendly software development tools like Python and the GPL C Compilers, etc. do matter. Why? Because your science involves calculating things, and software is a magic calculating box. You might be doing actual calculations, or production of graphics, or management of data, or whatever. All of the software that does this stuff is on the surface a black box, and just using it does not give you access to what is happening under the hood.

But, if you use OpenSoucre software, you have both direct and indirect access to the actual technologies that are key to your science project. You can see exactly how the numbers are calculated or the graphic created, if you want to. It might not be easy, but at least you don’t have to worry about the first hurdle in looking under the hood that happens with commercial software: they won’t let you do it.

Direct access to the inner workings of the software you use comes in the form of actually getting involved in the software development and maintenance. For most people, this is not something you are going to do in your scientific endeavor, but you could get involved with some help from a friend or colleague. For example, if you are at a University, there is a good chance that somewhere in your university system there is a computer department that has an involvement in OpenSource software development. See what they are up to, find out what they know about the software you are using. Who knows, maybe you can get a special feature included in your favorite graphics package by helping your new found computer friends cop an internal University grant! You might be surprised as to what is out there, as well as what is in there.

In any event, it is explicitly easy to get involved in OpenSource software projects because they are designed that way. Or, usually are and always should be.

The indirect benefit comes from the simple fact that these projects are OpenSource. Let me give you an example form the non scientific world. (it is a made up example, but it could reflect reality and is highly instructive.)

Say there is an operating system or major piece of software competing in a field of other similar products. Say there is a widely used benchmark standard that compares the applications and ranks them. Some of the different products load up faster than others, and use less RAM. That leaves both time (for you) and RAM (for other applications) that you might value a great deal. All else being equal, pick the software that loads faster in less space, right?

Now imagine a group of trollish deviants meeting in a smoky back room of the evile corporation that makes one of these products. They have discovered that if they leave a dozen key features that all the competitors use out of the loading process, so they load later, they can get a better benchmark. Without those standard components running, the software will load fast and be relatively small. It happens to be the case, however, that once all the features are loaded, this particular product is the slowest of them all, and takes up the most RAM. Also, the process of holding back functionality until it is needed is annoying to the user and sometimes causes memory conflicts, causing crashes.

In one version of this scenario, the concept of selling more of the product by using this performance tilting trick is considered a good idea, and someone might even get a promotion for thinking of it. That would be something that could potentially happen in the world of proprietary software.

In a different version of this scenario the idea gets about as far as the water cooler before it is taken down by a heavy tape dispenser to the head and kicked to death. That would be what would certainly happen in the OpenSource world.

So, go OpenSource! And, read the paper from Linux Journal, which by the way has been producing some great articles lately, on this topic.

The Scientists Workflow and Software

You collect and manage data. You write code to process or analyze data. You use statistical tools to turn data into analytically meaningful numbers. You make graphs and charts. You write stuff and integrate the writing with the pretty pictures, and produce a final product.

The first thing you need to understand if you are developing or enhancing the computer side of your scientific endevour is that you need the basic GNU tools and command line access that comes automatically if you use Linux. You can get the same stuff with a few extra steps if you use Windows. The Apple Mac system is in between with the command line tools already built in, but not quite as in your face available.

You may need to have an understanding of Regular Expressions, and how to use them on the command line (using sed or awk, perhaps) and in programming, perhaps in python.

You will likely want to master the R environment because a) it is cool and powerful and b) a lot of your colleagues use R so you will want to have enough under your belt to share code and data now and then. You will likely want to master Python, which is becoming the default scientific programming language. It is probably true that anything you can do in R you can do in Python using the available tools, but it is also true that the most basic statistical stuff you might be doing is easier in R than Python since R is set up for it. The two systems are relatively easy to use and very powerful, so there is no reason to not have both in your toolbox. If you don’t chose the Python route, you may want to supplement R with gnu plotting tools.

You will need some sort of relational database setup in your lab, some kind of OpenSource SQL lanaguge based system.

You will have to decide on your own if you are into LaTex. If you have no idea what I’m talking about, don’t worry, you don’t need to know. If you do know what I’m talking about, you probably have the need to typeset math inside your publications.

Finally, and of utmost importance, you should be willing to spend the upfront effort making your scientific work flow into scripts. Say you have a machine (or a place on the internet or an email stream if you are working collaboratively) where some raw data spits out. These data need some preliminary messing around with to discard what you don’t want, convert numbers to a proper form, etc. etc. Then, this fixed-up data goes through a series of analyses, possibly several parallel streams of analysis, to produce a set of statistical outputs, tables, graphics, or a new highly transformed data set you send on to someone else.

If this is something you do on a regular basis, and it likely is because your lab or field project is set up to get certain data certain ways, then do certain things to it, then ideally you would set up a script, likely in bash but calling gnu tools like sed or awk, or running Python programs or R programs, and making various intermediate files and final products and stuff. You will want to bother with making the first run of these operations take three times longer to set up, so that all the subsequent runs take one one hundredth of the time to carry out, or can be run unattended.

Nothing, of course, is so simple as I just suggested … you will be changing the scripts and Python programs (and LaTeX specs) frequently, perhaps. Or you might have one big giant complex operation that you only need to run once, but you KNOW it is going to screw up somehow … a value that is entered incorrectly or whatever … so the entire thing you need to do once is actually something you have to do 18 times. So make the whole process a script.

Aside form convenience and efficiency, a script does something else that is vitally important. It documents the process, both for you and others. This alone is probably more important than the convenience part of scripting your science, in many cases.

Being small in a world of largeness

Here is a piece of advice you wont get from anyone else. As you develop your computer working environment, the set of software tools and stuff that you use to run R or Python and all that, you will run into opportunities to install some pretty fancy and sophisticated developments systems that have many cool bells and whistles, but that are really designed for team development of large software projects, and continual maintenance over time of versions of that software as it evolves as a distributed project.

Don’t do that unless you need to. Scientific computing often not that complex or team oriented. Sure, you are working with a team, but probably not a team of a dozen people working on the same set of Python programs. Chances are, much of the code you write is going to be tweaked to be what you need it to be then never change. There are no marketing gurus coming along and asking you to make a different menu system to attract millennials. You are not competing with other products in a market of any sort. You will change your software when your machine breaks and you get a new one, and the new one produces output in a more convenient style than the old one. Or whatever.

In other words, if you are running an enterprise level operation, look into systems like Anaconda. If you are a handful of scientists making and controlling your own workflow, stick with the simple scripts and avoid the snake. The setup and maintenance of an enterprise level system for using R and Python is probably more work before you get your first t-test or histogram than it is worth. This is especially true if you are more or less working on your own.

Culture

Another piece of advice. Some software decisions are based on deeply rooted cultural norms or fetishes that make no sense. I’m an emacs user. This is the most annoying, but also, most powerful, of all text editors. Here is an example of what is annoying about emac. In the late 70s, computer keyboards had a “meta” key (it was actually called that) which is now the alt key. Emacs made use of the metakey. No person has seen or used a metakey since about 1979, but emacs refuses to change its documentation to use the word “alt” for this key. Rather, the documentation says somethin like “here, use the meta key, which on some keyboards is the alt key.” That is a cultural fetish.

Using LaTeX might be a fetish as well. Obliviously. It is possible that for some people, using R is a fetish and they should rethink and switch to using Python for what they are doing. The most dangerous fetish, of course, is using proprietary scientific software because you think only if you pay hundreds of dollars a year to use SPSS or BMD for stats, as opposed to zero dollars a year for R, will your numbers be acceptable. In fact, the reverse is true. Only with an OpenSource stats package can you really be sure how the stats or other values are calculated.

And finally…

And my final piece of advice is to get and use this book: Computing Skills for Biologists: A Toolbox by Allesina and Wilmes.

This book focuses on Python and not R, and covers Latex which, frankly, will not be useful for many. This also means that the regular expression work in the book is not as useful for all applications, as might be the case with a volume like Mastering Regular Expressions. But overall, this volume does a great job of mapping out the landscape of scripting-oriented scientific computing, using excellent examples from biology.

Mastering Regular Expressions can and should be used as a textbook for an advanced high school level course to prep young and upcoming investigators for when they go off and apprentice in labs at the start of their career. It can be used as a textbook in a short seminar in any advanced program to get everyone in a lab on the same page. I suppose it would be treat if Princeton came out with a version for math and physical sciences, or geosciences, but really, this volume can be generalized beyond biology.

Stefano Allesina is a professor in the Department of Ecology and Evolution at the University of Chicago and a deputy editor of PLoS Computational Biology. Madlen Wilmes is a data scientist and web developer.


Spread the love

Chrome as default browser in KDE Plasma: Getting it to stick

Spread the love

Go ahead and chose “chromium” as the default browser in the “settings” application, and hit apply. That setting will likely stick, but Chrome will not be the default browser anyway. A bug in KDE Plasma prevents this, but you can drill down deeper into the configuration information and make it work: Continue reading Chrome as default browser in KDE Plasma: Getting it to stick


Spread the love

A Guide To Using Command Line Tools

Spread the love

There are a lot of books out there to help you learn command line tools, and of course, they mostly cover the same things because there is a fixed number of things you need to learn to get started down this interesting and powerful path.

Small, Sharp, Software Tools: Harness the Combinatoric Power of Command-Line Tools and Utilities by Brian P. Hogan is the latest iteration (not quite in press yet but any second now) of one such book.

I really like Hogan’s book. Here’s what you need to know about it.

First, and this will only matter to some but is important, the book does cover using CLI tools across platforms (Linux, Mac, Windows) in the sense that it helps get you set up to use the bash command line system on all three.

Second, this book is does a much better than average job as a tutorial, rather than just as a reference manual, than most other books I’ve seen. You can work from start to finish, with zero knowledge at the start, follow the examples (using the provided files that you are guided to download using command line tools!) and become proficient very comfortably and reasonably quickly. The topic are organized in such a way that you can probably skip chapters that interest you less (but don’t skip the first few).

Third, the book does give interesting esoteric details here and there, but the author seems not compelled to obsessively fill your brain with entirely useless knowledge such as how many arguments the POSIX standard hypothetically allows on a command line (is it 512 or 640? No one seems to remember) as some other books do.

I found Small, Sharp, Software Tools a very comfortable, straight forward, well organized, accurate read from Pragmatic.


Spread the love

Writing Secure Shell Scripts

Spread the love

If you write shell scrips, you should check out Dave Taylor’s latest article in Linux Journal.

He gives key examples of what can go wrong if you don’t pay attention to certain things.

For example, if you have a dot in (especially at the start of) your PATH variable, you risk running a Trojan horse that snuck sneakily into your /tmp directory. If you want the dot, put it last.

Anyway, a simple straight forward article with a few pieces of good advice: Writing Secure Shell Scripts


Spread the love

Math Adventures with Python

Spread the love

Complex numbers, working with oscillations (trigonometry), using Turtles to draw, some basic algebra, my favorite, Cellular Automata, and more, are covered in Math Adventures with Python: An Illustrated Guide to Exploring Math with Code by Peter Farrell. Farrell is a math and computer science teacher who is interested in math education and using technology in learning. Continue reading Math Adventures with Python


Spread the love

Serious Python Programming

Spread the love

Julien Danjou’s Serious Python: Black-Belt Advice on Deployment, Scalability, Testing, and More is serious.

This book takes Python programming well beyond casual programming, and beyond the use of Python as a glorified scripting language to access statistical or graphics tools, etc. This is level one or even level two material. If you are writing software to distribute to others, handling time zones, want to optimize code, or experiment with different programming paradigms (i.e. functional programming, generating code, etc.) then you will find Serious Python informative and interesting. Multi-threading, optimization, scaling, methods and decorators, and integration with relational databases are also covered. (A decorator is a function that “decorates,” or changes or expands, a function without motifying i.) The material is carefully and richly explored, and the writing is clear and concise. Continue reading Serious Python Programming


Spread the love

Minecraft Blockopedia

Spread the love

Minecraft is probably the most creative video game out there, not in the sense that its creators are creative, but rather, that it is all about creating things, and this is done by constructing novelty out of a relatively simple set of primitives. But to do so, the player needs to know about the building blocks of Minedraft, such as Lava, Fencing, Redstone, Levers, various chest and chest related things, and so on.

The Blockopedia in use.
Yes, you (or your child) can learn as you go playing the game, watch a few YouTube videos, etc. But if we want to fully enjoy and integrate the Minecraft experience, and help that child (or you?) get in some more reading time, there must be books. For example, the Minecraft: Blockopedia by Alex Wiltshire. Continue reading Minecraft Blockopedia


Spread the love

Instead of Evernote, Try Raindrop

Spread the love

I’m not going to try to talk you out of Evernote. If you use the venerable application productively, good for you. I used it for a long time and it was fine. But, recent changes in the application caused me to look elsewhere for the satisfaction I was seeking. And I found it. I found Raindrop.io. Continue reading Instead of Evernote, Try Raindrop


Spread the love

Making Raspberry Pi Robots

Spread the love

At the core of this post is a review of a new book, Learn Robotics with Raspberry Pi: Build and Code Your Own Moving, Sensing, Thinking Robots. I recommend it as a great above-basic level introduction to building a standard robot, learning a bit about the Linux operating system, learning to program in Python, and learning some basic electronics. However, I want to frame this review in a bit more context which I think will chase some readers away from this book while at the same time making others drool. But don’t drool on the electronics. Continue reading Making Raspberry Pi Robots


Spread the love

Clean Energy: Good News Bad News

Spread the love

First some good news:

Corporate clean energy buying surged to new record in 2018

Corporations purchased 13.4 gigawatts of clean power through long-term contracts, more than doubling 2017’s total, helped by demand from new industries and previously untrodden markets

Scenery conflict (I’ll just add, that solar panels replacing some nice vistas is better than post-apocalyptic landscapes replacing some nice vistas): Rhode Island town grapples with how to promote solar and protect rural views

Similarly, Massive Wisconsin solar proposal splits farmers and clean energy fans

And … Oregon adopts strict rules for solar panel farms on high-value farm soil

And for those who want to pay more but perhaps have something cool: RGS Energy Revives Dow’s Solar Roof, Claiming Better Efficiency and Lower Costs


Spread the love