Tag Archives: Linux

Installing Ubuntu 16.04 LTS

This is one of four related posts:

Should You Install Ubuntu Linux?
Installing Ubuntu 16.04 LTS
How to use Ubuntu Unity
Things To Do After Installing Ubuntu 16.04 LTS

Some Linux/Ubuntu related books:
Ubuntu Unleashed 2016 Edition: Covering 15.10 and 16.04 (11th Edition)
Ubuntu 16.04 LTS Desktop: Applications and Administration
The Linux Command Line: A Complete Introduction

Linux isn’t for everyone, so I’m not going to try to talk you into using this superior operating system if you have some reason to not do so. But if you have a computer that runs Windows, it isn’t that hard to install Ubuntu. The main advantages of doing so are 1) You get to have a Linux computer and b) you get to not have a Windows computer.

Here, I have some advice on installing Ubuntu (this is general advice and applies across many versions).

How to install Ubuntu

If you are going to try Linux, I recommend installing Ubuntu’s latest version, which is Ubuntu 16.04 LTS Xenial Xerus.

A Linux distro (the specific version of Linux you install) includes a specific “desktop,” which is your user interface and a bunch of tools and stuff. The default Ubuntu desktop is called Unity. If you’ve never used Linux before, you’ll find the Unity desktop to be very good, especially if you tweak it a bit. If you have used Linux before, you may prefer a different style desktop. For me, I preferred the older style “Gnome 2.0” style desktop. The differences are cosmetic, but I happen to like the cosmeticology of the Gnome style better.


Check out:

<li><a href="http://gregladen.com/blog/2017/03/ubuntu-linux-books/">UBUNTU AND LINUX BOOKS</a></li>

<li><a href="http://gregladen.com/blog/2017/03/books-computer-programming-computers/">BOOKS ON COMPUTER PROGRAMMING AND COMPUTERS</a></li>

___________________

What I liked most about the older style desktop is the presence of a menu that had submenus that organized all the applications (software, apps) installed on the system. I also prefer the synaptic system for installing new software over the Ubuntu “Software Center.” But, there is a menu that can be installed in Unity that serves this purpose, and it is easy to install synaptic installation software as well. So, even as an old time Gnome 2.0 guy, I have decided to go with Unity.

There are many forms of Linux out there, and one of the best maintained and well done versions is called debian. Ubuntu bases its distribution on debian, but modifies it in ways that are good. The most current version of Ubuntu is therefore the version of Linux that is most up to date but at the same time stable, and the best supported. This situation has developed to the extent that people are now often using, incorrectly but harmlessly, the term “Ubuntu” to mean “Linux” with the assumption that the Unity desktop is the primary desktop for Linux.

So, how do you install Linux, in the form of Ubuntu, on your computer?

Should you install Linux along side Windows (dual boot)?

If you just want to install Linux on a computer, where Linux will be the only operating system, skip this section.

The first thing you need to decide if if you want a dual boot system or not. Say you have Windows installed on your computer. If you make this a dual boot computer, you install Linux along side Windows. Then, when you fire up your computer you chose which operating system you want to use.

This may sound like a good idea, but I strongly recommend against it. It adds a significant layer of complexity to the process of installing the system. Also, things can go wrong. A normal single-boot installation of Linux will usually give you no problems, and it will be more stable as an operating system than any other operating system out there. But things can go wrong with dual booting which could drive you crazy and, depending on your hardware and a few other things, may cause you to unexpectedly lose the ability to use your computer.

Dual booting and partitioning are related operations, because in order to dual boot you will have to mess around with partitioning. How you do this will depend on whether or not Windows is already installed on your computer.

There are people who will tell you differently, that dual booting is harmless and fun and good. Those individuals are unique, special individuals with the ability to solve complex problems on their computers. They may have good reasons to have dual boot systems. In fact, many of them may have several different operating systems installed on one computer. This is because, as a hobby or for professional reasons, they need to have a lot of different operating systems. Good for them.

I recommend that if you are not sure if you want to use Linux, don’t install it along side Windows, but rather, find an extra computer (or buy a cheap used one somewhere), install Linux on it, and if you find yourself liking Linux more than you like Windows, go ahead and install Linux on your main, higher-end computer and be done with it.

Using two partitions is a good idea for some

As with dual booting, I recommend that the first time Linux user skip this idea entirely, but here are some thoughts on it in case you are interested.

One of the great things about Linux is that it uses the concept of a home directory. The home directory is a directory associated with a particular user, one for each user of the system. In most cases, a desktop or laptop computer has just one user, you. But you still get the home directory. (Apple’s OSX uses this system as well.)

This means that your data, configuration files for software, and all that stuff, ends up in one single directory. So, in theory, if you decide to install a whole new version of Linux, all you have to do is copy all of the contents of your home directory somewhere, install an entirely new system, then copy all that stuff into the new home directory and it is like you never left.

This also means that you only have to back up your home directory. Installing software on Linux is so easy that you really don’t have to back any of that up. By backing up your home directory, you are also backing up your settings and preferences for most of that software, so if you reinstall it, the software will figure out how to behave properly.

One method people use is to make a partition for their home directory and a separate partition for the system. You can think of a partition as roughly equivalent to a hard drive. On a simple system, the hard drive has one petition (that you need to know about … there are other specialized partitions that you don’t interact with). But you can divide (partition) the hard drive into multiple parts, put your operating system on one, and your data (home) on the other. The operating system, if you are running Linux, can be fairly small, while your data directory, in order to hold all those videos you take with your smart phone and your collection of cat picture, needs to be larger.

There is also a third partition you can make, called the swap partition. This is a separate dedicated part of your hard drive that the operating system uses to put stuff that won’t fit in RAM (memory). If you don’t have a dedicated swap partition, Linux will use another parittion for this purpose. It is probably slightly more efficient to have a dedicated swap partition, but with a reasonably fast computer with a good amount of ram, you probably won’t know the difference.

You can totally skip the separate partition thing and have Ubuntu put everything on one partition. The advantage of separate partitions are not worth the effort if you are not comfortable playing around with partitions. But, if you do, 10 gigabytes will comfortably hold the operating system, and the swap partition should be something like 5 or 6 gigabytes. The rest should be your home directory.

Simply installing Ubuntu on a computer.

There are two major divisions of operating systems for regular computers: 32 bit and 64 bit. If your computer can run a 64 bit operating system, and most made any time recently can, then you should install the 64 bit version of Ubuntu. You need to know that 32 bit operating systems are becoming a thing of the past, so, in fact, some software is no longer developed to run on such systems.

Due to an historical quirk, the 32 bit version of Linux is often has the word “Intel” in it, while the 64 bit version of Linux generally has the word “AMD” (a competitor of Intel) in it. This does not mean that you have to have an AMD processor in your computer to run the 64 bit system.

There are other forms of Linux that run on other processors. I’m assuming you have a typical run of the mill desktop that normally would run Windows, so it is probably an Intel or AMD 64 bit machine.

You should have three things handy in order to install Unbutu on a computer where it will be the only operating system.

1) The computer

2) Installation media that will fit in your computer, such as a CD, DVD, or a thumb drive

3) An internet connection that works

You can get the installation media by going to the Ubuntu site and downloading a file from Ubuntu and putting it on a medium of some sort.

When you are looking for the file, look for “Ubuntu Desktop.” There are other versions of Ubuntu, don’t use those. The current version is Ubuntu 16.04 LTS.

This page has the download materials and provides good guidance.

To do a clean install with a DVD, download the operating system from that link. The file will be called something like “ubuntu-16.04-desktop-amd64.iso”. The “iso” part means that it is a disk image that you will want to burn onto a DVD using a working computer.

This download took be about five minutes on a medium-fast Internet connection.

You then burn the iso image onto a DVD or USB stick.

Using a DVD

In Windows, you right click on the iso file and pick “burn disk image” and follow the instructions.

On a Mac, use “Disk Utility.” Insert the blank DVD and drag/drop the .iso file onto the left pane of the Disk Utility. Select it, and click the “Burn” button. Follow the instructions.

In Linux, insert the DVD or CD into your computer and if you are lucky a window will pop up asking if you want to burn the disk. Otherwise, run Brasero and follow the instructions to put the iso image on the disk.

Now, here comes the slightly tricky part. You want your computer to boot off of the DVD/CD drive. (Which, by the way, can be an external USB device if that is what you have.)

So, first, put the CD or DVD into the computer, turn the computer off properly, then turn it on. It might just boot off that disk and you are good to go.

But, just in case, watch your computer screen as the computer is booting up. Note any message that gives the name of a function key and tells you what it does. It may say something like “F2 = boot order” or “F9 = configure bla bla bla” or words to that effect.

If the computer did not actually boot off of the disk you inserted, turn it off again, and start it again, but as it is booting up press the function key that should get you to either boot sequence or configuration.

Using the arrow keys (the screen will give you info on what keys to use) find the part that shows the boot order. If your computer ignored the boot disk you inserted, you probably have “Internal Hard Drive” as the first place to boot from. But you should see other options down below. Using the arrow keys and other keys as suggested by the instruction on the screen to move the DVD drive, or whatever device you want to boot from, to the top of the list. Then save the configuration (i.e., with F10, or some other method … it will tell you on the screen) and exit out of the configuration thingie.

You may then have to restart one more time, but your computer will boot from the DVD and will actually start to run a mini version of Linux set up to help you install the operating system on this computer.

Using a USB

You can also boot from a USB thumb drive. You may have to make your computer boot first from the USB drive instead of an internal hard drive (see above) and, of course, you will have to make a bootable USB stick.

You need a USB stick with at least 2 gigs of space and that doesn’t hold anything important.

Then, if you are using Windows, install the Rufus USB installer. Run that program and follow the instructions to make a bootable USB drive. You’ll be using the same iso image you previously downloaded.

If you are using a Mac, install the UNetbootin utility and use that to make the bootable USB stick.

Insert the USB drive before you run UNetbootin, or UNertbootin may not recognize the USB you insert later.

Since this will be “unconfirmed software” open it by finding it in the finder (the actual finder, not a finder replacement or any other method), control-click on the icon for the software, and select “open.” You will then be asked to confirm that you want to open it. Say yes. You will likely be asked for your password. Enter it.

Also, no matter what system you are using, make sure to “eject” the USB stick properly.

The installation process.

Note: It is possible that your computer will give you some sort of text-based menu when you boot off the USB or DVD drive. Just go with the default, and let the process continue until you get what looks like a normal graphical user interface that is, actually, a temporary Linux operating system running on your computer.

Also note, that during the install process, if you need to enter any numbers, there is a good chance the “numlock” button is turned off. You can, of course, turn it on.

If, as recommended, you are going to make your computer a Linux computer and not bother with a separate petition for home, swap, etc., then the rest is simple. I assume you have no data on this computer that needs to be backed up or saved. Indeed, you may have installed a new clean hard drive which is empty anyway.

You have inserted the boot medium, you taught your computer to boot off of it, you’ve restarted your computer, and now you are looking at a welcome screen that gives you two options: Try Ubuntu and Install Ubuntu.

If you pick “Try” then you now have a Linux operating system temporarily running on your computer and you can play with it. I’m not sure why you would bother with this.

If you pick “Install Ubuntu” then you have a series of easy tasks to perform, mostly picking the defaults.

Make sure to pick “Download updates while installing” and “Install third-party software for graphics and Wi-Fi hardware, etc. etc.”

If your computer is not currently logged into a network, you will have the option of doing so. Do so. You want to be hooked up to the network in order to download updates and stuff during installation.

Logging into the network may not be obvious. It isn’t an option on the install screen, but rather on the desktop that is currently running on your computer. Click on the blank triangular thingie on the top menu bar — this is the network applet. Pick your network, enter your password, etc.

Then you get to allocate drive space. For the simple, recommended, install, pick “Erase disk and install Ubuntu.” Pick “encrypt the new Ubuntu install” if you like. If you don’t know what LVM is, don’t bother with it.

Or, pick “something else” if you want to define different partitions for home directory, dual boot, swap files, etc. Then, good luck with that. For your first Linux install on a fresh machine, you don’t need to go down that rabbit hole. While such endeavors are not that difficult to do, you should probably make that the project for your next Linux install.

How to set up separate Root, Home and Swap paritions

Skip this part if you are doing the recommended default install. This will destroy everything on your hard drive. If it does not go well, you can always do a new fresh install and pick the default.

At this point you have booted off the DVD or USB and you have clicked the icon to “install Ubuntu.” You are now looking at several options, including “something else.”

Select “something else”

Accept (hit Continue) with the scary message “You have selected an entire device to partition. If you proceed with creating a new partition table on the device, then all current partitions will be removed…”

(But do note that you are about to blotto your computer, so there better not be anything you want to keep on it!)

Make the first partition for the Ubuntu install as a primary partition.

Put it at the beginning of this space

Use EXT4journalingfilesystem (unless you have some reason to use some other file system) … this will be the default already chosen.

Set the “mount point” as /

This is the partition in which your operating system will be placed, and is known as the root partition.

How big should it be? Ubuntu needs a minimum of 20 gb. I would make it larger. I used 50 gigs when I did this.

Hit OK

Make a swap partition

This is the partition your computer will use as “extra memory.”

Now select “free space” and set up a “logical” petition (at the beginning of this space) that is twice the size of your installed RAM.

If you don’t know how much RAM your computer has, open a terminal right there on the computer you are working with (The upper left button with the Ubuntu symbol on it, type in “terminal” and hit enter). In the terminal, type in

free -h

That will give you a total number (and other numbers) Round up to the nearest gigabyte and multiply by two.

Enter that number into “size” (if you want 16 gigs, it will be 16000 mb).

Select from the dropdown list “swap” to make this a swap file. Hit OK

Install the home directory (where all your stuff goes)

Select the free space again.

Just take for size whatever is left on your computer (hopefully a lot). Pick logical, beginning of this space, and EXT4journalingfilesystem again.

For mount point, enter

/home

This will be your home directory. Hit OK

Now, you’ll see a nice table with a graphic bar on the top showing you what you’ve set up. If it all looks OK to you, a small but not too small root directory, a smaller swap file (probably), and the rest a huge home file, hit “Install Now”

You’ll get another warning, but we don’t care about not stinking warnings.

Later, if you need to change any of these size requirements, run a live USB/DVD (like you did to make this install) and run “gparted” to change the partition sizes. (You can’t change the partition sizes from within an operating system. That would be like changing the fundemental fabric of the universe while you are actually in the universe. Not even The Doctor can do that)

And now, go back to the normal install.

Continuing with the default normal install…

If you have Windows installed, you may then get the option to Install Linux along side Windows. Pick that if you want, and chose how much hard drive space to allocate to each system.

After that are a few screens that are simple and self explanatory. Give the installation system a location, choose the kind of keyboard you want to use.

Then you get to chose your login and password details. Here you have to decide how simple vs. secure you want your system to be. You should make sure you never forget your user name and password or you will be locked out of your system.

So, enter your name, then pick a name for this computer for identification on networks, etc, then enter your user name which will contain no spaces or strange symbols, and be all lower case.

Then enter your password twice. The system will complain if your password is lower security, feel free to ignore this if you don’t care, pay attention to it if you want a more secure system. You are going to be using this password a lot. Just sayin’.

Check “require my password to log in” for most installations. You can also chose at this point to encrypt your home folder to limit access to your data if someone gets physical access to your computer.

Then, the system will install while you get to see some info about Ubuntu.

Then you are “done” in that you have a Linux computer. You may or may not have been prompted to remove the DVD or USB. If you restart the computer and fail to do so, you’ll be back in the installation system. Just remove the DVD/USB and restart the computer.

Once the computer is restarted, you’ll have to re-establish your network connection one more time. (This is your new system, it doesn’t know about your networks or network password yet.)

At this point, go right to this post and start tweaking your computer. If you don’t do that now, at least do the things noted below.

But there is something else you should do right away. Open a terminal (hit the super key, aka windows key, and start typing “terminal” and the terminal option will come up. Click it).

Then, type in:

sudo apt-get updates

You will be asked for your password, which hopefully you will remember. The computer will then go on the internet and find updates for stuff that was installed during the installation. Even though you told it to do something like this during installation, it probably didn’t do it for all the software that is now on your computer.

Following all this, you do now have a Linux computer. There are several things you can do after installing Ubuntu 16.04. First, go to this post to find out how to navigate around on your new Unity desktop, and then, see this post for how to tweak and refine your Linux installation in useful and important ways.

Have a good time using your Linux computer!

Should you install Ubuntu Linux?

This is one of four related posts:

Should You Install Ubuntu Linux?
Installing Ubuntu 16.04 LTS
How to use Ubuntu Unity
Things To Do After Installing Ubuntu 16.04 LTS

Some Linux/Ubuntu related books:
Ubuntu Unleashed 2016 Edition: Covering 15.10 and 16.04 (11th Edition)
Ubuntu 16.04 LTS Desktop: Applications and Administration
The Linux Command Line: A Complete Introduction

Why you should install Linux

Linux is an operating system, as are Windows and Apple’s OSX. It is the operating system that is used on the majority of computing devices. Linux is the basis for the Android operating system, so if you have a smart phone that is not an iPhone, then you are probably already using Linux. The majority of servers, such as computers that run cloud services and internet nodes, etc. run on Linux. Your wireless router probably runs on Linux. Many devices in the “internet of things” use Linux. Supercomputers often run on Linux.

There are a few reasons you might want your desktop (or laptop) to run on Linux. Perhaps you are annoyed with your current operating system. If, for example, you find yourself frequently having to reinstall Windows, you might want to switch to Linux, because reinstalling is almost never a solution to something being broken on a Linux machine. Perhaps you have an older computer and the newer version of Windows runs really slowly for you. Linux runs better on older hardware.

For the most part, for most things most people do, it really doesn’t matter much which operating system you chose among the main players (Windows, Linux, and Apple’s OSX). What matters most is what software you use (applications, apps). If a particular application that you need to use runs only on one operating system, pick that operating system. But first, check to see if you are right about what software runs on what systems.

Indeed, most of the commonly used applications have a version or equivilant that runs on each of the three main operating systems, or at least two of them. Microsoft Office runs on Windows and, somewhat less smoothly, on a Mac. You can also run MS Office software on Linux, but I don’t recommend it for the non-expert user. But all of the applications that make up Office have equivalent software that is designed just for the Mac (and won’t run on anything else) as well as equivalent (and some would say superior) software that is not only designed to run on Linux, but that will run on any computer.

Indeed, from a user’s point of view, getting used to some software on a particular operating system and then being forced by circumstances to switch to another operating system is hardest for Mac users (if you are using, say, Pages, Numbers, and Keynote … you are stuck with the Mac), less hard for Windows users (as noted, Office runs on a Mac but not really on Linux) and trivially easy for a Linux user, since the Libra Office suit, with a word processor, spreadsheet, presentation software, drawing software, etc. runs the same on Linux, Windows, or a Mac.

In the old days it was said that you needed to use Microsoft Office on Windows or you would not be compatible with other people. That was never actually true. The compatibility issue was much more complex. There were instances where two people using Windows, and running Office, could not reliably exchange documents because they were using different versions of Office that did not play well with each other, but one of those individuals could easily exchange documents with someone using Open Office Writer (a word processor) on Linux.

But now, that falsehood is even less true than it ever was, and document formats are much more sensible and interchangeable today than they ever were in the past. So the compatibility issue is largely gone, even if it ever was semi-true but mostly misunderstood.

There are pieces of software that people love that require a particular operating system. The big fancy expensive Adobe products don’t run on Linux. Scrivener, a great writing application, is mainly Mac but will run on Windows and not really (but sort of) on Linux.

But it is also true that most software that most people use has an equivalent version on each of the other operating system. So, it really depends on what you want and need to do with your computer, and how much money you want to spend on hardware and software.

If you have an “extra” computer in your possession, and would like to have it usable for basic functions, such as using Google Chrome or Chromium to access the internet, basic text editing, advanced word processing, advanced spreadsheet and presentation work, etc. then installing Linux on that computer is a really good idea. It is probably an older computer, and as such will run much better with Linux than any other operating system, because Linux is so much more efficient. It will be free to do so. It will be relatively easy. Then, you’ll have a Linux computer that you will use now and then, and over time, you may decide that you like the operating system itself so much that you’ll totally switch to it. Or not. There is no problem with having more than one OS in your life.

Linux History and Background

This section is not really necessary if you are trying to decide whether to try out Linux, and you can skip it, though the background and history of Linux are interesting and may help you understand Linux a bit better. Also, this is a very brief version and there is much left out. Readers are welcome to identify important parts I’ve ignored and put them in the comments! In any event, be warned: I oversimplify here.

Many years ago, an operating system known as “Unix” was developed to address certain growing needs, especially the requirement that many users could hook into one machine and treat that machine as their own, keeping their stuff separate from that of all the other users.

Unix was also designed to run on several different machines. Previously, most operating systems were designed for a specific piece of hardware.

In 1983, Richard Stallman of MIT began a project called “GNU.” GNU stands for “GNU is Not Unix.” GNU was in fact not Unix, but it was meant to work just like UNIX.

At the same time, Stallman and his associates created a licensing system for software called the GNU General Public License, or GPL. This license was designed to guarantee that work carried out on the GNU project would be available for anyone else to use, as long as they followed the rules of the license, which essentially required that any new work based on GPL licensed work also had the GPL license attached to it.

This was the origin of the Open Source movement. The idea of this movement is simple. Instead of creating and selling proprietary operating systems and software, sold for a profit and protected by a user agreement, people would create software that users could obtain and use for free (free as in FREE beer) and this software would be free to modify and apply by anyone anywhere (free as in FREEdom).

The GPL project involved developing a large number of tools that could be used while operating a computer. For example, when using a computer with a command prompt (there were very few computers that used graphical user interfaces at the time) one might enter a command to list the files in the current directory, create a file, create a directory, move a file, search through a file for certain contents, etc. The GNU collection of tools eventually grew to include a large percentage of similar tools ever developed for proprietary computers, and much more, but under the GPL license. There were tools created back in the 80s and developed through the 90s that are still at the heart of many computer operations today.

In 1991, the Finnish computer science graduate student Linus Torvalds started to develop a “kernel” that would interface between certain hardware and the larger operating system. The details are a bit complex, but eventually, Torvald’s kernel and Stallman’s GNU tools were combined into a single operating system that would be called “GNU Linux.” Today we often say just “Linux” but the longer name better reflects what the operating system contains and how it came to be.

All of that history is interesting, but probably more relevant to you, is how Linux is maintained and deployed today.

Linux is maintained and developed by a community of thousands of programmers and other experts, globally. Each part of the project is maintained by a “maintainer” and different programmers send their work to that maintainer for approval. Most of these developers work for a company that is involved in computing somehow, committing part of their time to Linux development. All the work is done in the open and subject to comment and critique by everyone else.

This project makes sense, and is valuable to companies involved in computing, because the operating system itself is free for them (and everyone) to use, and direct involvement (by the larger community) means that the functions the operating system serves, and how this is done, is determined by a large scale very open conversation, rather than the more limited ideas of a smaller group of designers within a corporate and proprietary system.

Even more important is the simple fact that for most of the uses required by these companies, Linux is a superior operating system.

Also, and I’ll expand on this a bit below, the process does not involve marketing. Nobody is directing the work on the basis of perceived value on the market, or in relation to any profit motive. It is all about getting computers to do things effectively and efficiently, with security and usability firmly in mind.

When the first version of this operating system was released in 1991, it included 10,000 lines of computer code. Linux currently has just under 20,000,000 lines of code.

But, here is the important part. The developers of Linux are focused on certain principles. I can characterize those principles from my own observation, but much has been written and spoken about this, and you should seek more expert sources if you want the richest and most detailed story. The operating system needs to be small and efficient, and every aspect of it has to work as flawlessly as possible. The kernel, the deep inside part of the operating system, needs to be very stable and to only contain what is necessary for the kernel to run. There is a great deal of discussion as to what functions should be moved into the kernel vs. left out and treated more as a tool that may or may not be included in a particular installation. This is why the Linux Kernel can sit comfortably inside your cell phone and be amazing, and at the same time, manage a complex super computer with hundreds of processors.

One of the outcomes of this sort of careful curation of Linux is that sometimes changes made in the guts of the operating system require that related changes be propagated outward into other software. Since changes at this level can affect a lot of other things, they are carefully considered and avoided until necessary. But, once they occur, maintainers of the various parts of the operating system, as well as some of the software that runs on it, have to make the appropriate changes. The result is a bit more work than other methods might produce, but continuation of efficiency and stability of the operating system and its parts.

There was a lot of competition and bad feelings between the dominant desktop operating system’s developer, Microsoft, and Linux, several years ago. In my view, Linux has long been a superior operating system, measured in terms of how well it works, how adaptable it is, how quickly development responds to security threats, new application requirements, and so on. Also, by and large, Linux has run on a larger range of hardware. Most importantly, perhaps, as Microsoft’s Windows developed over time, it required more and more advanced hardware to operate. So, keeping your computer updated required not only adopting the newer versions of the operating system (and often paying for that) but also replacing old hardware with newer hardware now and then. Linux, by contrast, runs on most (nearly all) of the older computers.

Eventually the competition and fighting between the Windows and Linux camps died down, and now Microsoft not only uses Linux in many settings, but has become one of the top contributors of code to the Linux project, and supports OpenSource in many ways. For its part, Apple adopted an operating system very similar to Linux as the basis of OSX.

I suspect that eventually Microsoft will also adopt a Linux like underpinning for its own operating system.

The most important differences between Windows and Linux

There are a lot of differences between these operating systems that are internal and often esoteric. As implied above, Windows tends to be bloated (lots of code) and relatively inefficient, requiring fancier and more powerful hardware to run, while Linux is leaner and will run on nearly anything. This also means that installing Linux is usually easier and faster. Linux takes up less space on your hard drive, as do most of the different software applications that run on Linux.

The Windows operating system appears to the average user as a single entity, a whole thing, that runs your computer and peripherals and at the same time interfaces with the user using a fancy and occasionally redesigned graphical user interface. Everybody gets the same user interface.

Linux is distinctly different from Windows in that it can be thought of as having two parts (from the average desktop user’s perspective). The basic system, that runs everything, is the Linux kernel and the GNU tools and a few other things, down under the hood. If just this stuff is installed on your computer, you interface with it using a command line. Computers that are used as web servers or to do certain other work operate this way, since the computer’s work is being done without much direct human involvement except by experts who are comfortable using the command line. This saves resources and makes the computer run very efficiently. Indeed, many of these computers are “headless,” meaning they don’t even have a monitor. Since only the command line is being used, an expert can sign into the computer over a network and mess around with it easily in a terminal program that gives them access to the command line. Such computers, most of the time, run themselves and nobody has to look at what they are doing.

The second part of a desktop Linux system, one that you as a regular person would use, is called the “Desktop.” The word “desktop” is confusing and messy in the computer world. Here, we mean the set of tools and stuff that give you a graphical user interface for the entire system, and that runs your software in windows, like Microsoft Windows and Apple’s operating systems do.

Unlike Microsoft Windows or Apple’s OSX, there are many different desktops that function on Linux. Any individual or group of experts can design a new desktop and make it available for Linux users to use. When you sign on to a Linux computer, there is a moment when you have a choice to pick among the various desktops that are installed on that computer (if there is more than one). Most people have a preferred desktop and use that as the default. Some people like to collect and play around with different desktops.

Then, there is the concept of the “Distribution.” When you “get” Linux, you are actually getting a particular distribution (aka “distro”). The structure, history, and dynamics of distributions is actually very complex, but you don’t need to know any of it . All you really need to know is a simple definition of what a distribution is most of the time, and which distribution you should chose to install.

A distribution is the package of stuff that is needed for Linux to run on your computer. You can think of it simply as a system, like Microsoft gives you Windows, Apple gives you OSX, Linux gives you a particular distribution and that is the thing you install.

A typical Linux distribution of the kind you might install is in some ways similar and in some, very important ways, different from a products from Microsoft or Apple.

Like the other systems, a Linux Distro (short for distribution) includes the installer. So you get a CD, DVD, or thumb drive, boot from it, and the installer takes over, asks you some questions, and installs the Linux operating system on your computer.

A Linux distro has a single desktop that is automatically installed. To the user, this is the most important difference between distros. You pick a distro in part on the basis of what desktop you like. You can install other desktops later, of course. But most likely, you will simply pick a distro with a certain desktop and that will be your desktop.

A Linux distro includes a whole pile of other software and installs that at the same time that it installs your operating system. Typically, you get a web browser; an email program; an office suit with a word processor, spread sheet, presentation software, etc.; a text editor; calculator; music player; some graphics software; etc. This sounds like it might be annoying because it would take forever to install all that software, but software that runs on Linux is very efficient and takes up less space than for, say, Windows, so it does not really take all that long.

A Linux distro has a particular way to install, update, and maintain software. If you use a Mac, you are familiar with the paradigm, because Apple copied the Linux method. There will be one or more user interfaces that you can use to search for, pick out, and install software. Every now and then you can issue an update command and all of the installed software will be inspected and updated. Typically, your distro will install and set up at installation time, a program that initiates this automatically and gives you a message saying that you should update your software. You can opt to have this done in the background automatically, or you can do it yourself. Any of this can be done from the command line if you like.

A modern Mac does this as well. That is because down deep a Mac is running a Linux like operating system. Traditionally, Windows did not do this, though maybe Microsoft has learned to follow the Linux pattern of updating software. I’ve not had to update a Windows computer in a while.

Here’s a key point that distinguishes Linux from both Apple OSX and Windows. The Linux operating system itself is updated a little bit pretty much every week. And, it is updated more or less flawlessly. Your distro will probably be conservative. The basic Linux OS is updated, and that update is tested out and incorporated into a distro. That distro may have two versions, a bleeding edge version and a more stable version. So the change goes into the bleeding edge version then later into the stable version. Chances are your distro will actually be based on one of those distros, and there is yet another level of checking out the changes. Then it comes to your desktop.

When Microsoft updates its system, it test the update internally (maybe using beta testers). When Linux updates the system, it is tested by all those thousands of Linux experts who are involved in the project. This means that Microsoft has to do its updates differently, because an update is a major and costly project, just to test. So a typical Microsoft update (and Apple is similar) has more changes, more fixes, more tweaking, and this is why the first version of those updates is almost always at least a little broken or problematic for a good number of users.

Since Linux updates in smaller increments, and the increments are very widely tested, there is virtually never a problem with such an update. They just happen, everything works, and nothing goes wrong. And, it is lighting fast. The update is just several seconds to a few minutes long.

So, you want to install Linux?

The best way for a noob (a first timer) to try out Linux is to get a hold of that extra computer, make sure there is nothing on the hard drive that you need to save, and do a fresh install of the Ubuntu operating system on that hard drive. Ubuntu is one of many different distros, but it is the most user friendly and the best supported. There are actually several versions of Ubuntu, but you will want to install the standard mainstream version, which uses the “Unity Desktop.”

Click through to THIS POST to see an overview of how you might do this,

What is the best mouse for a Mac, Linux, or Windows?

One mouse to rule them all

I had previously reviewed the Logitech Ultrathin Touch Mouse, suggesting it as a replacement for the Apple Magic Mouse. Now, I’ve tried it on my Linux machine (don’t know why that took so long). It turns out to work very well, better than most, possibly all, mice I’ve used.

One’s mouse is a very personal thing, and everyone is going to have a potentially different opinion about what the best mouse is. The Ultrathin is designed to work with laptops/notebooks because it is small, and it is assumed that everything you use with such a portable device must be small. The truth is, you can carry around a whopping big mouse in your notebook bag and not even notice, so this is a bit of a fallacy. Anyway, it obviously works with any computer with a bluetooth connection, desktop or laptop.

Also, some people want their mouse to be big, some want it to be small. And most people can probably grow to like whichever mouse they are using, and thus develop their preference longer term. I personally like a very large mouse or a very small mouse. I can not explain that.

A touchy mouse

There are, these days, two fundamentally different kinds of mouse. One is the kind with buttons and scroll bars and such, the other is the kind with a swipe-able surface. The Logitech Ultrathin Touch Mouse is one of the latter. It vaguely resembles the standard Apple mouse that comes with modern Apple computers, but is trapezoidal in shape rather than ovaloid. It is also smaller.

As I noted in my earlier review, my Apple mouse was starting to act strange, so I decided to replace it, and instead of getting an Apple mouse, I got the cheaper Logitech touchy mouse to try it out, and I’ve not looked back.

Designed for Windows/Mac but Works on Linux

There are two versions of this mouse, the T631 for Mac for the Mac, and the T630 for Windows. As far as I can tell, they are the same, but look different, with the Mac version being white and the Windows version being black. Makes sense at several levels.

I have read on the Internet, which is never wrong, that the Windows version works fine on Linux, and I can attest to the Mac version working fine on Linux as well. I doubt that at present Linux is using all the various swipy capabilities of the mouse, but it moves the cursor, has left and right click, swipe-scrolling, and it may also emulate a middle mouse button. Two fingered swiping back and forth trigger Linux buttons 8 and 9. And so on.

Obviously, I’ve not tried this mouse on Windows. Why would I ever do that?

Two hook ups and Great Battery Life

This is a bluetooth mouse (and that is how you get it to work with your Linux machine). The mouse has a selector switch, A and B, so you can pair it with two different computers (such as your desktop or your laptop).

Unlike the Apple Mouse or many other existing mice, this device does not use batteries that you replace. (Indeed, the Apple Mouse is even pretty picky about the kind of battery you use.) You plug it in to a micro USB cord hooked to something with power, every now and then. It charges really fast, and the charge lasts a long time.

I recommend the T630 or T31.

Command Line Science

A worthy Kickstarter science related project is afoot.

Face it. Most science is done on the command line. When it is not, we call it “science by spreadsheet” or name it by some other epithet.

Much of that is done on Linux or Linux like computers, but that actually includes Macs, and if you must, it can be done on Windows.

Bioinformatics, climate simulations, basic statistics using the r language, fancy math things using the appropriate python library, making graphs with gnuplot, and even producing nice looking results for dissimnation to our geeky peers using LaTex. Science-related engineering uses the command line too, if it involves any programming of controllers or sensor equipment.

This is not to say that all science is done this way. Quite a bit isn’t. But there are many tools used in science that are best handled with the command line or something like the command line.

Brian Hall, a computer science guy, is developing an on line training class to teach the methods of command line science. He is developing the class using Kickstarter, which is fairly unique as far as I know. He isn’t even asking for that much money, and is over half way to his goal. Visit the Kickstarter site to see what you get if you donate. He has a nice video explaining the project.

This video course is designed for scientists with little or no programming experience. It’s okay if you’ve never even touched the command line (or if you did once but it felt icky).

You’ll have fun learning a new, powerful way of communicating with your computer. Along the way, you’ll acquire access to a whole world of amazing open source data and software. Who knows what you’ll do next?

The project home will be at Udemy, here. You can go there and see a draft of the course, which will give you a very good idea of what it entails.

The class will probably cost $199, but Brian is considering discount rates for teachers.

Here’s the press release for Brian’s project:

Crowd­funded Video Course to Boost Scientists’ Computational Skills

“Learn the Command Line … for Science!”

Nearly every field of science has a significant computational component ­­ but few working scientists have been trained as programmers. Universities are adapting, but not nearly as fast as the sciences are exploding with new applications. Simulation, data mining, bioinformatics ­­ these are the fields that are driving innovation in physics, astronomy, biology, and medicine. New tools and techniques are being developed every day, but we need more scientists with the interdisciplinary skills necessary to harness them.

A new video course called “Learn the Command Line … for Science!” is calling for backers on the crowd funding site Kickstarter.com. This class will walk trained scientists through the basics of using the command line interface, an absolute requirement to run scientific applications and take advantage of high performance computing resources. It’s also great preparation for learning to code, and eventually contribute new and novel tools to computational science.

The class is being developed by Brian David Hall, a Computer Science instructor with experience doing bioinformatics for the USDA. The course is upbeat, fast­paced and targeted at the needs of working scientists. It goes into detail where necessary ­­ for example, covering how to install software and download datasets from the command line ­­ but it skips topics which are less relevant to scientists, such as the system administration tasks emphasized by other command line courses.

Kickstarter campaigns operate under an “all­or­nothing” funding model, so if “Learn the Command Line … for Science!” doesn’t reach its funding goal of $1,500 after 30 days then Brian gets no funding, and nobody gets to take the course! Be sure to follow him on Twitter (@_bruab_) to stay up to date on the project’s progress, and help spread the word to your social media networks. Just $5 is enough to become a backer of this project. For Science!

Setting up a Digital Ocean remotely hosted WordPress blog

Mike Haubrich and I are developing a science oriented podcasting effort. It will be called “Ikonokast” (all the good names, like “The New York Times” and “Apple” were taken). We decided to enhance the podcast with a WordPress based blog site, perhaps with each page representing one podcast, and containing backup and supplementary information.

Here is the site, set up and running.

After considering our options, we decided to try using a Digital Ocean “Droplet” to host a WordPress blog. Here, I want to tell you how that went, and give a few pointers. This might be a good idea for some of you. And, I’ll explain what the heck Digital Ocean is in case you don’t know.

What is Digital Ocean?

Digital Ocean is one of the many available hosting sites, but different. You’ve heard of hosting sites such as the infamous [name of comosmy deleted because having the name od that company in a blog post draws spam to the blog ](the “Hooters” of hosting sites), where you pay them to provide a server you access remotely, then using tools like cPanel (cringe) you install WordPress blogs or other stuff. Digital Ocean is different because, among other things, it does not set you up with cPanel (though you can install it). Also, Digital Ocean is not really designed to use as a full on hosting application for ALL of your needs, but rather, to set up a smaller but highly capable host for a specific need. This is great for developers who are always working on entirely separate projects. So, for example, a developer might create a “droplet” (a Digital Ocean server) and install stuff, setting up a specific application like a web site or content management system or whatever, and then hand that entire project over to the client who thereafter owns it. There are numerous other differences, including pricing, that I’ll cover below. Some of these differences made us chose Digital Ocean, others are not important to us (and still others are beyond our understanding because we are not hackers or professional IT experts).

The developer oriented philosophy is not of much relevance to the average non-developer, but it is likely very compatible with the user who wants to set up a web site or similar application for their own use. For us, setting up a simple WordPress blog, it seemed to be a good option. We could have gone the free route by getting a WordPress.com or similar free site, but by having our own fully functional Linux server, we could would not be limited by any of the technology that those sites use, allowing us to use the server for other purposes should such a need arise, and allowing us to configure the installation any way we want. For example, if you set up a typical host with a WordPress install, or use a general free blogging platform of some kind, there may be a limitation on the size of the file you can upload. You can probably get your host to change that for you (it is a PHP value, a single line of code in the PHP configuration file, usually). But that involves interacting with the host’s help people. Also, there may be configuration changes you want but that they won’t do. A Digital Ocean droplet can be regarded as a computer you own (but is not in your house) and that you can do whatever you want with, as long as it can be done with any Linux computer with those specifications. So, for this case, you would just log on and change the maximum file upload setting in the PHP configuration file.

Another use of something like Digital Ocean (again, this can be done with any host, but it may be easier with Digital Ocean) is to set up your own cloud server, using something like Own Cloud. (See below for more uses.)

Another feature of Digital Ocean is that the servers appear to be fast and efficient. As a user, you have a server with an SSD drive, for example.

Even though you can access your Digital Ocean droplet (your server) via the command line using SSH, Digital Ocean also provides an interface that helps automate or make simpler many of the tasks you would normally do. In addition to this, for the more tech savvy, Digital Ocean has an API that allows you to set up a way to interface with and control the server that matches your own needs. This feature is way above my pay grade, so I can’t really comment on it, but it is there.

Why we decided to try Digital Ocean

Now, here is the part of Digital Ocean that makes it most interesting and potentially useful for the average user who wants to play around with serious technology but is not a hacker. Like Mike and me. This is the set of different distributions and applications that can be “automatically” installed and set up with a “one click” system. I want to say right away that there is nothing “one click” about this, as far as I can tell. Nothing takes one click. I have no idea why Digital Ocean uses that term. To me, “one click” means you click once, then you are done. Having said that, the various options are highly simplified approaches to doing some stuff that is fairly complicated if done from scratch.

Apparently unique to Digital Ocean is that you can choose among a range of Linux distributions. This means you are likely to find a distribution you are comfortable with. Other hosts have a distribution they use, and that is the one you get. Digital Ocean has Ubuntu, CentOS, Debian, Fedora, CoreOS, and FreeBSD. When you set up a simple droplet, you pick one of these distros, and that’s it. (I’ve not done that, so I don’t know if that is truly one click. Could be.) What you get, of course, is a server version of that distro. If you want a graphical user interface, that is a different thing (see below).

In addition to being able to chose among these distros, you can “one click install” a number of major applications. Most of those listed on the Digital Ocean site are Things Unknown To Me, but I do recognize some of them. Joomla, MediaWiki, Docker, Drupal, LAMP, ownCloud, etc. are available.

And, of course, WordPress.

When setting up one of these applications, you start (I think in all cases, but I’m not sure) with no droplet. The droplet and the underlying distribution are created at the same time the application is installed. Also, the “one click” installs of these applications seem to be associated with a specific underlying distro. To mix and match distros and apps, you would install the distro, then manually install the app. The One Click WordPress install is on Ubuntu.

How much does Digital Ocean cost and how big and fast is it?

Pricing is, as far as I can tell, one of the major differences between Digital Ocean and other servers.

When you choose a distribution or an application, you then choose a droplet it will go on. This is where pricing and power come in. The smallest droplet costs $5 a month or $0.007 and hour. If you calculate that out, the per hour cost is just over the monthly cost during 31 day months, but the cost is capped at that monthly cost. More importantly, it is pro-rated at that hourly rate. So, as long as the droplet exists, you are being billed for it, but not when it does not exist.

As far as I can tell, and they are pretty straightforward in their description of pricing, so I think I have this right, if you create a droplet, run it for several hours, and then destroy it, you are charged only for those hours. By the way, you are charged while your droplet exists but is powered off, because the resources are sitting there reserved for you. But if you create a droplet to try something out, then destroy it, that limits the charge. So creating a droplet, installing stuff, trying it out, yada yada, if that is all done over a couple of hours, you might be billed something like 20 cents. If you have no droplets but have an account, nothing is being charged to that account.

Having said that, the five dollar a month droplet is usually not going to do what you need (though I have thought of a few uses for such a thing). The minimum droplet for a WordPress install using their “One Click” method is the $10 droplet. Technically, you can install a WordPress setup on a $5 droplet, but the “One Click” method takes up more resources than the $5 droplet has, so you would need to install it manually.

The $10 droplet has 1 GB of RAM and 30GB on the SSD disk. The transfer rate is 2TB, and you get one core of processor power. There are $5, $10, $20, $40, and $80 options that range up to 8GB of memory with 80GB SSD space, 5TB of transfer rate and 4 Cores at the $80 per month rate. There are also massive higher volume plans running up to the unspeakable sum of $640 a month, but we need not discuss this here because it is scary.

Another difference between Digital Ocean and most other hosts is that you can easily change the specs, or at least some of them. You can increase the RAM by simply changing the specs and rebooting. Changing the SSD size takes longer but it can be done on the fly.

About that One Click thing, and installing WordPress

The WordPress install has nothing to do with one click. There are many clicks.

We managed the WordPress install with no problem at all with respect to the server, except one bit of confusion on my part. Maybe two bits.

I just clicked on the one click button. Then I did a whole bunch of other stuff, as specified in the Digital Ocean instructions. It is worth noting that Digital Ocean has many tutorials, and I think they have some sort of incentive system to get tutorials written and updated by users.

I ran into three problems that an expert would not likely have had, and I’ll tell you about them so you’ll know.

First, early on in the process, you need to get a secure connection to the server. You can do this by setting up a key on your computer and syncing that with the key on the Digital Ocean droplet. Do you know what I’m talking about? If yes, never mind. If no, good luck with that, it is a bit esoteric. There seems to be another way, which involves Digital Ocean resetting your root password and mailing it to you. Now, the NSA has your password, so you may want to change that. In any event, the whole secure connection thing is one of those areas that hackers already know all about but someone like me doesn’t, so I was confused and that took a bit of work. The tutorial is written with the assumption you are jot an idiot, but you may be an idiot, like me. Just carefully follow the instructions. You’ll be fine.

Second, and this is totally stupid (of me). (Digital Ocean really needs to re-write a version of their tutorial just for idiots.) When I finally tried to log on to the server, having made a secure connection, I was utterly confounded. I knew what my password was, but I did not know what my user name was. I couldn’t remember specifying or being given a user name. I just didn’t have a user name. Digital Ocean help files were no help. I had no idea what to do. Then, I randomly ran into something that reminded me that I am an idiot.

When you set up a basic Linux server, your username is root. That is obvious, everybody knows that, right? I had forgotten that because most of the Linux setups I’ve installed (and there have been many) were using a hand holding install script on Debian, Fedora, or Ubuntu or something, which set you up as a special user who is not root, but whose password can be used to su or sudo.

So just remember that, your name is root.

The third problem has nothing to do with Digital Ocean, but somehow I seem to have missed these instructions in the guidelines. This had to do with getting the DNS thing set up so the domain (yadayada.com or whatever), which Mike had already bought, would point to the server. There are three things you need to know. First, the domain service has to be told what servers to point to (Digital Ocean provides this info on their web page). Second, you need to do an esoteric thing on the Digital Ocean interface under the “networks” section to enter your domain name. Third, you need to get into the WordPress installation and enter the domain name in the settings on wp-admin (in two locations). Oh, and fourth, you have to wait a while for this to propagate, which for us was a very short period of time.

Digital Ocean and Security

Recently, a few colleagues/friends have had their WordPress sites hacked by their own back end. The hosting service got hacked, and then the clients of that hosting service got hacked.

This can’t happen on Digital Ocean for various technical reasons. Unlike a typical server, in which you only THINK you “own” a computer where you are root, but really, there is a sort of Over Root that can root around in your root, Digital Ocean Droplets are more like a separate server, given the way they are set up. So, for example, Digital Ocean can’t go into your server to fix something for you. But this also means that malicious code (or whatever) at DO (or elsewhere) cant go into your server and break something for you. There is a way to recover a totally crashed droplet that involved DO involvement, but it is you, the droplet owner, that does the fix, while someone at Digital Ocean kicks the side of the server or something.

According to Ryan Quinn at Digital Ocean (I asked him to clarify this aspect of security):

In DO there is no such thing as a “super-root” user on a DigitalOcean droplet. When you create your droplet a couple things happen.

1.) If you do not use an ssh key the create process generates a temporary password and emails it to you. This password is not stored anywhere else in DO’s systems and you are prompted on the first login to immediately change the temporary password.

2.) If you do use an ssh key stored on DigitalOcean, DO admins and support personnel do not have access to these keys through their admin interface.

So while DO has access to the hypervisor (physical machine) that your droplet is running on we have no access to the operating system within your droplet so this would not be a viable attack vector.

So for example, if you were to find yourself locked out of your droplet, our support team could recommend a password reset from the control panel but the only way they could directly assist you in accessing the contents of your droplet would be to power it off, mount a recovery ISO that includes it’s own operating system, and boot your droplet with that image. From that image (which has networking disabled by default) it is possible for you to mount your disk image and access your files.

Overall, a user would have more ready access to your droplet if they were to gain access to your ssh key, root password, or an API key you generated form the control panel than they would if they gained admin access in our backend systems (which are well protected behind firewalls and two-factor authentication, and not accessible from the public Internet).

Deciding if you should use Digital Ocean

Digital Ocean is not for everybody. You need to be at least a little savvy with Linux, probably the command line, etc, and you need to be willing to mess around a little. But it is probably the best solution for getting a fully functional server that you have full control over. Best in terms of pricing, flexibility, and power. As far as the cost goes, that is pretty easy to justify. Adding a monthly bill to your mix of expenses is something you should be careful about doing, but if you set up a $10 a month server with Digital Ocean, and decide you don’t want to do it, just go to your account and destroy the server and you’ve probably spent less than $10. Also, if you click any of the links to Digital Ocean on this page (such as THIS ONE) you will get a $10 credit, so you won’t have to spend a dime. (I set up our server with such a referral, so we are so far cost free!). After that, $10 a month for another month or two is not a big deal, and by then, you should know if the server and all that is working for you and worth the expense.

What about a graphical user interface desktop thingie on Digital Ocean?

You can do that. Digital Ocean used to have “one click” installs for various distros with desktops, but does not seem to do this any more. What you can do is get a droplet with enough power (probably the $20 version with 2 GB memory), create a non-root user with sudo privileges, install a desktop and use VNC to access it. I’ve not tried this or looked into beyond a bit of poking around.

Ubuntu Linux 15.04 Vivid Vervet Beta Mate Flavor

Ubuntu Linux 15.04 will be released in April.

There is not a lot new for the average desktop user in the new release, as far as I can tell. One good “change” is a feature called “locally integrated menus.” This is where the menus are, by default, where they are supposed to be, instead of, well, invisible until you stab at the menu bar that must reside at the top of your screen in Ubuntu with Unity. Then the menu appears and maybe you can use it. That was a bad idea, and over the last few revisions of Ubuntu with Unity, the top menu bar menus have slowly gone away, first as something you could make go away by tweaking around, then an option to make them go away, and finally, they went away (but you can have the annoying disappearing menus if you want).

Several of the various “flavors” of Ubuntu are getting upgrades to the newer version of the pertaining desktop. There will be a newer version of Gnome, a newer version of KDE, etc. in each of those flavors.

I downloaded the Mate Beta and tried it out on my test computer, and liked it. It seemed to work OK so I simply installed it, and the installation went fine. It is now running and I’ve got no problems. There may be some bugs out there but I’ve not had a problem.

Mate is a desktop that forked form Gnome 2.0. Gnome 2.0 was the best desktop of its day for many users. There is an old saying in software development. Once you’ve perfected your software, further development simply breaks it. This happened to Windows years ago, somewhere around XP or before. And it happened in Linux, in my opinion, when Gnome dropped the Gnome 2.0 paradigm and went all Unity on us, and of course, Unity is a broken desktop as every one knows. Expect it to evolve back towards a Gnome 2.0 like framework.

Anyway, Mate is Gnome 2.0 forked and improved, but that improvement is mainly under the hood and not in the overall look and feel, which is the point. I did not like earlier versions of Mate because it was a mess of older Gnome tools and newer somewhat updated Mate tools and some key stuff was simply missing or broken (like the ability to mess around with screensavers). At that time I took my “production machine” out of play for the evolving Ubuntu environment and installed plain old Debian stable. For what I use that machine for, it is great. But I wanted to have my laptop do more snazzy stuff, so I’ve been experimenting with Mate Ubuntu. And that is why I installed the Beta.

There is a handful of cool new items. Mate now has a much better interface and somewhat improved set of tools for configuring things. Among those you will find a easy way to turn off and on Compriz on the fly. The menus are better organized. The theme, icons, other visual stuff is unruined and mainly improved. I’m not going to provide details here because if you are going to mess with the Beta version, 1) you probably know more than you need to know about Mate, and 2) things may be changing somewhat. But when the final release comes out I’ll post on the details and what you may want to do after installing it.

I looked at the new Gnome Ubuntu flavor as well. Although I don’t like Unity I can appreciate Gnome 3, and have used it and I kinda like it. I think the Gnome flavor will be even better. KDE users will also probably be happy with their new flavor, from what I hear, but I’m not much of an expert on KDE.

One final thing. Going from the current version of Mate Ubuntu flavor to the Beta was easy, an in place upgrade that preserved most stuff. It did, however, wipe out some of my previously installed software but not the configuration files. So, I had to reinstall emacs, but my .emacs file was still there. I also had to re install R and RStudio, and Chrome Browser, as well as Synaptic and Deb, and a few other things.

These installations were pretty painless, but one would prefer not to. But, I was INSTALLING the Beta version, not upgrading to it. I assume that if you are using the current version of Mate Ubuntu you will be able to simply upgrade to it after doing the usual backup with no problem.

10 or 20 things to do after installing Ubuntu Mate (14.10)

See here to see why you might want to install the Mate flavor of Ubuntu 14.10.

Then, install it and consider doing these things. Get your system up to date. Yes, yes, you just installed it but that install image was old(ish). Update and upgrade now:

First, you probably want to open the Software Center, to to Software and Updates, and enable all the Ubuntu Software Sourcews (other than source and the CDRom option). Then:

sudo apt-get update
sudo apt-get dist-upgrade

Go to Preferences/Additional Drivers and then allow additional drivers, and pick a proprietary driver for your graphics card if you like.

Install the Synaptic Package manager and if you like use it for some of the following updates. I like Synaptic package manager better than the Ubuntu software center.

sudo apt-get install synaptic

You might not need to install gdebi but make sure it is there. This is an application that installs .deb files.
sudo apt-get install gdebi

So now you have a better set of installation tools.

Go to the Google Website and install Chrome. Not Chromium Chrome. Chrome will run Netflix for you. Later, when you run it, it will ask if you want it to be your default browser. Your choice (I use Chrome as my default browser.)

Using Synaptic Package Manager (if you like) you may want to install vlc media player, and your favorite audio software.

I like emacs, you probably don’t, but if you do, this is a good time to install it, and consider updating your .emacs file.

Open up the control center and fiddle with stuff.

You then might want to head on over here and see if you want any of the suggested software for power management or other functionality.





Should you install Ubuntu Mate?

With Ubuntu’s release a few weeks ago of Ubuntu 14.10, Mate has now become an official flavor of Ubuntu.

There are two pieces of bad news that relate to this that we’ll get out of the way. First Ubuntu’s default distribution, which uses the Unity Desktop by default, broke a key Linux feature. If you install Ubuntu with Unity, you can’t easily change your desktop. Or, if you try, you’ll break your system. Ubuntu seems to want you to use Unity no matter what. Second, while at one time all flavors of Ubuntu were treated more or less alike (though the “Default” was gnome) now, the non-Unity distros are called “Older and other” and you have to dig around to find them. Apparently, Ubuntu wants you to use Unity no matter what. Where have I heard that before?

So, long term, don’t expect Mate, or KDE, or any of the other non-Unity distributions to remain as Ubuntu Flavors. I strongly suspect Ubuntu will eventually boot out all the non Unity distros. This will happen about the time Ubuntu gets past a certain percentage of the portable device market (which, at this time, it is not really part of) and it becomes in the interest of Ubunut’s backers to unify the look and feel, with Ubuntu Unityish being the operating system for the next generation of smart phones of which they will sell many. I assume. Or maybe not, we’ll see.

So, why should you install mate? Consider the following two reasons:

1) It isn’t Unity, it works better if you like the traditional Gnome 2.0 style of a desktop. This is really the only way to get that style desktop.

2) It isn’t Unity, and at this point as many of us as possible have to be using something other than Unity (unless of course you happen to like Unity in which case you’ve probably stormed off by now so good bye) in order to send the message that no, we won’t have the Linux Desktop broken by a big gorilla that first takes over the whole Linux thing by being so good at it then tells us what we have to eat for dinner every day thereafter. Thank you very much.

Beyond that, the reason to install the Ubuntu flavor of Mate instead of Mate on some other distro is that, like it or not, Ubuntu has the best distro if you don’t want to totally roll your own or fiddle a lot. You still have to fiddle (see here for example) but most will get their computer off the ground a lot faster and less painfully with Ubuntu.

I was not really happy with some of the earlier incarnations of Mate, partly because this Gnome fork seemed to have broken a lot of nice Gnome features, rather than just forking them. Now, however, either they have stopped doing that or I’ve forgotten what features Gnome had that I liked and don’t care any more. But seriously, Mate as implemented (version 17) on Ubuntu Linux (14.10) is a clean and nice installation.

To install go here, download the appropriate file, then make a bootable DVD or USB. The USB is easier. You can use the ddrescue command indicated here to make a bootable USB. Don’t make the mistake I did. I forgot that not all USB ports on your computer are created equal. Even if your bios is configured to allow you to boot from USB, that may refer to only some of the USB ports. Your computer might even be labeled to indicate this (mine was, but the labeling was tiny and criptic so I was unaware of it!) If you think you’ve got a working boot USB, and it does not work, move it to a different port.

Then, after you have installed Mate, you may want to mess around with it to make it work better.

10+ Things To Do After Installing Ubuntu 14.10 Utopic Unicorn

NEW: Very first look at Ubuntu Linux 15.04 Vivid Vervet Beta Mate Flavor

See: Ubuntu Unleashed

Here is a list of things to do after you have installed Ubuntu 14.10 Utopic Unicorn.

There is some discussion of whether or not you should upgraded to 14.10 here, but the short version is, for most people an upgrade from 14.04 is not necessary but not a bad idea, and an upgrade from any earlier version is a very good idea. Mostly, though, you should just upgrade.

One could ask the question, should you be installing Ubuntu with Unity. You have to like Unity. I personally like to have a wider range of desktop options than Ubuntu with Unity allows, but for a notebook or laptop where you are going to be using one application at a time, usually use GUI apps, and like to have your computer integrated fairly seamlessly to social networking services, etc., it is a good option.

But, as is always the case with any operating system, you can either use it out of the box or change a few things. Because of OpenSource related licensing things a few things need to be done by you that would normally be done by the provider of the OS (but this is a free OS so you don’t get that) but most of these changes are just to make the OS more like you like it. So pick and choose.

First, before you do anything…

Run these commands to bring your system up to date, even if you just installed Ubuntu 14.10.


sudo apt-get update
sudo apt-get dist-upgrade

Reminder: When you start a command with “sudo” you will be asked to enter your password. If you use “sudo” again soon after, the system figures a bad guy did not konk you on the head to take over your computer, and it is probably you issuing the command so it does not ask for your password again. After a while, the system figures you probably did get konked on the head and will attempt to verify your identity by asking for your password.

Also, for the various commands being suggested here (and I should say you are totally on your own and I take no responsibility if you muck up your system, good luck and have a nice day) you may have to enter a “y” (for yes) or do some other things, so keep an eye on your computer.

Install Better or More Appropriate Graphics Card Drivers

Using Software & Updates ~ “Additional Drivers” tab ~ Do what it says there

Install Ubuntu Restricted Extras

This includes some fonts, java, the flash plugin, DVD playback ability, and so on. You need some of this stuff. Use this command:


sudo apt-get install ubuntu-restricted-extras

Install additional extras for multimedia

To install DVD playback ability:

sudo /usr/share/doc/libdvdread4/install-css.sh

Some, many, users will want additional codecs:

sudo apt-get install gstreamer0.10-plugins-ugly gxine libdvdread4 totem-mozilla icedax tagtool easytag id3tool lame nautilus-script-audio-convert libmad0 mpg321 libavcodec-extra

Adjust the degree to which the Ubuntu Unity Dashboard annoys you and violates your privacy

Method 1

System Settings ~ Privacy and Security ~ Turn stuff off, especially the online items.

Unity now has the settings people usually turn off unset by default, so you may not need this.

Method 2

If you do need to turn off all the settings check out Fix Ubuntu has a nice script that will maximally crack down on Unity. You can get the script and run it right away, if you are trusting (it looks trustworthy to me) with this nifty one liner:


wget -q -O - https://fixubuntu.com/fixubuntu.sh | bash

While you are addressing privacy, you may or may not want to disable system crash reports. Sending system crash reports to Ubuntu is probably the polite thing to do, but you may not want to. You will need to edit a file to do this.

Open the file with sudo because it is a file you can only modify and save as a quasi-super-user:


sudo gedit /etc/default/apport

Then find the line that says

enabled=1


and change it to


enabled=0

Save the file, close the text editor, and now at the terminal enter:


sudo service apport stop

Put your name back on the top menu bar panel

You might like the name of the user showing, especially if more than one entity uses your machine.

Name on:

gsettings set com.canonical.indicator.session show-real-name-on-panel true

Name off:

gsettings set com.canonical.indicator.session show-real-name-on-panel false

Put the damn menus where they are supposed to be

Ubuntu Macified their Unity experience a while back by moving the menus that go with applications to the menu bar on the top of the screen. This breaks the Linux Philosophy by requiring a menu bar in a particular place. Then, they made it even more useless by making the menus disappear until you run at them with the mouse. With 14.04 and now 14.10 you can undo this travesty.

System Settings ~ Appearance ~ Behavior ~ Show the menus for a window ~ In the window’s title bar

Resize menu bars and panels

Linux users apparently would not stand for having panels and menu bars unscalable. Another feature taken away by Ubuntu Unity, but now with 14.10, you can make this adjustment.

System Settings ~ Displays ~ Scale for menu and title bars ~ Use the slider thingies

Install TweakTools or Unity Tweak Tools

This will allow you to tweek things. TweakTools is a Gnome tool, Unity Tweak Tools is specificall for the Unity Desktop (that yo just installed). They are not the same, you may want both. They merely give you access to things that are already there that you can tweak.


sudo apt-get install unity-tweak-tool gnome-tweak-tool

Turn off the most annoying scrollbars ever invented

Some say you let designers design your operating system, and users will later catch up. I say to them, Baaaaa.

The odd weird looking essentially useless scrollbars that plague Ubuntu Unity can be gotten ride of by typing this command:


gsettings set com.canonical.desktop.interface scrollbar-mode normal

If you realize you like these new fangled scrollbars later, you can put them back like this:


gsettings reset com.canonical.desktop.interface scrollbar-mode

Turn Nautilus Recursive vs Typeahead Search Off and ON

After you play around with the newest version of the file manager Nautilus, you may find that you prefer one or the other behaviors in the search bar. I’ve not decided. Switch recursive search on:


Terminal Command:
gsettings set org.gnome.nautilus.preferences enable-interactive-search false

Switch to typeahead search:


gsettings set org.gnome.nautilus.preferences enable-interactive-search true

Set up your online accounts (facebook, twitter, etc)

Seetings ~ Online Accounts ~ Then do obvious stuff there

Laptop users: Power Management

There are things one did with 14.04 to enable power management and related features, or to improve them. I am not certain what the best course of action is for 14.10, so I’m not going to suggest anything here. I’ll update this section at a later time. (Feel free to make suggestions below.)

Meanwhile, you may have a look at this, which covers 14.04 and other distributions.

Install a bunch of stuff

Ubuntu is a bit light on file archiving software. You may want to install more:


sudo apt-get install p7zip-rar p7zip-full unace unrar zip unzip sharutils rar uudeview mpack arj cabextract file-roller

Adobe Flash Plugin


sudo apt-get install flashplugin-installer

Install the latest version of Google Chrome. I don’t think it will be found in the software center, so check here, or go here and press the right buttons. Then you can Watch Netflix on Linux!

Install Dropbox

and/or

Install Copy, which is similar to Dropbox. Slower, but you get more storage for free. I’ve been using it for a while and I like it. (I actually use both.)

Cleanup

Some people like to clean up after themselves. I tend not to, but I know I should. These commands will get rid of some of the chaff you may have created while messing around with your system.


echo "Cleaning Up" &&
sudo apt-get -f install &&
sudo apt-get autoremove &&
sudo apt-get -y autoclean &&
sudo apt-get -y clean

So, you totally screwed up your installation, what do you do now?

Not everything you broke above can be undone easily, but you can reset some of it. Use the following commands. Then see what happens. Good luck. Did I mention that you are totally on your own here and I take no responsibility for anything that goes wrong?


sudo apt-get install dconf-tools
dconf reset -f /org/compiz/
setsid unity
unity --reset-icons


Other posts of interest:

Also of interest: In Search of Sungudogo: A novel of adventure and mystery, set in the Congo.

The Ubuntu 14.10 Upgrade: What to do

The Ubuntu 14.10 Release October 23, 2014

Ubuntu 14.10 will be released shortly and I know you are chomping at the bit and want to know all about it.

There is some important news, for some, and there is some exciting news for others, and there is some boring news, and frankly, some bad news.

Before diving into the shallow pool of Ubuntu 14.10 (shallow in a good way) I want to go over some other ground first. I want to address this question:

“I have installed Linux and I don’t like the default desktop. How do I change that without ruining stuff?”

If you are a long time Linux user you know the answer has two parts. First, “Oh, hey, don’t worry, this is why Linux is so great!” and second, something like “sudo apt-get install yadayada, then log out and then log back in again with your new desktop” where “yadayada” is the new desktop. Easy peasy.”

Now, let is rephrase the question, and in so doing reveal the bad news.

“I have installed Ubuntu 14.04 and I don’t like the default desktop. How do I change that to gnome?”

The answer to the question is actually pretty simple, but has a very different form that I find deeply disturbing. Again, there are two parts. First, “Well, Ubuntu comes default with Unity, and Ubuntu with Unity and some other stuff under the hood does not actually allow you to just swap around desktops like you could in the old days without messing around a lot and depending on exactly how good the information you get on this is, and which desktop you replace Unity and all that with, you will probably break something.” Putting this another way, Ubuntu has broken one of the most important features of Linux, one of the features that makes Linux cool, and in so doing, Ubuntu has made Linux more like Windows. Ubuntu/Unity/Etc as a “distribution” is now vertically integrated across the usual layers to the extent that it is either take it or leave it (I oversimplify but not by much).

And of course, you can leave it. That is the second part of the answer. “You will need to essentially replace your current distro with another distro.”

How to replace Unity with Gnome on Ubuntu

There is a tool to do this, available from Ubuntu. This is actually a pretty amazing tool. It allows you to take a current distribution of Ubuntu and convert it to a different flavor. Ubuntu comes in many flavors. The default is with Unity and it is a desktop environment designed for the average user. Then there are alternatives that have either different desktops or that serve very different purposes, and mixing and matching is allowed to some extent. For example, Ubuntu can be a basic server, or a web server (called a LAMP server), or a mail server (or all three) perhaps without any desktop at all. Or, you can pick any of several distinct desktops like Kubuntu (uses KDE, which a lot of people like) or XFCE, which is what Linus Torvalds and I use, or Gnome 3, and so on.

The tool is called tasksel

You install and run tasksel (sudo apt update; sudo apt upgrade; sudo apt install tasksel; sudo tasksel) and you get a thingie that lets you pick a “Package Configuration,” which looks like this:

Screen Shot 2014-10-04 at 11.46.51 AM

You then very carefully follow the instructions or you will ruin everything! But if you do it right, it should very cleanly remove Ubuntu’s default desktop and install Gnome 3 or whatever. HERE are the instructions and HERE is an excellent episode of the Linux Action Show that goes into detail.

Important additional information: First, this information is current in early October 2014. If you are reading this much later than that, re-research because things may change. Second, it is not perfectly true that Ubuntu does not let you install new desktops and use them. It is true, however, that this is not seamless, harmless, or even recommended. A clue to the seriousness of this is that if you use tasksel to remove Unity and install Gnome 3, you can’t then install Unity because Unity will not cohabit with the version of Gnome you’ve installed. There is too much stuff in the middle that does not work right.

I have installed multiple desktops on top of Ubuntu 14.04, including Mate, Gnome 3 and Gnome Panel. It was the first time for me that playing with desktops broke my system and I’ve been using Linux (and Ubuntu) for a long time, and I mess around with desktop a lot. This is the new normal (for Ubuntu). You will see instructions on what you need to do to switch around desktops on Ubuntu, but frankly, that boat may have sailed other than the use of extreme measures such as tasksel.

I will give you a recommendation below if you are confused or uncertain about what form of Linux you might want to install, based on my own experiences.

Now, back to what you need to know about Ubuntu 14.10.

The first thing you need to know is that Ubuntu 14.10 is almost exactly like 14.04. There are virtually no visible meaningful differences as far as I can tell. So if you are using Ubuntu and are sticking with Ubuntu, don’t expect pretty fireworks. This will not be an exciting upgrade.

Second, 14.10 has an updated version of the kernel, the deep guts of the operating system, and this is important. It is good to have a current kernel. Also, this kernel has some important new hardware support. Some Dell laptops have the ability to turn off your hard drive if it feels itself falling, so the drive is not running when your laptop hits the ground. The new kernel actually supports this feature so if you have a newer Dell laptop, you might want that. There is some improvement in the handling of Dell touchpads as well. The point is, you should absolutely upgrade to 14.10 for a number of unexciting but still potentially important reasons.

Want a better desktop, mate?

No, we are not in Australia. The third item is the big exciting news. If you think Unity sucks, and you liked the old fashioned Gnome desktop (back in the days of Gnome 2.0) you will find this cool. Gnome 2.0 was the best Linux desktop for most purposes, in my opinion. With the new approaches taken by both Unity and Gnome 3, and since forever with KDE, I get the sense that the purpose of the computer is to have a cool desktop. For me, the purpose of my computer is to run certain software and manage files. The purpose of the desktop is to facilitate that, ideally in a way that allows me some customization, but that stays consistent over time so an upgrade does not break my workflow or force me to relearn how to use the hardware, and often, that means just staying out of the way. For me, Gnome 2.0 was the sweet spot in meeting those requirements.

But Gnome has moved on. The current thing that looks and acts like Gnome 2 is called Gnome Panel. It kinda works but it has problems, especially (in my experience) on a laptop. It is not being kept up like it should be to be a current usable desktop. So, sadly, Gnome is no longer recommended for those who liked traditional Gnome. This not to say that Gnome 3 (or for that matter Unity) aren’t great. But they aren’t. Just sayin’

But then there is mate.

Mate is a fork of Gnome that intends to maintain Gnome 2 coolness. It has been around for a while now. It has been updated regularly, and the tradition seems to be to come up with the newest version of the mate desktop in sync with Ubuntu’s release schedule. I’ve tried mate a few times, and I’ve had mixed experiences with it, but in the end it is probably the desktop you want to install if you want Gnome 2-osity on any form of Linux.

This is a bit confusing unless you are already used to concepts like the difference between the terms “desktop,” “desktop,” “desktop,” and “desktop.” Mate is a desktop. Most desktops come along with software that is not strictly desktop but works with the desktop. There are two ways to get many (but not all) desktops. One is to install a “distribution” that uses that desktop, like installing Kubnutu to get the KDE desktop. The other way is to have some normal form of Linux on your computer, then you install the desktop onto that and later, you can chose to log into the newly installed desktop, or some other desktop that happens to be on your system.

Mate was available as an Unofficial Ubuntu Desktop. This means that the mate people would take the guts of a current Ubuntu distribution, and replace various parts with other parts so when you download and install the unofficial Ubuntu mate desktop you get Ubuntu with mate as your desktop.

Now, after a period of regular development, mate is an official flavor of Ubuntu. This means that you can do exactly what you could do before, install Ubuntu with mate instead of Unity or KDE or whatever. But it probably has other implications. I assume that being an official desktop enhances the degree to with an Ubuntu Mate distribution will install cleanly and function well.

It does not exist yet. I understand Ubuntu Mate as such will be released on October 23rd, the same day as Ubuntu. And it comes at a time when Ubuntu continues in the process of seriously downplaying the non-Unity desktops. If you go to the Ubuntu site and see what is there and download and install it, you can be forgiven for not ever knowing that you could have installed Edubuntu, Kubuntu, Lubuntu, Mythbuntu, Ubuntu GNOME, UbuntyKylin, Ubuntu Studio or Xubuntu. You have to dig through a couple of layers of the site and then you get to a scary page that most people will think is just for techies. In the old days, Ubuntu highlighted the diverse alternatives. Now, the bury them. That concerns me.

What you should do instead of automatically installing Ubuntu

There are a lot of Linux distributions out there, and you are of course free to mess around with them. But I’m happy to give you my current advice (subject to change frequently!) about what you might consider doing.

A given Linux distribution, which includes its own distribution materials, may or may not work fully and easily on a given piece of hardware. Considering that when you are looking at or working in a browser or your favorite text editor, the system you are using isn’t that important most of the time, the ease and seamlessness of the installation is really one of the most important features of a distribution. It is my belief based on recent experience messing around with installing several different distributions on five different computers (four laptops, one desktop) that Ubuntu, in one form or another, will generally install the easiest. This includes getting the install medium, doing the installation, and getting help when something goes wrong.

Having said that, installing debian, a traditional well developed form of Linux, on which Ubuntu is based (as are many other distros and most installations worldwide, I think) is pretty easy. Having said that, I quickly add that you probably really want to install one of the “extras” versions of debian, which includes “non free” material and is stored in a scary place and not so well documented.

So, my first piece of advice is this. Get two sets of installation media (this is not hard). One for Ubuntu, the other for debian. Try to install debian. If you run into trouble, switch to Ubuntu. You’ll get the job done. The installation process is not too time consuming or difficult, so this is not a big deal.

My second piece of advice is to figure out what desktop you like. If you actually like Unity, then by all means go over to the dark side and install default Ubuntu. Have a nice time communing with the devil. See you on Halloween!

But if you prefer a different desktop, like Gnome 3 or whatever, then follow my first piece of advice, trying debian than Ubuntu. If debian installs well, then go to town installing your preferred desktop if it wasn’t the default during your install. If debian does not work, then pick the flavor of Ubuntu that has your preferred desktop.

My third piece of advice I’m giving with an important caveat. The caveat is that I’ve not tried this yet so I have no business telling you to do it. But I am going to try this and I think it might be cool. If a Gnome 2 style desktop is your preference, then either install debian and then install mate on top of that, or install Ubuntu Mate 14.10 when it comes out. Just for fun. It might work great.

My fourth piece of advice is this. If you like the Gnome 2.0 desktop and you want to use a well tested and tried interface, consider using XFCE instead. XFCE is quite like Gnome 2 in many ways, but even less in your face. You could install Xubuntu, the Ubuntu flavor with XFCE as the default (or if you have Ubuntu Unity maybe you can use tasksel to switch, depending on things I don’t want to advice you on). Or, and this is probably the ultimate solution, you can instal debian with XFCE. Which, tellingly, is the default desktop for the canonical Linux distribution that is not Canonical. (See what I did there?@?)

And remember, there are only two things you need to keep your eye on. First, you need a computer that will run your software, and pretty much all of these solutions should do that equally well; the only difficulty here is the match between the distro and the hardware, and for a desktop computer, any Linux flavor with any desktop will probably work so you won’t be pounding your desktop in frustration. For laptops you may want to be more conservative and go with the herd (Ubuntu). Second, whatever you do, have fun. And there is nothing in the world more fun than repeatedly reinstalling your operating system, right????

Can't boot from a DVD/CD drive

In order to install a new operating system on a computer, you can make a bootable DVD that includes the software to install the new system, put it in the DVD/CD reader, and reboot your computer. If all goes well your computer will boot off the DVD/CD reader and then you follow the install process and there you go.

But sometimes this doesn’t work. The most common reason is that your computer is not configured to boot from the DVD/CD reader first (if it has a bootable disk in it). You have to go into bios and change the “boot order” so “boot from DVD” is on top of the list, above “hard drive.” A less common problem is that your motherboard is old and not configured properly at a deeper level, which may require installing a new bios. Another, probably common, reason is that the DVD you made is messed up somehow. The “.iso” image you downloaded is corrupted or something went wrong during the burning process. One way to check that is to use the checksum hashtag to verify the image. Never heard of that? Just look it up. It involves obtaining and comparing two numbers which are constructed from the image. One number is provided by the maker of the image using specific software that does this, the other number is obtained on your final version of the image (or downloaded version) using similar software. If the numbers are different, the data are corrupted. This might be more likely if your computer was doing wonky things while downloading or burning, or if the drive you did the burning with is messed up. (Yet another possibility is that the disk is dirty or damaged, but if you just made it, that seems unlikely.) A typical run down of the problems, in the context of installing Ubuntu Linux, is here.

But I think there is yet another explanation that occasionally happens. It is possible that your DVD/CD reader will only participate in the boot under certain circumstances. This would be the case either with older DVD/CD readers, or possibly, a matter of a broken or dirty DVD/CD reader. I’m pretty sure this is the case because I have an old computer with a DVD/CD drive into which I can put known functioning bootable CD’s and get results, but that will not boot off a known functioning bootable DVD.

I could have cleaned the DVD drive, or I could have replaced it (they are cheap). I did not do the former for no particularly good reason, and I did not do the latter because a long time ago I learned it was better, when buying a new DVD/CD drive, to get a nice external drive so it can be moved between computers.

What I did do in this case was to burn a regular CD rather than DVD with a system. This is a problem if you want to install Ubuntu because, apparently, there are no longer such images available for current versions of the operating system. But, Debian sill has an image that does this. Since I was giving serious thought to installing Debian rather than Ubuntu (which is based on Debian but with a lot of changes that I don’t like), this was a good move. Someday I’ll clean the DVD.

I am not certain that I’ve isolated an actual problem, but when I search around for explanations for what I observe, I tend to find the same thing over and over; the usual explanations are repeated and the user with the problem is left wondering. So, I’m putting this on the Internet for people to run into while searching for answers.

Scrivener on Linux: Oh Well…

UPDATE (January 2, 2016): The makers of Scrivener have decided to abandon their Linux project. Kudos for them for giving it a try. The Scrivener on Linux users were not many, and almost nobody donated to the project, and as far as I can tell, the project was not OpenSource and thus could not have attracted much of an interest among a community of mostly OpenSourceHeads.

So, I’m no longer recommending that you mess around with Scrivener on Linux, as it is no longer maintained. Back to emacs, everybody!

Scrivener is a program used by authors to write and manage complex documents, with numerous parts, chapters, and scenes. It allows the text to be easily reorganized, and it has numerous ways in which the smallest portion of the text, the “scene,” and larger collections of text can be associated with notes and various kinds of meta-data. It is mainly a Mac program but a somewhat stripped down beta version is available for Linux.

In some ways, Scrivener is the very embodiment of anti-Linux, philosophically. In Linux, one strings together well developed and intensely tested tools on data streams to produce a result. So, to author a complex project, create files and edit them in a simple text editor, using some markdown. Keep the files organized in the file system and use file names carefully chosen to keep them in order in their respective directories. when it comes time to make project-wide modifications, use grep and sed to process all of the files at once or selected files. Eventually, run the files through LaTeX to produce beautiful output. Then, put the final product in a directory where people can find it on Gopher.

Gopher? Anyway …

On the other hand, emacs is the ultimate linux program. Emacs is a text editor that is so powerful and has so many community-contributed “modes” (like add-ins) that it can be used as a word processor, an email client, a calendar, a PIM, a web browser, an operating system, to make coffee, or to stop that table with the short leg from rocking back and forth. So, in this sense, a piece of software that does everything is also linux, philosophically.

And so, Scrivener, despite what I said above, is in a way the very embodiment of Linux, philosophically.

I’ve been using Scrivener on a Mac for some time now, and a while back I tried it on Linux. Scrivener for the Mac is a commercial product you must pay money for, though it is not expensive, but the Linux version, being highly experimental and probably unsafe, is free. But then again, this is Linux. We eat unsafe experimental free software for breakfast. So much that we usually skip lunch. Because we’re still fixing breakfast. As it were.

When you create a Scrivener project, you can chose among a number of templates.  The Scrivener community has created a modest number of alternatives, and you can create your own. The templates produce binders with specific helpful layouts.
When you create a Scrivener project, you can chose among a number of templates. The Scrivener community has created a modest number of alternatives, and you can create your own. The templates produce binders with specific helpful layouts.

Anyway, here’s what Scrivener does. It does everything. The full blown Mac version has more features than the Linux version, but both are feature rich. To me, the most important things are:

A document is organised in “scenes” which can be willy nilly moved around in relation to each other in a linear or hierarchical system. The documents are recursive, so a document can hold other documents, and the default is to have only the text in the lower level document as part of the final product (though this is entirely optional). A document can be defined as a “folder” which is really just a document that has a file folder icon representing it to make you feel like it is a folder.

The main scrivener work area with text editor (center), binder and inspector.
The main scrivener work area with text editor (center), binder and inspector.
Associated with the project, and with each separate document, is a note taking area. So, you can jot notes project-wide as you work, like “Don’t forget to write the chapter where everyone dies at the end,” or you can write notes on a given document like “Is this where I should use the joke about the slushy in the bathroom at Target?”

Each scene also has a number of attributes such as a “label” and a “status” and keywords. I think keywords may not be implemented in the Linux version yet.

Typically a project has one major folder that has all the actual writing distributed among scenes in it, and one or more additional folders in which you put stuff that is not in the product you are working on, but could be, or was but you pulled it out, or that includes research material.

You can work on one scene at a time.  Scenes have meta-data and document notes.
You can work on one scene at a time. Scenes have meta-data and document notes.
The scenes, folders, and everything are all held together with a binder typically displayed on the left side of the Scrivener application window, showing the hierarchy. A number of templates come with the program to create pre-organized binder paradigms, or you can just create one from scratch. You can change the icons on the folders/scenes to remind you of what they are. When a scene is active in the central editing window, you can display an “inspector” on the right side, showing the card (I’ll get to that later) on top the meta data, and the document or project notes. In the Mac version you can create additional meta-data categories.

Scrivenings Mode
Scrivenings Mode
An individual scene can be displayed in the editing window. Or, scenes can be shown as a collection of scenes in what is known as “Scrivenings mode.” Scrivenings mode is more or less standard word processing mode where all the text is simply there to scroll through, though scene titles may or may not be shown (optional).

A lot of people love the corkboard option. I remember when PZ Myers discovered Scrivener he raved about it. The corkboard is a corkboard (as you may have guessed) with 3 x 5 inch virtual index cards, one per scene, that you can move around and organize as though that was going to help you get your thoughts together. The corkboard has the scene title and some notes on what the scene is, which is yet another form of meta-data. I like the corkboard mode, but really, I don’t think it is the most useful features. Come for the corkboard, stay for the binder and the document and project notes!

Corkboard Mode
Corkboard Mode
When you are ready to do something outside of scrivener with your project, you compile it. You can compile it into an ebook, a file compatible with most word processors, a PDF file, a number of different predefined manuscript or script formats, etc. Scrivener does all sorts of magic for writing scripts, though I know nothing about that. There is also an outline mode which, in the Mac version, is very complex and powerful. In the Linux Version it is not. So I won’t mention it.

The compile process is cumbersome, esoteric, complicated, and requires training, so it is PERFECT for the average Linux user! But seriously, yes, you can compile your document into a pre-defined format in one or two clicks, but why would you ever do something so simple? Instead, change every possible option affecting formatting and layout to get it just the way you want it, then save that particular layout for later use as “My layout in February” or “This one worked mostly.”

The Powerful Compile Dialog Box.
The Powerful Compile Dialog Box.
One might say that one writes in Scrivener but then eventually uses a word processor for putting the final touches on a document. But it is also possible that you can compile directly to a final format with adequate or even excellent results and, while you may end up with a .docx file or a .pdf file, you are keeping all the work flow in Scrivener.

This fantastic and amazing book was compiled in Scrivener directly into ebook format.

You have to go HERE to find the unsupported and dangerous Linux version of Scrivener. Then, after you’ve installed it, install libaspell-dev so the in-line spell checking works.

A scrivener project file is a folder with a lot of files inside it. On the mac, this is a special kind of folder that is treated as a file, so that is what you see there, but in Linux you see a folder, inside of which is a file with the .scriv extension; that’s the file you run to open the software directly from a directory.

Do not mess with the contents of this folder. But if you want to mess with it you can find that inside a folder inside the folder are files that are the scenes you were working on. If you mess with these when Scrivener is using the project folder you may ruin the project, but if Scrivener is not looking you can probably mess around with the contents of the scene files. In fact, the Mac version gives you the option of “syncing” projects in such a way that you work on these scenes with an external editor of some kind while you are away from your Scrivener base station, i.e., on your hand held device.

Since this data storage system is complicated and delicate, it is potentially vulnerable to alteration while being used by the software, with potentially bad results. This puts your data at risk with cloud syncing services. Dropbox apparently place nice with Scrivener. I’ve been trying to figure out if Copy does, and I’ve been in touch with both Scrivener developers and Copy developers but I’m not sure yet. I use Copy for the masses of data on my computer because it is cheaper, and I use a free version of Dropbox for Scrivener files, just in case.

I would love to see more people who use Linux try out Scrivener, and maybe some day there will be a full Linux version of it. As I understand it, the Linux version is a compiled subset of the Windows version code base (yes, there is a Windows version) and the Windows version is a derivative of the Mac version.

I should also add that there are numerous books and web sites on how to use Scrivener, and Literature and Latte, the company that produces it, has developed an excellent and useful manual and a number of useful tutorials. Literature and Latte also has an excellent user community forum which is remarkably helpful and respectful. So be nice if you go over there.

NSA Claims That Linux Journal Is A Forum for Radical Extremists? THIS MAY BE FAKE (Updated)

When I first became a regular user of Linux, several years ago, I tried out different text editors and quickly discovered that emacs was my best choice. By coincidence, about that time I ran into an old emacs manual written by Richard Stallman in the dollar section of a used booksore. In that edition, near the end of the book, was a section on “Mail Amusements.” This documented the command “M-x spook” which adds “a line of randomly chosen keywords to an outgoing mail message. The keywords are chosen from a list of words that suggest you are discussing something subversive.” (I note that the term “spook” in those days meant “spy.”) Stallman notes in the current edition of the manual,

The idea behind this feature is the suspicion that the NSA and other intelligence agencies snoop on all electronic mail messages that contain keywords suggesting they might find them interesting. (The agencies say that they don’t, but that’s what they would say.) The idea is that if lots of people add suspicious words to their messages, the agencies will get so busy with spurious input that they will have to give up reading it all. Whether or not this is true, it at least amuses some people.

It is amazing to see how things change over time. But this, unfortunately, is not a good example of change over time. As I’m sure every Linux user knows by now, the National Security Agency has included “Linux Journal” (the journal and the site, apparently) as an indicator for potential extremist activity. If you subscribe to the journal, visit the site, mention it in an email, or anything like that, your internet traffic will be subject to additional special attention.

Apparently the NSA captures all, or very nearly all, of the Internet traffic for just long enough to sort through it for key indicators, which they use to pull out a subset of traffic for longer term storage and possible investigation. If you visit Linux Journal’s web site, your internet traffic, apparently, is subject to this treatment.

Why?

Well, this should be obvious. Linux users are extreme. Linux is extreme. If I was the NSA I’d be keeping a close eye on the Linux community because that is where a major national intelligence agency is most likely to find useful, and extremely good, security related ideas. GNU/Linux, FOSS, OpenSource – these are all keywords I’d be watching because this is where the cutting edge is. LAMP systems are the most secure servers used on the Internet, by and large. Linux-like operating systems are the preferred systems for devices that need both reliability and security. I’m sure the NSA itself uses Linux as its primary operating system because it is the most adaptable and secure one they can get. If not, they probably use a cousin or hybrid of some sort.

Also, penguins. Penguins are known to be extreme. They wear tuxedos, who does that anymore? They live on the Antarctic Continent. I can’t think of anything more extreme than this. The adoption of Tux the Penguin as the symbolic mascot of GNU/Linux is a huge red flag for the entire intelligence community.

I do find it amusing that people are a bit up in arms over this. Did anyone ever seriously consider the idea that the Linux community and their Penguin friends would not be the subject of special NSA attention? It would be rather disappointing were it not. Stallman added M-x spook to emacs decades ago. We’ve known for years that the NSA snoops on everything and everyone. Linux is a widely used extremely important operating system. Linux Journal is a key publication used by a wide range of Linux extremists, er, users and developers. Of course the NSA is watching.

Kyle Rankin at Linux Journal who is a known Linux user notes that there is a more specific reason the NSA would view the Linux community as a hotbed of potential extremism. This is where things like Tor and Tails exist as projects and are mostly used. These are, of course, technologies to be more anonymous on the internet. Tor comes form a project originally funded by the US Naval Research Laboratory and DARPA with early work on it supported by the radical Electronic Frontier Foundation. It has also been funded by the US State Department and the National Science Foundation. The original idea was to allow communications over the internet to be untraceable so sailors (or others) could write home and keep their lips tight (loose lips sink ships and all that). With subversive beginnings and evil intent such as this, naturally the NSA would want to keep an eye on it.

I’m sorry to tell that if you’ve been reading this blog post you are probably on the NSA list of extremists. I use the terms “Linux Journal,” “Linux,” and “Penguin” several times in this blog post. And you are looking at this blog post in your browser. You are so screwed.

I would like to challenge the OpenSource/FOSS/GNU/Linux community to take up Stallman’s initiative and bring it to the next level. Let us M-x spook the spooks. Apps, browser add-ins, cron scripts, and other small scale technologies could be used to add subversive terms such as Linux Journal and Penguin to all of our Internet traffic, all the time. The NSA would quickly run out of disk space and someone would tell them to get back to work and do something useful. Real extremists just made a radical extremist Caliphate in the Middle East forchristakes. I would think the NSA would be more focused on such things than on Linux Journal, or Linux. I can see keeping an eye on the Penguins, though.

UPDATE: Charles Johnson send me THIS and THIS. This whole thing could be fake. Go have a look and tell me what you think.

Ubuntu One Is Closing Shop

Hey, wait! Ubuntu One was the next big thing. It was better than dropbox and iTunes and everything! I never personally got it to work for me, though I did sign up for it. Just now, I got an email from The Ubuntu One team telling me the file service system would be gone effective June 1, 2014.

Ubuntu has this blog post about it. This news is a few days old, so you probably already knew about it, but just in case, have a look: Shutting down Ubuntu One file services.

Today we are announcing plans to shut down the Ubuntu One file services. This is a tough decision, particularly when our users rely so heavily on the functionality that Ubuntu One provides. However, like any company, we want to focus our efforts on our most important strategic initiatives and ensure we are not spread too thin.

Our strategic priority for Ubuntu is making the best converged operating system for phones, tablets, desktops and more. In fact, our user experience, developer tools for apps and scopes, and commercial relationships have been constructed specifically to highlight third party content and services (as opposed to our own); this is one of our many differentiators from our competitors. Additionally, the free storage wars aren’t a sustainable place for us to be, particularly with other services now regularly offering 25GB-50GB free storage. If we offer a service, we want it to compete on a global scale, and for Ubuntu One to continue to do that would require more investment than we are willing to make. We choose instead to invest in making the absolute best, open platform and to highlight the best of our partners’ services and content.

Etc. Etc.

Interesting. Seems to me a convergent system like they want to build would have a kick-butt cloud. On the other hand, having an open source operating system not married to a particular cloud may be a good way to go. Less Microsofty.

Linux Shell Scripting

I just finished Linux Shell Scripting Cookbook – Third Edition by Shantanu Tushar and Sarath Lakshman. This is a beginner’s guide to using shell scripting (bash) on linux.

Usually, a “cookbook” is set up more like a series of projects organized around a set of themes, and is usually less introductory than this book. “Linux Shell Scripting Cookbook” might be better titled “Introduction to Linux Shell Scripting” because it is more like a tutorial and a how too book than like a cookbook. Nonetheless, it is an excellent tutorial that includes over 100 “recipes” that address a diversity of applications. It’s just that they are organized more like a tutorial. What this means is that a beginner can use only the resources in this book and get results. The various recipes are organized in an order that brings the reader through basics (like how to use the terminal, how to mess with environment variables, etc.) then on to more complex topics such as regular expressions, manipulating text, accessing web pages, and archiving. One very nice set of scripts that is not often found in intro books addresses networking. The book also covers MySQL database use.

All of the scripts are available from the publisher in a well organized zip archive.

I read the e-version of the book, in iBooks, but the PDF version is very nice as well. I don’t know how this would translate as at Kindle book. But, importantly (and this may be more common now than not) the ebook uses all text, unlike some earlier versions of ebooks that used photographs of key text snippets as graphics which essentially renders them useless. Of course, copy and paste from a ebook is difficult, and that is where the zip file of scrips comes in. You can open the PDF file, get the zip archive, and as you read through examples simply open up (or copy and paste) the scripts from the zip archive and modify or run them. Also, the ebook is cheaper than a paper edition and clearly takes up way less space!

If I was going to recommend a starting out guide to shell scripting this is the book I’d recommend right now. It is well organized and well executed.

I do have a small rant that applies to virtually ALL tech-related books I’ve seen. There is an old tradition in *nix style documentation of putting certain information in the front matter. Books always have front matter, of course, but computer documents tend to have more front matter than usual. A typical example is this reference resource for Debian.

Notice all that stuff in the beginning. Like anybody reads any of that, especially the “conventions” section. Proper typography in a code-rich book does not have to be explained in detail. You can see what is code, what are comments, etc. etc. Most of this information should be added as an appendix at the end of the book where it is out of the way and can be ignored.

On a web page like the one shown here all you have to do is scan down, but in a book you have to leaf (virtually or meatspacelly) past all that stuff to get to the actual book contents. The Linux Shell Scripting book being discussed here has the first actual text on actual page 25 or so (though it is numbered page 8). I recommend moving as much of this front matter as possible to the back.

But that is a general rant about all books of this sort, which I happen to think of while reviewing this book.

Some Linux/Ubuntu related books:
Ubuntu Unleashed 2016 Edition: Covering 15.10 and 16.04 (11th Edition)
Ubuntu 16.04 LTS Desktop: Applications and Administration
The Linux Command Line: A Complete Introduction