RSync gets a pretty decent front end GUI

I’ve used RSync for years as my primary backup tool. It’s excellent for mirroring directories and ensuring that backups are made headache-free. I often attach the RSync commands to cron jobs and have them execute on a regular basis. I sometimes have it email me the results, though I usually just pipe the output to file.

Like many great tools in Linux, RSync has a myriad of command line options, some of which can be quite confusing.

I’ve recently discovered a very pretty front end GUI for those uninitiated in the use of RSync, who wish to begin trying it — GRSync — I suppose the G is for ‘graphical’.

Here’s an article on the topic and here’s the product page.

A good reddit discussion on why the Unity interface for Ubuntu is so bad

Sorry for being away for about 6 months. I do enjoy blogging, though I mainly do it to retain articles I find interesting (or that some people close to me might find interesting). I suppose I use it as my own version of Pinterest, except that I have complete control over the presentation.

Anyway, I plan to return to blogging more regularly (so long as work/life permits).

Having said that, here’s a great discussion on why so many people dislike the Ubuntu Unity desktop interface.

I am still using the Long Term Support version of Linux Mint 13 (maya) with the MATE interface (which is as close to Gnome 2.x as we can get and still get maintained packages). It is most-definitely my distro of choice and likely will remain so for some time.

In addition, there’s some fiery talk about how Ubuntu’s default Unity interface is sending your local searches (made on your local system, looking for files or applications) to a 3rd party. The software also has an opt-out approach on-installation, which means that upon standard installation any searches you make are sent to those 3rd parties by default apparently in some desperate attempt to monetize the userbase. The article stipulates that if this indeed the case, it could very well be the beginning of the end for Ubuntu.

In which case, I expect distros like Mint and others to just use default Debian as their base distro to source their spinoffs, instead of Ubuntu proper.

Richard Stallman has browbeaten Canonical for this and a flame war has erupted — a good read (but don’t try to without some popcorn or your favorite finger-food of choice). The source article links out to the various posts by Stallman and Canonical.

Use BaGoMa to backup your GMail account to your PC

BaGoMa is a python script that allows you to backup your GMail email account.

Because of some limitations google puts on IMAP transfers, it may take a few days to successfully back up an entire GMail account if it’s a large one. I had nearly 200,000 messages in my GMail account and it took almost a week of running the script daily (each run took a few hours) to get all the messages and attachments and labels properly downloaded (in daily chunks) and indexed. Though once done, the average daily backup update takes about 10 minutes.

It’s easy to throw the script into a cron job and have it run nightly. BaGoMa will also restore the backed up emails to ANY IMAP server, be it another GMail account or any email server on any domain that supports IMAP.

It works really well. I recommend you check it out.

 

Linux Mint 13 – MATE Edition – is a supreme success!

Those who read my blog may recall that in February of this year, I wrote regarding the state of Linux desktop environments:

One of the burdens we all bear as Linux users is that from time to time, Linux (or one of its pillars) goes through an identity crisis. This time it’s the Gnome desktop. While working through any of these crises, users sometimes have to flounder around for a distribution to call home. For now, for me, that’s Linux Mint. On my “main box” though, I am still running Ubuntu 10.04 which has support until April 2013 and is still running true Gnome 2.x. That box runs like butter and for now I don’t plan to touch it.

Linux Mint 13 was released about a week ago. It was released in two distinct editions: Cinnamon and MATE. I’ve covered this topic before at length in the past, explaining that Cinnamon is a project of the Linux Mint developers to approximate the Gnome 2 interface using Gnome 3 code, whereas MATE was a real fork of the Gnome 2 platform. Ultimately, the goal of the MATE project was to maintain the feel of Gnome 2 using GTK2 code. I know there is a significant portion of users who want to retain that experience, myself included.

Mint 13 – MATE edition offers some very appealing features.

  1. It’s based on Ubuntu 12.04, which is a LTS (Long Term Support) edition, meaning it will continue to be supported with updates and packages for (5) years, until April 2017. Since Mint is based on Ubuntu 12.04, all those juicy forum posts that support Ubuntu questions & issues will work just fine under Mint as well.
  2. The Gnome 2.x feel is nearly entirely intact — bravo!

These features were critically important for me, since my main box at home was “stuck” running Gnome 2 under Ubuntu 10.04, which would only be supported until April 2013. The clock was ticking and I knew that soon I’d be unable to get updated packages or bug fixes.

Mint 13 offered me a way out, a way to keep my Gnome 2 style interface in a fully integrated & supported distribution of Linux that would continue to get bug fixes, updates and support well into the future.

I tried Cinnamon and Gnome 3, neither are my cup of tea (never mind Unity or KDE). I’ve heard of people using all sorts of extensions or addons under Gnome 3 to approximate the Gnome 2 look and I did try some of them, but all in all it was a poor approximation.

In preparation for this installation of Mint 13 MATE Edition, I purchased a new 2TB hard disk (my old disk was only 500GByte). My Ubuntu 10.04 hard disk had seen a lot of usage over the past few years and I do a lot of work on my main box. I felt it was time to not only move to a new distro given the state of desktop environments, but to install it on a recently manufactured hard disk to ensure some longevity. Even though I had already tested Mint 13 on my testing-box, I had planned to leave my original Ubuntu 10.04 disk intact temporarily, just in case. It would ride shotgun to my newly installed my 2TB disk which would be the destination of my new Mint 13 install.

Prior to doing anything of course I had (2) backups of the old 10.04 hard disk, including my entire /home partition. Everything was backed up to external drives.

The installation process was smooth and quick. It even saw my 10.04 install on the 2nd hard disk and populated Grub with boot options, so I could boot into 10.04 at will! At some point in the future when I’m thoroughly comfortable with the new install, I’ll be formatting that old hard disk with 10.04 on it and remove the 10.04 boot options Mint had added to Grub.

I’ve been using Mint 13 on my main box for about 3 days now and it’s been the a supremely smooth transition. I’ve had no major issues, and configuration was quick and easy. I brought over some dot-folders, such as .filezilla and .xchat2 so I could easily bring over my configs for those programs which I use heavily, so I wouldn’t have to reconfigure them from scratch. I had made notes of all my applications installed on my 10.04 box, including command line only applications as well as graphical ones. It took me all of 15 minutes to apt-get install and configure all of them.

I spent some time doing some customizations. Primarily, I still enjoy my main menu up at the top-left, and my open-window-tabs at the bottom, along with a desktop switcher in the lower right. I also like my time of day at the top right — like a classic Gnome 2 interface. Mint sets up the MATE interface to look a little bit like a classic Windows Start Menu and I didn’t care much for it. Within about 3 minutes I had taken the default MATE interface and changed it to my preferences. Adding new panels and making them transparent was easy and the same process as it would be under Gnome 2.x classic.

This shows the default Mint Desktop on the left and my customizations on the right. Click the image to see it in full fidelity

My system monitor applet was available as well (which can be seen above, in the upper right — the 4 squares) which allow me to monitor CPU usage, memory usage, network usage and swap file usage — all in real time. As far as I know, this is not even yet available for a panel in Gnome 3!

I have my classic cascading menus back, within a new distro that has 5 years of support — and now all is well with the world.

Many of the developers of MATE are hanging out on the IRC and since finding them I have struck up a nice conversation with them. I’ve been reassured that MATE will get tender loving care for some time to come and I’m glad to hear it. Like anyone else, I could have adapted to Gnome 3 or Unity — but I truly felt these were inferior desktop environments.

A great asset to being developed by a community, if enough people decide to change something, they can. Linux offers change with choice, and my choice is Linux Mint MATE Edition.

I personally want to thank the developers of MATE in this blog post: Perberos, Stefano Karapetsas (stefano-k), Clement Lefebvre (clem), Steve Zesch (amanas).

If you use MATE, please consider donating to the project as I have. It’s small and could use the funds for hosting the site as well as to support the developers.

Leveraging Google’s two-factor-identification to secure SSH logins

Recently, Google began to offer two factor authentication as an available option to securely login to Google accounts (Gmail, Google Docs, etc). Insodoing, they also offered an open source PAM module that will allow you to do one excellent thing: Force 2FA (two factor authentication) for SSH (secure shell) logins (or local logins) for Linux computers. This is a fully open source module as referenced in the open standard, RFC 4226.

What this means is, that when you SSH into your Linux box, you’ll not only be asked for a password, but a code that you’ll need to use your mobile phone or tablet to generate. This will greatly enhance security on your Linux box, because hackers won’t simply be able to brute force attack your publically accessible SSH box anymore — they’ll also need a way to generate that unique, time-sensitive sequence of numbers that changes every 30 seconds on your android, iPhone or Blackberry device. It essentially turns your cell phone or tablet into a token that you will almost always have on your person, but a hacker wouldn’t. Even if your SSH password somehow were compromised, your system would be completely safe, since now you need two factors to login to your Linux computer.

To allow this concept to be more fully understood, Google created a video, which I’ll post below. While this video explains how two factor authentication can be applied to access your Google account more securely, in this blog post I am going to show how to use the same Google technology generically (without having to activate Google’s 2-step verification on your Google account!) as an extension of your own security tools to better secure your Linux computer(s). This tool is independent of any Google account, you don’t even need a Google account to use this two factor authentication tool.

The underlying technology used by Google is made freely available as open source software. You can use this software to further secure your SSH logins (or local desktop logins) on your Linux boxes.

While the video explains that Google can text you your numeric code (which only lasts for 30 seconds), this PAM module cannot send SMS messages, instead it requires that you use the Google Authenticator App which is available for Android, Blackberry and iPhone/iPads to generate your 30 second code.

My explanation below is going to be Debian (Ubuntu, Mint, etc.) specific. There is a page that explains this for Fedora users. This process is exceeding simple, and easy to implement. It is also very easy to reverse, should you want to stop using 2FA for your Linux box. I’ll explain all this in detail.

First, you’ll want to install the ‘libpam-google-authenticator’ package, which is a standard Debian package:

sudo apt-get install libpam-google-authenticator

Next, you’ll need to run the google-authenticator program from command line to generate your keys and provide a QR Code for you to scan using your phone/tablet. The purpose of the QR Code is just to save you time entering the code into your Google Authenticator App manually. You’ll need an application installed on your phone that can scan QR Codes. I use QR Droid on my Android phone. The QR Code is generated in ASCII and will scroll up on your command shell screen; you may need to widen your terminal window to allow you to see the entire QR Code.

To run the Google-Authenticator program, just type from command line:

google-authenticator

This command will generate the large ASCII QR Code as well as some other information you’ll need. Here’s a sample of what you’ll see below the large ASCII QR Code in your terminal window:

Your new secret key is: DAEP55X5AZEVCAFB

Your verification code is 866046
Your emergency scratch codes are:
67868555
26247221
12215527
55436461
21077916

Do you want me to update your "~/.google_authenticator" file (y/n)

At this point, you’ll want to hit ‘y’ to allow this information to be entered into your /home/user/.google_authenticator file on your Linux box. Simply put, the information above is your secret key and verification code which is also the contents of the QR Code should you choose to scan it instead of typing it manually into your Google Authenticator App.

The scratch codes are one time use codes that if for any reason your phone/tablet is lost or stolen, you can use to login in case of emergency. At that point, you can choose to delete the .google_authenticator file and issue a new one, or temporarily stop using 2FA until you get your mobile device or tablet back for future logins. Be sure to write those scratch codes in a safe place where they can be accessed if needed (like when you don’t have your mobile device with you!)

After you entered ‘y’ to the question above, the next question asked will be:

Do you want to disallow multiple uses of the same authentication token? This restricts you to one login about every 30s, but it increases your chances to notice or even prevent man-in-the-middle attacks (y/n)

This is basically asking if you are OK with allowing a generated code to be used more than once in a 30 second period assuming multiple, concurrent SSH logins. This is a matter of preference.

Next, you’ll be asked:

By default, tokens are good for 30 seconds and in order to compensate for possible time-skew between the client and the server, we allow an extra token before and after the current time. If you experience problems with poor time synchronization, you can increase the window from its default size of 1:30min to about 4min. Do you want to do so (y/n)

What this really means is that a code generated is really good for 90 seconds instead of 30. Even after a new code is generated at the 31st second, the old code will still be good for another 60 seconds, and it’s configurable up to 4 minutes. I chose ‘n’ here, to allow for the default, 90 second time frame.

It is very important, however that you set up NTP on your system, so that you do not rely on your computer’s clock which will likely drift over time. Since the second factor code generated by the Google Authenticator App is time sensitive, it is critical that your PC have the accurate time of day to match Google’s servers.

In Debian based distros of Linux (Debian, Ubuntu, Mint, etc) – here’s how to install the NTP client on your computer and make sure it’s working:

Install NTP: sudo apt-get install ntp

That’s it!!! Now, to confirm that the server is working using the ‘ntp query’ program:

ntpq -np

The output of this command will show you a small table. You’ll want to look for a list of IP addresses under the ‘remote’ column. One of them should have an asterisk * next to it — that means that you’re successfully checking time with the time server that owns that IP address. If you don’t have a star next to one of the IP addresses, it means that the clocks are unreachable.

Also, to ensure correct synchronization make sure the delay and offset values are non-zero and the jitter value is under 100. Here’s a couple of links for more info: One, two.

Anyway, back to our setup process. You’ll be asked one final question before you’re returned to the command prompt:

If the computer that you are logging into isn't hardened against brute-force login attempts, you can enable rate-limiting for the authentication module. By default, this limits attackers to no more than 3 login attempts every 30s. Do you want to enable rate-limiting (y/n)

Again, this is a matter of preference, but I did enable rate limiting by entering ‘y’.

Once this is done, your system now has the correct codes in the “.google_authenticator” file in your home directory. You must now configure your Google Authenticator App and Linux system to use it.

Run your Google Authenticator app and add an account. You can scan that ASCII generated QR Code, or do it manually. I did mine manually, so I entered “Linux Box One” as my account name, and I entered “DAEP55X5AZEVCAFB” as my key (from the example above). Next you have an option for Time Based or Counter Based — I went with the default, Time Based.

Once this is done, the app will begin rotating your codes. You can add as many accounts as you like and you can remove them as well, if you choose to regenerate your codes.

You now need to configure your Linux box to use the Google codes for SSH logins.

On your Linux box, you will have to modify you /etc/ssh/sshd_config file:

sudo nano /etc/ssh/sshd_config

. . . and enable ChallengeResponseAuthentication. Search for this line and if disabled, replace the ‘no’ with a ‘yes’. Save your changes.

Restart your ssh service. When I tried this using ‘sudo’ I got some odd error, so I had to forcibly switch to root using the ‘su’ command, then restart ssh from there. Either way, you bounce ssh in Debian with the command:

sudo service ssh restart

. . . if this errors for you try this as full root:

su
(enter your root password)
service ssh restart
exit

Next, you’ll need to modify the /etc/pam.d/sshd file to let the Linux PAM SSH module know about the google authenticator module:

sudo nano /etc/pam.d/sshd

. . . and add the following line to the bottom of the file:

auth required pam_google_authenticator.so

Save your changes.

Once this is done, you’re good to go.

Start a fresh SSH session into your Linux box and you’ll be asked for your password AND YOUR CODE. You’ll need to get that code from your Google Authenticator App.

To undo all this, simply change the ChallengeResponseAuthentication line in your /etc/ssh/sshd_config file from ‘yes’ to ‘no’. Then edit your /etc/pam.d/sshd file and delete that line at the bottom (auth required pam_google_authenticator.so). Restart your SSH service (sudo service ssh restart) and on your next login, you’ll just be prompted for your password.

You can delete the “.google_authenticator” file if you like, or leave it there. So long as you reversed your changes to sshd_config and PAM’s sshd file, you will not be asked for the Google code.

A very cool thing about this, is the contents of the “.google_authenticator” file is user specific. This means you can repeat this process on the same Linux box as various non-root users, creating unique .google_authenticator files in each user’s home directory. Each user will have different (but valid) codes being generated on their respective mobile devices to login to the same Linux box. Each user will also get their own sets of scratch codes.

If you want this to also work for LOCAL LOGINS on your Linux box, you need to add the ‘auth required pam_google_authenticator.so’ line to your /etc/pam.d/common-auth file.

sudo nano /etc/pam.d/common-auth

Paste this line at the bottom of the file:

auth required pam_google_authenticator.so

Save your changes. On your next login to your Linux box locally (as opposed to using SSH), your Linux box will ask for your password and your Google code.

Here’s some sources I used for this blog post: Onetwo, three.

Linux Noobs – stop reading here!!

1. One word of warning to advanced users — if you don’t use passwords for your SSH sessions but instead use keys (PubKeyAuthentication), remember that SSH does NOT USE PAM when you have keys enabled! PAM and PubKeyAuthentication are mutually exclusive of each other in OpenSSH, therefore this will not work for you!

2. Yes, there is a Java-based Google Authenticator app out in the wild (for running the Google Authenticator app on your laptop or desktop instead of your mobile device) — but I have not tested it — here’s the link.

3. Also, if your home partition is encrypted, you may have a catch-22 where the  .google_authenticator file is inaccessible because it’s encrypted on your home partition and therefore your Linux box cannot authenticate the code you’ve entered. Check the comments under this web page, where this issue is briefly discussed.

Enjoy!

How to almost restore the “GMail Classic” Interface

Well — Google has forced us to use their “new and improved” GMail interface … What’s with everyone changing interfaces? Ubunu brings in Unity, Gnome drops 2.x for 3.x, Microsoft Windows 8 looks like a “tonka toy” interface, and now Google forces this “improved” GMail user interface.

Well for those who thought that the original was just fine, thank you very much … here’s how to pretty closely approximate “GMail Classic”.

First, you need to be using Google Chrome or Firefox as you have to install a browser extension. The extension names are the same, but I’ll be discussing the Chrome method here. This “fix” will have to be repeated on every computer where you use GMail since extensions do not propagate across browsers like links & preferences do in Google Chrome.

First, the hieroglyphic icons are very confusing to me so if they are for you as well, change the icon-buttons back to <TEXT> style buttons: Go to SETTINGS (click the GEAR in the upper right of your GMail page to find SETTINGS in the drop down box) and on the GENERAL Tab go down to BUTTON LABELS and select “Text”. Apply the settings.

Second, the default text density is too low to effectively read e-mail previews: Click the GEAR in the upper right of your GMail page and select “compact” so that more text is on the screen. This text density matches that used in “GMail Classic”.

Ok, now for the big one … first you have to install a chrome Extension.

Install this extension (Make sure it’s written by Jason Barnabe): https://chrome.google.com/webstore/search/stylish

Stylish is just a framework that allows you to then install custom styles for specific webpages, like Youtube, or Gmail, etc. You now have to install the “Gmail Classic” custom style for the GMail webpage.

Visit this URL: http://userstyles.org/styles/62621/gmail-messages-rounded-and-compressed. This is a page that allows you to install a custom style into the extension you just installed.

If you visit the webpage above before installing the Stylish Extension, it’ll show you the page but you won’t see an INSTALL icon on the top right of the page. However, once the Stylish Extension is installed and activated, when you visit the link above you should see an INSTALL icon.

Once you install that custom style, just refresh your GMAIL page and enjoy a close approximation of GMail Classic!

You can disable this extension easily by just clicking on the wrench in the upper right of your Chrome browser, and then click on SETTINGS. From there, go down to EXTENSIONS and you’ll see the Stylish extension. You can easily disable it right there and all you’ll have to do is refresh your GMail page and it’ll revert back to the “new” GMail interface.

If at the EXTENSIONS menu, you instead click the OPTIONS link, you’ll open a browser tab to the “Old GMail” custom style. There, you can check for an update to the style (if one exists) or surgically disable the custom style, while leaving the Stylish extension activated.

This custom style is by no means perfect and needs polishing. But it’s close enough for me to keep my sanity!

Oh, and if you want to browse other custom styles for GMail that work with the Stylish Extension, visit this page: http://userstyles.org/styles/browse/all/gmail

Enjoy!

How to update your system to the latest/greatest Libreoffice directly

I like to be running the latest version of Libreoffice, since the repositories are often behind the times and bug fixes are coming very frequently. You could do it through a PPA, but I prefer the direct .deb files.

Make sure you’ve downloaded the right version of Libreoffice — in my case, it was the 32-bit Debian version (all .deb files). I downloaded the torrent file and opened it using the torrent downloader Transmission to download the main program (about 150megs). I didn’t bother with the separate help file, since I google most of my Libreoffice questions. Extract the archive to a temporary directory. I already had Libreoffice installed, but it was an older version. It extracts to 2 main directories, a DEBS directory and underneath that a desktop-integration directory.

sudo apt-get remove --purge libreoffice*.* (purges all aspects of the current install of libreoffice. If you had OpenOffice installed, you could just type: sudo apt-get remove –purge openoffice*.*)

sudo apt-get autoremove (removes unwanted, associated libraries)

Change directory to your extracted files, where all the .deb files are located (the path of the extracted archive will look something like LibreOffice_4.1.0.4_Linux_x86_deb/DEBS).

sudo dpkg -i ./*.deb (this will install all the numerous .deb files in this directory)

Change directory to the desktop-integration directory. New versions of Libreoffice (4.1 and up) seem to have removed the separate desktop-integration directory and integrated it into the core debs directory … so the initial dpkg install command should include desktop integration. So running the next command below may not be needed in newer version of LibreOffice.

sudo dpkg -i ./*.deb (just an easy way to install the single .deb file here, this will set up desktop integration to your current desktop).

Once this is done, you’re all good to go!

When running Libreoffice calc, you may see the formula bar missing (a little quirk), just go to the menu up to, go to View, then check off “Formula Bar”.

An easy way to name files with year, date and time

Whether you’re creating log files of any crontab job, or “scheduled task” (to use windows parlance) or if you keep a journal or any other reason, you may want to name a file with today’s current year, date & time.

Assuming you were just doing a simple command like ‘ls’, an easy way to dump the output to a file with the full date and time would be:

ls >>`date +"%Y%m%d%H%M%S"`

The double arrow >> simply says to append the output to the file, instead of overwriting it (in case it already exists). Note that this filename goes all the way down to the second, so it’s highly unlikely the file would ever be overwritten by another job in crontab, even if you only used a single > arrow.

To rename an existing file and append the date, just type:

mv ./filename ./filename.`date +"%Y%m%d%H%M%S"`

This will add a dot and then the year, month day and time.

If you were keeping a simple text journal, just run your favorite command line text editor (like nano) and simple save the file with this filename:

`date +"%Y%m%d%H%M%S"`

This will give you a filename that looks like: 20120223073844 which is literally the year, month, day, hour, minute and second that the file was saved. if you want some breaks in there for readability save the file as:

`date +"%Y_%m_%d__%H_%M_%S"`

Which would look like: 2012_02_23__07_38_44. The hours will be in 24-hour mode, though you could use %r for an AM/PM mode, it auto-adds :colons: to the time, which can make dealing with filenames rather difficult.

Note that if you try this in a graphical editor, it will actually save the file literally as `date +”%Y_%m_%d__%H_%M_%S”` instead of using the meaning of those variables, so this will only work in a text-based editor such as nano, or vi but wouldn’t work with gedit.

Here’s a reference for date and time variables.

Linux Mint is overtaking Ubuntu

…while it’s based on Ubuntu oddly enough, the people at Linux Mint actually listen to their user base.

If you read this article, you’ll see that the founder of Linux Mint agrees that the slaying of Gnome 2 in favor of Gnome 3 was:

. . . idiotic. I agree with what Linus said about it and the thing that upset me the most was the fact that nobody cared about what people wanted. GNOME 3 could have used a different name, or at least been packaged with new libs and in a way that allowed people to continue to run GNOME 2. The way it was done, you could only replace GNOME 2 with GNOME 3, not run both.

GNOME 2 is the most popular Linux desktop out there and a few people decided we were no longer going to use it. Of course, people always get what they want – MATE will bring back GNOME 2, but it will take time to get right.

Just a few links showing how many others feel the same way: Onetwothree.

I have tinkered with Linux Mint, and while it looks and acts very much like Ubuntu 10.x (which is good), I still think there’s some polishing needed. I tried Linux Mint Debian Edition (also known as LMDE), and I found it to be a very nice iteration of the latest classic Debian, though it did have a variety of bugs, especially when updating. I wouldn’t recommend Linux Mint Debian Edition to a beginning Linux user. Also worthy of note is that LMDE is running Gnome 2.32 … real Gnome 2.x.

I would encourage you to read up on the MATE (independent fork of Gnome 2) and Cinnamon (fork of Gnome 2 written by the developers of Linux Mint) projects, as they both endeavor to bring back that Gnome 2 flavor we all want. (I wonder if the Gnome developers are listening … )

Linux Mint 12 standard 32 or 64 bit edition runs very nicely. The Gnome Classic mode which you can select from the login menu, looks and acts very much like Gnome 2, or at least enough like it that one can actually get some work done. I am now running Linux Mint 12 on my laptop, and Linux Mint Debian Edition on a testing PC. Linux Mint 12 is running smoother than the Debian Edition, however the downside to Mint 12 is that to move to Mint 13 or any future major version, it seems you should completely uninstall and reinstall from scratch. While they support upgrading, I’m not sure how cleanly it works (I’m still researching this). LMDE allows for rolling distributions, so you can install any version, and just apt-get dist-upgrade yourself to the latest kernel and software.

For anyone experiencing frustration at the direction of Ubuntu and Gnome and doesn’t find solace in Red Hat or SUSE (as I did not), then Linux Mint provides a port in the storm, and has a more polished desktop than XFCE under Xubuntu. It also has the “just works” factor that classic Debian lacks. Also much of the forum posts supporting Ubuntu will also work for Mint as Mint is totally based off of Ubuntu.

One of the burdens we all bear as Linux users is that from time to time, Linux (or one of its pillars) goes through an identity crisis. This time it’s the Gnome desktop. While working through any of these crises, users sometimes have to flounder around for a distribution to call home. For now, for me, that’s Linux Mint. On my “main box” though, I am still running Ubuntu 10.04 which has support until April 2013 and is still running true Gnome 2.x. That box runs like butter and for now I don’t plan to touch it.

However I think that once 2013 rolls around, Mint (and by extension, MATE & Cinnamon) will have polished itself enough that I may just consider moving my “main box” over to it — for now, it’s a port in the storm.

Ubuntu has jumped the shark and may soon be bitten by it.

Since my original post in May of this year and my follow up post in August of this year, a significant number of Linux users have lifted their voices and poured them onto clacking keyboards pushing back against the tide that is the child-like interface of Unity and Gnome 3.x. (For those who may find my choice of the phrase “jump the shark” odd, please see this link.)

Apparently there’s a pretty serious backlash to the whole Unity desktop as well as the Gnome 3.x interface.

Just a few posts to illustrate this point:

One, two, three, four, five, Six (#6 is complete with some video comparisons). Also, a recent post on Technorati shows that Debian classic is indeed beconing Ubuntu refugees back home. Here’s another recent article on why Gnome Refugees Love XFCE.

At this point many are about ready to write off Ubuntu, and switch back to Debian proper, or an entirely different distribution, such as Fedora 16. Some have chosen to stay with Ubuntu for now, but run a variation on it with Xubuntu (which has been my choice for now).

At this point I am not sure what Ubuntu brings to the table anymore except excellent forum support and perhaps a good set of drivers for hardware detection among various laptops and desktops. Linux Mint is already working on their own Gnome 3 fork called MGSE, which will look at lot closer to Gnome 2.

It also looks as though a serious OS-independent fork of the Gnome 2.x interface has already begun, it’s called MATE. Though I’m not sure how ready it is for use.

Ubuntu has reached the pinnacle of its exposure and I think the bubble in which the developers and Mark Shuttleworth live has entirely squelched the outcry, probably to their detriment.

I do hope the Gnome developers also see the error of their ways. Gnome 3.x is a “Windows Vista” of sorts — a real shark-jumping moment, but could be corrected with a mea culpa and at least an offering of a fully functioning “classic mode” for everyone not interested in Fisher Price style graphical interfaces.

I will continue to stay with Xubuntu for now, but I can readily say that I am now officially shoppin’ around.

Batch rename files in Linux with GPRename

This is a great utility if you have many jpeg’s or MP3’s that are haphazardly named and want to rename them sequentially, or with a specific pattern.

It has a very friendly GUI front end and allows for many types of character substitutions or prefixes and suffixes to be added. It also supports character substitutions for matching patterns.

Simply sudo apt-get install gprename.

Check out this link for a full write-up on the tool.

You know you’re getting old in the world of IT if …

A really great post from The Branyard at Informationweek. The comments section are very much worth reading, so click the link above to see them. Some of the comments go further back in time of tech nostalgia.

Fortunately (or unfortunately) I vividly remember and identify with a great many of these “If’s…”


You know you were part of the tech industry in the ’90s if …

1. you remember when Bill Gates did that blue-screened Win9x release onstage at Chicago Comdex.

2. you remember that there was a Chicago Comdex.

3. you were jealous of your friend’s NeXT.

4. you used a cool device that you held in your palm that made you learn how to write each letter a different way, and it changed the world.

5. you remember when people bothered to say “digital” before “camera” and “cellular” before “phone”–and only the uber-geeks and/or the really rich had either, even though both were barely usable or useful.

6. you had a pager.

7. you ever used a Macintosh clone.

8. you remember when Apple launched an unsuccessful tablet device called the Newton.

9. you defined a portable computer using terms such as clamshell, laptop, and lunchbox, instead of notebook, tablet, and smartphone.

10. you can identify the serial port and accurately discuss what it was used for.

11. you know anything at all about “the Pentium bug.” Extra credit if you know the name of the problematic instruction resulting in Intel offering replacement chips.

12. you could identify the speed a modem was connecting by the sound of the tones.

13. you went “online” with CompuServe or Prodigy.

14. your phone system and your data network used different wires.

15. you cared deeply about the 56K modem battle: spread spectrum vs. direct sequence.

16. you saw the first broadband cable modem and knew it would change the way we would think about being always online.

17. to you, Archie is not just a character in a comic and Gopher is not a small rodent.

18. you had to spell out acronyms like LAN and WAN.

19. you have a box of Zip disks.

20. you could be a network administrator and not ever use IP.

21. you remember when Ethernet was connected with hubs.

22. hearing the words “token ring” and “beacon” in the same sentence still gives you chills.

23. you saw token ring get killed when Ethernet switches were born.

24. you needed a memory manager–not for yourself but for your PC.

25. you loved that it finally was possible to attach a printer to the network and not the server.

26. you could watch flying toasters for hours on end.

27. you remember Novell had the dominant NOS and Microsoft had something called DOS.

28. you remember the OS/2 vs. Windows debate.

29. you were excited by the launch of Windows 3.0.

30. you remember when trying Linux involved downloading 27 floppy disk images, and installation carried the real risk of hardware damage if you used incorrect X Windows settings.

31. you remember the first time you used the NCSA Mosaic browser (shortly after feeding 27 floppy disks into a spare 80386 PC).

32. you could develop commercial software without fear of patent litigation.

33. you knew where Scott/Tiger came from and what software package used it as the default user name/password.

34. you thought installing software over the network instead of using floppy disks was a major leap forward.

35. you did comparative reviews of Vines, NetWare, and Windows NT.

36. you remember when IBM bought Lotus (and then everyone else).

37. you remember the Microsoft Bob operating system.

38. for you, “Chicago” means Windows 95 and “Memphis” means Windows 98.

39. you’ve actually used Windows for Workgroups or Windows Me.

40. you remember TV announcers struggling with “double u, double u, double u, dot …” and the brief period when it was considered necessary to preface that with “h, tee, tee, pee…”

41. you used the term “information superhighway” more than once, with a straight face.

42. you struggled to understand the difference between Internet and intranet.

43. you debated whether anyone would actually read the news online.

44. you remember Netscape–not just the browser but the company that put the fear of God and the Web-based operating system into Microsoft.

45. you remember publishing on the Web without cascading style sheets.

46. you ever wrote a weekly print tech-rumors column under a pseudonym.

Source: InformationWeek.

Top 10 things to call a Linux distro, if it were released by Microsoft:

I thoroughly enjoyed this post at FossForce.com.

The Top 10 things a Linux distribution might be called, if it were released by Microsoft:

10. Seattle’s Best.

9. Breakable Linux.

8. The best thing Microsoft ever came up with.

7. Open Windows.

6. Something that will never happen.

5. Closed open source.

4. Steve Ballmer’s nightmare.

3. Blue Screen Linux.

2. A distro we’ll never use.

1. SUSE

Personally, I favor #2, 3, 5, 7 and 9. I think my favorite among those would have to be #7: Open Windows. Number 10 (Seattle’s Best) sounds like a brand of Coffee . . .

The comments page over at FossForce.com have a pretty healthy number of further recommendations for such a distribution . . .

Linux’s 20th Anniversary Gala:

Linus with Maddog

The Vancouver LinuxCon celebrated 20 years of Linux, with one very special guest, in a tux.

Some more pics here, and here, and here, also a short write-up here.

Thunderbolt (Codename: Light Peak)

Thunderbolt is a new over-the-wire technology. Expect to start seeing it more and more in 2012, and I think it may eclipse USB 3.0 before it really gets off the ground.

It’s essentially PCIe over a wire, but also incorporates the DisplayPort standard which also allows the wire to handle monitor connections – the connections can also be daisychained, and it’s bidirectional. It supports up to 10Gbps in either direction (so 20Gbps bidirectionally).

For a sense of scale, in order:

– USB 2.0 runs at 480Mbit/sec (total speed in any direction)
– eSATA runs at 3Gbps (total speed in any direction)
– USB 3.0 supports 5Gbps (total speed in any direction)
– Thunderbolt can support up to 20Gbps, 10Gbps in each direction.

In real life, this wire will transfer about 3/4 of a Gigabyte (768 megs) to 1 Gigabyte in ONE second.

This is an Intel technology, but the only one using it right now is Apple. Apparently a really good reason to use this is to make devices (like tablets & laptops) lighter because it’s only 1 port, where you don’t need multiple ports to get things done. So a Thunderbolt wire could connect to a projector, monitor, PC, external hard disk, etc.. and they can all be daisychained…

Video demo from Intel: Video Part 1, video part 2.

A few articles on the topic: One, two.

Intel’s page on the subject.

Apparently, Linus Torvalds also doesn’t like the Gnome 3 interface.

You may remember some months ago I had posted about the state of graphical interfaces for Linux, and my thoughts on Ubuntu 11.x and Gnome 3.

I had commented that I did not like the way Ubuntu was going with its Unity interface, nor was I happy at all with Gnome 3’s interface. In that post I had explained that I had ultimately decided to go with Xubuntu, which is the latest Ubuntu mixed with the XFCE desktop.

It would seem that Linus Torvalds has also decided to adopt XFCE, and has urged Gnome to fork itself to bring back the Gnome 2 interface, just as I had said in my previous post.

Apparently there’s a vigorous discussion going on about it, in which Linus is participating. Below are some excerpts from the original Google+ thread (linked here):

From one message Linus says: “While you are at it, could you also fork gnome, and support a gnome-2 environment? I want my sane interfaces back. I have yet to meet anybody who likes the unholy mess that is gnome-3.”

From another post Linus says: “it’s not that I have rendering problems with gnome3 (although I do have those too), it’s that the user experience of Gnome3 even without rendering problems is unacceptable.

Why can’t I have shortcuts on my desktop? Why can’t I have the expose functionality? Wobbly windows? Why does anybody sane think that it’s a good idea to have that “go to the crazy ‘activities’” menu mode?

I used to be upset when gnome developers decided it was “too complicated” for the user to remap some mouse buttons. In gnome3, the developers have apparently decided that it’s “too complicated” to actually do real work on your desktop, and have decided to make it really annoying to do.

Here’s an example of “the crazy”: you want a new terminal window. So you go to “activities” and press the “terminal” thing that you’ve made part of your normal desktop thing (but why can’t I just have it on the desktop, instead of in that insane “activities” mode?). What happens? Nothing. It brings your existing terminal to the forefront.

That’s just crazy crap. Now I need to use Shift-Control-N in an old terminal to bring up a new one. Yeah, that’s a real user experience improvement. Sure.

I’m sure there are other ways, but that’s just an example of the kind of “head up the arse” behavior of gnome3. Seriously. I have been asking other developers about gnome3, they all think it’s crazy.

I’m using Xfce. I think it’s a step down from gnome2, but it’s a huge step up from gnome3. Really.

In my previous post about graphical interfaces I had also thought that XFCE was a step down from Gnome 2, but leaps & bounds better than Gnome 3 or Ubuntu’s Unity interface (which are remarkably similar). Apparently many others in the thread have adopted XFCE as well.

Here’s hoping the folks over at Gnome are listening (indeed, listening to Linus himself!) and give us back the Gnome 2 interface. If they love Gnome 3 so much, let them keep it — but at least offer the rest of us who don’t care for it the option to use the Gnome 2 interface again.

Happy SysAdmin Day!

The last Friday in July is System Administrator Day …. so happy SysAdmin day !

From the SysAdminDay Website:

sysadmin unpacked the server for this website from its box, installed an operating system, patched it for security, made sure the power and air conditioning was working in the server room, monitored it for stability, set up the software, and kept backups in case anything went wrong. All to serve this webpage.

sysadmin installed the routers, laid the cables, configured the networks, set up the firewalls, and watched and guided the traffic for each hop of the network that runs over copper, fiber optic glass, and even the air itself to bring the Internet to your computer. All to make sure the webpage found its way from the server to your computer.

Ted In Wires
Fig. 1 Ted.

sysadmin makes sure your network connection is safe, secure, open, and working. A sysadmin makes sure your computer is working in a healthy way on a healthy network. A sysadmin takes backups to guard against disaster both human and otherwise, holds the gates against security threats and crackers, and keeps the printers going no matter how many copies of the tax code someone from Accounting prints out.

sysadmin worries about spam, viruses, spyware, but also power outages, fires and floods.

When the email server goes down at 2 AM on a Sunday, your sysadmin is paged, wakes up, and goes to work.

sysadmin is a professional, who plans, worries, hacks, fixes, pushes, advocates, protects and creates good computer networks, to get you your data, to help you do work — to bring the potential of computing ever closer to reality.

So if you can read this, thank your sysadmin — and know he or she is only one of dozens or possibly hundreds whose work brings you the email from your aunt on the West Coast, the instant message from your son at college, the free phone call from the friend in Australia, and this webpage.

Show your appreciation! (Via SysAdminDay.)

Also, enjoy their cool photos of System Administrators in their natural habitat: One, Two, Three.

How to convert PDF’s to JPEG’s (and vice versa)

If you don’t have expensive PDF editing software, you can convert your PDF to a JPG and then edit/add text/graphics accordingly, then convert the image back to a PDF.

To convert a PDF to a Jpeg, you’ll need Imagemagick (sudo apt-get install imagemagick).

convert -quality 20 -interlace none -density 300 input.pdf output.jpg

The quality setting can be from 0 to 100 (100 being the best, but often a huge file), I have found 20 to be a good balance of size vs. quality. The interlace option helps with readability, and density specifies the dots per inch (for printing). For PDF’s with color images, you may find you need to add the option -colorspace RGB.

The default print resolution when using the convert program on PDF’s is 72 dots per inch, which is equivalent to one point per pixel. Computer screens are normally 72 or 96 dots per inch, while printers typically support 150, 300, 600, or 1200 dots per inch. To determine the resolution of your display, use a ruler to measure the width of your screen in inches, and divide by the number of horizontal pixels (1024 on a 1024×768 display). Generally, I prefer to maintain enough density to support a possible print job.

This does work the other way around, so the command below will work just fine:

convert -quality 20 -interlace none -density 300 input.jpg output.pdf

When convertnig PDF’s to Jpeg’s, each page will be it’s own numbered Jpeg. You can then convert multiple Jpeg’s back into a single PDF (the page order will depend on the filename sorted order, so be sure to number your files in the preferred order).

The command below will take a series of Jpeg’s and convert them into a single PDF:

convert -quality 20 -interlace none -density 300 *.jpg output.pdf

The Dropbox Security Saga gets Serious:

This news is a few days old, but I thought I should post an update about it as I’ve covered Dropbox security issues in a previous blog entry. There is an extensive article in Wired magazine about this Dropbox fiasco. Apparently an FTC complaint has been filed.

From the article . . .

The FTC complaint charges Dropbox (.pdf) with telling users that their files were totally encrypted and even Dropbox employees could not see the contents of the file. Ph.D. student Christopher Soghoian published data last month showing that Dropbox could indeed see the contents of files, putting users at risk of government searches, rogue Dropbox employees, and even companies trying to bring mass copyright-infringement suits.

Soghoian, who spent a year working at the FTC, charges that Dropbox “has and continues to make deceptive statements to consumers regarding the extent to which it protects and encrypts therir data,” which amounts to a deceptive trade practice that can be investigated by the FTC.

Dropbox dismissed Soghoian’s allegations.

“We believe this complaint is without merit, and raises old issues that were addressed in our blog post on April 21, 2011,” company spokeswoman Julie Supan said in a short e-mail to Wired.com. “Millions of people depend on our service every day and we work hard to keep their data safe, secure, and private.”

Dropbox, which has more than 25 million users, revised its website claims about its data security April 13, from:

All files stored on Dropbox servers are encrypted (AES256) and are inaccessible without your account password.

to:

All files stored on Dropbox servers are encrypted (AES 256).

The difference, Soghoian charges, is very important. (If his name sounds familiar, you might remember him as the one who exposed Facebook’s attempt to place anti-Google stories in the press this week.)

Dropbox saves storage space by analyzing users’ files before they are uploaded, using what’s known as a hash — which is basically a short signature of the file based on its contents. If another Dropbox user has already stored that file, Dropbox doesn’t actually upload the file, and simply “adds” the file to the user’s Dropbox.

The keys used to encrypt and decrypt files also are in the hands of Dropbox, not stored on each user’s machines.

Those architecture choices mean that Dropbox employees can see the contents of a user’s storage, and can turn over the nonencrypted files to the government or outside organizations when presented with a subpoena.

Read the rest of the article here.

Boot from USB devices on older computers that don’t support USB booting.

Ran into a great application called Plop Boot Manager. The software allows you to boot off USB devices for those older computers whose BIOS doesn’t support USB booting.

The software is free for personal use, but not free for commerical use. It’s also not open source, but a good tool nonetheless.

Click here for the application’s home site and here for a good write up on using it.

The Great Linux World Map

I thought this was a very witty creation. It’s certainly not all-inclusive but perhaps a live, interactive Google map of the Great Linux World Map would be a good idea!

Where on this map do you live ?

Source: dedoimedo.com

Google Page Speed Analyzer

Google Page Speed Online analyzes the content of a web page, then generates suggestions to make that page faster for both desktop and mobile browsers.

This site got a 71/100 … not bad. One of the biggest suggestions was to recompress the jpeg’s on the site much more than they are currently for an estimated 50% improvement. Though, I think the site loads pretty quick, it’s still a great tool for analyzing your own sites.

The direction of Ubuntu & Gnome with their graphical interfaces . . .

Well, recently we’ve seen some interesting developments. Canonical (makers of Ubuntu Linux) have come out with their latest version of Ubuntu (11.04), featuring their new default desktop interface called Unity. Meanwhile, the folks over at Gnome have also come out with Gnome 3, replacing the amazingly popular Gnome 2.x style interface. Boy, they look pretty similar, don’t they? Both of them are chasing Apple’s OS X interface for some reason.

I’m not a big fan of Apple’s interface and I’m very much not a fan of what Canonical and Gnome are doing with my desktop. While Ubuntu 11.04 offers an “Ubuntu Classic” mode (read Gnome 2.x mode here), this will only be around for 11.04. Once 11.10 comes out in 6 months, the “classic” will be discarded.

<rant> At this point I could go on about how these changes are trying to change my desktop into an oversized smartphone, or how for power users this is an absurd interface reminding me of a Fisher Price laptop, or an interface for children first learning to use a computer, but I won’t. While this perhaps might be a good interface for a tablet or a smartphone, it is not the way a desktop operating system should present itself to the user. There’s a plethora of posts on the internet about forking Gnome 2.x into its own project maintained by a separate community of developers from those over at Gnome. Others are talking about porting the Gnome 2.x shell & set of panels into the GTK 3.x platform: essentially porting the great Gnome 2.x desktop interface over so that they work well in the Gnome 3 code base. I am not sure if either of these will happen or not, but I can tell you that I cannot afford to wait & see</rant>

The people over at Linux Mint (the second most popular Linux after the Ubuntu family) have stated that they will adopt Gnome 3, but without the Gnome 3 shell (which is the interface that is presented to the user, AKA “The Desktop”.) I’m not sure exactly how they will do that, but I am sure it will require a lot of custom coding.

Ubuntu has always offered me the perfect blend of Debian’s rock solid stability with my choice of the latest & greatest apps pre-installed. I’ve enjoyed using it for a few years now, however, forcing the Unity interface on users who’ve enjoyed the classic desktop is too much for me to bear. I know in some way Canonical’s hand was forced here in that Gnome was already abandoning it’s 2.x desktop, so instead of moving with Gnome’s 3.x shell, they decided to create their own version of an astonishingly similar, child-like desktop. Both interfaces are wrong for the desktop and wrong for the power user.

The developers at Gnome are equally guilty of abandoning the classic, best desktop, forcing a completely different interface on its users without the choice & option of what I suppose has to now be called “Gnome Classic”. It’s a mistake to offer users change without choice.

For those who may think all is lost at this point: DON’T PANIC, I know where my towel is located.

In my effort to find a new home for myself, I started selectively searching through the myriad of Debian-based Linux distributions. I tried Linux Mint and while it’s based on Debian, I don’t think it’s my cup of tea. On a hunch I tried Xubuntu (Ubuntu with the XFCE desktop instead of Gnome). I’ve always regarded XFCE as a Gnome knock off with less polish. Certainly, XFCE has come a long way, but it isn’t as intuitive as Gnome 2.x. Having said that I am reluctantly choosing Xubuntu as my distro of choice. XFCE is close enough that I can bridge the gap with some tweaking. I am mostly tweaking the panels. The default bottom panel that comes with Xubuntu 11.04 is useless in my estimation. I’ve deleted that in favor of a fresh panel with simply a Window Menu, Workspace Switcher & Trashcan. I’ve removed the Window Menu from the top panel and added some quick-click launcher icons at the upper left of the panel to launch often-run apps such as Chrome, Xchat, File Manager, Calculator, etc.

All tweaks aside, Xubuntu allows me to maintain the advantage of the Ubuntu line of repositories (including medibuntu!) while offering me a desktop that is much more familiar. For me, Xubuntu is the safest bet for a new home. One should not confuse a desktop interface with an entire Linux distribution. I could easily run Ubuntu 11.04 and just install the XFCE desktop onto Ubuntu and then choose XFCE as my default desktop under Ubuntu, instead of running Xubuntu. I do think though, that while that approach may work, the folks over at Xubuntu have integrated XFCE a bit better into the Ubuntu base than I could. It would allow me to cleanly run a variant of Ubuntu without the extra Unity baggage and enjoy an overall lighter distribution as well.

The plethora of support offered in the great Ubuntu Linux community was also an important factor in my decision. Fortunately, Xubuntu is so close to Ubuntu that all the forum posts out there supporting Ubuntu will also work for Xubuntu as well. Sure, there may be some Xubuntu-specific issues with which to deal, but the grand majority of forum posts specific to Ubuntu will also address questions for the Xubuntu user.

This is very much a personal decision. Some are sticking it out with Ubuntu and the Unity interface. Others are installing Gnome 3, replacing Unity. Yet others are switching over to KDE . . . mmmkay . . . ya . . . I’m not a KDE fan. I always thought KDE was chasing the Windows “start menu” concept and I generally don’t care for their desktop. Others still are abandoning Ubuntu altogether and switching over to Fedora or OpenSuSE.

I am sticking with Debian, and with Ubuntu — just in the Xubuntu camp. Ultimately I hope Gnome 2.x finds a home with an active bunch of developers (or gets ported to GTK 3.x!!) so that I can re-adopt it. Though I fear that the decade old, rock solid Gnome 2.x desktop may be dead: wow.

Apparently, there’s no desktop for old men. The forums are afire with posts similar to this blog post (though maybe with or without the Xubuntu choice). I hope to find some camaraderie on the IRC with some folks and see where this goes. This may not be the last post on this topic.

One of the great things about Linux is choice. I wish Gnome hadn’t made their choice to abandon their 2.x interface and I wish that Canonical hadn’t forced Unity on their users, but as a member of the Linux community I can exercise my own choice and adopt Xubuntu with the XFCE desktop as my new home. I hope others in the community will not let the Gnome 2.x interface die, either by forking it or porting it over to GTK 3.x. I’d very much like to return to it, but until then, goodbye to the “original” Ubuntu and to Gnome: so long and thanks for the fish.

For those willing to give Xubuntu a try, I thought I’d mention a few issues right off the bat during my Xubuntu setup and some of the fixes:

First you’ll have to install VLC, Audacious, VNC, VINO, Chrome, LibreOffice, Screen, Nautilus, Gconf-editor, NFS server and client, SSH, SSHFS, ECryptFS, Samba, SmbFS, RDesktop, EOG (Eye of Gnome Pic Viewer), Imagemagick and perhaps a few other things I haven’t yet come across.

Below are a couple of the important ones that required some configuring beyond the basic “sudo apt-get install“:

1. XFCE didn’t auto-create a menu item when I installed Chrome, so I had to add it manually.

Add a launcher to the panel (right click the panel, go down to panel –> Add new items), then select Launcher.

Once selected, a blank square will appear on the panel at the far right, and will have a light grey/black box. Right click that new box on the panel, click Properties.

Add an item to the launcher (click the blue + sign) and search from Chrome. If it’s not listed you can add it manually by clicking the “add empty item” icon which is the white paper with the gold star. Select the icon for the application (when selecting icons its easier to select from the ALL ICONS item in the pull down) and then for the command in the launcher itself type:

/opt/google/chrome/google-chrome %U

2. Default File Manager is Thunar: No thank you. Thunar, while very light, is too light on features. I need tabbed file browsing. Therefore I have manually installed Nautilus which worked very nicely. For this you’ll also want Gconf-editor since that allows some Nautilus-specific customizations. So simply type:

#sudo apt-get install nautilus gconf-editor

Then to use Nautilus as your default file manager, go to Settings Manager –> Preferred Applications –> Utilities Tab, then select Nautilus from the File Manager pulldown menu.

If you prefer a /text/path/to/your/files instead of the graphical button style in Nautilus, the quick fix is to run gconf-editor from the command prompt, then in the configuration editor and navigate down to: apps –> nautilus –> preferences –> always_use_location_entry and make sure to check that box off.

3. I also had problems VNCing into my new Xubuntu install. To fix this, just install Vino.

sudo apt-get install vino

Then run vino-preferences (from command line) and check off your preferences:

#vino-preferences

Then you’ll have to set up Vino to start with a reboot: Go to Session & Startup in your Settings Manager, then click on Application Autostart then click ADD. Enter whatever you like for Name & Description, but in the command field, enter:

/usr/lib/vino/vino-server

 

FTP 40 Years Old Today . . .

This year is certainly the year for birthdays. The File Transfer Protocol, otherwise known as FTP is 40 years old today. Originally put forth as the RFC 114 Specification on April 16, 1971, FTP (and the various iterations inspired from it) is as heavily used today as it was back then by people and companies all over the world.

Originally put forth as RFC 114 and used as such from 1971 to 1980, it changed when in 1980 it was put forth again as RFC 765 by Jon Postel of ITI. This standard retired RFC 114 and introduced more concepts and conventions that survive to this day, including: A formal architecture for separate client/server functions and two separate channels, Site-to-site transfers, Passive (a.k.a. “firewall friendly”) transfer mode among other improvements. RFC 765 was replaced by RFC 959, which formalized directory navigation in 1985.

The third and current generation of FTP was a reaction to two technologies that RFC 959 did not address: SSL/TLS and IPv6. It was essentially a security upgrade to FTP.

The latest RFC’s that handle the FTP protocol are RFC 2228 in 1997 (which added SSL extensions and is how FTP became FTPS) and RFC 2428, which added IPv6 suport in 1998.

While FTP matured into FTPS, it is not to be confused with SFTP.

FTPS is essentially a secured or hardened FTP protocol that uses two channels, one for the data transfer and one for directory listings and other data not associated with the actual transfer. It’s FTP + SSL.

SFTP is a complete departure from FTP and is part of the Secure Shell File Transfer Project and was built from the ground up as an extension of SSH. It is a secured file transfer protocol built as an extension of SSH itself. While many confuse SFTP with “an FTP session through SSH”, it isn’t. While FTPS is FTP with security extensions (namely SSL), SFTP is an extension of SSH that adds easy file transfer capabilities to the already secure SSH session. Also not to be confused with SCP, SFTP allows for many more dynamic commands than that of simple SCP.

It is interesting to note that many companies still use classic FTP over VPN connections as well.

Anyone lost yet? Just checking . . .

For the record, I prefer SFTP, since I love SSH and do everything I can over SSH, even mapping file systems over it with SSHFS (more info about SSHFS here).

Here’s some write-ups on FTP’s 40th: One, two.

Heritage Site at the Birthplace of the Internet

Relative to an earlier post I made about the birth of the Internet, a heritage site is being set up where the very first message was sent over what would become ARPAnet and later, the Internet at UCLA. There’s a couple of great pictures in the article as well.

Click here for the full article from the Daily Bruin.

A heritage website is also being set up with some great articles and photos (being posted to Flickr).

From the heritage site:

Our heritage site is a restoration of the original 1969 ARPA lab that sent the first Internet message from 3420 Boelter Hall at UCLA.  It will be open to the public and feature key artifacts including the very first piece of the Internet infrastructure, namely the Interface Message Processor (IMP).  We use teaching tools from the 1960s such as slide projects and blackboards to tell the story of the Internet’s early history.

As an archive, historical documents from the Internet’s early history are being identified, acquired, and made available to scholars and the general public through social media and scholarly databases.  The physical copies are held permanently, securely, and accessibly in the world-class archive facilities at UCLA.  It is our conviction that the more of this information we make available – with particular attention paid to typically under-represented groups – the more objective, inclusive, and interesting  a history of the early Internet can be written.

Another article talks about this in the Atlantic Wire.

Load more