A Bit of Sense

Here I talk about my expericne with computers, software and computer programming. Let me warn you that some of this stuff will be technical. I'll aim to give you fair notice for technical posts.

My Photo
Location: Massachusetts, United States

Friday, September 10, 2010

Disabling power save button on keyboard in Ubuntu

My home desktop computer has special buttons long the sides for various actions. Ubuntu handles most of these buttons correctly. However, the suspend button shuts down the computer if hit by mistake. I hate that. Here are instructions for fixing this for Ubuntu 10.04. Other versions might require other steps. Other GNU/Linux distros with the same version of gnome will probably work with these steps.

1. Type F2 to get the run prompt.
2. Type "gconf-editor" (without the quotes) and hit enter.
3. go to the config path /apps/gnome-power-manager/buttons
4. change hibernate's value to "interactive"
5. change suspend's value to "interactive"
Should be all set.

Saturday, May 22, 2010

Ubuntu: a path to green(er) computing?

This is a draft of an argument (you might call it marketing piece?) on the use of Ubuntu to prolong the effective life of a computer. I need to do a little research on how computers are recycled to backup (or refute!) the claims in the first paragraph.

Computers are tremendously valuable tools in our lives. But they contain hazardous materials that can escape into the environment once they are retired. Recycling does not ensure that computers will not cause pollution as the act of reclaiming metals can release them and may involve mercury or other chemicals in the process of extraction.

From an environmental standpoint it is desirable to get the longest live possible from our machines. Reusing them ourselves or passing them on to others who can reuse them.

However, there are a number of major difficulties in keeping a computer running windows operating optimally:
* Windows accumulates "cruft" as it ages. There are files and settings that are mostly of no use, but it is hard to know for sure so you don't dare delete them.
* Viruses are detected by anti-virus software and removed but sometimes manage to cause damage before they are removed. Anti-virus software does not always manage to fix these problems.
* New versions of anti-virus software detect more viruses, but also consume more resources in order to keep your system safe.
* New service packs and upgrades increase the resources used by the computer sometimes exponentially.
* Various other issues gradually slow the computer down to a crawl.

There are things you can do to undo this damage:
* Buy utilities to fix windows, however, chances are it still won't run like new.
* Hire someone else to fix windows for you, gets expensive and may not restore performance completely.
* Reinstall windows from scratch, works for a while but eventually you run in to the same issues.
* Get the next version of windows, good solution if you are able to upgrade your hardware. However, there are limits to the degree a particular system can be upgraded. There may not be anything you can do to make your system Windows 7 "capable."

None of these solutions can reduce the amount of resources consumed by anti-virus software or upgrades to the OS. Eventually Microsoft will drop support for the OS and most people will toss their old computer and buy a new one with Windows 7 or whatever the latest release is. Then the cycle starts all over.

Some will argue the streaming video or 3D effects require a new computer. But many users just want to get their email, browse the web, write a document or spreadsheet and chat with friends on-line. None of these things *should* require a state of the art computer. Windows will eventually bog down these newer machines as well.

For the reasons cited above Windows is not a good choice if one wishes to continue to use their existing computer over the long haul rather then replacing it every few years.

So what features *would* be desirable in an OS designed for sustained use over years:
* A packaging system where the OS would handle installing and removing programs so that defects in individual programs would not cause files to be left around.
* An anti-virus strategy built around preventing infections rather then trying to clean them up afterward would result in less damage to the OS and leave more resources for other tasks.
* A commitment to not raising hardware requirements in future OS versions.
* The system would be quick and easy to install so that if problems eventually slow down the computer the user can re-install without a big hassle.

It happens that Ubuntu offers all of these benefits.

Will Ubuntu do what you need?
* Word processor, spreadsheet and database programs are included that can read the files created by their Microsoft equivalent.
* The Fire Fox web browser is included. Fire Fox is the second most popular web browser and its share of users continues to grow as Internet Explorer's share shrinks.
* Ubuntu comes with a good email program, Evolution. However, you can pick from a number of other excellent choices like the popular Thunderbird made by the same folks who created Fire Fox.
* There are literally thousands of software programs available for Ubuntu at zero cost and easily installable.

Ubuntu is created from thousands of software projects all over the world created by people who create them to meet their own needs or the needs of their employer and realize that the best way they can make their software better is by sharing it and inviting others to help improve it.

Sunday, January 24, 2010

Setting up Open SSH server

I am writing this post mostly for my own information, but knowing also that others may find these instructions useful.

Warning: This post will be technical in nature. This document is more of a reference for those who already understand ssh then for new comers.

SSH or Secure SHell is a way of executing commands on a remote computer similar to telnet. The difference being that SSH was designed with information security in mind. SSH encrypts information sent in both directions. For more information see the SSH wikipedia article. Open SSH is a freely available implementation of the SSH standard.

I am no expert, I am simply combining tips I've found elsewhere. There are many more changes that can be done to limit access to ssh from external computers, but each also restricts your own access as well.

Be sure you understand each change I am making here and consult with ssh man pages or other documentation as needed. You also may want to consult my own links related to ssh stored on delicious.


Modify ssh configuration to prevent log in as root and to prevent X11 forwarding in order to make the setup more secure. Modify your port to hide the fact you are running ssh. Makes it a bit harder for the script kiddies.

Edit /etc/ssh/sshd_config

PermitRootLogin yes
PermitRootLogin no

X11Forwarding yes
X11Forwarding no

Port 22
by replacing 22 with a positive integer of your choice.

PasswordAuthentication yes
PasswordAuthentication no
(Uncomment this line if needed)

AllowUsers usernamelist
with usernamelist replaced with a list of users you want to allow to use ssh. You may also use DenyUsers instead to specify the list of users you wish to block.

Edit /etc/hosts.allow:

or possibly
This is just an example. Your own network may use a different ip address range.

From the machine you want to ssh from execute
ssh-keygen -t dsa
to generate an ssh key. then
ssh-copy-id -i .id_dsa.pub username@yourservername
to copy the key to your server.

If you generated the keys on the server you still want to do the copy in order to place you key in authorized keys. You could simply execute this instead:
cp ~/.ssh/id_dsa.pub ~/.ssh/authorized_keys

ssh yourservername
to verify that login is now working correctly. Because you created a secret key to use to login a password is no longer required.

Once regular log in works restart ssh (on the server) with
sudo /etc/init.d/ssh restart

Finally, edit ~/.ssh/config on the client and add the lines

Host ServerNameHere
Port PortNumberHere

This is so that ssh knows that you changed the port number for your server.

Friday, January 30, 2009

apt-cacher - fast upgrades on a network of computers

I am writing up a short 'executive' summary about how to setup apt-cacher-ng to accelerate upgrades by caching packages locally. The title of this post links to the full story where I got all of this information. This should work on any apt/deb based system. This post has been updated to use apt-cacher-ng. Some of the steps may be a bit different.

I am publishing this mostly for my own use. But feel free to make use of this information as well. My settings are a bit different from the example.

To install:

sudo apt-get install apt-cacher-ng

Edit /etc/apt-cacher/apt-cacher.conf and add:


Edit /etc/default/apt-cacher

Finally import your debs:

sudo /usr/share/apt-cacher/apt-cacher-import.pl /var/cache/apt/archives
sudo /etc/init.d/apt-cacher restart

sudo cp /etc/apt/sources.list /etc/apt/sources.list.XXX
sudo nano /etc/apt/sources.list

deb http://[servername]:3142/archive.ubuntu.com/ubuntu/ hardy main restricted universe multiverse

Check that it is installed correctly: http://localhost:3142/
(assuming localhost is setup correctly in /etc/hosts)

EDIT: There is a new version apt-cacher-ng. It works the same way as apt-cacher, but is more resource efficient and has fewer bugs.

Sunday, January 25, 2009

HOWTO: Update to static IP Address in Ubuntu 8.10 (Intrepid Ibex)

A bug in network manager causes static IP'ed systems to revert to DHCP upon reboot. In fact I haven't gotten network manager to setup static IPs at all.

Here is a work around I found here at

I've condensed the solution to the minimum. I the 'x' character is when you should use a number that is correct for your network. If you are using a typical broadband connection w/a router then your numbers may be something like this:


Check the numbers before you begin any of the instructions using ifconfig in order to get the IP and netmask. Use route -n to get the gateway and cat /etc/resolv.conf to get the name server(s).


Disable network manager at start up:
sudo update-rc.d -f NetworkManager remove

Make a backup of interfaces file:
sudo cp /etc/network/interfaces /etc/network/interfaces.XXX

sudo nano /etc/network/interfaces

You interfaces should look something like this. Comment out the bit of

auto lo eth0
iface lo inet loopback
iface eth0 inet static
address xxx.xxx.xxx.xxx
netmask xxx.xxx.xxx.xxx
gateway xxx.xxx.xxx.xxx

This second part will probably already be ok, but double check to be sure.

sudo nano /etc/resolv.conf

# Generated by NetworkManager
nameserver xxx.xxx.xxx.xxx(enter your dns server ip)
nameserver xxx.xxx.xxx.xxx(enter your alt dns server ip)

Tuesday, November 11, 2008

Another image

Continuing on my computer generated image theme from before. I bring you another one:

Sunday, October 26, 2008

Computer Stores with Linux Pre-installed.

Traditionally installing a new operating system has been an intimidating task. This was especially an issue with (GNU/)Linux as it was difficult to get all of the hardware working correctly with the software. In recent years this has gotten much easier. However, most computer users still would prefer to get their system pre-installed. Another advantage of getting Linux pre-installed is that companies that offer this ensure that the hardware they include works well with Linux. There are several companies selling systems with pre-installed Linux. I haven't found a complete list so I'll provide a list of the ones I have found. Note that I haven't had any personal experience with these companys - this isn't an endorsement.
I am sure there are a lot of other companies out there. Feel free to drop a line if you know of some that I missed.

Computer Generated Images

I have been working recently on an evolutionary computation system of my own design. I have created it to evolve programs that generate computer images. I have been asked to share the most interesting images and this seems like an appropriate place to do this. Eventually I hope to use a similar system to help solve more practical problems. I am interested particularly in applying the same idea to optimizing computer code.

This is the initial base image. I wrote the program to generate this image and used that program as the original "ancestor" for all of the images. All of the other images are based on random changes (mutations) to this original image.

The cirle is gone on this one and much of the left side has disappeared. The left side represents negative coordinates so changes often relate to the origin of the image. I've chosen the center to be the origin in for these programs.

With random mutations you might expect to see a lot of images with random 'junk' in them. Actually these type of images are less common because the images are based on logical rules. I find these images combining order and choas really interesting.

In this one the cirle in the middle has become a horizontal line. The circle in these images was creating by calculating the distance from the origin so it appears in this image that X (horizontal distance) has dropped out of the equation in this one. It calculates the distance from the origin only along the virtical y scale.

Of all the ones I've seen so far I think this one is my favorite. You can see the similarity to the original, but it contains some new effects that weren't part of the original at all.

Dramatic I think is the word. Great contrast between the inner purple/pink and the outter green.

The images are a bit pixelatted because they were done at 64 x 64 pixel resolution. The programs work on a huge virtual resolution so it is possible to scale them up. However, at the moment this is a very slow process because it takes a long time to calculate the value for each pixel. 64 x 64 images can take over a minute to create.