Monthly Archives: November 2010

Compiling Emacs 23.2 From Source (Part 1)

As part of my goal to learn Linux internals from LFS, I’ve done some reading and experimentation with compiling software. Learning to compile software from source is widely considered an essential *nix skill, as many patches, tweaks, and settings can only be added or modified at compile-time. Also, building a LFS system requires compiling everything from source yourself, so I’ve got to learn some time.

Emacs, a text editor (in)famous for its extensibility, is the first software package I’ll practice compiling from source. Why start with Emacs? Emacs can be bent into doing pretty much anything you’d expect from a full-featured OS — e-mail, web browsing, music playback, Tetris, psychiatry… even text editing! — so there are obviously plenty of opportunities to learn how to compile special abilities into Emacs. Also, along with vi/vim, Emacs is a standard text editor for *nix systems, so I need to become at least somewhat proficient with it. Even the LFS essential prereading guide recommends Emacs as a starting point for package-building practice, so it’s probably a pretty good place to start… right?

So I began my dance with Emacs by grabbing the source for Emacs 23.2 from the gnu.org FTP server (~45MB) and extracting the .tar.gz to my Desktop folder (this is a good time to read any README files).

Next, I ran ./configure, a shell script for Emacs that generates makefiles (detailed instructions for compiling from source) based on a computer’s hardware/software configuration (and any special parameters you give it), and I hit my first snag:

Running ./configure

Running ./configure

As you can see, ./configure ran, but complained that several libraries were missing (image libraries in this case). I reran ./configure with the --with-xpm=no --with-jpeg=no --with-png=no --with-gif=no --with-tiff=no options that the first run said were missing, to see if I could bypass the error and get Emacs running at all.

Running A Slightly Different ./configure

Running A Slightly Different ./configure

Success! Once the ./configure script ran successfully (i.e., created valid Makefiles for compilation), I ran make, the slightly automagical compile-from-source tool. After about one or two minutes, make finished with no errors.

NOTE: make compiles the source code into the specified directory (by default the same directory as the source code), which makes it ideal for testing (run make clean to “reset” and try again if something breaks). make install, however, installs it to the normal program installation directories, so mistakes have more dire consequences for your time, as you have to hunt down many more loose ends.

This Means It's Finished, Right?

This Means It's Finished, Right?

I opened up Dolphin and browsed to the ~/Desktop/emacs-23.2/src/ folder, and ran the freshly-compiledd emacs:

Very... 1990s

Very... 1990s

It looked pretty shabby, but I suspect Emacs would be perfectly serviceable in this state. Compared to the Ubuntu package, you can see that there are definitely a few things missing from my homebrew build, including color image support and (semi-)native theming.

Very... 2000s

Very... 2000s

It seems I have a way to go before I get something as polished as Ubuntu’s tweaked-out version. However, I did manage to compile a program from source for the first time! Next time, I’ll try to get those image libraries working, along with a few other tricks and tweaks. Until next time, happy hacking!

Alternative To The “200 Lines Kernel Patch That Does Wonders” Which You Can Use Right Away: It Worked For Me!

This article is absolutely priceless! Ever since I first heard about the 200-ish line kernel patch that improves Linux desktop interactivity by, oh, an order of magnitude or better, I’ve been champing at the bit for K/Ubuntu to update their  kernels with the patch… but apparently the patch was predated by some bash-y proof-of-concept magic. Apparently, a few tweaks in your configuration files can do the same thing as this amazing patch for those people who don’t immediately custom-roll a kernel when cool patches go out (read, “most people”). I was even more excited to see that there was even a Ubuntu-tailored version that is a little more involved thanks to Ubuntu-y quirks, but my desktop experience has gone from a “meh, it’s okay” to a “whoa!”

The smoothness (no more lag!) and snappiness of my desktop is absolutely stunning — things that before I just took for granted as slow (like HD video when I do anything else in the background, file operations, that pause before menu and notifications are actually drawn) are now practically realtime. Applications themselves take just as long to load, but now everything doesn’t… take… forever… to… finally stop lagging so I can get to work. I would definitely recommend at least trying this out (of course, after a thorough backup — just in case)

Kernel developers +1!

Grumble…

Readership of BG, I am unfortunately going to have to renege temporarily on my promises that this would be a productive week. As the holidays draw inexorably nearer, my responsibilities as a husband, son(-in-law), friend, and host are each in turn being called upon. The short notice is thanks to my guests also giving short notice of their plans, as well as other cosmic rearrangings of previously well-laid plans.

In short? I won’t have the “extra” hours that I pare away from my schedule for BG on an irregular basis from this week through the end of December. I will necessarily be writing less, and pursuing BG projects less as well, but it is, I emphasize, a temporary measure.

That being said, I look forward to producing all the quality content I can squeeze out of my overworked brain and body, just for you, my readers 😉 . Happy Hacking!

Getting Started With Git (And GitHub)

In my first (and only, so far) ~/random post, I mused:

#3: Am I The Only One?

I hear over and over again how important it is to have a mentor, or at least a partner, when you code so there is always both pressure to do better and a check to make sure you’re considering all the open options. However, since I live in Hannibal, Missouri, the opportunities for doing so are… limited.

Okay, they’re as close to nonexistent as could be.

However, Breck Yunits’ post on how to get yourself a mentor mentioned a way I didn’t think of previously — Github. His other ideas, though good, are a bit impractical for me right now, but Github seems like it may be a promising avenue for me to pursue.

GitHub: Social Coding

GitHub: Social Coding

So, after the requisite several weeks’ procrastination,  I signed up for a free account on GitHub. Soon, I should be able to do away with the My Code Archive page and just direct you to GitHub for a much snazzier experience when browsing all my code. Woot!

Signing up for GitHub was a completely painless process, to my pleasant surprise. Pick a username (which you can change later, but only once!), password, and e-mail account, and you’re ready to go! Creating a new repo on GitHub is a simple process as well; choose a project name, decide whether your repo will be public or private, and (optionally) add a short description and a project homepage. Most people won’t need a private repo, I’m guessing — they seem to be geared for small companies that want cloud-like dev tools but don’t want to maintain their own git server.

Now in order to actually use GitHub’s repositories, you need an SSH keypair to communicate securely to GitHub’s servers as well as authenticate your machine to them. There are numerous clear instructions available on how to generate and test a keypair, but the best and clearest tutorial was GitHub’s help page on generating SSH keys in Linux (they also have instructions for Mac OS X and Windows).

And now for the scary part — learning git.

They're Coming To Git Us!

They're Coming To Git Us!

Despite my worries to the contrary, git actually feels pretty sane to use, even for my tiny projects. A good beginner’s tutorial (for me, anyway) is the online gittutorial(7) Manual Page. For the projects that I’ve churned out so far for BG, I only need about a half-dozen commands to get started — which is a lot less intimidating than the whole library of 150+ git commands!

Basically, you navigate to your project directory and type git init, which readies the directory with a .git subdirectory that will contain all the VCS voodoo. Then you need to tell git which files to keep track of, which is likely all of the files in that particular directory, so git add . is probably what you want. You are also allowed to specify individual files and/or directories, which would look like git add file1 file2 directory1 etc.. Then to make your first commit, type git commit, make any comments about the commit that you want (you type your comments in a vi interface, so type i to make your comments, then hit the Esc key to go back to normal mode, then :wq to write changes and exit), and the first milestone of your project is now set in stone! Okay, sort of 😉 . After you make changes to your code, just repeat those last two steps — git add . and git commit — and your new changes are now the master copy, and the previous copy is relegated to the dustbin of history.

There are plenty of other commands to explore, but those are the ones that I’ve experimented with (successfully) — I’ll get to the others as I grow more proficient with git. I have also looked at some graphical frontends to git — namely qgit and gitk — but I want to become acquainted with how git itself works before I start using other tools.

I don’t have my BG projects hosted on GitHub just yet — I’m working on migrating my current hand-crafted “VCS” system to git, and I’m probably going to practice a good bit before I push everything to GitHub. However, when I do push everything to GitHub, it shall be announced with fanfare and page restructuring (yay!).

In the meantime, have a wonderful weekend, and happy hacking!

…Too Quiet

I’ve been fairly quiet the past week or so, but I have been writing several posts behind the scenes. Nothing is quite ready for general consumption, but I am working on a few interesting things, like learning to compile Emacs from source and getting started with git and GitHub, so don’t worry — I’m still here! I’ll do my best to have a post ready tomorrow, and next week will no doubt be more… wordy… than this past week has been. Happy hacking, all!

Still Working On Piggybacker

As my boss well knows, I’m still working on getting Piggybacker up to snuff as far as Windows XP goes. Most of the delay is from having other office responsibilities than working on web development for eight hours a day, but part of it is that configuring XP is still horrible and time-consuming. On the upside, Piggybacker’s BIOS is now from late 2005 rather than mid-2003, all her drivers are ready to go, and her battery is being calibrated happily in the background as I write these words. So to fill the silence of the last few days, I’ll let the Interwebs in on what I’ll be using Piggybacker for once I get her up and running again.

Web Development (Windows XP – Work)

Go ahead and look at Scripture Memory Fellowship’s website and tell me that it doesn’t need some work! It is a vast improvement over the old smf.org site, which hadn’t been updated since March 2006 when the current setup replaced it in July 2009. Regardless, smf.org needs some major polish, and I want to both rework the old site into a more web-appropriate form as well as add blogging features so we can start building a more interactive online community. Eventually, Facebook and other social media will enter the scene, but probably not until next year (we’re working on synchronizing new smf.org versions to annual inventory turnover).

As far as the technical nitty-gritty goes, I think I may use Piggybacker only as a server and do the heavy-lifting development on Limited Edition (which I bring to the office during the week), since a Pentium M 1.6GHz doesn’t do batch image processing and the like particularly quickly. Also, using the 1920×1080 monitors at work is waaaay nicer than Piggybacker’s 1024×768 LCD :mrgreen: .

Generic Office Work (Windows XP/Crunchbang – Work)

I don’t do much office work in terms of traditional document and presentation creation, but when I do, both XP and #! (read Crunchbang) have the tools for it in one fashion or another. (Yes, I know, it’s boring, but it does help to point out that you really can get office work done in Linux OR Windows equally well in most cases. Seriously, has business writing advanced so much in ten years to recommend the latest MS Office unless it comes included with the hardware? I think not. Especially since Microsoft itself thinks OpenOffice.org is a worthy contender.)

Sysadmin Experimentation (Crunchbang/Windows XP – Work)

I’m hoping to use Piggybacker to explore Windows/Linux interactions and administration within SMF’s LAN, especially once we get our VPN and new file server up and running. Basically, Piggybacker is the most expendable piece of hardware I’ve got, and I’d really love to start learning some real-world Linux system administration, even on a small scale.

General Linux Experimentation and Tomfoolery (Crunchbang and other Linuces — Home)

Piggybacker is not particularly underpowered, but she appreciates every break she can get. Crunchbang, especially compared to XP, is a major breath of fresh air in terms of raw performance, and is much more appropriate for an underpowered machine in many ways (e.g., #! has a powerful CLI, many small utilities, many customization options, a small OS footprint, etc.). However, I’d like to explore other distros, window managers, utilities, etc. to see what I can really make a fairly humdrum office-standard laptop do.

Piggybacker Lives! Or, The Virtues Of A CF Card Put To Unconventional Use

Preparing the Hardware

I finally got my CF-to-IDE adapter for Piggybacker! It arrived in the mail last Tuesday, and it does work! I did have to bend back one of the adapter’s pins and wrangle the card into place with some nearby pens and a screwdriver, but it’s snugly in place now. The original hard drive caddy was basically stuck to the (broken) original hard drive, thanks to a too-small Torx screw, the CF card and adapter are basically just floating in the midst of a large gaping hole on the left side of the laptop instead of being protected in the old caddy. I don’t forsee this being a problem during normal use (sitting on a desk all day), but I’m keeping an eye out for a cheap/free Torx set anyway.

Choosing the OS(es)

I use XP for web development at work for compatibility’s sake, so I headed first to nLite, a priceless tool to create customized Windows XP install CDs. I made a slimmed-down version that suited my purposes, but it took some time because nLite kept crashing. Eventually, I resorted to The Pirate Bay for an untouched retail OEM ISO, which let me get past the first stage without crashing. It’s an odd problem, as I’ve used nLite before with Piggybacker’s i386 folder and Limited Edition‘s i386 folder with no problems, and I couldn’t find any evidence of anyone else having problems remotely like mine.

As for Linux, I decided on Crunchbang 9.04.01 (hereafter referred to as #!) as my distro of choice. I contemplated using #! 10 Statler, but it’s still in alpha and I’m perfectly happy with 9.04.01, so I declined to wrangle with Statler for now. I also tried out Linux Mint 9 Fluxbox, Lubuntu 10.10, and a couple other light distros, but I kept coming back to #! for its great assortment of preinstalled programs and its ease of use. However, I reserve the right to change my mind later. Again. (God bless Linux!)

Setting to Work

So with the major building blocks ready, I started down the road of getting Piggybacker ready for prime time as as a dual-boot XP Pro SP3 and #! 9.04.01 machine. I started out by divvying up the CF card like so:

  • (/dev/sda1) 8GB ext4 / partition for #!
  • (/dev/sda2) 1.5G swap partition for #!
  • (/dev/sda3) ~22GB NTFS C:\ partition for XP

XP installed just fine, and booted happily. #! also installed and booted just fine (though #! took about 40 minutes to be usable to XP’s 2-3 hours). A VICTORY FOR CF-BASED COMPUTING!

Troubleshooting, Of Course

However, GRUB was unable to boot to XP, giving a mysterious “out of memory” error message. As it turns out, this was because the BIOS couldn’t access the XP partition for the relocated XP bootloader (when XP’s bootloader was in the MBR instead of GRUB, the problem naturally didn’t present itself). So I reworked the partition table like so:

  • (/dev/sda1) 128MB ext4 /boot partition for #!
  • (/dev/sda2) ~22GB NTFS C:/ partition for XP
  • (/dev/sda3) 8GB ext4 / partition for #!
  • (/dev/sda4) 1.5GB swap partition for #!

Several hours later, with XP and #! reinstalled, I tried to boot XP… and it worked! Everything is at least baseline ready for me to configure now, but XP will require significantly more work with proprietary drivers and such — I can’t even get Ethernet to work without a driver from Compaq! Seriously, check out the drivers and software page for the N620c to see how much is really needed to get XP working with the “Designed for Microsoft Windows XP” hardware.

As an aside, how can people say that Windows is “easier” than Linux when even a quite techie-oriented distro installs much faster (using a graphical installer, too!), configures all my hardware automatically (even got the weird keyboard layout, though not the custom media keys), and boots to the desktop in about half the time of XP? Is it confirmation bias? Who knows…

CF vs. Conventional Hard Drives

Piggybacker is now running both Linux and Windows quite happily, and I’ve had some time to reflect on the differences between hard drive and CF-based performance.

#! 9.04.01 on Piggybacker

#! 9.04.01 on Piggybacker

Most striking at first is that Piggybacker is almost totally silent with a normal workload, the only noises being the occasional fan startup and this weird clicking that comes from who-knows-what in the bowels of the upper left corner of the case. (I’m actually kinda scared to investigate further.)

As far as performance is concerned, boot and program load times are spectacularly faster in both #! and XP. The famed Linux I/O scheduling bug brings my system to a crawl when write-heavy tasks like system updates are going on, but this is not a very noticeable issue during normal desktop use. Things slow down noticeably in XP during intense I/O operations, too — especially during startup, where it can be a minute or so after booting to the desktop before the desktop is really usable — but XP’s usually is a bit more responsive during this time than #! would be. Honestly, nothing gets done in either OS until the heavy stuff is over, but XP gets points for having a slightly more responsive desktop environment (which are promptly deducted for XP taking forever to actually let you work on it 😛 ).

Using CF, I get about 15-30% extra battery life (translates to roughly 20-40 minutes in my case), depending on use. Not amazing, but certainly a nice side benefit.

Weight-wise, I really can’t tell a difference at all without concentrating hard — but then again, weight isn’t a compelling reason for using CF in this manner unless you’re talking about a really small computer and you’re unwilling to sacrifice a few grams in exchange for true SSD performance.

One thing that surprised me a lot was that Piggybacker now runs significantly cooler with a CF card. Part of this might be the newly gaping hole in the side providing extra ventilation —  the old hard drive got very hot during intense read/write sessions, to the point where you couldn’t have Piggybacker touching bare skin for longer than a minute or two — but it might also be that the CF card just doesn’t get as hot. Regardless I’m quite thankful for it, because although I rarely use Piggybacker on my lap, when I do, it could be a pain (literally) from the excess heat.

The Verdict?

I have an old CF card with pretty abysmal read/write speeds, but it has turned out to be more than adequate for my daily needs (both personal and professional) and the conversion even conferred some unexpected benefits. The cost was comparable to buying a new hard drive from eBay, and was only slightly more technically difficult than replacing a hard drive normally is. I’d say for most people, it’s not really something to jump out and try — but for those who need to squeeze an extra bit of life, performance, and battery life out of their machine on the cheap, a CF card is hard to beat.

I would definitely recommend being right next to another computer with web access while doing this — I encountered many unfamiliar errors during the process, and being able to search for solutions at the same time as I was encountering the problem was invaluable. Also, you can use the longer-than-normal install time (remember, CF cards aren’t speed demons to write to) to look for cool post-install resources.

I would like to credit the ever-resourceful K.Mandla for the original idea behind this endeavor (which has morphed considerably over time), and evidex for being a copy cat (so I don’t feel as bad for being one myself 😉 ).