Installing RAID 0 in Ubuntu 10.04 (Lucid Lynx)
In a previous post, I was working toward having a dual-boot Windows XP and Ubuntu 10.04 system, with a RAID 0 array for at least the Ubuntu installation. Here, I describe the process I went through to set up that array. I had previously had a WinXP/Ubuntu dual-boot system on a single Western Digital Velociraptor drive; the main change here is adapting that to the RAID scenario, and seeing whether this would be faster.
I started with a very helpful video by amzertech. (Actually, two videos.) I have provided a detailed description of the process (below). To summarize, I had to boot the alternate Ubuntu CD and use it to install root ("/") and swap partitions on each drive in a RAID format, along with a /boot partition on one drive.
Troubleshooting
It all seemed to go pretty smoothly, and certainly other people seemed to have had good luck with that video. But when I was done and I tried to boot the system without the CD, I got "error: no such disk" and then a "grub rescue" prompt. And in the FastBuild Utility discussed in the previous post, I was no longer seeing two drives configured into one logical disk set; they were back to being two separate JBOD ("just a bunch (or a box) of disks") drives. So it seemed that what I had done previously was not necessary.
After digging around in a search, I found a post where they said the problem got solved for them by undoing the BIOS edits I described in the previous post. So I rebooted, hit Del, went into the BIOS > Integrated Peripherals and set the OnChip SATA Controller to Disabled. This greyed out the other two items that I had changed and put them back into their previous settings. I saved and rebooted.
This time, I got "DISK BOOT FAILURE, INSERT SYSTEM DISK AND PRESS ENTER." A search led to a Tom's Hardware thread where they advised, among other things, adjusting the boot order in the BIOS. So I tried setting the hard drive to boot first, instead of the CDROM drive. But I got the same error again on reboot. That thread contained many other suggestions: update your motherboard's BIOS, make sure all cables are firmly connected, disconnect all other drives, use other software (e.g., Disk Boot Manager), check that the jumpers on your hard drives are set correctly (at least if you're using PATA drives, which I wasn't), fiddle with BIOS settings, etc.
The BIOS update was a possible solution, but I began instead by rearranging cables a couple of times. My goal there was to match up the hard drive on which I had installed the /boot sector (I didn't know which of the two it was) with the lowest-numbered SATA connector on my motherboard (i.e., SATAII0 as distinct from SATAII1, SATAII2, etc.). But that didn't do it either; still the DISK BOOT FAILURE message. I also tried going back into the BIOS and setting Hard Drive as the first, second, and third boot options.
Following some other suggestions, I went back into BIOS and enabled the OnChip SATA Controller as Native IDE type (even though they were SATA drives). This gave me a new error: "MBR Error 1. Press any key to boot from floppy." I tried reversing the cables, in case I had the drives in the wrong order. And that did it. Success! Ubuntu booted. I restarted the computer, set the BIOS back to boot first from USB, second from CDROM, and third from hard drive (as it had been previously), and thus verified that the solution seemed to be as follows: set BIOS to Native IDE type, and make sure the drive with the /boot partition is connected to the first SATA connector on the motherboard.
Assessment
In Ubuntu, I went into Synaptic and installed GParted. I took a look at what had happened. I had used two 640GB drives for my RAID 0 array. In an earlier time, this would have been extravagant. By now, however, prices on such drives were in the basement. But I did still wonder if I could use the rest of these drives for some other purpose. I had allocated an absurdly large 100GB space (50GB per drive) for my root partition -- leaving a total of more than 1TB unused!
So now, in GParted, I saw three drives: md0, sda, and sdb. md0 was a net 93GB, with no further information. sda essentially had the partitions I had set up in the RAID setup process: the root partition, the /boot partition, the swap partition, and about 540GB left over. sdb had its own matching root and swap partitions, though they were labeled as having an "unknown" file system, and a matching 540GB unallocated.
More RAID Partitions
Could I use that 1TB of unused space? I wouldn't have put my data there -- RAID 0 had twice the risk of data loss, in the sense that either drive's failure would take down the array -- but there were some other possibilities. In particular, I could store my VMware virtual machines (VMs) there, as long as I kept backup copies elsewhere; they would run faster on the RAID array. And I also wanted to set up a separate /home partition. But should I have created these partitions while I was going through the initial RAID setup? And could I store anything on just one hard drive or the other, or did everything on these two drives have to be set up in a RAID format now?
I decided not to research the option of storing things on just one drive or the other at this point. Apparently it was possible, and could even be done after the fact from within Windows XP. Since there would be a lot of completely unneeded empty room left over after this RAID installation, I would just let it sit on the drives as unformatted space for the time being. But for the VMs and anything else (video editing files?) that might call for the performance of RAID 0, I decided that I did want to make use of some of that space. And I wanted it to be in separate partitions, for backup purposes, not part of the Ubuntu RAID installation mentioned above. I figured the contents of these partitions would change more frequently and would require a different backup schedule than what the root program installation would need.
RAID 0 Setup: Detailed Description
So now that I had my BIOS and my drives and everything else in order (above), I restarted the process of setting up the RAID 0 array, and this time I took notes. I booted the alternate CD and went into Install Ubuntu. I went through the initial setup options (language, keyboard, etc.). When it got to the partitioning screen, I chose Manual. Now I saw that, actually, I didn't have to undo what I had already done. The 100GB RAID 0 device I had already set up would be just fine. I could just arrow down to the FREE SPACE items and add stuff there.
So I did that. For each of the two drives (sda and sdb, on my system), I selected FREE SPACE and hit Enter > Create a new partition > 50 GB (giving me 100GB total) > Continue > Logical > Beginning > "Use as" > "physical volume for RAID." Then I chose "Done setting up the partition." Then, back in the Partition Disks screen, arrow up and hit Enter at "Configure software RAID" > Yes. Next, Create MD device > select the two items shown as 49999MB (i.e., about 50GB) > Continue > Finish. This put me back at the Partition Disks screen again, but this time I had a new RAID 0 device of 100GB. I selected that device, hit Enter > Use as > Ext3 (more reliable than Ext4) > Enter. I set the mount point to /home and the mount options to relatime and labeled it UHOME. Then I selected and hit Enter on "Done setting up the partition." I repeated this process, starting by selecting FREE SPACE, and created another pair of 200GB logical partitions that I labeled as RAIDSPACE and set it to mount as /media/RAIDSPACE. My VMs would go on a folder in this partition. I still had 700GB left over, but at least I had made a stab at converting some of that unallocated space to a useful form.
I noticed, in this process, that my previous root partition was no longer set as root. I configured it again. Now, when I went to Configure software RAID, it seemed that Ubuntu was going to be reinstalled there. I went ahead with that. With those changes made in the Partition Disks screen, I arrowed down and selected "Finish partitioning and write changes to disk" and hit Enter. It gave me an option of formatting the root partition, but since I had already installed Ubuntu there, I didn't want to do that. This put me into an empty blue screen for a while, but then it began installing the base system. So maybe I should have let it format the root partition after all. It went through the installation process and then told me that this seemed to be the only operating system on this computer and asked if it was OK to install the GRUB boot loader to the master boot record. I said yes.
When it was done, it went into Ubuntu. I reinstalled GParted and took another look. It looked like I might have done something wrong. The only md device was md0, the 93GB partition from before. It did show the UHOME and RAIDSPACE partitions on sda, with matching Unknown partitions on sdb, all with RAID flags next to them. Nautilus showed RAIDSPACE as a legitimate partition with 348GB free (19 GB used already!). But no, everything seemed OK.
Bringing Stuff Over to the New Installation
I shut down the machine, connected a disk containing files from my previous system drive, booted with a live CD, and copied some things over. Specifically, I installed a third hard drive and, while booted with the live CD, used GParted and Nautilus to prepare a partition on it and then copy over all of the files from my previous machine's Windows XP installation. I tried to copy the /home partition from my previous installation, to replace the contents of the /home partition in the RAID array, but the array was not accessible from a live CD boot. So I copied the previous /home folder to that newly installed third hard drive. (I knew that the previous system drive was bootable, so I would not start the system from a hard drive (i.e., without a live CD) while that one was connected, lest it screw up my new installation.) I got an error message for just one file, a Google Earth file: "Can't copy special file." A "special file," according to Andreas Henriksson, was something like a pipe, device node, or socket. I would be reinstalling Google Earth anyway, so this was no problem.
When those copy operations were done, I disconnected the previous drive and rebooted from the hard drive, this time with an external USB drive connected. The external drive contained my previous fstab and some other materials that I needed now, as I began to work through the Ubuntu post-installation adjustment process described in another post.
The first step of that process required some adjustment for the RAID situation. The /home partition in the RAID array wasn't available via live CD. So I tried the technique of commenting out the regular fstab line for /home and replacing it with a reference to /dev/md2 (where the home partition was to be) and /home_tmp (instead of home). I typed "sudo mkdir /home_tmp" and then rebooted. Now, if all went well, the system would think its /home folder was in /home_tmp. On reboot, I typed "sudo Nautilus," went into /home (not /home_tmp), deleted its "ray" folder (my username), and replaced it with a copy of the "ray" folder from my previous installation. Then I typed "sudo gedit /etc/fstab," deleted the line referring to /home_tmp, and rebooted. Then, in sudo Nautilus I deleted /home_tmp and rebooted once more. All was good: my settings from the previous setup were back.
I proceeded through the remaining steps to configure my new Ubuntu installation, as described in that other post. The RAIDSPACE partition was not available to me as a normal user, but I wanted it to be, so I typed "sudo nautilus," right-clicked on RAIDSPACE, and changed its permissions. Then I copied my VMS from the external drive to a VMS folder in RAIDSPACE. I was getting an error message on reboot, "Ubuntu is running in low-graphics mode," when it did not seem to be. Also, I noticed that the GRUB menu was no longer remembering the operating system that it had last used; it was defaulting to Ubuntu in every case. I was not the only one who had this problem in RAID. But otherwise, the previous post pretty much covered the adjustments required to get my system back to normal.
As mentioned above, I had copied over the Windows XP files from what used to be my system drive to the third hard drive now installed in this computer. When I ran "sudo update-grub" to consolidate the changes I had made to the GRUB2 menu while making those adjustments, it said, "Found Microsoft Windows XP Professional on /dev/sdc1." GParted said that sdc1 was the right place -- it was the NTFS partition to which I had copied those files. I wondered if just copying WinXP files from one drive to another in Ubuntu was sufficient to create a working WinXP installation in the new location. So now I rebooted and, in the GRUB2 menu, I chose Windows XP. And, what do you know, it worked! Just like that. No GRUB errors or anything. I had to adjust a few settings in WinXP, but for the most part things were in good shape.
The Acid Test
So that seemed to pretty much wrap up the process of converting my dual-boot WinXP/Ubuntu system from a single hard drive to a RAID 0 array. I was sure there would be other changes to come, but it was time for the acid test: I wanted to see how VMware Workstation performed in the RAID 0 environment. It had been dragging, functioning very slowly for a long time, in the computer that this one was going to replace. It had run more quickly on this replacement computer in the single hard drive setup. No doubt the Velociraptor helped. But how did it do in the dual drive array?
Let me say, first of all, that the general startup process in Ubuntu was darn snappy. I noticed it right away. Boom! my startup programs all came to life pretty smartly. Inside VMware Workstation, likewise, performance seemed faster than it had been in native WinXP on the Velociraptor. I was sure there would be much additional learning, but this had been a real step forward.
0 comments:
Post a Comment