Virtualization in Ubuntu: VirtualBox or Wine Replacing VMware?
In previous posts, I have described the process of installing Ubuntu Linux and becoming familiar with a Windows XP running in a virtual machine (VM) guest on an Ubuntu host, and of coming around to the point of favoring several smaller WinXP VMs rather than just one big VM. (Many of the terms and issues taken for granted in this post have been explored at length in those previous posts.) Now it was a new day, and I wanted to try to put together what I had learned into a working system. To review, the basic concept was as follows. I had two computers. On my primary computer, I had installed WinXP and Ubuntu on a dual-boot system. I had found, unfortunately, that the dual-boot approach led to boot complications. It occurred to me that I really didn't need to make the primary computer dual-booting, because the secondary computer was also capable of dual-booting. If I had a need to boot into native Windows XP, I could use the secondary machine. Making the primary machine a monobooter -- a purely Ubuntu machine -- would also mean that I could leave it up and running for days, just hibernating it at night. Hibernation was working very reliably on Ubuntu, so far, and that way I would be almost instantly back at where I left off when I started work again each new day. I had already installed WinXP on the primary machine when I came to this monobooting decision, so I decided to leave that basically functional WinXP installation in place, but I used the GPartEd CD to hide that Windows partition. In a real pinch, I could unhide it and boot it, probably; but I didn't expect to need it very often. In Ubuntu on the primary machine, I would then install VMware and would run virtual machines. I would use several of them, instead of just one, so as to make them smaller and free up more system resources when I did not need a particular WinXP program. So, for example, I might have a VM for Adobe Acrobat and Microsoft Word since I tended to use both of those programs together, but might then have a separate VM for Adobe Premiere Elements, the video editing program; and when I was trying to work with video, I might close down the Acrobat/Word VM. Each VM would come equipped with the basics, including Internet Explorer (preinstalled with Windows), printer support (hopefully; I had not worked through that yet), and other basic tweaks and tools. At most, I would probably have three or possibly four VMs running at once. It remained to be seen whether this would require an upgrade in my processor and/or RAM. On the secondary computer, meanwhile, I would be dual-booting Ubuntu and WinXP. I would install 32-bit rather than 64-bit Ubuntu, mostly for purposes of variety, and at some point hopefully I would run VMware Player, so as to use the virtual machines created on the primary computer. So now I wanted to start over with a new 64-bit Ubuntu installation on the primary computer. I assumed, that is, that I would need to do a new installation, because the boot problems caused by my attempt to dual-boot seemed to have screwed up the GRUB bootloading program to the point where I could not get it to run properly, and also because the Ubuntu installation was not running Firefox properly and I had not been successful in my attempts to uninstall and reinstall it. Reinstalling Ubuntu was simple enough, so I felt I would probably do it in any case; but just out of curiosity, now that I had hidden the WinXP program partition, I tried rebooting the system to see whether Ubuntu would now be able to start. It wasn't. Instead, I got a WinXP "autochk.exe not found" error. This seemed to be a common, expected response when the Windows boot partition was hidden. I proceeded to reinstall Ubuntu on the primary computer. I was disappointed to see that GRUB did continue to register the existence of a Windows XP installation. But it defaulted to Ubuntu after a few seconds, so it didn't seem to matter. I didn't try to see if it would actually load WinXP in its hidden partition. Instead, I logged into Ubuntu and easily installed program updates and NVIDIA 3D video drivers. Now that I had gotten this far, I retired a bunch of tabs I had open in Firefox on the secondary computer, having to do with dual-booting and fixing your boot manager, as discussed in the previous post: EasyBCD, Mepis, WinGRUB, and other boot managers, as well as advice on restoring your WinXP boot loader from Linux, fixing GRUB, and so forth. Hopefully I wouldn't be needing them. Following advice on how to uninstall Firefox 3 and install Firefox 2, I entered these commands in Terminal:
sudo apt-get remove --purge firefox sudo apt-get install firefox-2That didn't work. Firefox Help > About said I was still running version 3.0.1. Ubuntu's System > Administration > Synaptic Package Manager said I was running both Firefox 2 and 3. Following other advice, I used Synaptic to uninstall everything that was Firefox-related, and then (using Ubuntu's File Browser) went into File System > Home/ray (i.e., the file for my user name, which was "ray"). There, I selected View > Show Hidden Files. This enabled me to see the hidden .mozilla folder. I deleted that folder. Now the Firefox icons were gone from the bar at the top of the Ubuntu screen, and also from Applications > Internet. Then, using Synaptic Package Manager, I searched for Firefox and installed firefox-2 and firefox-2-gnome-support. It installed. The acid test was that I could run Firefox and it was able to install the very useful Tab Mix Plus. I installed my other preferred Firefox 2 add-ons. Then I installed VMware Workstation 6, using the Workstation User's Manual and following the steps I had worked out in my first post in this series. Now I wanted to do a backup of my Ubuntu installation so far. I had previously thought I would use PING for this purpose, but I hadn't liked the PING user interface or functionality. There had been some behavior that had not seemed stable. So I wanted to try something else. Acronis True Image had sounded good, but apparently it required a Windows installation, so that was out. Same thing with Paragon Drive Backup. There were server versions of both programs, and they might have worked, but they cost hundreds of dollars. Among the numerous freeware and paid backup solutions cited by many webpages, I ordered the house special: Simple Backup Suite, a/k/a Sbackup, which I could get directly through Applications > Add/Remove and which seemed much, well, *simpler* than a command-line solution like rsync, and disturbingly less complex than Charley Curley's Linux Complete Backup and Recovery HOWTO. I was concerned that I would discover the wisdom behind Mr. Curley's comprehensiveness in the worst possible time and way, but Sbackup really did seem to have been designed for, as they said, ordinary desktop users. So I followed the instructions and made a system backup. Evidently I did it wrong; there was nothing in the folder I designated as the target where the backup should be created. I realized I was not comfortable with the lists of folders to include or exclude, there in Sbackup, because I really had very little idea of what these different folders (/var, /usr, etc.) were supposed to do. What I still wanted was an imaging program that would indiscriminately capture it all -- that, in other words, would restore a complete working system, period. As an alternative, I tried again to get Partition Image (also called PartImage) via the System Rescue CD, and this time I was able to figure out their download page. The recommended download command, "wget -c address," did not work, and neither did "wget -c /home," which was apparently supposed to be an option. The line under the "Downloading and Burning" heading that said, "Here is the direct download link" was, it turned out, a link for downloading wget.exe, not PartImage. Or instead of using that "very simple and reliable download tool," you could just click on another link, further down on the webpage, and get the download in that even simpler way. The download links were for x86 (i486-PC), SPARC, or PowerPC computers. I was new to software for the AMD 64-bit CPU. I wasn't sure if it was considered part of the Intel 486 line of processors. It had been some years since I had last purchased a pre-Pentium CPU that was called a 486. I basically just had to download the ISO and, after that half-hour process, burn the CD, and then try using it to see if I had the right download. I wanted the CD even though PartImage was also available for installation via Synaptic Package Manager, because I wanted something that would restore my system in case Ubuntu wouldn't boot. I rebooted with the newly burned CD and did see something encouraging about AMD64. The CD stopped with a prompt, preceded by instructions that mostly went over my head. One of those instructions said, "Type wizard to run the graphical environment." So I did that. Next, I almost chose the "Graphical desktop using configured Xvesa (should work)," which sounded like it might be more helpful than "default Xvesa (should always work)," but then I decided to gamble on "Graphical desktop using Xorg (optimal but may fail)." It didn't fail. After poking around in the nice GUI for a minute, I clicked on the image of a CD and selected System > Partimage. It gave me a graphical interface which, while not very user-friendly, was manageable. After I completed the information and gave it the command to go ahead, for a while it showed nothing happening, 0% progress, but the hard drive light was burning, so I waited; and after a few minutes, it popped up an information dialog that seemed to say my backup had been created. So I hit Enter and, no, that was just a summary of what was going to happen. Now it started creating my backup. In 5 minutes and 53 seconds, it had "successfully finished," copying 3.92GB . . . to an unknown location. I couldn't find it. It didn't go where I thought I had said it should go. Searches turned up nothing. Well, OK, I hoped at least the CD was good for restoring after a crash. I thought about trying the Synaptic Package Manager (SPM) version from within Ubuntu. But I couldn't find that either. SPM showed it as being installed, but there was no icon for it on the Application, Places, or System menus. I tried searching File System for "partimage*.*" but that, too, turned up nothing, at least not within the first five or ten minutes. ("File system," as I realized after those five or ten minutes of searching, apparently included all files on all drives. I wasn't sure how long a search of my entire computer would take.) They really seemed to want to keep this Partition Image program a mystery to the uninitiated. I would have tried searching just the Ubuntu partition for it, but apparently that wasn't an option; the search option in Ubuntu's File Browser did not seem to offer a way to search just the Ubuntu partition. I wasn't sure of the search syntax in any case; it seemed that possibly I should have been using POSIX expressions rather than the *.* syntax I had used in DOS. This inability to search raised the question of whether I could do better with an alternative to Ubuntu's default Nautilus file manager. One recent discussion thread offered recommendations for xfe, Dolphin, Thunar, PCMan, ROX, and Konqueror. Some of those were described as being designed for the KDE rather than Ubuntu's Gnome desktop environment and would therefore, it was said, be somewhat slow in Gnome. Thunar seemed to receive the most votes in that thread, so I thought I would start with that. It was said to be designed for the Xfce desktop environment. At first I tried working through instructions for manual installation, but then I remembered Synaptic and searched for Thunar there. I requested download and installation of everything with Thunar in its name, including several plug-ins. But this, in turn, was going to require installation of a number of additional Xfce-related programs. At that point, I decided to research desktop environments. According to a poll taken in early 2008, 75 of 128 Ubuntu users (59%) were using Gnome as their desktop environment, 39 (30%) were using KDE, 27 (21%) were using Xfce, and 15 (12%) were using some other environment. (Apparently some were using more than one.) The three leaders were, respectively, the default desktop environments for Ubuntu, Kubuntu, and Xubuntu. Xfce was described as a lightweight interface designed for computers with less than 512K of RAM. I had more RAM and didn't need to give up other features, so I thought I might not start with Xfce. Kubuntu was described as being "focused on including a lot of point-and-click configuration options immediately available to end users." It was possible to install either Xfce or KDE on an existing Ubuntu system, though the latter, especially, sounded like it might come with unwanted additional hassles. The differences between KDE and Gnome sounded pretty cosmetic, for the most part. I decided I was not in a rush to install the KDE desktop. Meanwhile, it was possible to make Xfce sound different and superior to Gnome and KDE in some ways. There were some signs that Xfce was not developing too quickly, though, and that there might be at least some minor adjustments required if I wanted to run some Gnome software in Xfce. To keep things simple, in light of what appeared to be relatively minor advantages, I decided to stay with Gnome, especially if I could find a Gnome-based desktop manager I liked better than Nautilus. One website said it was not a problem to install Thunar within the Gnome environment. The steps to make Thunar the default file manager seemed pretty simple. But I couldn't find any information on search capabilities from Thunar's online documentation, and elsewhere it sounded like Thunar would just use the Gnome file-search capabilities. Those capabilities included, by default, Tracker (which may have been related or identical to Metatracker). Tracker came installed and running on my Ubuntu system, as I verified at System > Preferences > Sessions and also in Applications > Accessories > Tracker Search Tool. But Tracker seemed to offer almost no search options, and something on the Metasearch page had made it sound like that program, at least, would search only my home directory. Anyway, it did not turn up the PartImage file I thought I had created, and it did not seem to be searching my NTFS drives. This was all educational, but not productive. I still needed to be able to do a drive image before I would be happy to continue with my Ubuntu and VMware installation. Wikipedia offered a nice comparison chart that made me think I should take a second look at Clonezilla. I downloaded and burned an ISO image to CD. While that was in process, I looked further for information on the search process, and concluded that the command-line options find and locate were probably the most powerful search tools available in Ubuntu. The former seemed to involve relatively complex syntax, though, and the latter depended on indexing that would happen in the middle of the night and therefore would not have a same-day creation like this backup that PartImage had theoretically created somewhere. I concluded that I probably should have just let the interminable Nautilus computer-wide search run. An alternative was to let AvaFind index my system, once I had a working Windows XP virtual machine, and use its remarkable search capabilities for the task. Later, I found that I should have gone to System > Preferences > Search and Indexing and enabled indexing. A more appealing tool was Catfish, which was said to serve as a front end for find, Tracker, locate, etc. A search for Catfish in Add/Remove Applications showed it as having only three-star popularity, but I thought I would give it a try. By this time, the Clonezilla CD was done. I rebooted the primary computer and, following instructions, tried using Clonezilla to make an image of my Ubuntu partition. I appreciated the option of making a script to automate future backups of this sort. Without that, the text-based user interface could eventually promote nasty errors, for those who weren't familiar and/or comfortable with it. This time around, unlike the experiment with PartImage (above), I did get a backup image, totaling only 1.2GB, on the target drive. Whether it would work if I needed to restore it was a question for the future. Having used about the same name for the PartImage backup as Clonezilla had assigned to its image backup, I now used Catfish to search for 2008-07-27, which were the first characters in the names of both files (that being the date). Catfish, set to use find, almost immediately found two folders of suspiciously similar name in /var/backup, which was, I think, where PartImage had said it would default to. Upon inspection, it seemed I had two smallish backup folders, the larger of which contained only about 39MB of data, there in /var/backup. I couldn't figure out how to delete them, so I just left them for now. I tentatively concluded that I would not be using PartImage for image backups, at least not until I could find a good tutorial or could otherwise figure out what I, or the program, had done wrong. I will say that, more than once in this process, I found myself wondering if there wasn't some way to make a Windows-based program like Acronis True Image work for these purposes. It belatedly occurred to me that, when they said that only Windows operating systems were supported, maybe they meant that they would only provide tech support on using True Image with Windows. After all, they did say that the product copied ext2 and ext3 filesystems. Also, as I looked more closely at their datasheet, I realized that they did offer some kind of bootable CD option. I downloaded their free 15-day trial EXE file and thought I would try installing and using it, as soon as I found myself working in Windows again. About this time, I became aware of some new virtualization alternatives. I had looked into virtualization repeatedly in previous years, and had arrived at a sense that there weren't many possibilities. Times had changed! First, during the foregoing research, I noticed that someone had mentioned that VMware offers its VMware Server product for free, in contrast to the $189 charged for Workstation 6.0, and that Server was also able to create virtual machines. A comparison of Workstation and Server yielded the conclusion that Server might be adequate for desktop needs. Moreover, it seemed that VMware was not the only company offering free virtualization products. In a comparison against VMware Server, it was suggested that Sun xVM VirtualBox "has the widest range of host system support and has the lightest hardware demands, and excels for single PC personal virtualization needs." Unlike Server, for my purposes, VirtualBox permitted the creation of multiple snapshots, like Workstation. It was suggested, moreover, that VirtualBox was a solid alternative to VMware Workstation. There was also a third alternative: there existed a website, EasyVMX.com, that would let you create virtual machines online, which you could then use in VMware's free Player program. Player, it seemed, was a pretty competent and versatile program in itself. I downloaded VirtualBox. Doing so involved installing the Java runtime environment that, in a Windows system, would have been installed with a click or two but, in Ubuntu, required a half-hour of screwing around to enter various commands. I actually called it quits for the night, halfway through this process, and continued the next day. But then the trail was cold. I got partway through the instructions and got an error message, "368: ./install.sfx.7704: not found. Failed to extract the files." It was an error message that apparently nobody in the world had experienced before, judging from the zero hits I got on a Google search for it. But then it turned out that I didn't even have to use that problematic Sun downloader. So I just downloaded the VirtualBox file. Actually, I chose the immediate installation option, instead of the download and manually install route. This meant I would probably have to download again if I had to reinstall Ubuntu again, but I had little patience for more manual installation instructions at this point. I found a nice list of BASH commands and used that to find "rm" to delete the BIN file that I had downloaded and tried to install for the Java runtime environment. Early on, I was getting the feeling that VBox was going to require more command-line work than had been the case in setting up VMware. Anyway, by this point, the installer had completed and had given me the message, "Same version is already installed." I searched Synaptic Package Manager and, sure enough, there it was. I have no idea why Sun's webpage gave all those other instructions, if it was possible to install it through Synaptic. Then again, I had not previously had an entry for Sun xVM VirtualBox in Applications > System Tools, and now I did, so perhaps the installation exercise made some sort of difference after all. VirtualBox seemed very easy to use. The first tough question was, "Select the amount of base memory (RAM) in megabytes to be allocated to the virtual machine." The choices ran from 4MB to 2000MB. They said, "The recommended base memory size is 192 MB." One writer said that the default used to be 512MB, just a few months ago, but that s/he instead chose 1024MB (i.e., 1GB). It took a little digging to find the Sun VirtualBox community forums, which are not on the Sun website itself. Once there, I went to the Downloads page and got the User Manual. On the question at hand, that manual (pp. 29-30) said,
Every time a virtual machine is started, VirtualBox will allocate this much memory from your host machine and present it to the guest operating system, which will report this size as the (virtual) computer’s installed RAM. Note: Choose this setting carefully! The memory you give to the VM will not be available to your host OS while the VM is running, so do not specify more than you can spare. For example, if your host machine has 1 GB of RAM and you enter 512 MB as the amount of RAM for a particular virtual machine, while that VM is running, you will only have 512 MB left for all the other software on your host. If you run two VMs at the same time, even more memory will be allocated for the second VM (which may not even be able to start if that memory is not available). On the other hand, you should specify as much as your guest OS (and your applications) will require to run properly. . . . So, as a rule of thumb, if you have 1 GB of RAM or more in your host computer, it is usually safe to allocate 512 MB to each VM. But, in any case, make sure you always have at least 256-512 MB of RAM left on your host operating system. Otherwise you may cause your host OS to excessively swap out memory to your hard disk, effectively bringing your host system to a standstill. As with the other settings, you can change this setting later, after you have created the VM.All of which was informative and appreciated. Given that I presently had 4GB and was willing to get more, and anticipated typically running no more than three virtual machines at once, and having seen a couple of examples where 1GB was the figure that people used, I decided on 1GB for my machine too. Next, they wanted me to select or create a boot hard drive. The manual (pp. 31-32) explained that, as in VMware Workstation, a fixed-size drive would have slightly better performance than a "dynamically expanding file." The drive, they said, needed to be large enough to hold the contents of the guest operating system and the applications I would want to install -- at least several gigabytes. The setup program defaulted to 10GB, but allowed between 4MB and 2TB. I wanted to have enough space, but I also wanted to allow drive space for backups. I searched the forum for VirtualBox on Linux hosts but didn't find anything exactly on point. It did seem that several people had wanted to know how to enlarge their virtual disks. Most were smaller, but one had started out with 10GB and now wished for more. I suspected that might be a case of installing many programs just on one virtual disk, which I did not plan to do. So I went with the 10GB default. Otherwise, the description of VirtualBox virtual machines sounded much like what I had already covered (in a previous post) regarding VMware. There didn't seem to be any need to install anything like VMware Tools in the process, so in that sense this was simpler. I also found the manual and the user interface easier to understand and work with. One difference was that I now had the option of adjusting video memory, which wasn't the case with VMware. They said the default of 8MB would normally be sufficient, but I had a 256MB video card, so I upped it to 16MB. After configuring my WinXPBasic virtual machine, I tried powering it on. This gave me an error:
Failed to start the virtual machine WinXPBasic. The VirtualBox kernel driver is not accessible to the current user. Make sure that the user has write permissions for /dev/vboxdrv by adding them to the vboxusers groups. You will need to logout for the change to take effect.It now developed that I had been looking at the wrong pages in the manual during installation. The manual, a PDF document, didn't open its bookmarks pane by default when it opened, and I neglected to look for it. I think I just searched for what I needed when I first viewed the PDF, and thus skipped right by the section entitled "Installing on Linux hosts." But here we were, on p. 17: it said I would have to install some packages on my Linux system before starting the installation. Some systems, it said, would do this automatically when I installed VirtualBox; evidently Ubuntu was not one of them. I searched Synaptic for the first of the two listed packages, Qt 3.3.5 or higher, and saw that there were about a dozen qt3 and qt4 packages available. None of them were installed, and I wasn't sure which ones I needed. I didn't immediately find anything relevant in the VBox community posts, and I noticed that some of the qt3 options (presumably less bleeding-edge than qt4) listed in Synaptic were marked with the Ubuntu logo, so I started selecting those. The first one, qt3-apps-dev, wanted to install a whole bunch of additional stuff that looked related to development, which wasn't me, so I unchecked it and went for the next item, qt3-assistant. That was described as a frontend. That sounded good. I decided to install just that, and leave it at that. The second necessary package listed in the manual, SDL 1.2.7 -- a/k/a libsdl, according to the manual -- had even more possibilities. It looked like I had a couple of them installed already, though, so I decided to try running VBox again without further additions. But I still got the same error. The manual (p. 19) said, "A user can be made member of the group vboxusers through the GUI user/group management . . . ." I was not sure where that was, and a search of the manual turned up nothing. I tried the command-line alternative provided in the manual: "sudo usermod -a -G vboxusers ray," where "ray" was my username. That drew no error messages, but it also didn't solve the problem, unfortunately. I started over again in this section of the manual and noticed that Qt was required only if you wanted to use VirtualBox's main graphical user interface (GUI). Since I had SDL installed already, it seemed that I must have been using their "simplified GUI" (p. 17). So that's possibly why I wasn't seeing any options for adding group members in the GUI. So, alright, since the manual said I had to have those packages installed before starting the VirtualBox installation, I uninstalled VirtualBox in Synaptic. Qt3-assistant and libsdl were still installed in Synaptic, so I turned right around and tried to reinstall VirtualBox using Synaptic. That gave me an "unresolvable dependencies" error, "Could not mark all packages for installation or upgrade." I went back to the Sun website and downloaded VirtualBox again. The Package Installer ran, but then (once again) said, "Same version is already installed." Maybe it was, but I didn't see any option for it, now, under Applications > System Tools, so I selected "Reinstall Package." But maybe it was just a matter of closing the Package Installer, because after I reinstalled and closed it, I did see the option. I ran VirtualBox and got the same error. Then I found a thread discussing the topic, and it reminded me that I had not logged out to let the change take effect. I restarted Ubuntu and, hey, progress! This time, when I clicked Start in VirtualBox, I got "FATAL: No bootable medium found! System halted." Had I forgotten to recreate my virtual machine after reinstalling VBox? No, it was still there. But I deleted it and recreated it. After doing so, when I clicked settings, I got an error message I had gotten before but had overlooked:
Failed to access the USB subsystem. Could not load the Host USB Proxy Serivce (VERR_FILE_NOT_FOUND). The service might not be installed on the host computer.For this, somebody pointed toward a post advising several steps. First, "edit the script /etc/init.d/mountdevsubfs.sh and activate the four lines around line 40 (Magic to make /proc/bus/usb work)." I found the script in File Browser and opened it in Text Editor. Sure enough, there was a set of lines -- not numbered, but I would have guessed around line 30, not line 40 -- that began with # signs. The # sign, I supposed, was used to "comment out" a line -- that is, to make it inactive. So I removed the # signs from the start of each line in that group of lines beginning with that phrase, "Magic to make /proc/bus/usb work." Each line changed color as I did so, there in Text Editor, which apparently meant that the lines were now active. Then I tried to save the file, but I got the error message, "You do not have the permissions necessary to save the file." So, OK, silly me. In Terminal, I entered "sudo -i" to get the permissions, then entered "cd /etc/init.d" to get to the right folder, then typed "gedit mountdevsubfs.sh" to open the file. I removed the # signs from those lines again, and this time I was able to save the file. I forgot to take the next step recommended in that thread: execute /etc/init.d/mountdevsubfs.sh start. Instead, I restarted VirtualBox and, this time, did not get the USB error message. But I still got "No bootable medium found!" I realized it made sense: I had not put my Windows XP CD into the CD-DVD drive. I did that and tried again. Still no joy. Without closing out that dialog box that gave me the fatal error, I selected Devices > Mount CD/DV-ROM and designated the "Host Drive." Then I selected Machine > Reset. That did it. Now it recognized the WinXP CD. I went through the Windows XP installation process. I started adjusting things on the finished installation. The virtual machine didn't see my other hard drives, so I wondered how to change that. I looked at the manual and discovered that I had spoken too soon: there was indeed something, in VirtualBox, similar to VMware Tools. They called them "Guest Additions." I needed Windows Guest Additions, discussed in the manual starting at page 53. To install them, in the virtual machine I hit Devices > Install Guest Additions. It was a relief that they didn't seem to need to be installed before I activated Windows (though I had not yet activated it, having now been trained). When that was done, Windows Explorer showed that I now had VBOXADDITIONS_1 as drive D. So my Start Menu shortcuts to permanently installed Windows programs on drive D (i.e., programs that did not make registry changes, and that therefore could be permanently installed on a standalone basis) were all going to have to be changed to point elsewhere. Anyway, what about those other hard drives? Oh, but wait. Skipping ahead to the "Known Issues" section of the manual (p. 180), I saw one issue that would apply to my setup. It said, "A virtual network adapter configured for NAT is slower than an adapter configured for host interface networking." I believed I wanted NAT for security reasons: I wanted Windows to access the Internet via the Linux installation, not directly. I wasn't sure how to verify that this was the case, though. At this point, I did not recall whether that precise question had been posed in the virtual machine setup process. I was definitely online, so I went to ZoneAlarm.com and installed the firewall promptly. But was I using NAT? Yes! The answer was there in the VirtualBox under Details. So. The other hard drives. The manual (p. 62) said I had three options: (1) Virtual Disk Image (VDI) files, (2) iSCSI storage servers, and (3) raw host hard disk access. VDI meant giving the guest operating system access to a large image file on a real hard disk. So I would apparently have to get my files into that image, and if I lost that image I would lose my files. Option 2 didn't apply; I didn't have iSCSI storage servers. Option 3 sounded much more risky than in VMware: it was "an experimental feature," and on p. 121, they said this:
Warning: Raw hard disk access is for expert users only. Incorrect use or use of an outdated configuration can lead to total loss of data on the physical disk. Most importantly, do not attempt to boot the partition with the currently running host operating system in a guest. This will lead to severe data corruption.Looking ahead in the manual, I saw a number of command-line entries that I would be having to make in order to get this thing going properly. I didn't like it. My inexperience on the Linux command line, and the warning that this was for expert users only, made it seem that I would be putting my data at unacceptable risk, now and in the future. Stability was a priority. This, for me, was a deal-breaker. I wanted unfettered, low-risk access to physical drives. I noticed that VirtualBox appeared able to use VMDK image files, such as those created by VMware. If, in the future, VirtualBox became more solid in this area of disk access, I felt I would probably be able to make the switch at that point, possibly without having to recreate any of my virtual drives. But for now, it was time to uninstall VirtualBox and look back to VMware. The remaining option, in that case, was VMware Server. A comparison of Workstation and Server suggested several limitations of the latter, of which the key ones, for me, included no multiple snapshots and no cloning of virtual machines. I could see how those could be significant limitations for a software development lab, where they would be spending all day tinkering with different variations on a software theme. But I just wanted to set up a few virtual machines. Couldn't I copy them, with or without cloning, and develop those copies in different directions? There was only one way to know for sure. I opened the VMware Server Virtual Machine Guide and took a look. From what I read there, it seemed that I may have misunderstood the VirtualBox manual. Maybe they were talking about using physical drives as virtual drives. Because here, in the Server Guide, they too were saying that that was an experimental procedure. Well, but why did nobody's manual seem to be saying anything about access to physical drives in a non-virtualized state? I suddenly remembered that possibly I should have just tried to map those drives when I still had VBox going. It had seemed superior to VMware Server, when I was doing product comparisons, but now it was gone. So I had to reinstall it. It wouldn't take as long as before, but . . . . So, OK, this time I started with Synaptic Package Manager. Now it was showing only virtualbox-ose, the open-source edition of VBox. I went with that. For installation, I marked virtualbox-ose and virtualbox-ose-modules-generic. I wasn't sure what the latter did, but the descriptions of various modules had made this sound like something I would probably need. Unfortunately, this time, when I clicked on Applications > System Tools > VirtualBox OSE, I got an error message:
Failed to create the VirtualBox COM object. The application will now terminate. Could not load the settings file . . . FATAL ERROR: Attribute 'version' has a value, '1.3-linux', that does not match its #FIXED value, '1.2 linux'In a brief search, I didn't find solutions to this. I noticed, though, as I had not noticed previously, that people were talking about version 1.6, which sounded a lot more recent than the 1.2 or 1.3 cited in this error message. So I marked the virtualbox-ose files mentioned above for complete removal; I downloaded the installation file after all, virtualbox_1.6.2-31466_Ubuntu_hardy_amd64.deb; and I right-clicked and selected "Open with GDebi Package Installer." This time, unlike before, I got a "Configuring virtualbox" dialog that said this:
Creating group 'vboxusers' Users of VirtualBox must be member of that group in order to have write permissions to /dev/vboxdrv. Otherwise starting of VMs will not be possible.I told it to go ahead and compile the vboxdrv module, although I wasn't sure what that meant. The installation finished and I tried Applications > System Tools > Sun xVM VirtualBox. It found my previously installed WinXPBasic virtual machine without my having to reinstall it; and when I clicked Start, it loaded Windows XP. When that was done, I tried drive-mapping in Windows Explorer, but no, it didn't recognize any of my hard drives. Back to the VBox manual: ah, yes, as in VMware, I had to do something with folder sharing first. In the virtual machine, as advised by manual p. 59, I selected Devices > Shared Folders and clicked on the tiny +folder icon. Alas, it was not recognizing anything from outside the Ubuntu file system -- except for one folder on an NTFS drive. Somebody mentioned installing Guest Additions. So, OK, those had to be reinstalled. I clicked on Devices > Install Guest Additions. The process completed and the virtual machine rebooted. The option to install them was still there after reboot, but it did nothing further when I clicked on it. Shared Folders still wasn't willing to see anything outside of the Ubuntu file system. The solution was to use Ubuntu's Places > Computer option and then double-click on each of my hard drive partitions, and then click on the Back button to prepare for the next. Once I had opened them within Ubuntu, this apparently made them visible to VirtualBox under /media. Then it was the mapping procedure described in the previous post. In the VBox virtual machine, I proceeded to move some files around using Windows Explorer. I noticed atypical behavior. Unlike in a native WinXP installation, WinEx in the VM did not always move files when I told it to. I found I could only move a handful at a time. Moreover, it did not notify me if I was moving files to a place where there were already files of the same name. It seemed to overwrite the existing files without asking if that was what I wanted. And then, once I had moved all of the files away from a folder, if I hit F5 (Refresh), they were all back again. In a brief Google search, I didn't find an answer to these bugs. It occurred to me that it probably made more sense to try building virtual machines in VMware Server than in VirtualBox. My understanding was that VBox was able to import virtual machines created by VMware, but that the reverse was not the case. Also, I thought VMware Player might be more lightweight, for purposes of running virtual machines on the secondary computer, with its 2GB RAM ceiling, and I guessed that Player would probably not play VBox virtual machines as well, if at all. I might not have pursued these thoughts if I hadn't been getting that flaky behavior in Windows Explorer inside the VBox VM. But after a half-hour of screwing around with that, I had to wonder: if they couldn't even get basic file copying and moving right, what were they going to do with my data in the middle of a program? VMware had been in the virtualization business a long time. This was their baby. For Sun, virtualization was a sideshow. I had found VMware impressive; VBox, less so. So I saved the VBox machine state, shut it down, and downloaded VMware Server. I followed a How-To Forge instruction page on installing because there didn't seem to be any instructions in the VMware Server manual or at VMware's webpage. The instructions lost me almost immediately because I was using a single quote ' instead of a grave accent ` in a Terminal command. I could see, plainly enough, that they were using grave accents; I just had a personal issue with using grave accents where single quotes seemed more appropriate. Once we were past that little hurdle, the installer ran until the point where it said, "We were unable to locate an unused Class C subnet in the range of private network numbers," and so forth. It asked me, "What will be the IP address of your host on the private network?" After some searching and fumbling around, I tried Ubuntu's System > Administration > Network Tools > Devices. There, I saw an IP address, and this is what I entered, and likewise for the netmask question, which came next. I tried the same trick when it asked the same questions a bit later, but now I got a new message:
The new private network has collided with existing private network vmnet8. Are you sure you wish to add it? [no]I recognized the wisdom of the default answer, "no," but was not sure what I would do after that. It asked again. I took the matter to Google. Even Google was stumped. I scrolled back up and saw that the first question had been seeking an IP address for vmnet8, whatever that was, whereas this second time around, the installer was seeking an IP address for vmnet1. According to an ExtremeTech webpage, "VMnet1 is dedicated to Host Only mode, and VMnet8 is for NAT (Network Address Translation) mode." So it looked like I might have gotten it backwards. Now that I was looking more closely at Network Tools, I saw that I had two different Network Devices. One was Ethernet Interface (eth0); the other was Loopback Interface (lo). Between the two, Ethernet Interface sounded more like something that would be connecting to the outside world via NAT mode, while Loopback sounded like Host Only mode. So maybe I should have entered the Ethernet Interface IP address for vmnet8. I tried entering the Loopback Interface IP address for vmnet1 now, but I mistyped it. Soon I was in a world of pain where all I had was the endlessly repeating question, "What will be the IP address of your host on the private network?" I just typed something, anything, to move along. It all came to nothing anyway, because after saying that it was "Generating SSL Server Certificate," the installer said this:
Unable to get the last modification timestamp of the destination file /etc/vmware/ssl/rui.key. Execution aborted.And just like that, I was dumped back at the prompt. I tried running it again, thinking that maybe my confused answers had contributed to this abortion. The installer gave me the option of reconfiguring my network settings, and I said yes, please. This time, I entered the Ethernet Interface numbers for vmnet8, and the Loopback Interface numbers for vmnet1. But that was apparently not the problem; I got "Execution aborted" again. I was told that the solution to this problem was to enter these lines:
sudo touch /etc/vmware/ssl/rui.key sudo touch /etc/vmware/ssl/rui.crtI tried that and then ran the installer again. That seemed to work. Now we got as far as the serial number. I entered the one I had gotten from VMware. I received a message saying that the serial number is invalid. I tried again, copying and pasting to avoid typos. Same result. I hit Enter to cancel and got the message,
You cannot power on any virtual machines until you enter a valid serial number. To enter the serial number, run this configuration program again, or choose 'Help > Enter Serial Number' in the virtual machine console.I also got a message indicating that Bridged Networking had failed on vmnet0, and NAT service had failed on vmnet8. Otherwise, though, they said it had completed successfully, and I was proud. I went to Applications > System Tools > VMware Server Console. The little Ubuntu wheel spun for a while and then disappeared. No action. I tried again. Nope. Server was not running for me. When all else fails, reboot and give it another try. I did; it didn't. The recommended next step was
After the installation, please run: sudo ln -sf /lib/libgcc_s.so.1 /usr/lib/vmware/lib/libgcc_s.so.1/libgcc_s.so.1 Otherwise VMware Server will refuse to start on Ubuntu 8.04. After the successful installation, you can delete the VMware Server download file and the installation directory: cd /home/falko/Desktop rm -f VMware-server* rm -fr vmware-server-distrib/So I entered that first line and tried again. No luck. I typed "vmware" at the command line and got this:
vmware is installed, but it has not been (correctly) configured for this system. To (re-)configure it, invoke the following command: /usr/bin/vmware-config.plI entered that line, with a "sudo" in front of it, and that took me back through the same installation questions. I opted to skip networking, just this time, because there were a bunch of messages right before that, saying "This program previously created the file /dev/vmnet0, and was about to remove it. Somebody else apparently did it already." There were messages like that for vmnet 1 etc., and also for parport 2 etc. So I thought maybe the previous failure was now being cleaned up, so I just went with all the defaults -- and this time, there were no failures. It ran all the way through. I still didn't have a satisfactory serial number, but that could wait (maybe). I typed "vmware" at the prompt again, and this time it worked. I still couldn't enter a serial number, but I had VMware Server Console. I poked around in it a bit. It looked almost exactly like Workstation. I downloaded VMware Converter and rebooted with the GPartEd CD. I removed the Hidden flag from the Windows XP installation and rebooted again. This time, when I selected Windows in the GRUB menu at bootup, I went into Windows, in regular dual-boot fashion. Once there, I ran the Converter EXE file. It advised me that I could optionally make a virtual machine from my Windows installation without leaving a footprint -- that is, without Converter being included in the VM -- by running converter from the bootable CD. But evidently that CD was now part of the nonfree Enterprise version of Converter. I didn't see anywhere to download the CD image for free. It was OK; I figured I could just uninstall Converter from the virtual machine later. So I installed and ran Converter in Windows on the primary computer. It was willing to let me include all of my drive partitions in the virtual machine. I set it to include only the Windows program drive. It finished its job pretty quickly -- didn't time it, but something on the order of 10 to 20 minutes. It gave me a VMDK file of about 15GB because, I guess, the partition it was copying was about that big. The basic Windows installation on that drive took about 5GB. I wasn't sure if that included the pagefile. I decided to stick with the plan of having 10GB virtual machines, and therefore I rebooted into the GPartEd CD, shrank this partition, and tried again. While that was going on, I examined Wine. If I had to run a Windows-based program, there was more than one way to do it. Running it in a virtual machine was one possibility. Running it on Wine was another. The Wine people maintained an application database with lists of Windows programs whose Wine performance was rated platinum, gold, silver, etc. (CrossOver Linux was a for-profit operation based on Wine that seemed to have more or less the same application database.) For example, Microsoft Office 2007 was rated silver in Wine's database. Silver meant "Applications with minor issues that do not affect typical usage." Excel 2003 had a gold rating on Ubuntu 8.04, and Word 2003 had a gold rating on Ubuntu 7.10. What looked tough about Wine was that people seemed to have to jump through hoops to get programs to work. I did find an article by Tom Wickline, posted in January 2008, that made it seem relatively manageable to install Office 2003 using Wine; but when I looked closer, I realized I didn't really understand it. A discussion thread asking which would be faster for gaming -- Wine or VMware -- drew responses heavily favoring Wine. Likewise for a small poll asking about high-end computing performance. It made sense -- VMware created an entire virtual environment, while Wine just provided a translation. To facilitate the transition away from Windows, I thought I might try my hand at Wine for a few applications. It sounded like Wine basically did not work for some programs (e.g., Adobe Acrobat Pro, Adobe Premiere Elements), so I would be needing virtual machines to some extent, regardless of the Wine situation. Another possibility was to simply replace the Windows program with a Linux-based program. Then there would be no need for either VMware or Wine. This, like the Wine option, seemed likely to call for some time spent learning how to install and/or use the program. In Windows, of course, I had already invested that time: I had found the program and had learned how to use it. So there was a time advantage, at least for the short term, in just installing the Windows program in a virtual machine, rather than scouting out a Wine or Linux-based alternative. For some programs, it would probably make sense to take that route. If I could pass another year or two using the program I already knew, I might find that there had been progress, in Linux or in Wine, toward offering the kind of program I needed. The time investment seemed likely to be fairly small in the case of my word processor. Before going to great lengths to make Microsoft Word 2003 function under Wine or VMware, there was the option of using OpenOffice.org (OOo) Writer, which came included with Ubuntu under Applications > Office. In my brief work with it so far, I had found it to be quite Word-like, so this transition might be easy. If I absolutely needed Word for something, I had already installed it in the dual-boot Windows setup on my secondary computer, and could also install it in a virtual machine; but perhaps I could reduce the number of pressing computer-related tasks, for now, by just using OOo Writer. The one missing thing had been that I couldn't automatically transfer my AutoCorrect entries from Word to Writer. I posted a note to that effect, in the appropriate OpenOffice forum, to revive an old discussion thread on the subject. Someone then devised an AutoCorrect import macro for Writer, but at this writing that was not yet working. Once this came around, I would try Writer; but in the meantime it had to be Word on Wine or VMware. While I was killing time, waiting for "chkdsk /r" to run in the Recovery Console on the main machine (all this booting and crashing seemed to have messed up some disk directories), I also discovered the Compiz desktop effects in Ubuntu on the secondary computer (basically, install compiz-settings-manager in Synaptic), and began playing around with having a rotating desktop, which seemed likely to be faster and more accommodating than switching between windows. Roll the mouse wheel on one desktop and you're in the other one. Very easy. Eventually, of course, I got in trouble with the top bar disappearing from my windows, evidently because I had turned off the Compiz "Window Decoration" option. Having installed Wine on the secondary computer (via Applications > Add/Remove), I now went to Applications > Wine > Configure Wine and set it up to recognize my drive partitions. I discovered that I had been granted only Wine 1.0, which was the last stable version but was very outdated. The advice was, instead, to stay up with the current version, first by uninstalling what I had installed (sudo apt-get remove wine) and then using Synaptic to install. But Synaptic was showing just version 1.0, and the official Wine website was advising that it might be safer not to be constantly uploading the latest developments, if version 1.0 worked for you. I decided to refer back to what version those testers had used, when they had awarded a gold medal to some versions of Wine that were able to run Word and Excel. For Excel, the one that had worked in Ubuntu 8.04 was Wine 1.0-rc4, and for Word, the one yielding a gold rating in Ubuntu 7.10 (the latest version tested) was Wine 0.9.53. Since Wine 1.0 was supposed to be the stable release, I went back to Synaptic and reinstalled that. I did find a website with clear installation instructions, evidently prepared before Wine 1.0, that insisted people should use version 0.9.37; but since both of the tests just mentioned had successfully used more recent versions, I decided to start with 1.0. I went through those instructions. The result was interesting. "Wine" becomes a sort of prompt that allows Terminal (technically, the BASH shell) to process commands. So if you type "wine regedit," that will open the Windows registry editor, assuming you've installed Wine. I completed the installation and got an indication that it had completed successfully. Unfortunately, a lot of what looked like error messages had been flashing by in the Terminal window, and when it was done I did not have a copy of winword.exe (the program that starts Microsoft Word) in my Program Files folder under the .wine directory. A search told me, however, that of course I did have a copy of winword.exe in the Windows partition that I used to dual-boot the computer. So I wondered: could I type "wine winword" in that folder and start Word that way? The answer was no. I got a "module not found" error. But that seemed to me to be the way Wine should work: just set you up so you can run the programs you've already installed on your dual-boot drive. Anyway, judging from the meager contents of my Program Files folder under .wine, nothing had really installed from Office. I wondered if I could use Wine 1.0 to install anything else. For instance, the website for IrfanView, an image viewer and editor that I used frequently, specifically said that you could use Wine to use IrfanView. So in Terminal I went to the folder where I kept my downloaded copy of IrfanView.exe, and at the prompt, logged in as root, I typed "wine irvanview400_setup.exe." That failed. Following a tip in the Wine applications database page for IrfanView, as well as some other helpful instructions, it seemed that I might need MFC42.DLL, and that I might find that in Winetricks, whose webpage advised me to enter "wget http://www.kegel.com/wine/winetricks" to install Winetricks. I tried that. It seemed to work. Then I typed "sh winetricks mfc42." That finished and said "no errors." Then I tried "wine irvanview400_setup.exe" again. That worked. IrfanView was up and running on Ubuntu! I installed the IrfanView plugins in the same way. But to restart IrfanView was not easy. I found that its program file was not where I had been looking before. It was actually at /root/.wine/drive_c/Program Files/IrfanView. I guessed this was because I had installed those programs as superuser, i.e., root, using "sudo." So now would I have to be logged in as root every time I wanted to use it? I found the program file for Word 2003 nearby, too, at /root/.wine/drive_c/Program Files/Microsoft Office/OFFICE11/Winword.exe. And when I tried running it from the prompt, it too worked, after all. I was not too happy with the way Word ran, though. This was not like using it in a Windows environment. I could not identify a Word document in the File Browser and double-click on it to open it in Word. Instead, it appeared I would have to start Word using a "wine winword.exe" command, and then navigate to each individual document to open it. This could be very inconvenient when trying to open multiple files in an extended directory tree. I was also not happy with the amount of time I was spending to try to figure this out. There were many Ubuntu skills and bits of knowledge that I needed, but presently lacked, to work through these sorts of problems effectively. Rather than continue to research the technicalities of setting up each individual program in Wine, I decided to see whether the VMware Server option would be satisfactory for the coming year or so. Having used VMware Converter once, it occurred to me that I could use my dual boot as the test bed on which I would create my basic WinXP virtual machine. That is, instead of installing my basic Windows XP setup in a virtual machine, complete with various tweaks and related programs, I would install that stuff directly on drive C. I would use drive C when I dual-booted into Windows, and I would also use Converter to copy drive C into a basic WinXP virtual machine. Then, hopefully, all I would need to do in VMware Server would be to install a few different programs on each copy of that basic virtual machine -- one for Microsoft Office, one for Adobe Premiere Elements, and so forth. Armed with that concept, I proceeded to flesh out the basic WinXP installation on drive C of the primary computer. I installed various items that I would want to appear in all virtual machines, cleaned up and organized the Start Menu, ran System File Checker and the disk checker, and defragmented. The basic tools I installed included PDF995 (so that I would have a lightweight PDF printer in all virtual machines), 7-Zip (for file zipping and unzipping), drivers and software for my printer, IrfanView for image viewing and editing, Nero and CDBurnerXP, TreeSize, Unlocker, and ZoneAlarm. Even with a 3.5GB system-assigned pagefile, the total contents of the drive were only 6.4GB, leaving 3.6GB for installation of a few more programs in each specialized virtual machine. It seemed adequate. I ran VMware Converter to capture this state-of-the-art basic Windows XP installation. In case I neglected to mention it earlier, Converter had the appreciated feature of installing VMware Tools in the conversion process. I rebooted into Ubuntu and tried to open that virtual machine in VMware Server. I got an error message, "Unable to add virtual machine
Cannot find a serial number to unlock this version of VMware Server. Please ask your system administrator to run "vmware-config.pl" and enter the serial number. For more information, please read the INSTALL file in VMware Server's documentation directory.So, OK, I closed Server, searched Filesystem for "vmware-config.pl," opened Terminal, typed "sudo -i" and then navigated with "cd" to /usr/bin, and typed vmware-config.pl. This ran me back through a bunch of installation questions about Server. It came to this question:
Do you want this program to set up permissions for your registered virtual machines? This will be done by setting new permissions on all files found in the "/etc/vmware/vm-list" file.This, I suspected, could be where the serial number issue would arise. Unfortunately, being still a bit disoriented in the Ubuntu search and file manager programs, I could not quite figure out where I had put the Server manual; and when I went to download another copy, I found that Firefox on the primary computer was giving "Server not found" error messages for every webpage I tried to open. I thought this might be a temporary problem caused by the vmware-config.pl configuration/installation program, so I downloaded a copy of the manual on the secondary computer. The VMware Server Virtual Machine Guide didn't have anything to speak of on the subject of the serial number, so I tried the Guest Operating System Installation Guide. But it, too, seemed to have nothing. Next, I tried the VMware Server Administration Guide. It had a few references to serial numbers, but nothing that explained what was happening here. So I answered "yes" to the question written above and, sure enough, after asking "In which directory do you want to keep your virtual machine files?" (the default, which I kept, was /var/lib/vmware/Virtual Machines), it told me to enter my serial number. I did. That was the last question. We were done. I was a little concerned that I might have to redo this little process each time I created a copy of a virtual machine. I will just say this was a huge hassle for a bloody free program, and it looked like a lot of people had shared my pain. But now, by golly, Server was willing to start the Windows XP virtual machine. My first mission was to complete the basic WinXP installation. There were some things that I couldn't or wouldn't do while it was still running in native, dual-boot mode. For one thing, I had to install VMware Tools before activating WinXP, and as the notice now reminded me, I couldn't do that until the guest operating system was running. So I did that. Installing Tools required access to some files from the installation CD or, as I directed the program, from C:\WINDOWS\system32\dllcache or C:\WINDOWS\system32\drivers or their parent folders. The Found New Hardware Wizard came up, which usually meant some kind of installation hassle in WinXP. It turned out to be trying to install the Ethernet adapter. I had the latest driver on another partition, but here in Server, as distinct from VMware Workstation, the VM option on the menu was not giving me the opportunity to name shared folders. It's not that it was greyed out; it just didn't exist here. It was OK; I could install from the CD that came with the motherboard. But the Browse button didn't even show the CD drive as an option; it only showed the floppy and drive C for devices on my computer. I tried just typing the drive letter for the CD, but I kept getting the error, "The specified location does not contain information about your hardware." I could have tried typing in the exact directory location, determined by putting the CD in the other computer and navigating around until I found what looked like the right driver; but often these driver lookup efforts were trial-and-error, with the system sometimes informing me that I hadn't selected the right driver. So I gave up on the "Have Disk" option and instead just let Windows figure out what to install. It seemed to be happy; it completed that and went on to the next item, my video driver. But here, I wanted to adjust the screen, so I clicked a couple of things, including Quick Switch, a button there at the top of the VMware Server screen. That instantly plunged me into an Ubuntu command line on an all-black screen and then, after a moment, it allowed me to log into Ubuntu again and start over. OK, that was weird. I soon rediscovered that I had actually been looking for View > Autofit Guest; but when I tried that after restarting Server, nothing happened; part of the WinXP desktop was still scrolled off below and to the right of the screen. Anyway, I started VMware Server again, and this time I selected Edit host settings. I told it to allocate 3GB, of my 3.4GB system total RAM, to virtual machines, and I told it to fit all virtual machine memory into reserved host RAM. So we would see if three 1GB virtual machines were all allowed to run simultaneously. Once again, though, I got "You don't have the permission to execute this operation." So I canceled out of that, reopened the basic WinXP virtual machine, and got an indication that I did not have VMware Tools installed. So I started that process again. But up came the Found New Hardware Wizard for my video controller again. Turns out it was searching for a driver for the "VMware SVGA II" video controller, not my actual physical controller. So no CD access was necessary; it was just trying to access the virtual (i.e., not physical) video controller. I allowed it to proceed by its own sense of what was right, but this gave me an error message:
Cannot Install this Hardware There was a problem installing this hardware. Video Controller (VGA Compatible) An error occurred during the installation of this device Driver is not intended for this platformSo, OK, another mystery from VMware. Next, it wanted to install my PCI Bus Master IDE Controller. The big picture seemed to be that VMware Tools would not install, at least not on Server, until all of my hardware had been detected and maybe supplanted by virtual hardware. It was a little unclear. But the IDE Controller seemed to install OK. Now I got repeated visits from the Insert Disk dialog, which was telling me to insert the WinXP CD. At first, I pointed it to dllcache (above), but it kept coming back, so I did insert the CD. But as before, it was still not recognizing the CD drive, so the dllcache option was the only way to go. Next, it wanted to restart WinXP, within the virtual machine, in order to recognize the installed devices, so I let it do that. I tried again on the shared folders option. It still didn't exist. Possibly this was a difference between VMware Server versions 1.0 and 2.0; maybe 2.0 would catch up with Workstation in this regard. Or maybe this was something that came with VMware Tools. I was still getting an indication that I did not have VMware Tools installed, so I went to VM > Cancel VMware Tools install and tried again with VM > Install VMware Tools. WinXP was booting up very slowly, for some reason. Needless to say, I was having serious doubts, by this point, about my theory that VMware specialized in virtualization and was therefore taking it more seriously than Sun, for whom it was perhaps just a sidelight. My browsing around had introduced me to a number of complaints, and while I guess you always have those, it wasn't reassuring. Right now, I couldn't believe how slowly the WinXP virtual machine was rebooting. Something was plainly wrong. Maybe I had confused things by trying to install VMware Tools prematurely. I rebooted it again, canceled the Tools installation again, restarted it again. Still no Tools installation. In my browsing, I had kept seeing that all these other people were successfully using VirtualBox. I had to think that my problems with it had been a fluke. I wondered how it would do at importing the basic WinXP installation that VMware Converter had now made into a virtual machine. I killed VMware and fired up VirtualBox. I couldn't figure out how to import the virtual machine, so I went looking for documentation and advice. Unfortunately, it now appeared that I could not access the Internet on the primary computer. Every webpage I tried to open in Firefox within Ubuntu came back with, "Server not found. Firefox can't find the server at ." I was able to ping my router, and of course the secondary computer (on which I was writing these words) was connecting to Blogger.com just fine. Something in the process of running VMware Server, VMware Tools, and/or VirtualBox had screwed up the network connection. I dual-booted back into Windows XP and ran the New Connection Wizard. That achieved nothing, so then I remembered I needed to go through the router installation process. I hadn't intended to connect this WinXP installation directly with the Internet, because I didn't expect to be using it for that. I had just wanted it to be a starter for the virtual machine and a fallback in case of emergency. But now we were on plan B. I inserted the router installation CD and went through the process. Unfortunately, the process terminated prematurely with an error message:
Setup Wizard MFC Application has encountered a problem and needs to close. We are sorry for the inconvenience.This happened several times. I looked for a solution. Unfortunately, while picking up the router to look closely at its model number (Linksys WRT54GL), I seemed to have punched a button on the front, and now my Google search seemed not to be running on the secondary computer either. When I hit that button, a light started flashing on front of the router. It didn't stop. I punched it again; no change. I powered down everything -- both computers, router, and modem -- and left them off for a couple of minutes. I came back and turned them back on, and ran the router setup in WinXP in the primary machine again. This gave the same result as before. It also changed nothing on the secondary computer, where I was reduced to writing these words in a text file because I could no longer communicate with Blogger. This led to an extended digression into getting myself a working Linux-compatible router. As hours passed in that unwanted distraction, I found myself thinking about a conversation I had had with a friend, a few days earlier. I told her about my computing woes. She asked why I didn't just buy a Mac. I had to admit that I had not seriously thought about owning a Mac since the 1980s, when Apple acquired a reputation for having expensive stuff that you couldn't adjust to your needs. I was now realizing, however, that I had been mistaken in my understanding of something that had happened in the previous summer, 2007. Then, as now, I had spent weeks on end trying to make the computer work properly. In that case, it was a hardware problem. My solution was to make sure I had two compatible desktop computers, so that when something failed, I could immediately switch parts or software back and forth to troubleshoot the problem. This approach had cost me several hundred dollars for the backup computer, but had paid for itself numerous times when I was checking my own hardware or software or that of a friend. Indeed, even at the present time, I was constantly using one machine to aid with another. I had seen that month or so of hardware-oriented effort, in summer 2007, as a huge exception to the general idea that I would mostly just do my work and avoid computer hassles. That, I now believed, was mistaken. First, I had had to spend several days at a stretch, on several occasions during the past year, dealing with various computer difficulties. I had upgraded my motherboard and had then had to reinstall Windows; I had had a computer virus; etc. In fact, I had come to the point of making a determined effort to install and run Ubuntu only after spending a week reinstalling and configuring Windows XP, only to find that my installation was not working properly. So the computer thing was not just an occasional interruption to my work. It was a huge imposition. I also realized that it had always been that. There had been times -- in 1996, for example, and again in 2000 or 2001, and in 2003, and so forth -- when I had spent literally months in an on-again, off-again effort to get productive work done while dealing with significant hardware and/or software difficulties. The Ubuntu adventure was not going to be an exception to that. I was going to have to continue fiddling with various kinds of hardware and software problems. Possibly virtualization would reduce some of that; possibly not. Certainly, at any rate, it was not turning out to be the kind of thing where I would just make a quick switch and get back to work. This was going to take some time. That's not to say I had given up on Ubuntu. It still felt much more solid and reliable than Windows. The open and helpful attitude of people in forums seemed better than when I had had to troubleshoot Windows problems. The software seemed more responsive, so far. There were a lot of things to like about it. But at the end of the day, I needed it to work. So, OK, I needed to keep troubleshooting this router problem; and over the longer term, I needed to keep thinking about what I was willing to do, and what I needed to give up, in the computing area. Twenty-four hours later, I had resolved the router issue and was back in the game. It was time to make this virtualization thing a reality. I started another post for that purpose.
0 comments:
Post a Comment