War and Peace: Creating an Ubuntu System with WinXP as a VMware Guest
A year ago, in summer 2007, I investigated various options for transitioning from Windows to Ubuntu Linux. I was especially interested in making sure that I could continue to run Windows applications. The best solution at the time seemed to be to install Windows XP and then install Ubuntu within a VMware virtual machine. In other words, I would learn Ubuntu as though it were just another application within Windows. I would not have to deal with dual-booting between Ubuntu and Windows; I would just stay in Windows and use Ubuntu applications sometimes. Since then, there had been some further developments. This post presents the new things I learned, as I looked into this issue again in summer 2008. One development was that, at the time of this writing, I once again found myself spending a week or more dealing with problems with WinXP. It was a lot of time that I definitely did not want to spend in this way, having already done so a couple of times during the past year. I really did not want to be having any more operating system meltdowns or freezeups that would require me to spend days without functionality on my main computer. Another development was that I had moved to a new place and was using DSL. It was less shielded from viruses than the setup I had enjoyed in my previous residence; in fact, I had already contracted a virus. Even after installing a router as a hardware firewall, I noticed that some attackers still seemed to be getting through as far as my ZoneAlarm software firewall (and no further, I hoped). That is, I was newly sensitized to the relative virus security offered by Linux. Research at this point indicated that Wine was still not great for running Windows programs within Linux -- especially not Word 2003, Excel 2003, and Adobe Acrobat Professional 8. I looked into Xandros, which had recently acquired Linspire; but it, too, did not seem capable of running those programs well. So the VMware solution seemed like the only game in town, and that's what some people were saying. The virtualization tool of choice, based on my review in 2007 and on some comments I had heard since then, was VMware. I found the VMware offerings a bit confusing in 2007, though, and in my tinkering it seemed to run pretty sluggishly. I guessed it was probably a fixable problem, but in the end I was not ready to make that kind of commitment to Linux, when it seemed that I was still able to run Windows pretty well, after spending enough time at it. As of this writing in summer 2008, though, VMware seemed to have gone through some additional development, and VMware's desktop offerings had become more simplified and streamlined. You would basically buy VMware Workstation for $189; you would use it to create a virtual machine; and then you would use Workstation or the free VMware Player to run that virtual machine on another operating system. One concern, the previous year, had been that Ubuntu did not support multiple monitors. Working with two monitors had been extremely efficient for me, and I did not want to give that up. Preliminarily, it looked like Ubuntu did now offer better dual monitor support than it had in 2007, and VMware said it could span multiple monitors too. VMware offered a free 30-day evaluation. When I got to the page offering a download of VMware for Linux, I saw that I also had a choice between 32-bit and 64-bit options. My understanding was that Windows XP was 32-bit. I had decided against upgrading to Windows Vista, based on reports of hassles that people were having. I assumed I could run a 32-bit version of Linux within 64-bit Ubuntu, using VMware; but I did want the 64-bit version of VMware to take advantage of my 64-bit processor. I guessed this would require the 64-bit version of Ubuntu, so I downloaded the CD image for that as well. I ran the 64-bit Ubuntu installer with no problems at all, using the relatively complex partitioning scheme that people had advised back in 2007. I would have let Ubuntu do it automatically, but I had data partitions on the same drive and I had also seen that Ubuntu didn't recognize the partitions if I created them in PartitionMagic. The installer recognized my NVIDIA graphics card and installed the drivers for the graphics card and the motherboard *much* more easily than had been the case with WinXP, where I had spent many hours fooling with the installation of that card in Windows. Likewise for the updates: it took a fraction of the time and hassle to download Ubuntu updates, as compared to the process of installing Windows updates. And then I rebooted, or tried to. The system showed the Ubuntu logo, and then the screen said "No signal" and the hard disk drive stopped working. I gave it a few minutes and hit the reset button. Same thing again. Third time around, I chose the "recovery mode" option for bootup. This gave me a Recovery Menu with four options. I decided against "resume normal boot," since that didn't seem to be working, and tried no. 2 on the list, dpkg, "Repair broken packages." That flashed another group of command lines and then said, "Finished. Please press enter." So I did, and I was back at the Recovery Menu again. I decided against the third option, "Drop to root shell prompt," since I would have no idea what to do there, but I did try the fourth option, xfix, "Try to fix X server," in case that was the problem. That took a few seconds and then I was back at the Recovery Menu again. So now it seemed like "resume normal boot" was the option to try. And that did it. I got the Username prompt, and all was good. I was back in Ubuntu. I ran Firefox, and it worked with no problem. Now it was time to figure out what to do about running VMware and all that. I had already saved the VMware download (a TAR file) on one of the partitions on this machine, so I right-clicked and extracted that to the desktop. I didn't know what to do after that, so I looked at the Workstation User's Manual (contained within the VMware Help option and also downloadable as a separate PDF). At 470 pages, this was really quite the impressive manual. I went straight to page 53, "Installing Workstation on a Linux Host." They recommended doing the unpacking in the /tmp directory, so I cut and pasted the unpacked vmware-distrib folder to tmp. This much I could do without opening a Terminal window (Applications > Accessories > Terminal), but now I had to do that to run the installation program. It didn't run when I right-clicked on it and chose Open > Run. When I tried it from the command prompt in Terminal, following their instructions, I saw why: I got the message, "Please re-run this program as the super user." I took this to mean the equivalent of the Windows Administrator, but I didn't know how to get there. I got a couple of different opinions from one discussion thread. Experimentation suggested that typing "sudo -i" (here, and elsewhere, without the quotation marks) prompted the system to request my password, which I entered, and then I could change directories and run the installation command as the manual instructed. The installer asked me, "In which directory do you want to install the binary files?" The VMware manual said, "Accept the default directories." So I understood that the "[/usr/bin]" entry appearing there in Terminal, after the installer's question, was the default. So I just hit Enter, and did likewise for the other directory questions. I just kept hitting Enter in response to each question, except that hitting the spacebar got me through the License Agreement more quickly. I had to type "yes" after that Agreement, and then some more Enters, and we closed with an instruction from "the VMware team" that said I could now run VMware workstation by typing "/usr/bin/vmware." I had a feeling I could probably also run it in the File Browser by just double-clicking on it or something, but that wasn't where the manual was taking me next. According to the manual (p. 56), "Where to Go Next" was to create a virtual machine (p. 89). Or, as p. 89 instructed me, I had the option of importing a virtual machine from another format -- but only if I was using VMware for Windows, which I wasn't. Instead, it looked like I would have to do a new Windows installation inside VMware. Page 89 of the manual started me with the New Virtual Machine wizard; but to find out how to run that wizard, I had to go back to page 66, which instructed me to start Workstation by typing "vmware &" at a Terminal prompt. Instead, I did what I just said -- I double-clicked on the vmware file in the /usr/bin folder in the File Browser -- and that gave me an option of running VMware, which I did. So now I was looking at VMware Workstation. (Right-clicking didn't give me the option of creating a shortcut to VMware, which I would have put on my desktop.) The first option in Workstation was to create a new virtual machine. That option started the New Virtual Machine wizard. It gave me a choice of guest operating systems that I would plan to install in this virtual machine: Windows, Linux, Novell NetWare, Sun Solaris, or Other. I chose Windows. It asked me to specify which flavor of Windows, and it offered about 20 of them, from Windows 3.1 through Windows Server 2008 x64 Edition (experimental). I chose Windows XP Professional. Now we had a choice for Network Connection. "Bridged networking," it said, would give Windows direct access to an external Ethernet network. I wanted Windows to be insulated from viruses. I guessed that I probably wanted option 2, "Network address translation" (NAT), which was described as giving the guest operating system access to the host computer's external Ethernet network connection. It looked like some people were using NAT successfully, so I chose that option. The next question had to do with Disk Size. The best performance, according to manual p. 198, came from the option of allocating all disk space now, rather than the alternative of letting VMware allocate more space as needed for the operating system, program files, and data files. The reference to "data files" confused me: was my data going to have to stay inside the virtual machine? Manual p. 171 said that I would be able to drag and drop, or copy and paste, files between the Linux host and the Windows guest, or between virtual machines. So apparently the only worry, for data files, was that I might want to edit some huge AVI video file in Windows, and I wouldn't have enough virtual disk space for it. I had set my /usr directory at 10GB on installation (above), and now VMware was offering to create an 8GB virtual disk. I would definitely have video files larger than that. So now I had to look for a way to enlarge an Ubuntu partition.
The directory "/home/ray/vmware/WinXP/" has less than 150MB of free space. Running out of space in this directory may corrupt the virtual machine's RAM. This is likely to cause the guest operating system to crash. To avoid these problems, VMware recommends you move or delete files to free up space now.This didn't make sense. I had allocated 20GB to that directory. A basic WinXP installation took just a couple of GB at most. Ah, but no. I used Alt-Tab to get to the file browser, which I already had open, and then right-clicked on that WinXP folder and selected Properties. It said the contents of that folder were 3.5GB, and that I had just 1.3MB free space. I had put the 20GB in /usr/bin -- which, as I now saw, contained 99MB worth of files and had 29GB of free space. Oops. A slight misallocation. It seemed I would want to make /home larger. I went to the top of the black screen and selected VM > Power > Suspend Guest. Then I went to the top right corner of the screen, hit the power button, and chose Hibernate. Hibernating seemed to mean death, so I hit Ctrl-Alt-Del, and this enabled me to reboot with the GPartEd CD and resize things to give 15-20GB of space to each of /usr and /home. When I came back to Ubuntu, my hibernation attempt had failed. I just had to start up VMware Workstation again. Once I was in there, it automatically resumed the setup process, but it was basically starting over from scratch. It finished very quickly, though, and then I was at the set of screens where WinXP says, "Welcome to Microsoft Windows .... Let's spend a few minutes setting up your computer." I noticed that my mouse was trapped inside the Windows blue screen -- I could no longer go to the top of the screen to click on the VMware menu items. So, OK, I continued. When it asked if the computer would be connecting to the Internet directly or through a network, I said through a network. And, what do you know, it worked. I began installing updates from the Microsoft website. Meanwhile, about VMware Tools. The manual (p. 115) gave me some instructions, but to follow them I needed to get my trapped mouse out of that WinXP virtual machine. The answer to this and other mysteries lay on manual p. 87, where I learned that Ctrl-Alt would release the mouse cursor and Ctrl-Alt-Enter would toggle full-screen mode on and off. I tried the latter, and that didn't give me what I expected. The Windows window was still the same smallish size, but now it was enframed within VMware Workstation menu, toolbar, and sidebar. Inside WinXP, I changed the screen resolution to 1280 x 1024 (i.e., set up classic view on the Start Menu and then select Start > Settings > Control Panel > Display > Settings), and toggled back to full screen mode, and I had a much bigger Windows window. But as I was saying, VMware tools. Manual p. 115: select VM > Install VMware Tools. The dialog that popped up said this:
Installing the VMware Tools package will greatly enhance graphics and mouse performance in your virtual machine. [Thanks to the 64-bit installation, perhaps, I hadn't noticed any performance problem so far, but whatever.] WARNING: You cannot install the VMware Tools package until the guest operating system is running. If your guest operating system is not running, choose Cancel and install the VMware Tools package later.So, OK, I was all set to go. I had to reboot the virtual machine for the VMware Tools to be installed; and when I did, the display resolution was back to 800 x 600. I wasn't sure, at this point, whether the resolution could be made permanent at a higher setting. I resumed the effort to install Windows updates and other programs. I couldn't get any more updates without activating WinXP, and it turned out I couldn't activate WinXP online because they felt I had exceeded the number of authorized activations -- which was probably true, because I had experienced a number of dysfunctional Windows installations over the years. So I had to call their 800 number, but that didn't work because the representative said their system was not responding and I needed to call back in an hour. I did that and got the same message again. On the third try, she said it would take two more hours. At least this one was willing to have a conversation with me about India. All the same, we seemed to be accumulating reasons not to use Microsoft. Yet as I say that, something funky was happening in Ubuntu too. When I clicked File > Close to get out of Windows Explorer within the virtual machine, for some reason the entire virtual machine window closed. Then I noticed that some Ubuntu windows would close if I hit Scroll Lock twice -- which was what my KVM switch required me to do in order to switch Keyboard, Video, and Mouse over to the other computer. It occurred to me that maybe resizing the partition, so that it was now less than 20GB, may have caused a problem for a 20GB virtual machine. So I deleted the first VM and tried again, this time setting it to 13GB, after checking Properties to see how much space there actually was in /home. I wasn't sure how many of my Windows programs I would need to install, or whether they would all go into /home or instead into /usr, but this did seem to be an incentive to use Ubuntu rather than Windows programs whenever possible, so as to stay within that 13GB ceiling. So, for one, I wouldn't be installing a copy of Firefox in the Windows virtual machine, seeing that there was already one in the underlying Ubuntu operating system. Instead, while WinXP was reinstalling in the VM, I started configuring add-ons etc. for that Ubuntu Firefox installation. Of course, you just know that, when I finally got the thing activated, I had forgotten to install VMware Tools again. So I shut it down, but VMware wouldn't let me install Tools until I had Windows booted. So I powered up the VM again and installed Tools. It said I had to reboot, so I did. It went back into Windows OK, so it seemed no harm was done by my failure to install Tools before activating. I started installing WinXP updates again. I had shut off Automatic Updates, for the time being, because I had decided that Windows downloaded the updates slowly, and that sometimes what they downloaded would crash. So, for example, I usually didn't use Express install, and I had better luck installing everything else before WinXP Service Pack 3. Anyway, so far, it seemed like the system was actually snappier and more responsive than it had been when I was just running 32-bit Windows without VMware. While Windows was downloading and installing 106 updates, I went back to tinkering with Firefox. I was getting more weird behavior from Ubuntu. For some reason, in Firefox, I wasn't getting any capital letters, like when I tried to type my name and address into the Google Toolbar auto-fill option. I tried to test this in a Text Editor window, but when I was there, merely hitting the Shift key would close the window. I switched keyboards and still got the same thing. Hitting Shift didn't close down Firefox, only Text Editor. These problems had not appeared on the secondary machine, so I didn't think they were problems in Ubuntu per se; I thought they might be caused by VMware. I decided to restart Firefox, but it wouldn't start. So instead, I started OpenOffice spreadsheet, to look up something in one of my Excel spreadsheets. But if I hit Alt or Scroll Lock, the spreadsheet would crash. So I suspended the Windows installation process -- actually, I hit the WinXP "Cancel" button, but it didn't seem to be canceling in any great hurry, so then I suspended the virtual machine and rebooted the system. This seemed to take care of the funky keyboard behavior in Firefox etc., but we were restored right back to the same place of seeming to be hung up at update 92 of 106 in the Windows Update process. The hard drive light was on, though, so I hoped something was happening. After a while, I gave up on that and just powered off the VM, and then powered it back on. I ran Windows Update again, and now there were 24 updates yet to go. Fortunately, the weird behavior was gone. While those updates continued, I looked into Linux compatibility for some of the programs I needed to use. I already knew I would have to install Microsoft Office 2003 and Adobe Acrobat Professional 8 into the Windows virtual machine. Now it seemed that I would have to do likewise with the software for my PDA, my MP3 player, etc. These were not large programs; it just seemed that I would be doing most of my work in the Windows environment for the foreseeable future. Another large program: Adobe Premiere Elements. It looked like Linux video editing programs were still very much in the development stage by comparison. I also discovered the OpenPrinting Database and saw that my Canon printer was completely unsupported by Linux, or that Canon offered no Linux support. If I wished to use that machine, I would have to use it within Windows. Thus, even if I did find a Linux alternative to Acrobat -- that is, a way to scan and edit documents into PDF format as well as I could do it with Acrobat -- there would still be the problem that my multifunction printer/scanner would not work in Linux anyway. I decided that, if I could make the VMware thing work reliably, I would look on this whole effort as an opportunity to become familiar with how things are done in Ubuntu. After a year or so of moving files around, using folders and command lines and so forth, I would probably have a better sense of how Ubuntu worked and what I could expect from it. I would still have to work in Windows, for the most part, but I would be doing so within a Ubuntu and VMware framework. The Windows updates did seem to install pretty quickly in VMware. Certainly the shutdowns and reboots were fast. When I was all caught up on Windows updates, I decided I wanted to make a backup of my work. I got to the point of deciding to try PartImage, since it would enable drive imaging for my various Ubuntu partitions. (It was just about impossible to understand the download instructions, but Then I realized that I didn't necessarily want or need to make a relatively huge drive image of the whole thing -- not yet, anyway. Most of my changes had just involved the Windows virtual machine -- and wasn't that supposed to be saved in a single file, or set of files, that I could just copy to another drive or burn to a CD? I went to /home/ray/vmware/WinXP and saw that, sure enough, there were a bunch of files labeled (in list view) as VMware virtual disks. There were also some other files there. Could I just copy this folder somewhere else and consider that my backup? That was the gist of some advice I saw; they also cautioned that you have to shut down the virtual machine -- can't just suspend it -- before making a backup. VMware said the same thing:
Virtual disks are the disk partitions of virtual machines. They are stored as a file on the file system of your host operating system. One of the key features of the VMware application is "encapsulation". This means that complete environments are contained in a single file, which can be copied, moved, and accessed quickly and easily. Since an entire disk partition is saved as a file, virtual disks are easy to back up, move, and copy.So I copied the /home/ray/vmware/WinXP folder to another drive. It was 6.9GB, and it took about 15 minutes. And while I was on the subject of backup, I decided that (partly because of difficulties figuring out how to download the ISO for PartImage, among the various programs available) PING was the best available freeware disk imaging program, for purposes of capturing at least an occasional snapshot of my Ubuntu installation as a whole. So I burned a CD for that. Now I was a little concerned that maybe I couldn't use my printer at all with this setup. When I had finished making the copy of my virtual machine, then, I started up the WinXP virtual machine, to try printing something. At this point, I discovered that the virtual machine had only three drives available: the floppy (A), the CD/DVD drive (D), and the programs disk (C). Drive C was about 13GB altogether, and about 8.2GB were already taken. So it seemed I would have to do more resizing in GPartEd, once I got some more programs installed and saw more clearly how much space would be needed. Or could I modify the virtual machine's settings to add another hard drive partition? Anyway, about that printing question. I realized that I had not yet installed the drivers and software for my printer, here in this virtual machine. It just occurred to me that I had made the mistake of leaving the printer on and the USB cable connected to this machine. Yet Windows had not popped up its little balloon or its Found New Hardware wizard to tell me so. I couldn't tell, yet, whether this meant that no hardware would work until Ubuntu said so -- in which case my printer would not work at all. While the software was installing, I unplugged the printer and waited. The drivers and printer software installed without difficulty. I plugged in the printer when instructed. This brought up a VMware dialog:
The specified device appears to be claimed by another driver (usblp) on the host operating system which means that the device may be in use. To continue, the device will first be disconnected from its current driver.That sounded good, so I said OK. And then, what do you know, Windows did deliver its "Found New Hardware" bubble, for the printer and also for a "USB composite device." So at this moment, it looked like the process was as follows: you install the drivers; Windows tries to see the device; VMware sees that Windows wants to see the device; VMware offers to get out of the way; and if you say yes, please do step aside, then Windows is allowed to see the device, even if Ubuntu itself doesn't quite know what to do with the device. That, anyway, was the operating assumption at this point. If that was correct, I liked it: it meant no more accidental screwups when installing USB devices, where you plug in the cable before you're supposed to and Windows instantly sees the device and installs all the wrong software. When I created a little test text file and tried to print it, here in my virtual machine, Windows did see the Canon printer. But no paper actually came out of the printer -- which is, of course, the object of the game. The printing process itself was working OK, as I verified by printing my test text file to the Microsoft Document Imaging program: a file was indeed created, and when I viewed that file on the computer I could see my text visible in it. And then, what do you know, that seemed to break the logjam: I tried again to print to the Canon, and this time it did print OK. The other half of the printer/scanner question was whether the scanner would work too. I ran WinXP's Scanner and Camera Wizard. It recognized the scanner, and the scanner scanned. But no scanned image showed up in the wizard. I gave it a couple of minutes and then canceled. I had never used this wizard before, so I didn't know if I was having special difficulty or if this sort of malfunction was typical for it. I tried again, and this time I just let it sit for a while, to see if it was just slow or if it was genuinely not working. Meanwhile, I downloaded, installed, and scanned with IrfanView, which I had used for years. Like the WinXP wizard, it did not transfer a visible scan to the monitor. So it seemed like a pretty good bet that Acrobat was not going to do too well with this setup either. But then, once again, I was pleasantly surprised. By the time I finished installing IrfanView, the wizard had indeed downloaded a picture. Granted, it was abysmally slow. It probably took five minutes. This would never do for regular working purposes. But at least the virtual machine could use the scanner, in some sense. It seemed like the wizard may have slowed down the process, so I killed it and just tried scanning with IrfanView. But no, again there was an abysmal delay of several minutes before the image appeared on the screen. I tried to reboot Windows inside the virtual machine, just in case it needed that to get its head on straight; but it froze somewhere in the shutdown process, so I had to power the virtual machine down and back up. While it was down, I tried adding another hard drive to the mix, but my brief attempt at that was not successful. After powering up the virtual machine again, I tried again with IrfanView. This time, IrfanView said that it could not connect with the scanner. I turned off the entire computer, let it sit for a minute or two, and then turned it on again. Still no luck in IrfanView. It tentatively appeared that I would have to install the printer drivers each time I powered up the virtual machine. To test that, I tried printing another little text file. The Canon did show up as one of the available printers, but its status was said to be "Offline" -- unlike the Microsoft XPS Document Writer, whose status was "Ready." I right-clicked the Canon, there in the Print dialog, and chose "Use printer online." That converted its status to "Ready." I tried printing my test file again. I still got no paper, but at least this time I got a bubble confirming, "This document failed to print." The best I could do on that was to put in a request for a driver I'd have to pay for at TurboPrint. At this point, it was time to make some decisions about VMware and Windows for my immediate purposes. I liked what I saw of VMware; my only problem had to do with the things it couldn't do. There was a good chance that there was a workaround for the printer and scanner problem, if I wished to invest the time to find it. I had other devices and programs as well, and I was concerned that researching them all would be very time-consuming and ultimately only partly successful. This time around, I was more committed to making Ubuntu work than I had been in 2007. I had just lost too much time to Windows malfunctions, viruses, and so forth. I decided to keep developing this approach, with Ubuntu and VMware, on the main computer, and to use the secondary computer for my scanning and, if necessary, my printing. This would not be a very ideal long-term solution, and if something didn't work out eventually, I would probably decide to give up on Ubuntu or buy a different printer or something. But for now, this was an approach I could live with. It would mean having a dual-boot setup on the secondary computer, using Windows for my scanning and printing, at least in the short term. That would entail some Windows security risks. It would also mean a lot of hassle, moving documents between the two computers -- by USB thumb drive, I guessed -- but, again, I thought I could probably work with it and see how it developed. The next step, then, was to see if I could get my other key software and hardware to work in VMware. I started with Microsoft Office 2003. First, I wanted to know if I could get by without it. Ubuntu came with the OpenOffice software suite already installed. OpenOffice was supposed to be able to handle Office 2003 documents. I had recently looked into the specific need of bringing over my set of several thousand AutoCorrect entries from Word. Some people claimed to be able to import tens of thousands of entries, but I had not been able to figure out quite how they did it. As I gathered from reviewing various posts in the relevant OpenOffice forum, the OpenOffice developers were trying to deal with an immense number of bugs, suggestions, and other alterations to the OpenOffice program, and it did not seem likely that this particular problem or feature would be resolved anytime soon. I posted a question on it, just in case; but it didn't attract any immediate attention. So in the meantime it seemed necessary to make Microsoft Office work within VMware. Throughout this process, and especially at this point, I started to feel overwhelmed by the large number of little bits of functionality I had in Windows. I could do this, I could do that. There were all sorts of small tasks that, added together, amounted to a lot of things I knew how to do in Windows but had no idea of how to do in Ubuntu -- if, indeed, such things were even possible in Ubuntu. For instance, I had found that DoubleKiller was a very useful way to detect if I had inadvertently created duplicate copies of a document on my hard drives. Was there such a utility in Ubuntu? There was, surely, but would it be as easy to use? And how, anyway, would I get all those gigabytes of data into forms where I could manipulate them in Ubuntu, or where WinXP could see them from within the virtual machine? I already knew part of the answer. I would take it in small steps. I would install what I needed to install in WinXP, and I would learn how to do what I could in Ubuntu. It would be a hassle, but hopefully the hassle would be justified by the payoff of more stable and secure computing. But another part of the answer, as I thought about it, seemed to be that I was attempting to switch paradigms. Windows was based on the idea, or perhaps the pretense, that you could do anything you want. The software might be half-assed; there might be crashes and bugs and security breaches; but, by God, if you wanted Windows to toast a bagel for you, there was probably a way to do that -- most of the time, more or less, to some degree. In Linux, the philosophy seemed to be more like, This is real computing. We do it right, or we don't do it. So you've got your marketing types squared off against your engineering types. You can understand the marketers. They're saying exactly what you want to hear. For ten times the price, they will give you whatever you want. And you will almost have it. So ... was I ready to switch to real computing? It almost seemed like a question of whether I wanted to use the computer as an assistant or as a partner. In the end, my Windows computer was not my partner. It pretended to be. But after all these years, it was still too demanding and unstable to be trusted. It needed defragmenting, constant downloading and installation of virus updates and security updates, reboots to clear its head, drive imaging, running and responding to spyware scans, and so forth. At any moment, it could become dysfunctional to the point of requiring me to spend hours or even days figuring out the problem and the solution. I just couldn't afford that anymore. I needed to move away from that paradigm. I wasn't sure if Ubuntu and VMware would deliver a superior alternative, but it was time to find out. So the next step was to install Microsoft Office in my virtual machine. I was using an academic download of Office 2003, and to install it on the VM, I needed to copy it over from another drive. VMware Tools was supposed to be able to do this: I just had to drag and drop. I did that, and VMware looked like it was copying; but then, after several minutes, it said, "Cannot open file on virtual machine. Aborting the drag and drop operation." I found one discussion thread on this. Respondents offered two possible solutions. One was to use something other than drag & drop; the other was to uninstall and reinstall VMware Tools. The one who said to use something other than D & D said that drag and drop was for small things, and that for large tasks you should instead use a Network Share or VMware Shared Folders. I thought this might be the better answer, because the drag and drop approach did work when I tried moving just one file. So I didn't think my VMware Tools were the problem. My belated installation of them did come to mind; but the respondent's advice to uninstall and reinstall made me think that you could indeed install Tools successfully after activating Windows. So I decided to find out what Network Share or VMware Shared Folders were all about. According to p. 173 of the VMware Workstation User's Manual, shared folders could be in the host computer's file system, or they could be network directories accessible from the host. The only references I found to Network Share, in the manual, had to do with mapping network drives. I wasn't on a network, though, so my focus could be on simply sharing folders in the host computer's file system. Sharing folders in VMware Workstation involved using VM > Settings > Options > Shared Folders. I chose the "Always enabled" option and clicked the Add button. There was a warning there: "Shared folders expose your files to programs in the virtual machine. This may put your computer and your data at risk. Only enable shared folders if you trust the virtual machine with your data." This was unclear to me. I would have thought that only the stuff in the shared folders was at risk. But it seemed to be saying that everything on the computer was at risk, including stuff that was not in the shared folders. The manual (or at least the part on p. 173) said nothing on this point. I found an article, however, that said a hacker could gain complete access to a computer if shared folders was enabled. The article, dated February 2008, said that VMware planned to fix this problem in an upcoming release, but so far the fix seemed just to be to change the default from sharing enabled to sharing disabled. The idea behind this threat, as I understood it, was that you could be working at a virtual WinXP machine, and through the shared folders opening you could get into the Ubuntu file system. I didn't plan to let any hackers sit down at my computer and use my WinXP virtual machine, but another possibility would arise if I stuck in a virus-infected USB thumb drive. This still sounded like a relatively remote risk, but I decided that probably the best compromise was not to choose the "Always enabled" option but, instead, to select "Enabled until next power off or suspend." So I would use the shared folder temporarily, to move things into or out of the virtual machine; and then I would disable sharing, or let the system do it upon suspend or reboot if I forgot. Drag and drop might have been easier, but it wasn't working very well, so this was the next option. (Later, I realized that I had to enter a password to get into Ubuntu, but no password to get into Windows because I had disabled that, so maybe it would be easier for someone to access my data through Windows for that reason.) I went ahead and created a folder on an NTFS partition, labeled that folder "Shared," and designated it for sharing in Workstation. Then I stopped and thought about this some more. Was I really going to go through the steps to enable a folder every time I wanted to pass a file between Ubuntu and WinXP? And did I want WinXP not to be able to see all these hundreds of articles, notes files, and all the other stuff I had collected for research purposes? That didn't make any sense. Acrobat, for example, would be operating only in WinXP, and I would want to be using it to read and edit a variety of PDFs. I sure didn't want to be moving documents and folders around all the time: I had spent quite a bit of time arranging and organizing the various files in them. So it sounded like I needed, not only to choose "Always enabled" for the sharing option, but also to designate a large number of folders and subfolders for sharing. I decided to try this out. I was able to designate an entire drive for sharing. And it seemed to work. In the WinXP virtual machine, I went into Windows Explorer and found the shared folder under My Network Places > Entire Network > VMware Shared Folders > .host > Shared Folders. Seemed a little redundant, but there it was. I had a problem with it being buried down in so many levels, though. I already had some folders that were nested too deep. Files in those folders had gotten their names truncated in Windows, and the folders had not always been fully functional. Was I now going to be putting those folders several levels deeper? I went into an individual subfolder, there in the Shared Folders directory in WinXP, and was able to run an EXE file and open a TXT file. So the shared folders did seem to function normally. So the scenario was that I was going to be designating most or all of the data folders on my system as shared folders. I would work on their contents in Ubuntu when I could, and in WinXP when I had to. I guessed that I probably should be doing virus scans on them in Windows, in case something went wrong with the Linux firewall, or in case I happened to put a virus in there through a jump drive or some other shared device. I would still be partially maintaining a Windows system, but hopefully at a much lower level of risk. Anyway, the WinXP virtual machine now had access to my Office 2003 installation files. I went ahead and started the installation. It occurred to me that the hassle or unfamiliarity of doing installations in this virtual environment might actually be beneficial, in the sense that it could encourage me to install only those programs that I really needed to use in a Windows version. Be that as it may, installation went ahead. I went online and downloaded the available Office updates. I used the Save My Settings Wizard to restore my preferred settings in Office, and it seemed to run correctly. I used the Autocorrect.dot macro to restore my AutoCorrect entries. I used Word to type a few words, and I was able to save the file. Interestingly, though, I was not able to save that newly created file on the shared drive; I had to save it to drive C, the only hard drive (a virtual one) visible to WinXP in the virtual machine. When I tried to save to the shared drive, I got an error message: "The save failed due to out of memory or disk space." But the shared drive had 7.9GB free. It was a tiny file, so memory shouldn't have been an issue either. Nor had I designated the shared drive as read-only. The idea of saving my files only to the virtual drive was not too appealing. That approach, it seemed, would require either a huge virtual drive or a constant effort to copy files from the virtual drive back to the shared drive. Files left on the virtual drive would be available only as long as VMware kept working. They would not really be files at all: they would just be entries in a VMware virtual drive file. If there was some kind of problem with VMware, those files could cease to exist. Along about this time, as I continued to use Ubuntu and VMware, I also became aware that they were not perfect. At the moment of this writing, for example, I was not able to use the Blogger.com website on my secondary computer, because Firefox on VMware had decided to show me only the HTML version of this posting, whereas I wanted to compose these words using the "Compose" version. (This problem persisted even after turning the computer off and back on.) The Alt-Tab means of viewing the open programs also was not working properly: it would not show me the contents of a text file I had open. Meanwhile, on the main machine, in VMware, I had been unable to close a session of Windows Explorer, in a way that I had not experienced within native WinXP. So I wondered if this inability to save a file to the shared drive was a flaw in VMware or in Ubuntu, or if that was how it was supposed to work. I tried the same experiment with Excel, and in that case I was able to save to the shared drive. So I decided it must have been some kind of temporary or permanent problem with Word in this particular arrangement. So I felt that it would be OK to go ahead and enable sharing with the other partitions on my computer. I faced a different kind of problem with Adobe Acrobat Professional 8. It seems my license agreement allowed me two installations. I had been using one installation in Windows on the main computer, and one on the secondary computer. That wouldn't work if I also planned to install Acrobat in the virtual machine on the main computer. I would do so, not to scan from inside the virtual machine (because that didn't seem to be working), but to annotate and otherwise mark up PDFs. The solution, I decided, was to use some other PDF program for scanning, or just to scan JPGs into IrfanView or some other program, and then refine and edit those scans on Acrobat within the virtual machine. Ubuntu came with the XSane Image Scanner, so I started there. But, of course, there were no Ubuntu drivers for the scanner, so XSane reported "No devices available." So the concept was that I would scan into PDF or JPG in a Windows dual-boot setup -- for example, I would use the secondary computer as my Windows-based scan slave, where I could also edit scans in Acrobat if I didn't want to do that in the virtual machine. Armed with this concept, I installed Acrobat in the virtual machine and used it to view a PDF on an NTFS drive without any difficulty. Next, I needed to install the Palm Desktop software for my Palm Zire 21 PDA, an old model. That went smoothly. I plugged the mini-USB cable into the USB hub connected to the main computer, and plugged the PDA into the cable. I told the PDA to sync itself with the desktop software, but nothing happened. As an easier USB testing device, I plugged a jump drive into the hub, and a window opened up for it in Ubuntu, but there was nothing for it in Windows Explorer in the VM, either under My Computer or under Shared Folders. In the virtual machine, I went to VM > Removable Devices > USB Devices, and there it was. So I clicked that option, and after a minute or so Windows recognized this USB Mass Storage Device. Then it said "Found New Hardware. A problem occurred during installation." The jump drive was still not visible in Windows Explorer. The VMware Workstation User's Manual (p. 357) said,
If you attach a USB drive to a Linux machine, use the above-mentioned procedure to access it. Do not attempt to add a USB drive's device node (for example, /dev/sda) directory to the virtual machine as a hard disk. That is, to add a USB drive, use the Add Hardware wizard to add a USB Controller, not a hard disk.So, OK, I could live with that. First, I had to figure out which Add Hardware wizard they were talking about. There didn't seem to be one in Ubuntu. There was one in WinXP, but I wasn't sure that made sense. I searched the manual and found this on page 361: "Choose VM > Settings. . . . Click Add to start the Add Hardware wizard." This seemed to be what they meant, but in this case VMware Workstation said that the USB Controller was already "Present," that it was "enabled." It instructed me to "Use the devices menu to connect or disconnect USB services." This seemed to refer back to the Removable Devices > USB Devices option mentioned above. I checked that again and saw that the Lexar jump drive was checked. So I wasn't sure where the problem was. Well, it had been a while since I had rebooted Windows, so I thought I'd give that a try, there in the virtual machine. When I went to the Shutdown menu, it indicated that Automatic Updates (which I had now enabled) had some updates ready to install, so I chose Turn Off and let it do that. When it was done, I rebooted. VMware was still recognizing the jump drive, and Windows Explorer still wasn't showing it. I was thinking about trying to add it as a shared folder, but when I went through that procedure (VM > Settings > Options > Shared Folders > Add), there was no Lexar Jump Drive to be seen. Ah, but it wasn't visible in Ubuntu's File Browser either. Evidently it was not automatically recognized on reboot. I unplugged it and plugged it back into the USB hub. This time, Windows reported, "Found New Hardware. Your hardware is [installed, I think they said] and ready to use." Sure enough, there it was under My Computer in Windows Explorer. Oddly, though, it still wasn't visible in Ubuntu's File Browser or in VM > Settings > Options > Shared Folders > Add, even though it *was* visible and checked in VM > Removable Devices > USB Devices. Well, whatever. Apparently it was being claimed as part of the Windows domain and, as such, was not going to be visible to Ubuntu. I tried unchecking it, there in USB Devices, and it disappeared from under My Computer in WinEx, and a minute later it popped up as recognized under Ubuntu's File Browser. I went back in and checked it again under USB Devices, and now it was back to being a Windows device. So that's how that worked. With that little episode under my belt, I tried again to get VMware to recognize the Palm PDA. According to p. 355 of the Workstation User's Manual, it sounded like the procedure I had just gone through was the manual installation procedure. The guest system (WinXP) was apparently supposed to recognize the USB drive automatically when I plugged it in. The User's Manual seemed to say that this would happen if the virtual machine was the active window. So I clicked on the virtual machine and then plugged in the PDA again. This time, Windows recognized both the hub and the Palm PDA. The manual (p. 355) said, "If the physical USB devices are connected to the host computer through a hub, the virtual machine sees only the USB devices, not the hub." But then I got a bubble telling me that my hardware might not work properly, and the Palm Desktop software didn't jump up and start to synchronize like it was supposed to. So maybe it saw the hub, this time, because the PDA connection wasn't working right. I had the dim sense that rebooting the virtual machine had helped recognize the jump drive, so I tried that again. WinXP was getting slower on these reboots, by the way, now that I was larding it down with all kinds of software. On reboot, we still had no automatic recognition. I went into VM > Removable Devices > USB Devices and there it was, "Palm Handheld." I checked it and nothing happened. The WinXP system tray (bottom right corner of the screen) showed the Palm HotSync Manager as being loaded and set for Local USB, but still no joy. In Windows Start > Settings > Control Panel > System > Hardware > Device Manager, the only yellow exclamation mark was next to USB Mass Storage Device. I uninstalled that and rebooted again. But that was the Lexar jump drive, which I had forgotten to unplug. Windows detected it again. The same problem was back again in Device Manager. I wasn't sure what that was all about. Now the Palm Handheld was gone from VM > Removable Devices > USB Devices. I unplugged both the Palm and the Lexar. Eventually, the yellow exclamation circle disappeared from next to the USB Mass Storage Device. Meanwhile, I was replugging the Palm PDA. It was there, and checked, in VM > Removable Devices > USB Devices, but WinXP still wasn't seeing it. I was stumped. Oh, but correction: it was there in Device Manager after all -- not under USB controllers, but under Palm OS Handheld Devices. It only showed up when it was powered on and USB-connected. It had no yellow exclamation circle; it seemed to be fine. So why couldn't Palm Desktop see it and synchronize with it? I didn't have an answer to that. I tried the next thing. I would also be needing to use that USB hub to download recordings from my digital voice recorder. i installed the software and plugged in the usb cable. oops -- here we have that weird ubuntu failure to recognize capital letters again from the keyboard. hold on while i reboot. ... I was going to say, as far as the digital voice recorder is concerned, I tried the simple approach -- just plug in -- and I got an error message from Ubuntu or VMware. It went by too quickly to read. I may have been clicking on something else at the time. But when I rebooted to eliminate the problem mentioned in the previous paragraph, where for some reason Ubuntu (not just Firefox) was not responding to the Shift key on my keyboard, I got an error message again, and it may have been the same one as before. This one said, "A USB device that was previously attached to this VM could not be automatically reconnected. If the device is still available but resides on a different USB port, you will need to reconnect it manually." That may have referred to the Palm PDA or the Lexar jump drive. Not sure. Anyway, on reboot, I tried plugging in the recorder again. This time, WinXP found the device, and after going through the installation steps recommended by WinXP's Found New Hardware Wizard, it worked: it downloaded the recordings from the recorder. It placed those recordings in My Documents, as usual. By default, My Documents was on drive C. I preferred to have it on one of the shared drives, for the reasons described above: I wanted it to be stored, accessible, and backed up outside of VMware, so that it would not be vulnerable to anything that might go wrong with VMware. I thought maybe it would be possible to map the shared drives so that they would have drive letters, so I could move My Documents and other data folders off drive C. But now that I looked, it seemed that rebooting had wiped out the share. Although I had specified the sharing as being always enabled, the Shared Folders folder in Windows Explorer was empty. In VMware, I went to VM > Settings > Options > Shared Folders, and yes, those drives were all still listed there, and Always Enabled was still checked. So why didn't WinEx see them? The manual (p. 450) said, "In a Windows virtual machine, shared folders appear as folders on a designated drive letter." This was not what I had seen. They appeared as folders under the Shared Folders folder. The manual (p. 452) also referred to "VMware Tools control panel and support for such features as shared folders." Shared folders seemed to be supported on my machine, yet I continued to be worried that VM > Install VMware Tools was not greyed out, whereas VM > Upgrade or Change Version was. Did I not have VMware Tools installed? Did I not have shared folders support? According to a How-To Geek webpage, I could see if the VMware shared folders module was running by entering this command, presumably at a Terminal prompt: lsmod | grep vmhgfs. It said that if I entered that and got nothing, then the module was not loaded. I got nothing. They said I could load it with this command: modprobe vmhgfs. I got the message, "Module vmhgfs not found." They said that, if I got an error message (which I assume that was), then I didn't have VMware Tools installed, and I would want to be sure to install them before proceeding. So I went to VM > Install VMware Tools (again). I then re-entered the two command lines just listed, with the same results. But when I went to the VM menu item in VMware, now the option was Cancel VMware Tools Install. This made me think the installation was still in process. Maybe I had prematurely closed VMware or something last time, thereby aborting the installation process. But that line just stayed there, and meanwhile there was an icon (which I think had been there previously, but I overlooked it), and now its tool tip was saying VMware Tools. I right-clicked and opened it. I just poked around among the various tabs there. One, Shared Folders, said, "Your shared folders are at \\.host\Shared Folders\. You can map a drive letter to this path using Windows Explorer -> Tools -> Map Network Drive. You can also access your shared folders from Windows Explorer under My Network Places." And under the About tab, it said, "The VMware Tools Service is running." So I got out of there and checked the VM menu pick again. The Cancel option was still there. I ran those HowToGeek lines once more. I got the same results, and concluded that the advice was wrong, outdated, or irrelevant. Which was fine, but those shared drives were still not visible in WinEx. VM > Settings > Options > Shared Folders had not changed. I removed and re-added one of those shared drives, and now it did show up under Shared Folders. I went back into VM > Settings > Options > Shared Folders, and now I saw that I had been mistaken. There was indeed a change, after all. When I selected the other shared drives, and clicked on their Properties button, I saw that they all bore this warning: "The path does not exist on this host." But the one I had removed and re-added did not have that warning. There was nothing on this in the manual, and I didn't find helpful English-language webpages on it. It seemed that the "Always enabled" button was not working. My Google searches turned up nothing. It eventually occurred to me that maybe the solution was to map the drives while VMware was still recognizing them. So, for the one I had just re-added, I selected it in Windows Explorer and then chose Tools > Map Network Drive. It took it a minute to see that partition, but then it did. It had a checkbox, "Reconnect at logon," that was checked by default, and that sounded right, so I left it there. Then I turned off WinXP, powered down that virtual machine, and closed VMware. I turned everything back on and went to see what happened. And yes! That was it. WinEx saw the mapped partition as "Data on '.host\Shared Folders' (E:)," and there was no warning for that drive in VM > Settings > Options > Shared Folders. So I repeated those steps for my other drives, and now I had a partition setup that looked more or less like what I had previously had in native Windows XP. Now, I felt, I was in a position to think about security and backup. I had assumed I didn't need to worry about a firewall or antivirus software, but apparently that was not correct. I had run across several references to the Firestarter firewall, so I began with that. In Ubuntu's Applications > Add/Remove, I searched All Open Source applications for Firestarter, checked it, and clicked Enable and then Apply Changes. Then I installed and configured it using Applications > Internet > Firestarter. I selected Ethernet Device (eth0) and Dial-Out and DHCP. I enabled Internet connection sharing via NAT, and based on one discussion thread, I chose vmnet8 as my local area network device. I finished and this started Firestarter. I went into its Preferences and made a few changes, notably enabling DHCP for the local network. Now I had a "Failed to start the firewall" error that said, "An unknown error occurred. Please check your network device settings and make sure your Internet connection is active." It seemed to be: I was composing this message on my blog using that connection. I exited from that and Firestarter seemed to freeze up. I tried clicking a couple of things, eventually got a message that it was not responding, and tried to bail out. That led to a "Force quit" option, which I selected. After that, I couldn't get Firestarter to start again. But then, after a five-minute delay, it did start. For a moment, its self-declared status was Disabled, but then it changed to Active. I hadn't noticed whether it had an Active status before. Under Network, its eth0 entry for Internet showed activity at various rates, so it did seem to be in business; and when I clicked on the triangle next to Active Connections, at the bottom of the dialog box, it showed 192.168.2.100 -- which, as I recalled from that whole DSL installation effort, was the ID for my router. The number of active connections would increase as I opened additional websites in other tabs in Firefox, and decrease when those pages would finish loading. So far, Firestarter was not immediately reporting any Inbound or Outbound events. But a few minutes later, when I happened to look at the top bar on my Ubuntu screen, I saw four red circles with what looked like lightning bolts in them. When I moused over one of them, I got a tooltip that said, "Hit from 172.16.46.128 detected." The others said the same thing. It appeared that each of these corresponded to a Firestarter session. When Firestarter didn't seem to be responding, I had made several retries to start it using Applications > Internet > Firestarter. Eventually each of those retries had seemed to start up a Firestarter session. I had closed those out, I thought. But now I had to right-click on three of these four red lightning-bolt icons and choose Exit. That left me with what was, I believed, just one active session of Firestarter. So, OK, it looked like I had a firewall. Next, antivirus. According to one thread, there was no need for it in Ubuntu per se, although there was a use for it when checking Windows EXE files. This made it sound like I needed a Windows antivirus program inside VMware, but nothing for Ubuntu itself. A blogger said there was no need for antivirus software in Linux at all. But an article pointed out that there are indeed some Linux viruses, and also that Ubuntu antivirus can help prevent accidentally passing on a Windows virus via an e-mail from a Windows user that you forward to another Windows user. I found no antivirus programs in Applications > Add/Remove, but the article pointed toward AVG and Avast antivirus programs for Linux. The AVG updates page listed a number of Linux viruses and worms that its software protected against, so it did sound like at least a vaguely good idea to have some antivirus software in place, despite all the posts saying that nobody had ever had a Linux virus. According to a poll, AVG was somewhat preferred over Avast, so I went with AVG. Unfortunately, their downloads were all for i386 machines, whereas I was using 64-bit Ubuntu. When I tried to open the download for Debian-based distributions (which is what Ubuntu was), I got an error message indicating it was the wrong version. But then it looked like you could edit it to be a 64-bit program. I followed the instructions on how to do that. It took a couple of tries to realize that replacing text, in that primitive editor, meant just using the delete key, one character at a time. They also had documentation, priority updates, and optional updates. I wasn't sure how those updates would work, given that I had altered the program to work on my 64-bit system. Instead, I used the Update option within the program itself. It downloaded a 24.6MB update, which was just the size of the latest update they were offering on the priority updates webpage. So I assumed I was all set for Linux antivirus. There was also the question of running a firewall and antivirus software on Windows XP, inside the virtual machine. A firewall seemed unnecessary, since I hoped I had correctly set up the networking so that the virtual machine was contacting the internet through the Ubuntu installation. But I was curious about that, so I went ahead and installed ZoneAlarm, using the Ubuntu Firefox installation to find and download it. What I actually got was just the setup file, and when I ran that EXE file in the virtual machine, it downloaded the larger ZoneAlarm installation program. So it seemed the virtual machine was already in constant contact with the Internet. As for antivirus, I already had a copy of Symantec AntiVirus, so I installed that. Partway through the downloading of the latest Symantec AntiVirus updates, I got a WinXP message telling me that I was very low on disk space on drive C. I checked in WinEx and, sure enough. It was time to run GPartEd again and revise my disk space allocations. But GPartEd reported that I had lots of space in most of my Linux partitions. Apparently most of my program installations had been going into /home, which had by far the most stuff, with 9.2GB used. But /home still had more than 8GB free. So that wasn't the problem. Then it occurred to me: it wasn't the physical disk partition that was filling up; it was the virtual machine. I wasn't quite sure how that was possible, though. As WinXP saw it, all of my drives were shared drives except drive C. Those were real partitions. C just lived in the virtual world and was saved in the VMware files. I rebooted into Ubuntu and powered up the WinXP virtual machine again. In doing so, I got a Windows bubble message: "Could not reconnect all network drives." Never were truer words spoken: a look at My Computer showed that, in fact, none of the shared drives had been reconnected. VMware told the tale: once again, I was getting an Install VMware Tools option. Apparently my failure to install them before activating Windows meant that I was condemned to have to reinstall them every time I powered up VMware. Ah, but it was worse than that. After reinstalling VMware Tools, I hit F5 to refresh the Windows Explorer display, but it still showed the shared drives as disconnected. VM > Settings > Options > Shared > Properties gave me that error message, "The path does not exist on this host." Even after VMware Tools were reinstalled, the path was bad; I would apparently also have to reset each of the shared folders each time I rebooted. This also meant that, if I wanted anything to happen involving one of those other drives on startup -- say, to open up a certain file each time I rebooted, which was how I had done things in Windows -- that wouldn't be possible, because the drive wouldn't be recognized. These thoughts favored starting over on the VMware installation, unless maybe reboots would be so rare in Ubuntu as to make it a minor hassle. I postponed that question and went back to the issue of not having much space left on drive C. VM > Settings > Hardware > Hard Disk told me that I had used 8.6GB of the 13GB drive. It also said I had 7.5GB of "System Free" space. So why was drive C running dry? I looked at Properties of drive C in Windows Explorer. It showed 12.8GB of 12.9GB used. Puzzling! The VMware manual didn't seem to have anything on this. I tried Disk Cleanup in WinXP, but it reported only about 55MB of potential savings. I could only surmise that some VMware files were being saved on /home, while others were saved on /usr or elsewhere. No logical disk partition was full, but collectively they were filling the 13GB I had allocated. This, and the thought of doing backup of multiple Linux drives, made me think that I probably should have just gone with the approach of having all my Linux stuff in just one partition plus a swap partition. I had heard that the more complicated partitioning scheme I used would improve system performance, or would make it easier to change some parts of the Linux system without having to change others; but I just wasn't at a point where I could understand and use that approach well, and meanwhile it was complicating this VMware thing. It was painful to contemplate devoting additional hours to the process of starting over from scratch to correct these flaws, but I decided to do it. Now that I had all these notes typed up, maybe it would be fast. So I started shutting down the system. I ran GPartEd and combined all those partitions. I installed 64-bit Ubuntu in the resulting partition and installed updates. I checked File Browser and confirmed that File System contained 43.9GB of free space. This time, I thought I'd try setting disk size at 50GB, which was somewhat more than I had available. I wondered if it would complain now, or if it would only complain later, when it tried to expand to 50GB and found that there was not enough room (at which point I could use GPartEd and allocate more to it). Again, I told it to split the disk into 2GB files, just in case I had to be moving it around on a jump drive or CDs sometime, and I did not select "Allocate all disk space now." It didn't complain. Things progressed along more quickly this time. I remembered to install VM Tools before activating Windows -- a mildly tricky proposition, since the WinXP installer just leads right on into the activation process. But when I clicked VM > Install VMware Tools, it seemed that nothing had happened. The status bar at the bottom of the VMware screen continued to say, "VMware Tools is not installed in this guest," and I didn't see the VMware Tools icon in the VMware Workstation system tray at the bottom-right corner of the screen. I waited for a while, and then proceeded with the WinXP installation. When it came to the activation screen, I said don't activate now. I finished installation and logged off. VMware said, "You do not appear to be running the VMware Tools package inside this virtual machine." They offered to remind me to install VMware Tools when I powered the machine back up. I said OK. I restarted the virtual machine and tried again. This time, the VMware Tools installation wizard started up. I went through it and rebooted as suggested. Now the VMware Tools icon appeared at the bottom-right corner of the screen -- but in the Windows system tray, not in the VMware system tray; I had been looking for it in the wrong place. Anyway, I shut down the computer. It got hung up, shutting down, and after several minutes I finally just shut off the power and let it sit for a minute or two. I started the computer up again, opened VMware Workstation, and powered on the WinXP installation. The VMware Tools icon appeared in the system tray, and when I opened the icon and selected the About tab, it said "The VMware Tools Service is running." The VM menu pick still gave me the option to "Install VMware Tools," but apparently that was not to be relied upon. With meals and a shower and some other general screwing around, this part of the process -- from the decision to run GParted and combine partitions, up to the moment when I started downloading those 106 Windows updates inside the virtual machine -- had taken almost exactly three hours. It would have been faster if I had been more familiar with things, or if I had not had to re-download the VMware Workstation TAR file. It was going to kill the better part of a day to go through all these steps again. On the positive side, I didn't seem to be getting that flaky behavior from Firefox that I was getting before. I did the drive-sharing routine again. As before, I didn't like that VMware insisted on changing the case of my drive labels: I wanted to call one of them DATA, for example, not Data. But anyway, I installed Office 2003 again. That, plus downloading its updates, plus a few other minor program installations, somehow took another four hours. More than that, actually. The downloads were slow at times. Once, they seemed to stop altogether. I rebooted and they started up again. I ruefully thought about how OpenOffice came included with Ubuntu, and if there were updates, so far they had been pretty quick. Recreating the system had fixed the disk space problem. Unfortunately, it had not fixed the shared drives problem. This time, Windows Explorer was showing the drives, but when I clicked on one of them, Windows gave me this error message:
An error occurred while reconnecting E: to \\host\Shared Folders\DATA. VMware Shared Folders: The network name cannot be found. This connection has not been restored.The VMware Tools icon was still visible in the WinXP system tray. I couldn't figure out why this problem was there. I posted a question on it in the VMware forum. When I rebooted Windows again, the shares were altogether gone from Windows Explorer, and in VM > Settings > Options > Shared Folders, I still had the message, "The path does not exist on this host." So that was another mystery. This second time around, I discovered that when VMware recognized my jump drive, it did so at the expense of another drive I had already mapped as drive D. In other words, my "real" drive D was nowhere in sight; the jump drive was taking its place as drive D. When I ejected the drive, I got an "unrecoverable error" dialog and WinXP vanished. I powered the WinXP virtual machine up again, this time with the jump drive removed from the USB port. I really liked how fast WinXP rebooted in VMware, and I was also relieved that my original drive D was back where it belonged. But apparently I was going to have to map it to some other drive letter if I expected to keep using that USB thumb drive. This time around, WinXP reognized my Palm handheld PDA, along with the USB hub. It then gave me the bubble that said, "Found new hardware" followed by something about a problem during installation and a statement that my device might not work properly. (It disappeared before I could copy its exact wording again.) I rebooted WinXP in case that would help. It didn't. Meanwhile, something funky was happening with the jump drive. I had been composing this post in Blogger.com, using Firefox in Ubuntu on my secondary computer. As noted above, Firefox on that machine was only showing me the HTML version of this post. I preferred Blogger's Compose view. So I saved this post in Firefox on the secondary machine, pasted a copy of these words into a text file, and opened the draft of the post in Firefox on Ubuntu on my main computer. Well, all of my edits for the past hour or two were *not* saved on the secondary machine. So it seemed that I might need to reinstall Firefox on that machine. I mention the jump drive, in that connection, because I was going to copy this text file backup over to the primary machine and paste it into Blogger. I saved this file and copied it to the jump drive. It was plainly visible in File Browser. I unmounted the jump drive and moved it to the primary computer, plugging it into the same USB hub socket as before. This time, it didn't supplant my regular drive D. It wasn't recognized by WinXP in the VM at all. But Ubuntu did recognize it, though it couldn't see any files on it. I unmounted it from the USB hub and plugged it directly into a USB port on the back of the computer, and made sure it was checked in VM > Removable Devices > USB Devices, but still nothing. I put it back in the secondary machine and yes, that machine did see this text file. Another little mystery. Also, apparently File Browser does not let go of a jump drive when you unmount and remove it. The directory listing entry for the jump drive was still there, unchanged, long after the jump drive had been removed. I didn't see a Refresh option, and F5 didn't fix it. An even better mystery came to light about this time. It seems that the contents of a data drive vanished. I can't quite explain it. It wasn't a killer -- it was just a drive I was using for backups -- but everything I had put onto it had been replaced by Ubuntu stuff -- by folders with names like .adobe and .cache and so forth. It looked like maybe Ubuntu had decided to start storing some of its program files on that partition. Baffled me. I was just glad it chose that drive instead of DATA. That could have been unpleasant. But then I thought to look at that same drive in Ubuntu's File Browser instead of Windows Explorer. File Browser reported everything being in place, my original folders where I had put them, with no sign of any .adobe or .cache or other Ubuntu-style folders. I checked the drive mapping in WinEx, and it did seem to be pointing to the correct drive. So I was still confused, but in a more refined way. I disconnected and re-mapped that drive in WinEx, and of course that fixed nothing: it was still showing the Ubuntu stuff. No other partition seemed to be having this problem. I restarted WinXP, there inside the virtual machine, and it was still the same. I wondered if maybe VMware or drive mapping were somehow interfering with Ubuntu's ability to recognize that USB drive. I powered down the virtual machine and closed VMware. I tried the jump drive in both USB ports. No luck. I tried a different USB drive. That was the solution. The older Lexar jump drive didn't work reliably with Ubuntu on both machines; the Kingston jump drive did. But then -- what's this? Now the file was visible on the Lexar drive too. Did it come to life when I plugged in the Kingston drive? None of this got me any further toward having Ubuntu or the Windows VM communicate with my Palm PDA. It appeared that the PDA, like the scanner, was something I would have to use on the secondary computer, running native WinXP, until such time as I upgraded to a Linux-compatible model or discovered a workaround. After the usual hassle with the Olympus digital voice recorder, it installed in Windows. Continuing to catch up with other things I had installed the first time around, I installed Firestarter and AVG AntiVirus on Ubuntu, and I installed Symantec AntiVirus and ZoneAlarm on the WinXP virtual machine. I also installed PC Tools Spyware Doctor, which had actually seemed more on-the-ball than Symantec. I ran scans with the antivirus and antispyware programs. AVG seemed a little weird: it looked like it either had to have a window open or else it was closed down -- like it only did scans, and did not continuously monitor the system. (Note, added to this portion of this post two days later: Firestarter has reported at least one serious inbound event.) After antivirus and firewall, the last security-type issue was backup. What I really wanted was one simple tool that would keep everything backed up -- Ubuntu program files, Windows program files, data files, whatever. I had an external hard drive, and I wanted regular backups to that, every hour or two. On WinXP, I had used Second Copy 2000 for this purpose, as part of my larger backup scheme. Librenix offered a page of Linux backup solutions, and from those I decided my first criterion was that I wanted a GUI. I was not unwilling to learn command-line syntax; I just didn't want to spend the time. It might have been different if I had seen a command-line option that seemed suitable for my situation, but I didn't. So the options I would choose among were TimeVault, which was apparently still in beta and whose installation instructions did not look too easy; Sbackup, which seemed to offer a GUI and the option of backing up locally, and for which I found a seemingly helpful installation page; Flyback, which also included helpful instructions but seemed that it might be oriented toward incremental backups; Restore, whose "lack of local file support" means that "it certainly wouldn't work for home users with only one machine"; BackupPC, which looked like it was designed more for system administrators; and Mondo Rescue. I also found another webpage that confirmed my sense that PartImage would be a good tool for backing up the system drives in an image file, in case something went wrong with the software installation (as distinct from my data files) -- though it also seemed that some of these backup programs might take care of that as well, without requiring a special reboot to do the backup. Of those, for reasons just stated, the most appealing ones included Sbackup and Mondo Rescue. Sbackup appeared to be another name for Simple Backup Suite and, as such, was the only four-star option that came up when I searched Applications > Add/Remove for "backup gnome." So I installed that. Using Sbackup to back up my partitions to my USB external hard drive required me to have the external hard drive turned on. So I turned it on. This produced an Ubuntu error message, "Cannot mount volume." The error message offered details that made me think of looking in the WinXP virtual machine, to see what it said about the external drive. I had recently experienced that funky problem in Ubuntu again, where using the Shift key while attempting to type the first letter of a word into Text Editor caused the Text Editor to close, and had therefore shut down and restarted the computer. This had caused all of my shared volumes to be inaccessible except the strange one that had started displaying real or imaginary Ubuntu files and folders. I went into VM > Settings > Options > Shared Folders and removed all of the shares, then re-added them. It still wouldn't let me add the external drive, and it still showed the Ubuntu files and folders in that NTFS partition. I powered down the WinXP virtual machine, in case that was the problem, and tried again to view the contents of the external drive in Ubuntu, but still no luck. I tried connecting the external drive to the secondary computer, where it came right up in WinXP. I rebooted the secondary computer in Ubuntu and tried to recognize the external drive there. It recognized it without difficulty. I guessed that maybe the external drive had to be connected and powered on before I started Ubuntu. To test this theory, I unmounted the external drive, turned it off, restarted Ubuntu on the secondary computer, and then turned the external drive back on. When it became visible in File Browser, I clicked on it. It showed me its contents with no problem. So one possibility was that this was a difference between the 32-bit version of Ubuntu I had loaded on the secondary computer, and the 64-bit version that was installed on the primary computer. I unmounted and turned off the external drive, reconnected it to the primary computer, turned it on, and rebooted the primary computer. Now the primary computer saw it clearly. So I would apparently have to connect the external drive before booting the primary computer, if I wished to use that drive to back up the primary computer. Now that I had the full attention of the external drive, there was the question of what Sbackup would do with it. Unfortunately, I couldn't find it under Applications. I did happen to see Firestarter there, under Applications > Internet, and I clicked on it. What I got was, "Failed to start the firewall. An unknown error occurred. Please check your network device settings and make sure your Internet connection is active." Well, it was -- I had just brought up the Blogger webpage on which I was typing these notes -- and I was disappointed that apparently you had to click on the firewall each time you started the machine if you wished to have an active Firestarter firewall -- and then, of course, I was also disappointed to see that, in this case, that didn't even work. But maybe that was not correct. Aside from the error message, clicking on Firestarter alos opened up its main dialog box; and after first saying "Disabled," that box changed to say "Active." It seemed to be noticing my web browsing activity, so possibly the error message was, itself, an error. We were moving incrementally closer toward investigating Sbackup. Continuing down the Applications list, I got to the end of the line, System Tools, and noticed AVG for Linux Workstation. I clicked on it and it, like Firestarter, came to life. I bailed out and it asked, "Do you really want to quit?" So this seemed to be confirmation that it was not a real-time virus barrier, just a post-facto virus scanner. But where, among these applications, was Sbackup? I clicked Applications > Add/Remove > System Tools and there, sure enough, I did have Simple Backup Config and Simple Backup Restore installed on this system. Ah, silly me: they were listed, not under Applications, but under System > Administration. I started to set up Sbackup and then it occurred to me that I wanted to use a different external drive. So I unmounted the one that I had connected and shut it off. I put the other drive into the external drive enclosure, reconnected, and powered it back up. Now, it was not necessary to reboot the primary computer for this one: it recognized it and showed me its contents. So either that particular hard drive was not recognized or the computer was in a different mood at that moment, which had been (at this point) some two hours earlier Returning to Sbackup again, I reviewed the options and decided this was not the direction I wanted to go. Surely it did a fine job of full and incremental backups, but what I wanted was more like a RAID or mirror type of arrangement, where the backup drive would have what Second Copy 2000 had referred to as an "exact copy" of what was on the drive. Even better, I thought, was the rdiff-backup solution of saving an exact copy plus saving (in a subfolder) previous editions of changing files. (Second Copy 2000 would do that too, but only in Windows.) I had taken that approach to a limited extent, but of course disk space could be a problem as you multiply backup copies of files. I decided to take a multipronged approach to backup. First, I would continue to use Drive Image 2002 to back up my Windows XP dual-boot partition. It had proved pretty reliable for me for some years, and at this point PartImage was only in an experimental state for purposes of backing up NTFS partitions. Since CloneZilla described itself as being based on PartImage, I was not confident that it would represent a better solution. If I had needed an alternative to Drive Image 2002 for Windows backup, I would have chosen Acronis True Image, but that was not the situation at present. I did plan to try out PING as well (see above), but would not yet solely rely upon it. As before, I would keep copies of the drive images on the external backup hard drive, and would also occasionally burn one of them to DVD. Backing up the Ubuntu directories (/usr, /home, etc.) called for a different strategy. An occasional PartImage image would probably be a good idea, but I expected there to be lots of configuration changes, and it also seemed that some data files were being saved in those directories. I did not yet have a clear sense of what each of those directories was for, but it seemed like I would be well advised to supplement the image backups with more of a day-to-day backup. For this, I thought it might work to use Sbackup. Otherwise, as someone pointed out, my data files on NTFS partitions seemed to call for two different approaches. Some (e.g., MP3s) would change rarely -- especially these days, when I was busy with other things -- and could probably be adequately captured with an infrequent backup to a nonchanging medium, be it DVD or a hard drive that would mostly just sit on a shelf. It did not seem that I needed to burden my external drive with redundant copies of those unchanging materials, though it might behoove me to do some sort of file count or checksum calculation to verify at a glance that, indeed, nothing had changed within a given folder over a period of weeks or months. Perhaps Unison would prove to be the best solution for that purpose; it sounded like it had some such features built in. With those relatively unchanging materials out of the way, I would have space and inclination to focus, as I should, on using something like rdiff-backup, with its "reverse-diffs." This, I hoped, would permit greater flexibility and attention to keeping multiple generations of those data files that did change frequently. It seemed likely that a good backup system would take a while to develop, but now I thought at least I knew enough about it to go on with the next step of trying out this whole Ubuntu and VMware alternative to a native WinXP installation. At this point, it felt like a number of basic needed functions were working out OK in this VMware setup. It seemed like I was going to be able to make this thing work. On a psychological level, the longer I worked in the Ubuntu environment, the more normal it felt. One thing that I especially liked, I realized, was thinking of Windows XP as just another application. Granted, it was an application with multiple things going on in it. But even that was less true than it had been. As some of those things (at this point, synchronizing the Palm PDA and scanning and printing) were assigned over to the secondary computer, WinXP on the primary computer was less important. It could crash, and I would just keep on working on Firefox or whatever else would be continuing to run in Ubuntu, while WinXP rebooted itself inside the virtual machine (which, as noted above, was much faster than in its native installation). This did not trivialize WinXP, but it put it into context, and that felt good. Anyway, the next thing I needed to figure out was how to make the sound work. Somebody had e-mailed me a WMV movie file, and I wanted to watch it. Windows Media Player (WMP) was showing me the video, but there was no audio. I tried playing it in Ubuntu by double-clicking on the file, and the Totem Movie Player came up by default. It said it needed to add codecs to play the WMV, and pursuing the matter led me to the GStreamer video plugin -- rated five stars in popularity according to Ubuntu's "Install multimedia codecs" dialog box -- so I went with that option. Up came another dialog, "Confirm installation of restricted software," which required me to confirm that it was legal for me to use this software in my country, or that I had a patent license or other permission to use it, or I was using it for research purposes only. Lacking access to a law library at the moment, I guessed that choices 1 and 3 might both be true in this case, and I went ahead with the installation. I mean, with five stars of popularity, surely the world couldn't have *that* many people who were significantly more entitled than I. I need not have bothered, though; Totem couldn't actually play the thing either. First, there was no sound, so this was not just a VMware problem; and second, the video was in very slow motion and couldn't be persuaded to speed up. I wasn't sure why there was no sound in Ubuntu. I right-clicked on the icon in the top right corner of the screen and verified that it wasn't muted. It was set at 80%. I found what looked like a pretty good sound troubleshooting website, and started through its steps. Unfortunately, a crucial link in that webpage no longer seemed to lead where it was supposed to, so I wasn't able to stay on that train all the way to the station. But it did make me think that, most likely, it was a matter of finding Linux drivers for my cheap sound card, or else finding another sound card. I wasn't sure of the exact model I had, so I wanted to fire up Belarc Advisor in WinXP in the virtual machine. Belarc Advisor gave great information on system hardware. I noticed, though, that the virtual machine was responding slowly, both in my attempt to add Belarc Advisor to this new WinXP installation and also, previously, when I wanted to open a PDF in Acrobat. It did respond, both times; it was just slow. Anyway, sound was working OK on the secondary computer, when it was booted into Ubuntu, so I suspected hardware was the problem. But I got a surprise. When I removed the cheap sound card that I had bought because my onboard audio was not working, I found that the onboard sound did work in Ubuntu. I did not need a separate sound card after all. Sound worked in the virtual machine too. Problem solved! Next issue: multiple monitors. When I plugged the second monitor into the primary computer, it just flashed strange stuff. I turned it off while I explored this issue, in case the present arrangement had the potential to damage it. Ubuntu's built-in help (i.e., the blue circle with a question mark on the menu bar at the top of the screen) was of no help, but I found a guide for systems with NVIDIA graphics cards (which was what I had) at Plek Blog. I entered the recommended first two lines into Terminal:
sudo apt-get install nvidia-settings sudo nvidia-settingsbut this produced an error message:
You do not appear to be using the NVIDIA X driver. Please edit your X configuration file (just run 'nvidia-xconfig' as root) and restart the X server.Running something as root meant, I think, the following, which is what I typed:
sudo nvidia-xconfigThis did a bunch of things very quickly, including giving me a "validation error," before concluding with the assurance that "New X configuration file written to '/etc/X11/xorg.conf.'" So I tried the second line (above) again:
sudo nvidia-settingsbut I still got the same "You do not appear to be using the NVIDIA X driver" error message. I bailed out on Plek Blog and tried, instead, the Ubuntu Guide, which advised entering these lines:
sudo apt-get install nvidia-settings gksu nvidia-settingsthe first of which was the same as above. I entered it anyway, because I was not too sure what it did, and I got a statement that "0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded." Which sounds like the very definition of a null activity, an unnecessary and redundant effort. I then proceeded to enter the second of the two commands just shown, and that gave me the same error message as above. I then browsed through several discussion threads in which people jumped through all kinds of hoops, doing things that I did not understand, in order to get their NVIDIA drivers working properly. I was thinking it can't be this hard, so I kept looking. What I tried next began with the advice to enable additional Ubuntu program repositories in System > Administration > Synaptic Package Manager. But I already seemed to have them enabled. Next, I was supposed to install envyng-gtk. (It would have been a different file if I hadn't been using the 64-bit version.) I searched for envyng-gtk while I was there in Synaptic, and I found it. Out of curiosity, I also searched for it in Applications > Add/Remove but did not find it. So Synaptic Package Manager seemed to have access to programs that were not found even in the "All available applications" realm within Add/Remove Applications. I marked envyng-gtk for installation and went through the remaining steps needed to apply it. Next, I went to Applications > System Tools and there it was, EnvyNG. I opened it, told it to install the NVIDIA driver automatically, and watched a dialog box show a bunch of messages as various commands were executed. The last line said, "Exception: ('\nEnvyNG ERROR: The following packages cannot be installed:',)" and that didn't look like the ideal solution. I looked up a few lines earlier and saw this: "E: Unable to lock the administration directory (/var/lib/dpkg/), is another process using it?" I realized that I had left the Terminal dialog window open, so I cancelled this EnvyNG dialog box, closed Terminal, and tried again. But that wasn't it. I closed Synaptic Package Manager and tried again. This time, the program dialog took quite a while and seemed to be unpackaging various things -- showing me that it was 15%, 16%, 17% of the way through, etc. After a while, it said, "Attempting to install the packages," and then it played me the Ubuntu song and I got a dialog that said, "Operation Complete." They recommended restarting my computer at that point, so I did. Well, this certainly had effects. Now the screen on which I was typing these words moved from monitor no. 1 to monitor no. 2. So at least I knew it was possible for the system to use both monitors in Ubuntu. Unfortunately, it was in a distorted resolution and my monitor was saying, "Mode not supported." I couldn't even see the bottom of the System > Administration (or was it System > Preferences?) listing to find the one that would allow me to change the resolution. I had to mouse down to the bottom of the visible ones and then use my arrow key to try the option that was one step below that. That didn't work, so I tried two levels below. That gave me monitor resolution options. I had to switch it to 800 x 600 resolution in order to see this composition area in my Blogger blog. I typed these words and then tried to figure out what to do next. This did not seem superior to what I had had before, so I went into Applications > System Tools > EnvyNG and, this time, I chose to manually select the driver, and I chose one that was one step back from the cutting edge. I got the "Operation Complete" dialog again and, holding my breath, I accepted the offer to restart the computer. This was worse. Now I had nothing but two blank screens. Oh, correction: screen no. 2 said, "Mode not supported" again. I had to hit the computer's reset button and, when it was got to Grub, choose to boot in Recovery Mode. Among the various commands flashing on the screen, I saw one that said something about overwriting a possible customization, and I assumed this meant that whatever EnvyNG had done, Recovery Mode was undoing. But just in case, I went into System > Administration > Synaptic Package Manager, searched for EnvyNG, marked it for uninstallation, and clicked Apply. So I was back at the starting point. I tried again to enter these lines:
sudo apt-get install nvidia-settings gksu nvidia-settingsand once again got the dialog that said, "You do not appear to be using the NVIDIA X driver." I ran sudo nvidia-xconfig again, and this time I focused on the VALIDATION ERROR that resulted. It said, "Data incomplete in file /etc/X11/xorg.conf. Device section 'Configured Video Device' must have a Driver line." I searched for those terms and found another tutorial that said, among other things, "Write down these instructions, because we're going to be dropping to a command-prompt." I thought that meant that a Terminal would open up. No, it meant that the screen would go black and I would be looking at a DOS-style command prompt, with no clue of what to do next. It happened after the first line or two, which had apparently wiped out my NVIDIA drivers. When I finally bailed out of the command prompt by hitting the computer's reset button and waiting until it finally put me back into Ubuntu, I found myself, once again, looking at something like 640 x 480 resolution (i.e., very big print). So I did as the writer advised: I wrote down those instructions, and then tried them again. And, simply put, they did not work. It occurred to me, as I reviewed NVIDIA's Linux Display Driver Archive, that maybe I was having problems because these various instructions were written for 32-bit rather than 64-bit systems. I tried "sudo nvidia-settings" and got "command not found." So I tried "sudo apt-get install nvidia-settings and then "sudo nvidia-settings." That gave me the familiar "Please edit your X configuration file (just run `nvidia-xconfig` as root)." Suddenly I wondered if running "sudo nvidia-xconfig" was not the same as running nvidia-xconfig as root. In the process of trying to learn about this, I discovered that Alt-F2 is a fast way of opening up a Run box, so I did that and (following some instructions) tried "gksu nautilus." This gave me a File Browser that supposedly had root priviliges -- which, now that I had done it, did not seem to be what I needed. I closed that and followed more advice that suggested using "gksu" rather than "sudo." So the command I entered was, "gksu nvidia-settings" again. And of course I got the same error message as before. The large print was getting to me, so I tried installing EnvyNG again in Synaptic Package Manager. It didn't make any difference in the available resolutions. Then I found instructions that seemed oriented toward 64-bit installations. They said that I could manually edit the configuration file by using "gksudo gedit /etc/X11/xorg.conf" so I tried that. It opened the xorg.conf file. The advice was, "Find the 'Driver "nv"' line and change it to 'Driver "nvidia"'" -- but there was no "Driver nv" line. It looked like I was just using a plain-vanilla video driver -- that my previous efforts had wiped out the NVIDIA driver -- and now the question was, how could I get it back? The "sudo apt-get install nvidia-settings" command didn't seem to be doing it. It said, "nvidia-settings is already the newest version." So then I thought about one of the error messages cited above, "Device section 'Configured Video Device' must have a Driver line." What I saw in xorg.conf (using the gksudo command shown above) was this:
Section "Device" Identifier "Configured Video Device" Boardname "vesa" Busid "PCI:2:0:0" Driver "vesa" Screen 0 EndSectionSo, following another bit of advice above, I tried changing "Driver 'vesa'" to be "Driver 'nvidia'" and, when this didn't magically change anything, I rebooted. Still no magical change. A check in System > Administration > Hardware Drivers told me that I had no restricted drivers (which is what the NVIDIA drivers had been) in use on this system, and there was no option of enabling them there. The system was also not autodetecting them, as it had done when I had first installed Ubuntu. I followed advice to run "sudo apt-get update," but that did nothing -- because, I guess, there was nothing yet on the system to update. So back to "sudo apt-get install nvidia-settings" but it said, once again, that I already had the newest version. I was going in circles. On reboot, I was now getting, "Your screen and graphics card could not be detected correctly." I tried installing EnvyNG again, but it said it was just a reinstallation. I ran EnvyNG on the automatic setting again. It wanted me to reboot, so I did. This time, I didn't get that incorrect detection message. The font was truly huge this time. On the positive side, I saw that I now had a new option: System > Administration > NVIDIA X Server Settings. Yes! The Holy Grail! There was nothing in there on screen resolution, though, and System > Preferences > Screen Resolution didn't go higher than 640 x 480. I thought maybe I had to reinstall my monitor driver, because the monitor resolution dialog wasn't recognizing the monitor. Unfortunately, there weren't any Linux drivers on my Acer monitor CD. But as I looked again at the NVIDIA X Server Settings dialog, I noticed (thanks to the Plek Blog pictures) that the Server Display Configuration screen did have resolution information. Still, there seemed to be no way to change it, even though that screen also indicated that the system was indeed recognizing my Acer monitor after all. After I tried once or twice, the Server Display Configuration screen started jumping around, there in that 640 x 480 resolution, so that I couldn't see the button and/or couldn't get it to open the resolution options (assuming there were any). At a loss, I rebooted, in case that would fix things. It didn't. Then I found a tip on how you could edit your xorg.conf file (using the gedit command, above, to open it) to allow higher resolutions. I did that and rebooted. It didn't help. I found two webpages on additional xorg.conf resolution settings I could change (make that three). I didn't get a chance to try those out, though, because I found my solution by following the Binary Driver Howto link to the Fix Video Resolution Howto, where running the Autodetect script was (I think) the answer. This did not solve the problem of multiple monitors, but at least it gave me back my original display resolution -- or, I think, better. Yes, definitely better. This looked more like the best I could get from Windows. But now, what about that second monitor? I went to the Ubuntu guide on configuring multiple monitors with an NVIDIA graphics card and ran "gksu nvidia-settings." As with my attempt, a few minutes earlier, to run it via System > Administration > NVIDIA X Server Settings, this once again gave me the old familiar message about running "nvidia-xconfig" as root. I did that and got the familiar "Validation Error" message. This reminded me to use the foregoing procedure to edit xorg.conf. But once there, I couldn't figure out what to edit. The monitor section wasn't naming my Acer, but the Device0 section plainly did name NVIDIA. What seemed to be happening, at this point, was that I was slowly learning bits and pieces of the computer language used to convey commands to the system. My experience with DOS, years earlier, had taught me that it can take a while to get this stuff right, and it can be a lot slower to work through than if you have a GUI (graphical user interface); but if you do master it, you can do all kinds of things that aren't possible within the limitations of the GUI. I didn't want to spend the time to learn this language, but there seemed to be no alternative at this point, given my interest in developing an Ubuntu and VMware setup as a partial replacement for Windows. Linux had been like this since I had first looked at it, nine years earlier. It could be another five years, or more, before it resembled the mostly graphical environment of Windows. There was just too much stuff that still had to be done from the command prompt. Unfortunately for me, some of the advice that was offered, for commands to enter at that prompt, carried the risk of taking me backwards rather than forwards. So, OK. How was I going to make NVIDIA X Server Settings work? I ran EnvyNG again, but it said I had the latest drivers -- and then, without any further ado, it said it was in the process of trying to remove them. I canceled that, fearing that EnvyNG may have already put me back into the Stone Age, and rebooted to see where I stood. Sure enough, the resolution was back to (I think) 800 x 600. I retraced my steps from above, regarding the Fix Video Resolution Howto, specifically running the Autodetect script. In fact, this time I just entered this one command: "sudo dpkg-reconfigure xserver-xorg." Then Ctrl-Alt-Backspace to reset, and presto! ... no, that wasn't it. The resolution was still 800 x 600. Editing all three lines didn't make any difference either. Next, I edited xorg.conf, which now looked nothing like the one shown above. I completely replaced its contents with those shown above, saved it, and then tried Ctrl-Alt-Backspace again. This did not make any difference. I no longer had the System > Administration > NVIDIA X Server Settings option, which suggested that EnvyNG had removed that, so I tried to figure out how I had gotten that option preivously. EnvyNG had seemed to be the key to that, so I ran that again. Yes! That was it. I had the small resolution back again. And this time, for the first time, when I ran System > Administration > NVIDIA X Server Settings, I got the dialog box, in a manageable resolution, without any error messages. I plugged in the second monitor, which I had unplugged somewhere along the way, and clicked on Detect Displays in the NVIDIA X Server Settings > Server Display Configuration > Display area. The second monitor appeared but was shown with the parenthetical comment, (Disabled). I clicked on it and then clicked Configure. They gave me three options: Disabled, Separate X Screen, or TwinView. TwinView meant basically making two identically sized monitors function as one big screen, while Separate X Screen meant that each monitor would function separately from the other. So in Separate X Screen, you could not drag your word processor from one screen to the other, as you could in TwinView; apparently you would have to close it down in one monitor and open it up in the other. It sounded like neither of these would be a complete substitute for the DualView option I had been using in Windows XP. Between the two, I had to go with Separate X Screen, since my two monitors were not the same size. Separate X Screen could entail additional complexities. When I hit Apply, I got an error message:
Cannot Apply The current settings cannot be completely applied due to one or more of the following reasons: - The location of an X screen has changed. - The location type of an X screen has changed. - The color depth of an X screen has changed. - An X screen has been added or removed. - Xinerama is being enabled/disabled. For all the requested settings to take effect, you must save the configuration to the X config file and restart the X server.I chose the "Apply What Is Possible" option. Nothing special happened. Then I clicked on "Save to X Configuration File." This gave me the option of merging whatever I had just done with the existing contents of xorg.conf. I figured why not, and did it. I got an error message, "Unable to remove old X config backup file '/etc/X11/xorg.conf.backup.'" I didn't think I really cared about that, so I just clicked OK. Then I clicked Quit. This did not give me a second monitor, so I started NVIDIA X Server Settings and tried again. The second monitor was still shown as Disabled. I went through the same steps in a different combination and still no luck. At this point, I noticed the "Enable Xinerama" option. A Wikipedia article made it sound like this overcame some of the drawbacks of separate X screens, so I clicked on that option. Following some advice in a forum posting, I opened xorg.conf again for editing. What do you know -- it was blank! But, oops, I had not typed its name right. Trying again, I got it. The advice was to delete its contents and then select Save Configuration File, then Show Preview, and copy and paste everything from the preview back into xorg.conf. The preview seemed to have picked up everything from the existing xorg.conf, including the references to NVIDIA and to the Acer monitor, so I went with this. So now apparently I didn't need to care when I got that error message about being "Unable to remove old X config backup." But when I clicked Apply in NVIDIA X Server Settings, I still got the same "Cannot Apply" dialog. So I canceled out of that, quit the server settings dialog, and Ctrl-Alt-Backspaced to see what would happen next. And, well, it seems I got TwinView. My mouse would run right across the one to the other, and not exactly in a straight line; it took a bit of a hop there at the dividing line between screens. But at least it didn't run off the screen altogether. There also wasn't any menu bar, taskbar, or status bar on the second monitor, so it looked a little odd. But , hooray, at least I had double monitors. I opened up OpenOffice.org Writer and dragged it back and forth with no problem; same with the Firefox window in which I was writing these words. When I maximized Writer, it filled the second screen; when I minimized it, it went to a space on the taskbar on monitor no. 1; and when I maximized it again, it went back to the second screen. I rebooted and the system froze. After a minute or two, I hit the reset button and it all came back, both monitors as expected. So here was my final xorg.conf file:
# nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 1.0 (buildd@yellow) Thu Jun 5 09:27:12 UTC 2008 # xorg.conf (X.Org X Window System server configuration file) # # This file was generated by dexconf, the Debian X Configuration tool, using # values from the debconf database. # # Edit this file with caution, and see the xorg.conf manual page. # (Type "man xorg.conf" at the shell prompt.) # # This file is automatically updated on xserver-xorg package upgrades *only* # if it has not been modified since the last upgrade of the xserver-xorg # package. # # If you have edited this file but would like it to be automatically updated # again, run the following command: # sudo dpkg-reconfigure -phigh xserver-xorg Section "ServerLayout" Identifier "Default Layout" Screen 0 "Screen0" 0 0 Screen 1 "Screen1" RightOf "Screen0" EndSection Section "ServerFlags" Option "Xinerama" "1" EndSection Section "InputDevice" Identifier "Generic Keyboard" Driver "kbd" Option "XkbRules" "xorg" Option "XkbModel" "pc105" Option "XkbLayout" "us" EndSection Section "InputDevice" Identifier "Configured Mouse" Driver "mouse" Option "CorePointer" EndSection Section "Monitor" Identifier "Configured Monitor" EndSection Section "Monitor" Identifier "Monitor0" VendorName "Unknown" ModelName "Acer AL2216W" HorizSync 31.0 - 84.0 VertRefresh 56.0 - 77.0 EndSection Section "Monitor" Identifier "Monitor1" VendorName "Unknown" ModelName "HIQ N91S" HorizSync 30.0 - 83.0 VertRefresh 56.0 - 75.0 EndSection Section "Device" Identifier "Configured Video Device" Driver "nvidia" EndSection Section "Device" Identifier "Videocard0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce 7900 GS" BusID "PCI:2:0:0" Screen 0 EndSection Section "Device" Identifier "Videocard1" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce 7900 GS" BusID "PCI:2:0:0" Screen 1 EndSection Section "Screen" Identifier "Default Screen" Device "Configured Video Device" Monitor "Configured Monitor" DefaultDepth 24 Option "AddARGBGLXVisuals" "True" EndSection Section "Screen" Identifier "Screen0" Device "Videocard0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "0" Option "metamodes" "DFP: nvidia-auto-select +0+0" EndSection Section "Screen" Identifier "Screen1" Device "Videocard1" Monitor "Monitor1" DefaultDepth 24 Option "TwinView" "0" Option "metamodes" "CRT: nvidia-auto-select +0+0" EndSection Section "Extensions" Option "Composite" "Enable" EndSectionNow I wondered if I could make those dual monitors work in VMware. I started up VMware and powered on my WinXP virtual machine. Just as with Writer, maximizing VMware, or using its full screen option, filled just one monitor, not both. The Workstation User's Manual (p. 162) said you can run a different virtual machine on each monitor and can also have one virtual machine use two or more monitors. I followed the manual's instructions for configuring the thing, but as it turned out their instructions were what I already had set up. When I powered WinXP back up and it was fully prepared to go, the VMware toolbar at the top of the screen (which I had set to not always be on top) gave me the option to "Add a monitor to full screen view." I tried that and, yeah, my Windows wallpaper stretched from sea to shining sea. There were three viewing options in VMware: full screen, normal view (as I called it), and minimized. I went from full screen to the other two and then back to full screen, but now my WinXP session was back to just filling monitor no. 1. I had to click the "Add a monitor" button again to make the WinXP session fill both screens, and then I could drag Acrobat and Word around so that each of them would have its own monitor. Then I minimized VMware and I was back to having the regular Ubuntu wallpaper, with the Hardy Heron, showing this Firefox session and the Workstation User's Manual. I wanted a way to keep the WinXP full, full screen setup in place, so that I could switch over here to my browser (or to whatever I had going on in Ubuntu) and then quickly back to the previously arranged WinXP layout with, say, Acrobat in one monitor and Word in the other. Full-screen VMware obscured the Ubuntu taskbar, though, so I couldn't click on the alternate desktop button at the bottom of the screen and switch between layouts that way. Ctrl-Alt-Enter was built into VMware to open up the (next) virtual machine (if you had more than one open), but it didn't seem to work between VMware and the rest of Ubuntu. It looked, at present, like there was no really fast way to switch between Ubuntu and a maximized virtual machine that would continue to fill both monitors. There was a lot to like about this arrangement. For example, I wrote this particular sentence in Word, inside the WinXP virtual machine, and then copied and pasted it directly into Firefox in Ubuntu, with both windows in plain sight. So it felt like the barrier between WinXP and Ubuntu could be pretty thin, once you got used to this basic arrangement. The setup also felt pretty stable -- I was not getting any flakiness from it at this point. Overall, it seemed like I might have graduated into a larger sphere of operation, like the feeling I'd had when I first started working on multiple monitors. It seemed that things would probably get more efficient, once I became familiar with the setup. It also seemed, however, that there would be unexpected traps and failings. For example, I accidentally deleted some files in Windows Explorer. I was looking this blog, and meant to delete some characters in it. My cursor was over here, and I thought this was where the deleting would happen. But in fact a folder in WinEx was selected, and that's what got deleted. The unfortunate discovery was that Edit > Undo was greyed out. Those files were gone. I had copies, but I didn't know exactly which ones vanished; I had sorted them since copying them. So there was going to be some additional work to figure that out. I also found that basic file operations involving Windows Explorer were very slow. The actual file copying itself seemed to go quickly enough, but I could not get ahead of the machine and click on other files that I wanted copied or deleted somewhere else. The virtual machine was simply not responsive in that scenario. In short, it was going to be a transition. But I downloaded an undelete program and I felt that I might be able to make a go of it. And I looked forward to it. There were still other programs that I needed to install, but at this point I felt I had the basic idea and that I had worked through most of the more difficult aspects of the hardware and software transition.
2 comments:
The preview seemed to have picked up everything from the existing xorg.conf, including the references to NVIDIA and to the Acer monitor, so I went with this. So now apparently I didn't need to care when I got that error message about being "Unable to remove old X config backup." But when I clicked Apply in NVIDIA X Server Settings, I still got the same "Cannot Apply" dialog.
To the "Cannot open file on virtual machine" in WM Player.
If you need some big folder to be copied into the VM file you might succeed in this in a bit awkward way(at least it worked for me on Win XP host and also WM PC):
1)set a folder you want to copy into the VM as a shared folder. Do not forget to check the option map as windows net drive
2)copy the files from this shared folder into some folder created within the host OS and you are done.
Lubos Dobrovoda.
Post a Comment