Ubuntu and VMware: Fixing, Fixing, Fixing
Before starting this post, I had written a number of posts on the process of deciding on, acquiring, and installing Ubuntu and VMware Workstation 6. I had chosen Workstation to help me run Windows XP programs, because I still needed some of them. I had gotten bogged down in a couple of places, where Ubuntu, VMware, or my hardware did not work as anticipated. Most recently, I had written up the process of reinstalling Ubuntu, which had proved necessary after I played with an uncooperative video card until my system itself was finally screwed up. Reinstallation had turned out to be a lot easier than reinstalling Windows, especially with the use of virtual machines (VMs); and it seemed that I would be much faster at reinstallation in the future, now that I had worked through it once. Now I was at a point of continuing to clean up some loose ends that still had to be redone to finish the reinstallation process, and then forging ahead to install the last of my software and make the system function as well as I reasonably could at present. First, I discovered that my VPN connection no longer worked within my WinXP VMs. There was not a lot of detail to share on that; I was not sure whether that was because of the reinstallation of my underlying Ubuntu setup or for some other reason. I basically just had to go into Network Connections, in each WinXP VM, delete the old VPN connections, and set up new ones. This, like all tasks required across the entire set of VMs, could be time-consuming. A day or so later, I got an error message in one of my VMs:
VMware Workstation unrecoverable error: (vcpu-0) Unexpected signal: 11 A log file is available in "/media/VMS/VMware VMs/WXMUProjectC/vmware.log". Please request support and include the contents of the log file. To collect data to submit to VMware support, select Help > About and click "Collect Support Data". You can also run the "vm-support" script in the Workstation folder directly. We will respond on the basis of your support entitlement.I clicked OK, and the VM powered down. I powered it back up and the same thing happened. I was thinking it might be related to my external USB drive (not a jump drive, but a regular hard drive in an external enclosure), because for some reason my VMs were now failing to see that drive, although Ubuntu itself had no problem in that regard. I went to Workstation's Help > About and clicked Collect Support Data. A dialog came up that said it could take minutes or even hours to collect all the information. I said OK. Then a little black box came up. It said,
Collecting support data You are not root, some system information can't be collected. Preparing files: Could not copy /etc/security/opasswd to the tar area. Preparing files:Then it listed a bunch of files. It may have said other things, but I couldn't tell -- the listing scrolled off the top of the screen, and there were no scrollbars to go back up with. While it was working, I looked at another powered-down VM. I didn't power it up; I just looked at the summary snapshot that Workstation provides, to see what files I had left open in there. In just a minute or two, when the data collector was done, I looked at the Collecting Support Data box again. It said,
The tar did not successfully complete! If tar reports that a file changed while reading, please attempt to rerun this script.It didn't give an exit option, so I just clicked the X in the upper right corner of the black box. It hadn't said where the tar file might be. In the About VMware Workstation dialog, I noticed that it said, "Current UI log file: /tmp/vmware-ray/ui-7510.log." I went there in File Browser and double-clicked on that file. At the end -- that is, in the most logging of most recent events -- I saw that there had been maybe a hundred attempts of this description:
Sep 04 06:22:28.773: vmui| connect to /var/run/vmware/ray_1000/1220522556964158_12241/testAutomation-fd: Connection refusedI didn't know what that was about, but "Connection refused" sounded consistent with the failure to detect the external drive, so maybe that was the idea. I powered down or suspended all virtual machines and rebooted Ubuntu. I had noticed, during this early-morning shutdown and also during the previous night's hibernation, that upon closing, Ubuntu gave me a screen full of colors in a low resolution, like if I had been connecting an unrecognized monitor. On reboot, interestingly, the system noticed that it needed several updates; I wasn't sure why it hadn't recognized that before reboot. Or maybe it had, and I just hadn't noticed. Anyway, I restarted Workstation and went back to the VM that had crashed. I powered it up. It started with no problems. I ran Advanced WindowsCare 2 and told it to correct the few problems it inevitably found. I looked in Windows Explorer and saw that it still wasn't seeing the contents of drive O, the external drive, although it was seeing the drive itself. Again, Ubuntu was seeing it without difficulty. I powered up another VM and tried there. Same thing there. Both of these were machines that had been powered down, not merely suspended. I went to Workstation's VM > Settings > Options > Shared Folders and looked for this drive O. I clicked its Properties. Everything looked OK. In Windows Explorer, I selected drive O, clicked Tools > Disconnect Network Drive and disconnected it. Then I clicked Tools > Map Network Drive and indicated the drive and location. No difference: I could see drive O listed in the left-hand folders pane of Windows Explorer, but nothing in the right-hand files pane. Same thing when I went through the same steps in the other VM I had just powered up. I resumed another VM, one that had been suspended, and tried there. Same thing. Ubuntu was still able to access the external drive without a problem. My principal interest in accessing that drive was to check up on how it was doing as an external backup. I normally ran Second Copy 2000 in a small Windows XP VM for this purpose. Second Copy would periodically verify that the second drive contained an exact copy of the first. It seemed to be time to examine an Ubuntu alternative. My previous review of backup options had uncovered several possibilities for data (as distinct from program) backup, including Rdiff-backup and Unison. But Rdiff-backup would keep multiple generations of things, which was not necesssary or appropriate in some cases, and Unison would synchronize, when what I actually wanted was to copy from a source to a target in a one-way arrangement. Unison's description sounded like it would get your go-ahead before mixing files from the two drives. Even so, there was no way that I wanted to risk that some old files on a previous backup would somehow get shoved back into the current collection. And then there was rsync, another option that people sometimes recommended. A Google search had not led immediately to a manual or tutorial that seemed reliable and widely used, and the man page ("man rsync" in Terminal) ran to 3000 lines, with dozens of options. Another Google search led likewise to pages that seemed pretty technical in nature. I would get to that level, perhaps, but I wasn't there yet. For now, I figured I would have two options. First, for folders or partitions that didn't change much, I would just do a simple copy and paste in File Browser, followed by a comparison of Properties of the source and target. I liked, by the way, that you could have multiple Properties boxes open for different folders at the same time, in Ubuntu, and that they would stay put while you poked around in different places to learn more -- unlike Windows, which would all too eagerly shut them down. I also liked that Ubuntu was capable of deleting many gigabytes, containing thousands of files, in a matter of minutes, while doing so could take a half-hour or more in Windows XP. I also liked how Ubuntu would actually let me unmount the external drive. When I tried the Safely Remove Hardware thing in WinXP with my external drive (as distinct from a jump drive), it invariably said that it could not shut down the hard drive right now -- so either I had to reboot the system and disconnect the hard drive during reboot, or just take my chances and shut it down when Windows might suddenly try to write to it for some random reason. And now I believed I had a theory as to why my VMs were not seeing the external drive. I labeled it drive O in Windows. That way, it would always be at the same place. Windows would assign drive letters according to how many partitions there were. So if you had three partitions, it would assign them as C, D, and E. If you then plugged in your external drive, Windows would probably label it drive F. But then if you decided you actually needed four working partitions plus the external drive, now the external drive would be drive G. This would screw up the operation of profiles in Second Copy and other programs, which (in Windows style) would be looking for the drive by letter, not by name (e.g., OFFSITE) as Ubuntu did. Fine. But if I opened up the external drive enclosure and swapped hard drives within it, as I had just recently done, now Windows would no longer remember that it was drive O. Now it would be back to being drive G or I or whatever Windows thought it should be. So, in this theory, I had mapped drive O, all right, but there was nothing there as far as Windows was concerned. It was a little confusing, but I thought that, if I could persuade the external drive to be drive O again, all of my VMs would see it, and the world would be wonderful. But how to do that? The only solution I could think of for this, right now, was to post a question on it, suspend all my VMs, and then boot WinXP natively and reset OFFSITE, the external drive, to be drive O and reboot into Ubuntu and try again. But then, as I was suspending the VMs, I noticed that one of them had recognized OFFSITE as drive I. It was seeing it as an actual local drive, not as a network drive. That meant Windows Computer Management (i.e., Disk Management) would see it. It did. I changed it to be drive O. This gave me a message:
The drive letter O: is already mapped to a network share or a local path. In order to see the volume after the operation, you must remove current mapping. Do you want to continue?Well, current mapping wasn't working for me, so I said sure. Now I went back into one of my other VMs and looked to see what Windows Explorer was making of all this. It said this:
I:\ refers to a location that is unavailable. It could be on a hard drive on this computer or on a network ....So, OK, evidently it had been seeing the external drive as drive I, and now it didn't find any drive I. I had changed the drive letter in one VM, and it seemed to be working here in a different one. I hit F5 to refresh but, surprisingly, Windows Explorer repeated that error message again, as if it were still looking for drive I, even though it had not previously shown me any drive I. I went into Tools > Map Network Drive, there in WindEx, and mapped OFFSITE to be drive O. But no joy: no listing of files and folders on OFFSITE. I restarted that particular virtual machine, but that made no difference. I took a look at the other VM, the one that had first seen OFFSITE. It, too, wasn't showing anything for drive O in the right-hand pane of Windows Explorer, though it did show the existence of OFFSITE as drive O in the folders pane. But then, after I hit F5, OFFSITE was back as drive I; and when I went back into Disk Management, I couldn't change it to drive O again because O wasn't an option. I canceled out of that and rebooted this machine that had originally seen the external drive. On reboot, it recognized OFFSITE as local (i.e., not shared) drive O. I suspended that machine. All of my VMs were now off or suspended. I powered up one, to see if it would recognize drive O. It didn't at first. I let it sit for a while. No change. I rebooted the computer into Windows XP. It recognized OFFSITE as drive D. Using Disk Management, I changed it to O and rebooted into Ubuntu. It didn't help. Nothing had changed. And I had no responses to the question I had posted. I started up the VM that had recognized OFFSITE as a local drive. I remembered that people said USB devices were problematic in VMware, and I also knew that I had not been able to get my own USB devices working in it -- that, instead, I had to use the Windows XP dual boot on my secondary computer to make contact with my Palm PDA and my digital voice recorder. This seemed like that kind of situation, where sometimes one VM would recognize a USB device but none of the others would. In Ubuntu on the primary computer, I unmounted OFFSITE. I went around back of the computer, disconnected its USB cable, and plugged in its eSATA cable instead. (The external drive enclosure, a Rosewill, did have that alternative capability.) Now Ubuntu didn't recognize it (in File Browser or in df -h), and neither did either of the VMs. I closed down the VMs and rebooted Ubuntu. File Browser recognized OFFSITE without the usual USB emblem, but it had not been automatically mounted by fstab. My double-click in Ubuntu mounted it now, though, and I could go in and look at its contents. In VMware, I powered on a VM and went into Windows Explorer. WindEx was still showing O as "Offsite on '.host\Shared Folders' (O:)," so I right-clicked and selected Disconnect. OFFSITE was still visible with no error messages in VMware's VM > Options > Settings > Shared Folders. In WindEx, Tools > Map Network Drive recognized OFFSITE, as usual, under VMware Shared Folders > .host > Shared Folders. I mapped it to drive O. Now OFFSITE was visible as a shared drive in WindEx again, but still nothing when I tried to look at its contents. No other partitions, whether NTFS or ext3, were having this problem. In Ubuntu, OFFSITE > Properties showed me what the problem might be: root was the owner and group. I had been in a similar position before; I had just forgotten. I logged in as root, typed "nautilus", and went into File System > Media and right-clicked Properties > Permissions for Offsite. I changed to ray (as owner and group) and changed folder and file access to "Create and delete files." I closed and went back into Permissions again, to be sure. The changes had stuck. I suspended all open VMs and rebooted Ubuntu. OFFSITE was still not automatically recognized. I mounted it and went into VMware and opened a VM. No difference! In the VM that had previously recognized OFFSITE as a local drive, Disk Management now saw only drive C. I could not figure out why WindEx would still show drive O, OFFSITE, as a local drive. I played with Tools > Map Network Drive a bit, but it made no difference. I reposted my question, which had still received no responses, into a VMware forum. While I was waiting for guidance or insight, I decided to do some of my actual work. I opened Microsoft Word and a session of WindEx in one of my 1GB VMs. I selected 24 PDFs that I wanted to look at while adding notes to the Word document. I hit Enter and hoped that all 24 would open up on the first try, although I knew that Acrobat and/or Windows would ordinarily get confused when trying to open so many PDFs at once, and would be missing a few. So I would have to go back to WindEx and, with those same PDFs still selected, hit Enter again. For 24 PDFs, two rounds would ordinarily do it. In VMware Workstation, unfortunately, it took more like five rounds. Then again, on another group of 18 PDFs, all loaded in the first round. So apparently VMware was still getting its act together, with the first group, although it had seemed like it was all set and ready to go. Maybe hard drive light activity was not the best or only guide to whether Workstation was prepared to function properly. Another thing to take care of was VPN. I needed to use a VPN connection for some of my work projects. I found a webpage that made it sound very simple. I didn't want to disturb the state of things on the primary computer, so I tried it out on the secondary one instead. First, I went to System > Administration > Synaptic Package Manager and searched for pptp. The webpage said I was supposed to get VPN Connection Manager, but I didn't. Doing it their way, I searched in Applications Add/Remove. Well, and wasn't that interesting: I had gotten the impression that Synaptic was more comprehensive and more reliable, but here was VPN Connection Manager, just like the webpage said. I killed Synaptic, checked this one, and clicked Apply. Then I rebooted. The instructions said to look for the Network Manager icon in my system tray, but I didn't have a system tray. Using Ubuntu's System > Administration > Network menu option, I got a Network Settings dialog, but I did not see any VPN Connections option. Nor was there one under Applications > Internet. Using another guide, I found that my mistake was that I was supposed to left-click on the network icon that somehow, at least in my case, had migrated to my top panel; and there, I was to select VPN Connections > Configure VPN > Add. From there, I mostly went with the preferred options. Now, when I went back and left-clicked on that network icon again, I had the connection that I had just set up. I entered my username and password. After a few seconds, I got this:
VPN Connect Failure Could not start the VPN connection (connection name) due to a connection error. VPN Connection failedI thought this might be due to Firestarter, my Ubuntu firewall. I went to Applications > Internet > Firestarter > Firewall > Stop Firewall and then tried VPN again. But no, that wasn't it: I still got VPN Connect Failure. On the theory that Applications > Add/Remove was not as reliable as System > Administration > Synaptic Package Manager, I uninstalled VPN Connection Manager in Add/Remove and then went into Synaptic and installed network-manager-pptp. It didn't look like anything had changed, in any of the menu picks or icons mentioned above. I tried making the VPN connection again and still got Connect Failure. Then I found a webpage that made me think, of course, the router. I had a router. Quite possibly it was blocking the connection. For the time being, anyway, I had to use a browser within a WinXP VM to do my VPN work. It also developed that I needed Google Desktop, or something like it, to do full-text searches of materials on my computer. I was pleased to discover that there was a Linux version -- not available through Synaptic, it seemed, but directly from Google. Then again, correction: they did have a repository version for Ubuntu 7.04 (Feisty). I thought I might give that a try first and see what happened. The first step was to download the signing key, which meant just clicking on the link and saving the resulting file with its default name, which was linux_signing_key.pub. Then I went to System > Administration > Software Sources > Authentication > Import Key File, and pointed to the location where I had saved it. I thought I had saved it to Desktop, and I had, but apparently there was more than one desktop in Ubuntu, so I had to look for it specifically in /home/ray/Desktop. Now, still in that same Software Sources dialog, I went to Third-Party Software > Add and typed "deb http://dl.google.com/linux/deb/ stable non-free" and then clicked Add Source > Close. It said, "The information about available software is out-of-date." I clicked Reload. I closed out and went to Synaptic, and this time a search did turn up google-desktop-linux. I marked it for installation and installed. Then I went to Applications and there it was: Google Desktop. I clicked on it. Nothing happened. When I double-clicked on the icon that had now been installed on my top panel, I got the Google Desktop Quick Search Box, which I did not ordinarily like; but when I clicked on its Preferences option, the Search Box vanished. I was going to have to get back to that. In the meantime, I had to sort out another thing. I had swapped out hard drives in my external backup drive enclosure. The drive in there now already had a backup of my hard drive contents, but it was several weeks old and I wanted to update it. I thought I would try using rdiff-backup for this. They had just come out with a new stable version within the past few weeks, for the first time since 2006, so it looked like a good time to give it a try. I searched for it in Synaptic, but they only had an older unstable version. I downloaded rdiff-backup-1.2.1.tar.gz and searched my blog for a reminder on how I had previously installed tar.gz files. It turned out that VMware Workstation had come in a tar.gz file, and here is the command I had used in that case:
tar zxpf VMware-workstation-6.0.4-93057.x86_64.tar.gzIn Terminal, I typed "man tar" and got an explanation of what this command had done. First, the -z option had filtered the VMware-workstation zipped archive file through gzip. I guessed this meant that it had unzipped to the point of removing the .gz extension. (I wasn't sure why it hadn't been necessary to put a hyphen in front of the z.) Next, the -x option had apparently extracted the contents of the .tar archive file. So now I probably had a folder bearing the same name as that VMware*.tar.gz file, with the contents of this .tar.gz archive file inside it. Next, the -p option indicated that permissions were to be preserved. The annotation said, "ignore umask when extracting files (the default for root)." I didn't see a reference to umask in the man (short for manual) page, but I did see this:
This man page was created for the Debian distribution. It does not describe all of the functionality of tar, and it is often out of date.So, oops, it seemed I should be searching for what the man page described as "info documents" pertaining to GNU, which tar was evidently a part of. My Google search led me to a number of pages that weren't helpful, including an Ubuntu package description page that did not seem to have any links to official information pages. Finally, I found a post that seemed to explain it fairly well. They recommended using "tar -zxvf filename.tgz." I didn't have a .tgz extension on this file, so it wasn't 100% applicable, but I was interested in their substitution of -v in place of -p. Since I still did not understand exactly what -p did, I thought it might be wiser to go with -v. The man page said it called for a verbose list of the files processed. I didn't really need that, and I didn't actually like all those filenames scrolling -- I preferred to see fewer and (possibly) more important information from the process -- so I decided to leave it out. Finally, how about -f? The man page said it was short for --file, and explained as follows:
use archive file or device F (otherwise value of TAPE environment variable; if unset, "-", meaning stdin/stdout)Clear as mud. But since both of the above examples recommended using it, it stayed. So my command was going to be this:
tar -zxf filename.tar.gzRemembering that you could use ~ instead of typing /home, I went into Terminal and typed "cd ~/Desktop" and then "dir" and that gave me a file listing of my Desktop, which did contain rdiff-backup-1.2.1.tar.gz. I typed "tar -zxf rdiff-backup-1.2.1.tar.gz" and hit Enter. Terminal did not automatically indicate whether the contents of a folder were folders or files, but I saw that I did now have an additional rdiff-backup-1.2.1 entry. I guessed it was probably a folder and typed "cd rdiff-backup-1.2.1". Sure enough, "dir" indicated that there were a bunch of files in there. The helpful post said I should first make sure I had a compiler by typing "sudo apt-get install build-essential". I was sure I must have, but since there didn't seem to be any harm in double-checking, I typed that apt-get command. Well, silly me: it said (a) "The following packages were automatically installed and are no longer required" and, after the list of those files, it said, "Use 'apt-get autoremove' to remove them." So I made a note to do that. Then (b) it said there would be seven newly installed programs. So, OK, apparently I had not had a compiler in place after all. I told it to go ahead. I needed the compiler because what I saw unzipped, there in that folder, was not a binary file (which I assumed would be a .bin), in which case some other directions or procedure would have been appropriate. Instead, as I say, I saw a bunch of files, and one of them was named README. The helpful post said that the README would typically contain instructions on compiling and installing. Before I forgot, I ran "apt-get autoremove" as just instructed. I had to do that as root. The README said that I needed Python 2.2 or later and librsync 0.9.7 or later. I went to Synaptic and searched for those two. I had Python 2.5 but not librsync. Synaptic offered librsync1, version 0.9.7-1build1. I installed that, although then I noticed that the rdiff-backup download page made it sound like I would only need librsync if I was running Windows. Whatever. The README said that, to install, I should type "python setup.py install" and I did. This yielded a large number of error messages, ending with "Error: command 'gcc' failed with exit status 1." "Exit status 1" sounded like it meant I was going to be back at the prompt promptly, and, you know, I was. Now what? The FAQ.html file there in the rdiff-backup folder did not offer any insights. As I poked around in it, I got the sense that rdiff-backup was still in development, or I thought maybe it was not going to work with my 64-bit version of Ubuntu. As an alternative to rdiff-backup, what people had actually mentioned more frequently was rsync. It did not have the incremental backup features of rdiff-backup, to my knowledge, but it seemed to be an established tool for backup purposes. So for now, at least, I thought I might try that instead. Once again, I did a Google search and got a package detaiils page with no apparent link to any help files. Eventually I found what looked like the official rsync webpage and, after looking at their FAQs and some other pages, landed on their Examples page. It was intimidating. It looked like I would basically have to write a shell script to use rsync. There was too much stuff in those scripts that I didn't understand. I wanted a secure backup; I didn't want to discover, down the line, that -- oh, sorry -- I was using some command wrong and therefore was not getting the backup I thought I would get. I didn't have time, now, to invest in self-instruction in scripting, and even if I had, experience suggested that it might have been unwise to assume that now I understood it well enough to produce a reliable backup. What I really wanted, in this, was a backup tool that would operate through the copy-and-paste features of File Browser (Nautilus). I wanted to select and copy the files and folders to be backed up; I wanted to go to the target drive or directory and say "Paste"; and if the file already existed in the target, I wanted an option to replace it or to move it (and, optionally, all other preexisting files) to a newly created incremental backup subfolder. That tool, to my knowledge, did not yet exist. I was going to post a question on this, but for some reason I was now unable to log into the Ubuntu forums. So for now it seemed I had to fall back on doing my backup the old-fashioned way: delete the contents of the backup drive and replace them with a copy and paste from the source drive. At this point, I had a new problem. I didn't use the computer for about 24 hours, and it seemed that somehow screwed things up for web browsing. I noticed it particularly on the secondary computer, where I wasn't running VMware, just Ubuntu. I started Firefox and noticed that a bunch of my tabs from the previous session could not load. I reloaded all tabs and most did, but some still didn't, load. Blogger (i.e., the website on which I post these notes) was only able to go into text mode, not HTML, on the webpage where I would edit my posts. So if I wanted to insert a hyperlink into this particular message, it would look like I was programming in HTML. This wasn't happening in Firefox on the primary computer, where at this moment I was writing these words. But it was weird, because other Blogger pages on the secondary computer would display in normal HTML mode, like there was no problem. For Blogger (and some other webpages), this problem persisted when I reloaded the page, restarted Firefox, or even rebooted the computer. It looked like some other people had had this problem in Firefox 3.0, but I was still running 2.0.0.16. To fix the problem in 3.0, someone recommended a clean install. That advice described Blogger as a "high-risk application." The idea seemed to be that, if something was going to go wrong in Firefox, Blogger would be one of the first places where you would notice it. The implication seemed to be that I should do a clean reinstallation of Firefox. I used InfoLister to make sure I had a complete current list of installed add-ons; I had already made sure to export the settings from each add-on that allowed me to do that, saving them to a special folder; and then I went into Synaptic and completely uninstalled firefox-2, rebooted, and installed firefox-2 and forefox-2-gnome-support. (That webpage had advised running NoScript, but I understood that to be for Windows versions of Firefox. They also recommended saving your bookmarks, but I used the Foxmarks synchronizer for that purpose.) It occurred to me that I might have been able to get away without marking Firefox-2 for complete uninstallation, but it was too late; I had already done it. But then it looked like nothing had changed: the add-ons were still in place, and the wysiwyg (what you see is what you get) functionality had not returned. I tried changing Blogger's Settings > Basic > Show Compose Mode for all your blogs? option to No, saving, and then changing it back to Yes. That didn't help. People in a thread from 2006 were recommending clearing the cache and especially the cookies, but that sounded like Windows-related advice. I had some other work I had to do, so I had to let that problem sit for the moment. Meanwhile, I had another problem in VMware Workstation. In one of my virtual machines, after hibernating and resuming work, I got this message:
Adobe Acrobat ADBC.api failed to load because JavaScript support is not present.After I clicked OK, I got another one:
WARNING: The EScript plugin cannot be loaded. The Accessibility Checker can still function, but will be unable to attach comments to documents.I clicked OK on that, and Acrobat crashed. I got a Windows XP error:
Adobe Acrobat 8.1 has encountered a problem and needs to close. We are sorry for the inconvenience.I realized, then, that of course this was not an Ubuntu problem; this was just Acrobat and Windows fighting it out. I rebooted WinXP, there inside that virtual machine, and when WinXP came back up, Acrobat was running normally. Probably it was a matter of not restarting XP often enough. Before starting with Ubuntu, I had found that Windows functioned best if I shut it down each night and restarted it each morning, and I had not been worrying about that too much inside VMware: I was just suspending VMs, hibernating the system, and then resuming Ubuntu and restarting VMs as needed. Maybe Acrobat and/or Windows just needed to crash once in a while. The more bothersome problem was that this virtual machine had now decided to resize itself. I don't know why, but suddenly I had scroll bars and my full Windows XP desktop was not fitting within the Console View in VMware Workstation. I had had this problem before and I couldn't remember what had fixed it. I saw that I had solved it for myself, that time, and had recorded the solution in a VMware forum post. The solution, that time, was to go into Windows, there inside the VM, and go to Control Panel > Display > Settings and reset the resolution to 1280 x 827. (I was using a 22" wide-screen monitor.) I tried that again, and it worked. I switched it to full screen mode and checked those same settings again, and there it was 1680 x 1050. Back in Console view, it was still 1280 x 827. (In my Workstation View menu, I had both Autofit Window and Autofit Guest checked.) When I could, I got back to the wysiwyg failure problem in Blogger (and some other webpages) that I had started to address, above. Let me just say that, incidentally, at this point I was just about overwhelmed by the number of little things that were not working in VMware. I think I hit my limit when I spent 15-20 minutes trying to figure out how to move a file from one computer to the other so that I could print it (because Ubuntu could not see my Canon printer; because WinXP on the primary computer could no longer see the data drive that I had converted from NTFS to ext3 so that Word in VMware would be able to save documents to it; etc.). For some reason my external hard drive was not being recognized by Ubuntu on the primary computer, and as an ext3 drive was also not recognized by native-booted WinXP on that computer. My jump drive was also not being recognized. It occurred to me, about this time, that Microsoft didn't have to keep inventing new dumbass operating systems like Vista. If they wanted to stay ahead of Ubuntu, all they had to do was to deliver greater stability and to save people these gazillions of hours of learning how to fix and tweak every last little thing they do. I had some real-world work that I had to get done, and I couldn't even blog about some of this stuff because I was now writing these notes on the primary computer because Blogger on the secondary computer was no longer wysiwyg. So let's get back to that. I was able, on the secondary computer, to view my blog in wysiwyg format. I just wasn't able to view the editing screen that way. It was strictly text mode, which was problematic primarily because the font was harder to read -- being very small, in Ubuntu Firefox, unlike Firefox in Windows -- and it also showed HTML commands, which also made reading and proofreading somewhat harder. My Google search drew my attention to a webpage that listed five blog editors. It was a category of software that I did not know existed. I had just realized that I could probably explore alternative Internet browsers, and I decided I probably should do so when I viewed some of my other open tabs in Firefox and found that this was not just a Blogger-specific problem. But before going in that direction, what caught my eye about these five blog editors was that some of them were standalone, i.e., not Web-based, and therefore did not require me to be online or to have a browser open. This was appealing because Blogger sometimes seemed to return me to an earlier version of a post that I had revised, so that my more recent revisions were lost. I went to Ubuntu's System > Administration > Synaptic Package Manager and searched for, and installed, BloGTK, because that was one of those five blog editors whose description seemed most fitting for my needs. (The others were GNOME Blog Entry Poster, Drivel Journal Editor, ScribeFire Firefox Extension, and Google docs.) BloGTK was now available under Applications > Internet. I started to fool with it, and quickly realized that, of course, if I was saving a blog posting in a standalone program on one computer, I would not be able to switch to the other computer and continue editing the post there, because the post would not yet be stored as a draft online. Trying again, I went to Firefox's Tools > Add-Ons > Extensions > Get Extensions and searched, in the Firefox tab that opened up, for ScribeFire. I installed it and restarted Firefox, and up came their introductory webpage. I didn't want to experiment with it with this post that I was now writing, so I made a note to get back to it later. A search for alternatives to Firefox, to be used when I was having trouble with a particular webpage, led me to a page listing 5 Cool Alternatives to Firefox for GNU/Linux (and some others, in the comments that followed). Of those, one was Konqueror, which was based on the KDE desktop, whereas I was using GNOME. Another was Opera, which I hadn't realized was free for personal use. I had also just recently heard of Epiphany. There were others. I found a page that was somewhat critical of some of these. I decided to try Opera. I searched for it in Synaptic, there on the secondary computer, but didn't find it. So I went to their Linux download page and downlaoded the version for Ubuntu 8.04 Hardy Heron. It detected that I was running 64-bit Ubuntu and downloaded a .deb file to my /home/ray folder. I had to review my blog for instructions on how to install a .deb file. What I had done, on one previous occasion, was to right-click on the .deb file and select "Open with GDebi Package Installer." I clicked on the Install Package option that opened up, and it seemed to install. I closed the installer and went to Ubuntu's Applications > Internet and, sure enough, there was Opera. I opened it and pasted the URL into it, from one of the Firefox tabs that was not displaying properly. It displayed perfectly. I closed this draft on the primary computer and reopened it in Opera on the secondary computer. It looked good -- better than in Firefox, probably better than in Internet Explorer. So there: I had an alternative to Firefox.
After a while, though, I decided that wasn't the full solution. There were too many webpages that weren't working in Firefox at this point, and I really didn't want to do without the add-ons and other features of Firefox that I had gotten used to. I really hesitated, because Opera looked so good. The fonts were great, the whole interface was very attractive. But -- yeah. I had been using Firefox happily until now; surely it would not be too difficult to uninstall it, or do whatever I had to do, and get it working right again. I found a bunch of advice in a thread that made me think I would try it differently this time. In previous Firefox reinstallations, I had used Synaptic to completely uninstall, and had also deleted the /home/ray/.mozilla folder. (I had to use File Browser's View > Show Hidden Files to see it.) This most recent time, I had not deleted that folder. One of the posters suggested a more conservative solution of just renaming that folder to be .mozillaBackup. I thought I might try just that, without also uninstalling and reinstalling Firefox. It was at this point that I first faced up to something I had noticed in passing previously. There was more than one "home" folder. In File Browser (i.e., Nautilus), the default display would show me Home Folder and also File System in the left-hand pane. But if I went into the File System > home folder, I saw only /home/ray and /home/bin folders. (I had previously created the bin folder; the ray folder had been automatically created.) By contrast, if I went into Home Folder, there was no ray subfolder. Confusing! Worse, there were .mozilla subfolders in both Home Folder and /home/ray. Which one was I supposed to delete or rename? I shut down Firefox and renamed /home/ray/.mozilla to be /home/ray/.mozillaBackup. As I suspected, that renamed both of them. They were both the same folder, viewed two different ways. This was one of the things I had disliked about Windows Explorer: it, too, would give me that confusing redundancy. Anyway, I then restarted Firefox. It came up in plain vanilla format, like it was a brand-new installation. I went to Blogger.com and viewed the Edit webpage that had caused a problem previously. It was fine now. So there: the problem was not in the Firefox program installation; uninstalling and reinstalling was not necessary. The problem was somewhere in the .mozilla folder. I shut down Firefox and copied .mozillaBackup to .mozillaBackup (copy), and then renamed that copy to be .mozillaNew. Wow -- almost 2,800 files in there! Then I copied the files in the .mozilla folder, the one that Firefox had just now newly created when it could no longer find the old one, the one that I had renamed to be .mozillaBackup. I copied those new .mozilla files into the .mozillaNew folder, overwriting the old ones in .mozillaNew (i.e., selecting Merge All and Paste All when prompted). (Basically, this just amounted to copying over the .mozilla/Firefox folder I found there.) I then deleted the new .mozilla folder, since I knew Firefox could easily just recreate it again if necessary, and I renamed .mozillaNew to be .mozilla. Then I restarted Firefox. This didn't seem to have helped: I still had a plain-vanilla, brand-new-looking Firefox installation, without my add-ons. I deleted .mozilla and renamed .mozillaBackup to be .mozilla. I restarted Firefox, and it was back to the dysfunctional state I had started with. I wondered if somebody knew which files to fix in .mozilla, and how to fix them, to solve this. I posted a question in the Firefox forum. When I next got back to this post, several hours later, it still had no responses. I decided that this post had gotten long enough, and it was time to start a new one with my next steps in this saga.
2 comments:
My god, I'm just starting the process of doing an Ubuntu/Virtual Box/Vista install.
Trying to do much of the same things you've done in this massive post.
Did you ever get video editing to work? My installs went much easier for me, vista is running, printers shared, no video capture or video editing installed yet, but I'm only at this two hours. :) Please tell me you had a happy ending to this story.
I'm still doing my video editing in native WinXP. I did find it much easier to install and use VMware on Ubuntu 8.10 -- tried that a few weeks back. I'm hoping to update this post with a re-try after Ubuntu 9.04 comes out.
Post a Comment