Sunday, August 31, 2008

Sarah Palin: The Clarence Thomas of 2008

I am a middle-aged white guy. If I were a member of a race, gender, or ethnic group that was underrepresented in presidential politics, I am not too sure I would think well of it if the majority group -- black people, say, or women -- came to me and said, Ray, we are going to give you a consolation prize. We aren't going to nominate a white guy as president, but here's some whack-job white guy to represent you. Take your pick: Richard Nixon? Dan Quayle? Adolf Hitler? This is the sort of thing that could backfire. I could find it offensive that the majority's views were so clueless as to think that these guys would represent who I am and what I want from my government. So we come to Sarah Palin. Her views only begin to emerge from obscurity, and I wouldn't be surprised if half of the complaints I've heard about her already are wrong. Nonetheless, this is no Hillary Clinton. She didn't get where she is by being the best, by fighting her way to the top. She got where she is because she's female. She has the potential to be a great politician. Likewise, the members of Milli Vanilli had the potential to be great singers -- but that's not what their concertgoers were paying for. It is as if these white guys were saying, OK, you women can't actually come out on top, so we'll throw you a bone to make it seem like you're equal. The thing would have flown better if McCain had at least chosen a woman who had earned national-level respect and had shown national-level capability. If he's going to embrace the Bush legacy so wholeheartedly, why not Condoleeza Rice? Female, and black to boot, and with global experience. There are accomplished women who demonstrate superior credentials for the task of handling the presidency in a crisis. I predict some women will be supremely displeased at the choice of Sarah Palin -- as I would be, if I were one of them.

VMware in Ubuntu: Backing Up and Restoring

I had been working on completing my VMware installation, detailed in a series of posts, when I made such a hash of multiple monitor installation in Ubuntu (version 8.04, Hardy Heron) that I decided to give up and start over. A kind soul had responded to a question I posted by telling me a few things to help with the start-over process, so that's where this new post begins. My basic setup: 64-bit Ubuntu with dual monitors, installed on each of two computers, both of which were dual-boot setups with Windows XP as the alternative. I used the WinXP installation on my primary computer as the starting point for VMware Workstation 6: that is, I used VMware Converter to create a basic WinXP virtual machine from that native WinXP installation, and then I used Workstation to run and add more programs and adjustments to that basic WinXP VM. So at this point, I had several VMs for different purposes. As I contemplated the project of restoring my Ubuntu setup, possibly from scratch, I was pleased to have those VMs, because I expected those virtual machines to survive and run just fine. That is, unlike Windows, I would not have to reinstall those particular programs; I would just have to reinstall VMware Workstation. I would have to reinstall Workstation, that is, if my first line of defense failed. I had bought a copy of Acronis True Image and had made a backup, but had never before tried to restore from a True Image backup. Acronis was supposed to be able to back up and restore Linux as well as Windows partitions. My Acronis backup was two weeks old at this point, but that was OK; during maybe half of those past two weeks I hadn't been doing much tinkering with the system, and for the other half I had some notes. I was a little concerned about using True Image because someone had said, somewhere, that it wipes out all of the partitions on a drive when you use it to restore one partition. I thought that was probably a case of user error, but it was not the kind of thing I wanted to learn from experience. Fortunately, the other main partition on that drive was a backup partition, so I just made sure I had alternate copies of whatever was there. In case the Acronis True Image restore process failed, I followed the advice mentioned above. First, I made a backup of my /etc and /home folders. They said to use a backup program, so I used sbackup, which I had previously installed using Ubuntu's System > Administration > Synaptic Package Manager. Sbackup had created a shortcut in System > Administration > Simple Backup Config. I saw that it had somehow gotten configured to include /var and /usr/local along with the other two folders just mentioned, so I went with that. I clicked Save and told it to run the backup now, and it told me it was doing it in the background. I went off to do something else, and when I returned, it had created a file called Files.tgz of over 300MB. I highlighted those four folders in File Browser and right-clicked for their Properties, and it said they were about 1GB. So I couldn't tell whether they were capable of being compressed that much, or if maybe Sbackup had missed some. I tried opening the flist file that Sbackup had created, thinking it would be a file list, but Ubuntu said it didn't know of a program to open it with. I opened the files.tgz file and saw that it claimed to contain about 670MB of files. Just in case, I also tried to do a straight-across (uncompressed) copy of all of the Ubuntu folders except /media. I checked the Properties of each, and the only one much over 1GB was /proc. But the copy process (a) notified me that it couldn't copy the "ray" folder even though I was running as root, and (b) seemed to freeze after copying 24.3MB, making me wonder whether the existence of multiple copies of system files could confuse Ubuntu as a similar maneuver could confuse Windows. I couldn't think of another way to do a better backup, so I left it at that. The response to my question also suggested running this command: "dpkg --get-selections >installed-packages.txt." They pointed me toward another thread that explained how to use this to reinstall the packages I had installed to date. So I ran that and saved the resulting installed-packages.txt file to another drive. I looked at it and, wow, it listed hundreds of packages. Probably most of them had been installed automatically with Ubuntu, but still -- it looked like an impressive and possibly very useful tool, this dpkg thing. So the time had come to take the leap. No doubt I was forgetting something that would cost me hours of additional work, but I didn't know what else I could do in preparation, and time was marching on. I rebooted the primary computer with the Acronis CD, all the while making notes here on the secondary computer. I was a little nervous, so I started with a look at their help file, which -- to my surprise -- recommended trying to restore data first by running True Image within Windows. This was just the opposite of Drive Image 2002, which I had been using for some years. Drive Image had been much more effective running from the CD. Since this was a test of Acronis as well as an attempt to get my Ubuntu installation back, I decided to start with the CD, which I would use in the worst-case scenario (i.e., if Windows wouldn't run). I chose the Recovery option and pointed it toward the .tib file containing my True Image backup of the Ubuntu program partition. I selected "Restore disks or partitions" and chose to restore only the partition itself, not the MBR (Master Boot Record). I told it to restore to the Ubuntu partition, which it was calling drive H in Windows-speak. This was a little disconcerting because True Image was not reporting drive labels for any ext3 partitions, and I had several of them at this point. I had tried to label all of my partitions, but maybe GPartEd had failed to label them; I couldn't remember for sure. The backup was the same size as one of the available ext3 partitions, however, and I had already scoped it out in Windows, so I was pretty confident I knew which ext3 partition I wanted to wipe out and replace with the backup -- even though the drive letter saved in the backup no longer matched the one I was planning to wipe out. Finally, I told it to validate the backup archive before restoration. I had already validated it after creating it, but I hated to wipe out a live Ubuntu partition for the sake of a dead backup, just in case something had affected that backup file in the past two weeks. I was impressed with all the careful questions True Image was asking me; there wasn't nearly as much of a plunging-off-a-cliff feeling as there had been in Drive Image. After a few minutes, it said, "The data was successfully restored." So I thought, OK, we'll see. I took out the CD and rebooted the computer. Ubuntu acted like it was starting up -- GRUB ran, and then I got "Ubuntu" on a black screen with an orange slider indicating startup progress, so all was well -- and then, suddenly, "No signal" appeared on my monitor, and the computer was still. Not the desired outcome! I waited a minute or two. Nothing further. I punched the computer's reset button. This time, I chose GRUB's Recovery Mode for Ubuntu. In the Recovery Menu, I ran dpkg ("Repair broken packages") and xfix ("Try to fix X server"). Then I selected "Resume normal boot." This time, I got to the Ubuntu sign-in screen. And, just like that, I had Ubuntu back. Right away, it needed a restart to complete its update of something or other. But it seemed to freeze at "System is restarting, please wait . . ." After about ten minutes, I tried hitting Enter, Esc, Ctrl-Alt-Del. No action. Hard drive light was off. So I punched reset. One again, we went through login and I had a normal-looking Ubuntu desktop. I started up Firefox and Workstation. Firefox showed me that the first thing I had forgotten to think about was the location of the backup files for Session Manager, the add-on that remembered what webpages I had open. I wasn't sure where Session Manager had been saving those. It was supposed to have been saving them on another partition, but they didn't seem to be there. So I had to fix that and bid goodbye to those webpages I had opened in Firefox during the past two weeks, had not yet read and closed out, and couldn't presently remember. There seemed to be no question, at this point, that the Acronis backup strategy worked, and that I would ideally have used it instead of screwing around for hours on the previous day in my vain attempt to get the Ubuntu multimonitor resolution right. Next time around, I would try to remember to run Acronis before getting into the multiple monitor thing, so if it went bad again, I would be back in business within the hour. I still had to reinstall some other things that I had previously installed. The description of that process appears in a new post.

VMware in Ubuntu: Dual Monitor Nightmare

I was writing a post on assorted fixes to make VMware Workstation 6.0 work within Ubuntu (version 8.04, also known as Hardy Heron). I came to the part that called for installing multiple monitors. That turned out to be a huge hassle. I wrote so much material on it that I decided to break it out into this separate post. See the beginning of that previous post for details on my two-computer system and other aspects of the setup on which I was attempting this installation. I had gotten the multiple monitor setup working briefly, once before, but hadn't tried since starting over, and now I wanted to be able to use that in my projects. I reviewed my log of the previous effort, undertaken about a month earlier, and saw that I had gone through a bunch of failed steps before finally hitting on the solution that would work with my 64-bit version of Ubuntu. This time around, I tried to cut to the chase. I started by using Ubuntu's System > Administration > Hardware Drivers to verify that my NVIDIA graphics card's proprietary drivers were still enabled and in use. Then I went to System > Administration > Synaptic Package Manager, searched for nvidia-settings, marked it for installation, and installed it (Apply). Next, it was System > Administration > NVIDIA X Server Settings. But at this point, I had to stop to explain something about my setup. I was using an IOGear GCS62 KVM switch that, for only $20, had enabled me to use one keyboard, one mouse, and one monitor with two different computers. Since switching to Ubuntu, I had mostly been using this KVM switch only for the keyboard and mouse; I had been just using one monitor dedicated to each computer. But now I plugged the second monitor into the KVM switch, and connected that switch to both computers. So my first monitor was still dedicated solely to the primary computer, but now the second monitor was doing double duty. I had previously tried using another KVM switch so that both monitors could go with both computers, but I had found I really didn't need that; I almost never did anything on the secondary computer that would benefit from having two monitors. Rather, I wanted to keep the first monitor fixed on what was happening on the primary machine, so I would stay in touch with my primary workspace. But now I changed things a bit more: I decided to use that second KVM switch for just the second monitor. This way, I would hit ScrollLock twice to switch the mouse and keyboard between computers, and I would punch a button on the second KVM to switch the second monitor between computers. This way, I could see what was happening on one computer when I was on the other, and I could also use both monitors on the primary computer if I so desired. With the second monitor now able to work with both computers, I hit ScrollLock twice. The second monitor was now showing a black screen. I went to the first monitor, connected to the primary computer, and in the NVIDIA X Server Settings dialog (which was presently showing only the primary monitor) I clicked Detect Displays. It showed the HIQ N91S secondary monitor as "(Disabled)." I clicked on that monitor and clicked Configure. It gave me a choice of Disabled, Separate X Screen, or TwinView. My previous notes seemed to say that Xinerama had been a superior solution, but it wasn't clear where I had found Xinerama. I searched Synaptic and saw that a bunch of programs seemed to refer to it, but I couldn't tell which of them (if any) I needed to install. The Wikipedia page I had cited last time said that Xinerama had problems and was being discontinued, and that's more or less what I got at the Xinerama official website too. The new thing seemed to be XRandR; but when I searched Synaptic, it seemed I already had that. Or at least I had installed libxrandr2 and x11-xserver-utils, which were two of the seven packages that Synaptic listed in response to my search. One that I didn't have installed yet was gnome-randr-applet, which was described as a "Simple gnome-panel front end to the xrandr extension." So maybe if I installed that, I would have a clickable icon somewhere that I could use to configure or run multiple monitors. I tried it. But it didn't seem to add anything anywhere. Ultimately, it seemed to come back to editing the xorg.conf file. My previous notes contained a printout of what my working xorg.conf had looked like, but I wasn't sure if it was applicable anymore, since this time I hoped to use XRandR rather than Xinerama. In Terminal, I typed "cd /etc/X11" and then "gksudo gedit xorg.conf." I compared it against what I had in my previous notes. It was really different, and I suspected much of the difference had been caused by Xinerama. I also compared it against an Ubuntu Wiki page. Closer, but still different. I decided to make mine similar to the Wiki. I saved xorg.conf. Nothing happened. Apparently I would have to reboot. I did. Or at least I thought I did. I went away and came back to a black screen on the first monitor. I punched my computer's power button and it shut off. I punched it again and the system booted. It showed "Ubuntu" on the screen, and it went through the part where the slider goes back and forth, indicating that Ubuntu is loading. Then the screen went black, said "No Signal," the hard drive light flickered a few times, and nothing further. I punched the computer's reset button, and this time, at GRUB, I selected Recovery Mode. There, I selected dpkg, "Repair broken packages," and xfix, "Try to fix X server," and then "Resume normal boot." But that did not do it; I still had a black screen. Reset button > boot from Ubuntu CD > "Try Ubuntu without any change to your computer." But what's this? Nothing there either. Still a black screen. I thought maybe the primary monitor was not supported by the Ubuntu Hardy Heron 8.04 boot CD, so I tried looking at the primary computer using monitor no. 2. That one was flashing weird colors. I tried the reset button again, this time with only the second monitor plugged in (as the primary monitor). This time everything went OK; this monitor was recognized. I opened File Browser and navigated to "115.3 GB Media" (that being the size of my Ubuntu programs partition) and then to /etc/X11/xorg.conf. Weird thing was, xorg.conf didn't seem to have changed. I had saved my changes to it before rebooting, but it didn't look that way. Instead, there was another file, named xorg.conf.20080830154959, and that was the one containing my changes. I had no idea why mine had gotten saved with that filename, but there it was. I thought maybe it was messing up the works, even with that weird name, so I opened up a session of File Browser as root (using "sudo -i" and then "nautilus" in Terminal) and moved that xorg.conf.20080830154959 file to another drive, far, far away. The comments at the top of xorg.conf said that you could recreate xorg.conf by running "sudo dpkg-reconfigure -phigh xserver-xorg." I was going to run that command in Terminal, just out of superstition -- xorg.conf didn't actually seem to be any different than it originally was before I started messing around with it -- but then I thought, no, why don't I try rebooting first, and see if getting rid of that weird xorg.conf.20080830154959 file fixes it. So I removed the CD and restarted Ubuntu from the hard drive. This time it restarted fine. I was back in my good ol' Ubuntu setup. I thought I should try to see whether maybe the presence of two different versions of xorg.conf had confused the system at bootup, so I swapped out xorg.conf to a different directory (renaming it as fnoc.grox -- which was xorg.conf spelled backwards -- because this experience with xorg.conf.20080830154959 suggested that Ubuntu might go looking for files even if their names weren't what they were supposed to be), replacing it with xorg.conf.20080830154959 renamed as simply xorg.conf in /etc/X11. Now, I had to switch my monitors back to their ordinary positions before rebooting -- that is, I had to make sure they were plugged into the appropriate sockets in back of the computer -- because xorg.conf would be telling the computer which monitor was left and right and DVI or VGA. After replugging, I rebooted the primary computer. Both monitors were showing the primary computer's bootup process for a while; but when we got to the Ubuntu login screen, monitor 1 said "No Signal" while monitor 2 showed a login screen that was skewed way to the right. All I could see of its "Ubuntu" lettering was just the left edge. So I thought maybe I was off by one monitor, like maybe left should have been right and vice versa. I tried swapping plugs on the two, but no, now I had nothing on either monitor. I replugged the monitors the right way. The thing was still skewed the same way, but I knew that I had to enter ID and then password to log in, so I did that, even though I couldn't see what I was typing -- it was far off the right edge of monitor 2. That worked, and suddenly I was at a normal Ubuntu desktop on monitor 2. Still nothing on monitor 1. I thought maybe I had not entered the right command for it in xorg.conf, so I sudoed my way back there and stared at it. I remembered that that one website had said,

The third [important thing to notice in xorg.conf] is the name of your device's output name. In the above, it is "Monitor-VGA-0", but it could be "Monitor-VGA0" or something. Look at the output of xrandr to see what name your graphics driver uses.
I had tried running xrandr in Terminal without any insights; but now that I looked again, I saw it was not even providing info on monitor 1. Lacking a clue, I tried the suggestion, typing "Monitor-DVI0" into xorg.conf, and then rebooted. That didn't make any difference. Then I realized that Ubuntu was recognizing that monitor just fine with the old xorg.conf, and all I needed to do was to refer to that monitor in the same way in the new xorg.conf. I looked at the old fnoc.grox (see above), but it didn't make sense to use exactly that same approach in the new xorg.conf. I thought maybe the webpage was not quite on target in its advice on how to edit xorg.conf. Another thread made me think that the missing piece was that I had not yet run EnvyNG, which I had done the previous time around to get the latest NVIDIA drivers. I did that now, starting with a search in Synaptic. It came back with three EnvyNG options. The thread had referred specifically to envyng-gtk, and that was one of the three, so I selected that. It said it also needed envyng-core, so that took care of two of the three packages offered. I went to System > Administration > NVIDIA X Server Settings > X Server Display Configuration. Next to the HIQ N91S monitor, it said "Acer AL2216W (Disabled)." So it did recognize both monitors. When I clicked on the box for the Acer, it said "Configuration: Disabled." While I was looking at NVIDIA X Server Settings, I went down to OpenGL Settings and set it to High Quality. Under "GPU 0 - (GeForce 7900 GS)," I saw that it referred to the HIQ N91S as CRT-0 and to the Acer as DFP-0. I clicked on CRT-0 and adjusted its Digital Vibrance because the Hyundai monitor was noticeably less bright than the Acer; I thought that might help a bit. I clicked on DFP-0 and saw that its resolutions were all listed as "Unknown." I tried changing the reference in xorg.conf to Monitor-DFP-0 instead of Monitor-DVI-0; and since that and the EnvyNG thing may both have needed a reboot, I restarted Ubuntu. Unfortunately, these steps had not helped at all. I looked at the xorg.conf shown in my previous post. It was really different from the one I had been working on. I hadn't paid any attention to it because I had thought I would be needing something very different, since I wasn't going to be using Xinerama. But now I noticed that its very first comment line said, "# nvidia-settings: X configuration file generated by nvidia-settings." I tried running nvidia-settings in Terminal, expecting some kind of magic. But, silly me, that just opened up a session of NVIDIA X Server Settings. There, I clicked on the Acer monitor and then on Configure. I clicked on TwinView. I didn't want TwinView, but I had seen a reference to it in the previous post's xorg.conf. I rearranged its position (Absolute, Acer left of Hyundai) and made it the primary display. I clicked Save to X Configuration File and then took a look at the xorg.conf. It said it was unable to create a backup, and xorg.conf did not seem to have changed. I thought maybe the problem was that I had not run nvidia-settings as root. So I did that. This time, it showed the Acer as being (Off), not (Disabled). All of the changes I had previously made in NVIDIA X Server Settings were still there except those under the X Server Display Configuration option, so I made those again and clicked Apply, which I had not done last time. And, whoa, suddenly I had one desktop spanning two monitors, with a big beautiful heron right in the middle. Excellent! I clicked "Save to X Configuration File" and took a look at xorg.conf. This time it made a backup with no problem, and it looked roughly like the xorg.conf that I had saved in the previous post, though I didn't do an exact line-by-line comparison. Here was the full text of it:
# nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 1.0 (buildd@yellow) Thu Jun 5 09:27:12 UTC 2008 # xorg.conf (X.Org X Window System server configuration file) # # This file was generated by dexconf, the Debian X Configuration tool, using # values from the debconf database. # # Edit this file with caution, and see the xorg.conf manual page. # (Type "man xorg.conf" at the shell prompt.) # # This file is automatically updated on xserver-xorg package upgrades *only* # if it has not been modified since the last upgrade of the xserver-xorg # package. # # If you have edited this file but would like it to be automatically updated # again, run the following command: # sudo dpkg-reconfigure -phigh xserver-xorg Section "ServerLayout" Identifier "Default Layout" Screen 0 "Screen0" 0 0 EndSection Section "Module" Load "glx" EndSection Section "ServerFlags" Option "Xinerama" "0" EndSection Section "InputDevice" Identifier "Generic Keyboard" Driver "kbd" Option "XkbRules" "xorg" Option "XkbModel" "pc105" Option "XkbLayout" "us" EndSection Section "InputDevice" Identifier "Configured Mouse" Driver "mouse" Option "CorePointer" EndSection Section "Monitor" Identifier "Left Monitor" EndSection Section "Monitor" Identifier "Right Monitor" Option "Right Of" "Left Monitor" EndSection Section "Monitor" Identifier "Monitor0" VendorName "Unknown" ModelName "HIQ N91S" HorizSync 30.0 - 83.0 VertRefresh 56.0 - 75.0 EndSection Section "Device" Identifier "Configured Video Device" Driver "nvidia" Option "NoLogo" "True" Option "Monitor-DFP-0" "Left Monitor" Option "Monitor-VGA-0" "Right Monitor" EndSection Section "Device" Identifier "Videocard0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce 7900 GS" EndSection Section "Screen" Identifier "Default Screen" Device "Configured Video Device" DefaultDepth 24 SubSection "Display" Virtual 2960 1050 Depth 24 EndSubSection EndSection Section "Screen" Identifier "Screen0" Device "Videocard0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "1" Option "TwinViewXineramaInfoOrder" "DFP-1" Option "metamodes" "CRT: nvidia-auto-select +1680+0, DFP: nvidia-auto-select +0+0" EndSection
Now I needed to know an easy way to switch it from spanning two monitors to just one, because as I was writing these notes on the secondary computer, the right monitor was no longer displaying the right half of the desktop on the primary computer. In NVIDIA X Server Settings, I tried setting Resolution to Off and clicked Apply, and that did it: now the whole desktop for the primary computer was contained in the left-hand (Acer) monitor. With these steps taken care of, the remaining thing was to get VMware Workstation to use the whole two-monitor desktop, so that I could have multiple PDFs and Word documents open at once and readily visible. The approach used in my previous post wasn't working now. When I clicked on the Full Screen option in Workstation's Console View, I got the disappearing toolbar at the top of the screen; but the icon that had said "Add a monitor to full screen view" in my previous installation was now saying (well, its tooltip was saying), "Cannot use multiple monitors; click for more info"; and when I did that, I got this:
This virtual machine cannot use multiple monitors for the following reasons: The virtual machine's Display setting must be "Use host settings" or specify more than one monitor with sufficient maximum resolution.
So back in Console View, I went to VM > Settings > Hardware > Display. But the option to change the number of monitors was grayed out. Evidently I had to do this by logging into Workstation as root. I did that, but it was still greyed out. Now I realized I should probably have closed down the virtual machines rather than just suspending them. I was curious whether that was the whole explanation. I killed Workstation and started it up again as just me, not root. I restored one of the VMs and, no, it was still grayed. So apparently I had to do both: power it down (not just suspend it) and also make the change as root. But no, correction: once the machine was powered down, I could make the change without running vmware as root. I tried the host settings option, hoping that maybe this would fix a problem I had had when attempting to run another program (Second Life) within a VM: I had gotten an indication that my drivers weren't up to the job, and had concluded that they were talking about Workstation's native drivers, not the latest NVIDIA drivers that Ubuntu was now supposedly using for me. Once I had set the Display option to "Use host settings for monitors," I powered on that VM and tried Full Screen again. For some reason, the collapsing toolbar at the top of the screen was not expanding. I think it was Ctrl-Alt-Enter that got me back to Console View. I tried Full Screen again. This time the toolbar was behaving properly. Maybe it had just been funky while the WinXP VM was loading. Anyway, there was an icon whose tooltip now said, "Add a monitor to full screen view." I clicked that and the screen flickered, but otherwise I saw no difference. Then I remembered to punch the button on my KVM switch. Yes, there it was: I was able to move a program's window (e.g., Internet Explorer) anywhere on either monitor. But (perhaps because my monitors were not the same size) the resolution on the second monitor was not right. I went back into NVIDIA X Server Settings and saw that the Hyundai (right-hand) monitor was now labeled as just CRT-0 with a resolution of only 1024 x 768. To give it exactly the same vertical resolution as the Acer (so that the Ubuntu bottom panel would not bleed off it, as I had noticed it did before), I changed the resolution from Auto to 1400 x 1050 (instead of 1280 x 1024) and clicked Apply. That didn't work too well, so I changed it to 1280 x 1024 after all. But now I was getting weird effects with windows. In Ubuntu, I moved the NVIDIA X Server Settings window to the right-hand monitor, and then found I couldn't get it back to the left side; and I opened a Terminal session and found that I could not drag it to the right side. And in VMware, I clicked the icon to "Add a monitor to full screen view," but it wouldn't let me drag Internet Explorer to the right-hand monitor. I tried shutting down the virtual machine, but leaving VMware running, and then dragging those other windows around. Still no good. I killed VMware and tried again. Nope. It seemed like telling Workstation to use the host settings had screwed up the host. I powered up VMware and changed that in Display: instead of saying "Use the host," I specified two monitors and the maximum possible resolution, and tried again. Now I got the "Cannot use multiple monitors" error again, but this time there was an additional reason: "The current arrangement of monitors is not supported." I guessed the reason why it also repeated the earlier error message, shown above, was that it did not see me as using multiple monitors because Ubuntu was treating the two monitors as one big monitor. It wasn't like in Windows, where each monitor functioned sort of like a separate desktop with its own wallpaper; here the one piece of heron wallpaper was stretched all the way across both monitors. This made me think that maybe the solution was back in NVIDIA X Server Settings, with a Separate X Screen (instead of TwinView) configuration. So I made that change. It wouldn't let me Apply it right then and there; it said I had to restart. So I clicked Save to X Configuration File. It gave me a bit of advice:
Multiple X screens are set to use absolute positioning. Though it is valid to do so, one or more X screens may be (or may become) unreachable due to overlapping and/or deadspace. It is recommended to only use absolution positioning for the first X screen, and relative positioning for all subsequent X screens.
So, OK, I canceled that, clicked on the Hyundai (CRT-0) monitor there in NVIDIA X Server Settings, clicked its X Screen tab, and changed position to "Right of." Unlike TwinView, the Separate X Screen option also gave me an "Enable Xinerama" option. Why were they giving me that option if, as someone had said, Xinerama was on its way out? I thought I'd try it without that first. I tried Apply again, that didn't work again, tried Save to X Configuration File again, and wasn't logged in as root again, so that didn't work again, so I quit NVIDIA X Server Settings again and logged in as root again and ran nvidia-settings again and made the changes again and saved to X configuration file again, and that worked (again). Then I rebooted. But this wasn't good: it seemed I had only monitor no. 2 again. Then I realized the problem was that nvidia-settings had designated monitor 2 as Screen Number 0 and monitor 1 as Screen Number 1, which was backwards. So when I started nvidia-settings, I got an "Invalid display device" error message in Terminal, which continued scrolling other error messages as I continued fooling around with NVIDIA settings until I had probably totally fubared the whole thing. At that point, it occurred to me that maybe I had to swap the cables on back of the computer again. That took three minutes, down there in the dust and darkness behind that computer, cursing the engineer who designed those DVI connectors. And this, truly, seemed like no improvement at all, because now both screens were black. Ah, but I thought, no doubt I must reboot in order to allow the machine to see the newly configured monitors. Did that via punch of reset button. But this made no difference, either in the priority of monitors upon reboot or in the X Server Display Configuration. Then the problem became clear: xorg.conf was designating the monitors in the wrong order. So I put the cables back where they were, with appropriate imprecations directed at the eternal soul of the aforementioned engineer; and after another punch of the reset button on the primary computer, I decided to take the advice proffered at the start of xorg.conf, namely, to run this command to reset the sucker and start over:
sudo dpkg-reconfigure -phigh xserver-xorg
which I did. Then I rebooted again. I got a "Mode Not Supported" error message from monitor 2 and an otherwise unusable machine. I rebooted with the Ubuntu CD. Mode continued to be not supported, as we now had a black monitor 1 and a weird flashing monitor 2. Deciding that the cables might have been more right the other way, I swapped them once more (remember, third time's a charm) and punched reset again. Upon CD reboot, I had the left monitor (no. 1) working properly, the right monitor flashing (apparently not supported yet), and the generic xorg.conf created by the command shown above. I tried to run nvidia-settings, but it hadn't been installed by the CD boot process. I checked my notes, above, and remembered to try removing the xorg.conf backups from /etc/X11. I did that and rebooted. It booted properly, and this time with monitor no. 1 behaving like monitor no. 1. I typed "sudo nvidia-settings" and got this error message:
You do not appear to be using the NVIDIA X driver. Please edit your X configuration file (just run `nvidia-xconfig` as root), and restart the X server.
I typed "sudo -i" and then "nvidia-config" and got a "VALIDATION ERROR":
Data incomplete in file /etc/X11/xorg.conf. Device section "Configured Video Device" must have a Driver line.
Well, I thought maybe nvidia-settings could help me with that. First, though, I had to figure out how to "restart the X server," since apparently that did not require an actual reboot of the system. According to one source, you would restart the X server by typing "sudo /etc/init.d/gdm restart." Unfortunately, that advice froze my screen after running some commands, ending with "Running local boot scripts" under "Starting VMware services." I found another source that said all I had to do to restart the X server was hit Ctrl-Alt-Backspace. That seemed easier. I punched the reset button (which seemed like another way to restart the X server) and now I got life only from monitor 2 -- and it was a distorted Ubuntu login screen underneath the monitor's self-generated "Mode Not Supported" message. I couldn't even see the Screen Resolution menu pick, to change it, but using the secondary computer as a guide, I counted down 12 (and that wasn't right) and then tried again with 11 items from the "Keyboard" menu pick and hit Enter, and that gave me the Screen Resolution option. The best resolution monitor 2 could support at that moment was 832 x 624, but it was enough. I made changes in nvidia-settings, but each time I would try to save them to xorg.conf, the settings program would disappear. Terminal reported a "Segmentation fault." I tried Applications > System Tools > EnvyNG and chose NVIDIA > Install the NVIDIA Driver (Automatic hardware detection). EnvyNG ran for a while. When it was done, I actually had to do the nvidia-settings thing a couple of times to get it to work through the changes, improve the resolution, etc., hitting Ctrl-Alt-Backspace each time to restart the X server. The second time, though, it didn't seem to have changed anything from the previous time. Meanwhile, I had a bunch of error messages in Terminal. There were too many, and I had no answer for them, so I did not attempt to log them all here. I tried going back into NVIDIA X Server Settings again, but again it vanished when I tried to make a certain change -- not sure which one -- and then I got another "Segmentation fault" error in Terminal. I looked for webpages on that, and didn't find any clear ones immediately. What I did find was a growing sense that I had really screwed up in attempting to reinstall. A webpage on NVIDIA drivers, and other pages I looked at, gave me the impression that installing and uninstalling stuff in Ubuntu was not at all necessarily a clean, precise process -- that it could be just as messy as in Windows, if not more so. By this point, plainly, I had a nonworking system, as far as the monitor was concerned -- and I couldn't get very much done without the monitor! One missing piece, as I gathered from the officially supported page on installing NVIDIA drivers, was to go to System > Administration > Hardware Drivers and make sure the NVIDIA drivers were enabled. I had done that previously, but somehow they had become disabled, and I hadn't thought to check. That called for a system restart. On reboot, the resolutions were not right. I had gotten a message, "Ubuntu is running in low-graphics mode," but the configuration options there didn't include my graphics card and monitors, so I just continued past that, and then the system loaded into a tolerable but not great resolution. Meanwhile, I belatedly noticed this advice on that webpage:
The Hardware Drivers tool may not work properly on machines that have previously used third party tools like 'Envy' or manual installation to install previous drivers. You should remove those drivers before attempting to install using the built in tool.
So I went to Synaptic, searched for EnvyNG, and marked for complete uninstallation the two packages there that I had installed. But before doing that uninstall, I went back to System > Administration > Hardware Drivers and disabled the NVIDIA accelerated graphics driver. I had to kill the Synaptic session to do that, and then I had to reboot, and on reboot I got back to the uninstallation of Envy in Synaptic. Then I went back and installed the NVIDIA driver again and restarted again. The problem was not fixed: resolutions were still bad. Another page advised trying System > Preferences > Screen Resolution, but no higher resolutions were available there. That page also advised a number of additional steps, but at this point I thought I might just try restoring an earlier backup of xorg.conf, one that identified two separate monitors and video cards with resolutions (since the researching and writing of code on monitor resolutions was included among the numerous steps they were advising on this webpage. On reboot, the bootup program ran a "Routine check of drives," and I still had lousy resolutions. I edited xorg.conf to reverse the two monitors, hoping that I was doing it right. On reboot, no improvement. Back at the FixResolutions webpage, I went with their autodetect option, which required these commands:
sudo cp /etc/X11/xorg.conf /etc/X11/xorg.conf.custom md5sum /etc/X11/xorg.conf | sudo tee /var/lib/x11/xorg.conf.md5sum sudo dpkg-reconfigure xserver-xorg
The second of those commands produced a "No such file or directory" error. I double-checked the typing; I had entered it exactly as instructed. I ran the third command anyway, since it looked like something I had tried previously. It asked basic questions about the keyboard and such. When it was done, I tried Ctrl-Alt-Backspace again. As the webpage seemed to anticipate, this step was relatively easy but absolutely unhelpful. The next step they advised was to verify that xorg.conf contained horizontal and vertical monitor resolutions. It didn't, possibly because the previous steps had erased the xorg.conf that I had restored from backup. I restored it again and took a look at it. It looked roughly right. I did Ctrl-Alt-Backspace again to restart the X server. System > Administration > Hardware Drivers said I was using the NVIDIA drivers; but when I ran NVIDIA X Server settings, I got that message again:
You do not appear to be using the NVIDIA X driver. Please edit your X configuration file (just run `nvidia-xconfig` as root, and restart the X server.
I did exactly that. No improvement. Xorg.conf was still confused about the lines I had corrected previously -- it still had the monitors backwards -- so I tried again to fix that. This did not help. Following some advice further down that webpage, I tried nvidia-settings again, and again got the message that I did not appear to be using the NVIDIA X driver. Continuing with their advice, in Synaptic I searched for and installed xresprobe and then ran "sudo ddcprobe | grep monitorrange." This gave me horizontal (31-84) and vertical (56-77) figures, in that order, that should have appeared in xorg.conf. Those numbers matched the ones given for the Acer monitor in xorg.conf. I wasn't sure how to get the values for the Hyundai. But now I noticed that this webpage was telling me that maybe I hadn't understood how to restart the X server. According to them, I had to log out of the ubuntu session and then press Ctrl-Alt-Backspace in the login screen. I tried that. It was different, but it didn't make a difference. I had to conclude that the problem was in that error message indicating that I was not using the NVIDIA X driver, when Hardware Drivers said I was. Uninstalling and reinstalling had not fixed that. I disabled those hardware drivers again and went back to Synaptic and reinstalled EnvyNG. I went into Applications > System Tools > EnvyNG and installed the NVIDIA driver again. I rebooted, went into Hardware Drivers and enabled those drivers again. I rebooted and verified that it still just wasn't happening on either monitor. Sooo ... what was the fallback plan? What I would have done in Windows, at this point, might have been to restore a previous working system image using Drive Image 2002. More recently, I had made an Acronis True Image backup of my Ubuntu drive. What would happen if I just restored that whole thing? And was that necessary, or could I just restore or fix part of it? Was this what people were talking about when they were saying they had recompiled their kernel? Before jumping in over my head, I decided to solicit advice. I posted some of the above questions and went to bed. When I got up, I had a response. Basically, yes, you lose your installed applications if you reinstall Ubuntu, so try again on the NVIDIA thing before bailing out. They pointed me toward another thread which, in turn, pointed me toward a page on RandR -- which referred me to the XRandR manual for details. I will say that the RandR page had a helpful description of the different parts of the xorg.conf file. But that other thread ended with a couple of posts by a newbie who claimed to have tried everything exactly as prescribed and still wound up with unsatisfactory results. If I did decide to reinstall, the response also provided some advice on how to preserve a list of the programs installed, along with their basic settings. The risk of losing all installed applications made VMware seem like an excellent idea, of course, because the virtual machines, with all their Windows programs, would presumably be unaffected. I had to say, I did not like that Ubuntu could become so screwed up in its monitor resolutions, and so seemingly irreparable. This was Windows territory. No doubt it was my fault, in a dozen different ways perhaps. But I really had been trying to find the right instructions and figure out the next steps, and it hadn't worked for me, and it wasn't working for a number of other people. Ubuntu would continue to improve -- I was sure that even the next version, coming out in a little more than a month, would be an improvement, possibly in this area in particular -- and in that sense I had hope in Ubuntu that I did not have in Windows, which after all these years could still dump masses of people in Vista-land and leave them with systems that might not be much more workable than what they might have had with Windows 95. Nonetheless, this was discouraging. Having spent hours screwing around with monitor resolution fixes, I was inclined to try out the reinstallation or restore-from-backup approach. I started another post for that.

Thursday, August 28, 2008

Wild and Crazy VP Prediction: It's Arnold!

I have only a day or so until John McCain announces his pick for vice president. I was thinking, whom could he choose who would really electrify the country, grab the headlines, and relieve people's worry that he might die in office and leave some nobody as president? The first thought that flashed through my mind: Arnold Schwarzenegger. Now, since Arnold can't be president (because he's not native-born), I would guess he can't be vice president either. But if you really wanted to have fun with it, you could choose him anyway and leave everyone else to fight about it. There would be some complaints that McCain doesn't know his ass from his elbow, else he would never make such a goofy mistake; but he could conceivably counter those with an up-front statement of the issue and an indication that he does have an alternative pick in mind, if the law does not permit it. Realistically, it won't be Arnold, and it also won't be Jesse Ventura, though I suspect there's a part of McCain that would find that one pretty intriguing too. But I do think it will be someone who is young, strong, confident, competent -- a real counterpoint to Obama, and equally well (and admittedly) too young to be president just yet.

Second Life: Introduction to a Virtual World

I have been introduced to Second Life (SL) as an area of possible professional development. This post pulls together a number of links and basic bits of information about SL in somewhat haphazard fashion. SL is a creation of Linden Lab, a for-profit company. Revenue for Linden Labs comes from various sources, including the sale of memberships to users, fees charged to users for buying and selling Linden Dollars and for other transactions, sales of land "in world" to users, and land maintenance fees. There are free memberships, which I have, and there are also premium memberships, which cost US$6-10 per month. Land can also be purchased from other residents, at a cost of about L$2 per square meter. Linden Dollars are cheap; the exchange rate is about L$266 = US$1. SL was launched in June 2003. Its economic statistics now claim almost 15 million residents in total, of whom about 473,000 were logged in within the past week. Because a single human can create multiple memberships, and some have been known to create dozens of memberships, the ratio of human users to SL residents is unknown. It does appear, though, that there is substantial actual usage. For instance, over 400,000 residents spent money in SL during July 2008; and while the most common purchase size was less than L$500, there were also some 11,000 transactions involving purchases of L$100,000 or more, including about 600 of at least L$1,000,000. Purchases of that size suggest investments for the future -- which can make sense, in a company that has been estimated as producing profit of $40 million per year. Investment can mean risk, of course. If I read the statistics correctly, the price per square meter of land has dropped more than 10% from July to August 2008. Risks to SL's continued survival and growth include technical difficulties that can make the user's experience less worthwhile; deteriorating conditions in the general economy, or in users' free time, that may leave people unable to spend time or money on leisure pursuits (although conceivably an escapist pastime, which a virtual world could become, could actually benefit from hard times in the real world); and competition from other suppliers of virtual realities. One commentator describes Linden as "struggling in an increasingly competitive market." While I have not analyzed the statistics in detail, it appears that membership hit a plateau last year, after several years of sharp growth. This may have been responsible for the change to a new Linden CEO this past spring. SL's competitors include Google and Microsoft. The Google offering, called Lively, has been described, however, as being "not a contiguous, immersive, fully user-created metaverse like Second Life" -- as a tool that is presently not very threatening to SL, an instance in which "the 800 lb. gorilla is just saying, 'Me too.'" Microsoft, by contrast, is taking an OpenSim approach. OpenSim is short for OpenSimulator, i.e., currently available open-source virtual reality simulation software whose purpose is to create SL-type virtual environments. An important advantage of open-source software in general, and of OpenSim in particular, is its potentially lower cost. While its developers describe it as alpha-level software, a number of grids already use OpenSim. Microsoft has a concept for management of avatar identities that differs from Linden's. (An avatar is a cartoon-like character that stands for you in the virtual world. Basically, your avatar can walk around, buy things, talk to other people's avatars, and so forth.) To Microsoft, the virtual world is an extension or evolution of the user's already existing online presence; it "is not about having an alternate identity divorced from your real life self." Linden has also embraced interoperability and OpenSim in collaboration with IBM, to the extent that avatars from SL are now able to "teleport" from SL to a virtual reality grid created by IBM -- that is, an avatar can disappear from one position, in SL, and reappear in another place, in the IBM grid. What Linden hopes for, from this collaboration, is that, instead of selling virtual land that may be available more cheaply elsewhere, it can offer value-added services that will enable it to earn revenues from a larger pool of users. But there are some indications that Linden is facing a challenge in keeping up with OpenSim's rapid development. My impression, from this introductory scan of a few sources on Linden Lab and Second Life, is that SL is in a position like that of WordStar or Lotus 1-2-3 in the 1980s. It is the pioneering software that introduces a lot of concepts about its sphere to leading-edge users; but those concepts then become available to better-funded or -designed rivals who avoid your mistakes and improve upon your offering to take the product to the masses. In such spirit, as stated in an article in Computerworld,

It's time to get involved, to get used to the issues, the programming concerns . . . . [W]ithin the next year or two, "virtual world as management interface" should get closer to reality, as a) more APIs and virtual-world representations of are built, b) the client and server software gets more provably reliable, and c) client software that can provide scaled-down access for less powerful computers and for handhelds and smartphones, becomes available.
My own dabbling in SL thus far demonstrates that basic movement and communication seem possible and relatively stable. I have only begun to determine what particular kinds of uses people might make of it. Linden's own webpage seems oriented toward economic activities like those mentioned above. That page does also mention, however, that users can explore, meet people, own virtual land, create things, and have fun. The Explore webpage does not seem very oriented toward exploration per se, however; instead, it speaks of using the map "to find people socializing"; using the search menu to find events; and viewing people's profiles to learn more about them. That is, it is creations, people, and events that count. The Own Virtual Land webpage sends the same message: the purpose of your land is "to build, display, and store your virtual creations, as well as host events and businesses." What, then, are these creations, events, and people -- what is it, in other words, that lies behind the business transactions, that draws people to enter and spend money in the virtual world? The Creations webpage points toward several other pages, including a Building page that says you can build anything from a navel ring to a skyscraper, alone or as part of a team, and "Imbue all objects with Havok™-powered physics so they respond to gravity, inertia, propulsion, and wind from the in-world weather system." Here, too, though, the commercial interest seems essential: the Create Anything page says you can sell the things you build, and if you don't have time or know how to build something, you can buy it. So, for a more vivid grasp of what these things are all about, or a sense of why they are worth building or buying, one stop would be the SL Exchange (SLX), where (if you enable mature content) you can see (at this writing) thousands of items under these categories:
Animals (5196)
Animations (16109)
Apparel (177447)
Art (22294)
Audio and Video (2814)
Avatar Accessories (68786)
Avatar Appearance (39451)
Building Components (35458)
Business (10225)
Celebrations (12541)
Gadgets (9444)
Home and Garden (92843)
Miscellaneous (3791)
Recreation and Entertainment (15827)
Scripts (2220)
Services (197)
SL Exchange (33)
Used Items (1405)
Vehicles (5810)
Weapons (5750)
What seemed to be most popular, at this point, included textures, tools, and devices to equip your avatar or your virtual land, or otherwise to create an appropriately decked-out virtual life. People were also able to create pictures and postcards of scenes in SL. On the first of over 21,000 pages of SL merchandise at Onrez, lingerie and sexy clothing was the leading topic when sorting by "Top Matches," and jewelry was most prominently featured when sorting for the newest items to appear first. Turning to events, SL's Have Fun page says, ""The world is filled with hundreds of games" as well as "dance clubs, shopping malls, space stations, vampire castles and movie theatres." To see the listings, the user is advised to search, within SL, for Events, under these headings: Discussions, Sports, Commercial, Entertainment, Games, Pagaents, Education, Arts and Culture and Charity/Support Groups. Events thus seem to overlap with the concept of meeting people. The Meet People page speaks of a "vibrant society of people" in which it is "easy to find people with similar interests," and also that, at any time, there are dozens of events where you can party at nightclubs, attend fashion shows and art openings or just play games" as well as "form groups ranging from neighborhood associations to fans of Sci Fi Movies." Less prominently mentioned is griefing, in which a user deliberately breaks the rules to spoil someone else's online experience. Techniques for this purpose have apparently been sufficiently "common" as to provoke FBI investigation. Fiend Ludwig provides an example of one user's griefing experience, and SL offers a video tutorial. Pulling it together, SL's Showcase provides images and videos of various kinds of scenes and activities within the virtual world. There are, first, separate showcases for Arts & Culture and for Music. Looking at the latter for a few minutes, I observed the 3D home for the rock band Journey, and the Amsterdam Arena, described as one of SL's biggest techno dancehalls. There seemed to be about 25-30 clubs listed altogether. The occasional dancehall images that I saw, at this point and otherwise, never featured more than a few avatars. Another Showcase category, Hot Spots, listed a similar number of sites, some of which (e.g., H&R Block Island, Weather Channel Island) would not normally have qualified, in the vernacular, as "hot spots." The other two Showcase categories, Tutorials and Photos & Machinima, both seemed to be oriented toward the development of one's portion of the SL world. The latter featured videos, some of which were on YouTube, with music and somewhat interesting graphics. One of them comes, for example, from Princeton University in Second Life. There was, I could see, some potential for artistry and creativity in the SL realm. Certainly I did not come away, however, with a sense that SL provides a compelling sense of where virtual reality is going. It reminded me of my first experiences with CompuServe e-mail -- again, in the 1980s -- when I could feel the excitement of being in instant and yet safe contact with people far away -- in ways that did not and, to some extent, could not happen via telephone. The potential was there, but ultimately most people did not opt for e-mail until some years later, especially in a visually more appealing format. SL has, to me, that rough feeling of sometimes slow resolution, artificiality, and other restrictions upon what one could instead experience in real life. This seems to be the spirit behind e.g., the Get a First Life webpage, whose parody of SL includes such reminders as, "Go Outside -- Membership Is Free"; "First Life is a 3D analog world where server lag does not exist"; and "Find Out Where You Actually Live." So why do people do it? What are they looking for in SL simulated spaces (called "sims")? Wikipedia cites real-world applications in areas of education, religion, politics, and the arts. Yet that seems to be markedly different from another Wikipedia article, on emerging virtual institutions, that discusses participation within political, economic, social, and linguistic institutions that exist only in the virtual world. An audio clip that played while I was looking at one of SL's Showcase presentations mentioned the user's freedom to be whatever s/he wanted to be, there in the virtual world. Evidently it goes both ways: some people want to use SL to influence or enhance some kind of training or other interaction or behavior in the material world, while others want to use SL to create or participate in a world that is deliberately different from that world. Presumably people would not spend large amounts of time designing islands and jewelry for a virtual world unless they (or their customers) were intending to enjoy that place for its own sake, much as people have long enjoyed novels and dollhouses for the imaginary worlds they help to create -- quite unlike those who write textbooks and would use virtual reality (VR) to serve some other end. I, personally, am not too interested in devising a virtual world for its own sake. I may be, someday; I can imagine using it to illustrate, or to participate in, a hypothetical social arrangement -- to try out some sociopolitical ideas in the virtual world before suggesting their implementation in actual lives. As I say that, I guess I can also imagine lining up participants for a virtual therapy group characterized by anonymity and freedom to develop one's different personas in a group setting. Surely there is great potential, in the long run. My initial reaction to SL was on the negative side, I suspect, because I found the graphics somewhat clunky (although really not bad) and definitely unrealistic (e.g., I could walk through fire; I could fly); and for these reasons, I think, it was difficult for me to feel personally invested in the imaginary world. A default first-person perspective (i.e., seeing through my avatar's eyes, instead of seeing him as he got himself into various situations) might also have helped. These or similar possibilities will likely be explored by SL and/or by other participants in OpenSim eventually. Once again, the WordStar/Lotus/CompuServe examples come to mind. Unless you want to be expert at some program or capability for its own sake, or for some short-term need, the most sensible path would seem to be to keep abreast of developments, but focus on advancing your core skills and interests, and wait until the technology gets to a point where you can use it without investing tremendous amounts of time and money and possibly being on the wrong track or learning details you will have to unlearn later. From that perspective, the important question seems to be, not How can your clients or customers find some use for this technology now? but rather What could this technology (if properly developed) do for your clients or customers that nothing else can do -- what will they eventually see as the "killer" application of this technology? In an apparently notorious recent speech, Mitch Kapor (designer of Lotus 1-2-3) said,
The pioneer era in Second Life is beginning to draw to a close. It has been five years and we are at the beginning of a transition and I think it is an irrevocable transition. And I am hoping what you see now is a slide of a technology adoption curve, a classic bell curve that shows early adopters on the left and then a set of pragmatists as we move from left to right and so on all the way over to the right edge of the curve, we show the laggards. This technology adoption curve is well known for the way to characterize the adoption of these disruptive new innovations. Now, where are we on this? OK, could I have the next slide please. When you see this resin, you should be seeing a big red vertical arrow just at the margin between the early adopter phase and the pragmatist phase. That is really where we are today and I think that has some very important implications and I want to talk about that for a minute. So the first is, in the earliest wave of pioneers in any new disruptive platform, the marginal and the dispossessed are over represented, not the sole constituents by any means but people who feel they don't fit, who have nothing left to lose or who were impelled by some kind of dream, who may be outsiders to whatever mainstream they are coming from, all come and arrive early in disproportionate numbers.
There appears to be some anecdotal evidence supporting Kapor's hypothesis, at least to the extent that SL and Linden may be representative of only a first phase of a VR revolution. It seems timely, that is, to ask whether one should be in the business of promoting the fantasy or the reality in SL -- promoting, that is, the enjoyment of a virtual world in which one can be whatever one wants, or instead helping people to do better and be happier in the material world. That question goes beyond the scope of what I wanted to achieve in this introduction to SL, however, so I will stop here for now.

Monday, August 25, 2008

Barack, Hillary, Bill, and Biden: I Was Wrong Once

My prediction was wrong. I figured that the only way Hillary Clinton would be playing along so well and willingly with Barack Obama's victory in the primary election contest -- pledging her support and all that -- was if it was somehow in her interest to do so. That part was surely correct. What was wrong was that I thought her interest must have been some sort of deal for her and, later, perhaps, for Bill Clinton. Barack has now chosen Joe Biden as his vice presidential candidate. So there was apparently no promise for Hillary. I can only guess that she saw it as being in her interest to be a good sport and keep her powder dry for the next presidential election, in 2012. Conceivably she also thinks she can still put on a challenge of some sort at the Democratic convention, but I don't think she would seriously bank on that. We'll see what happens at the convention, with her speech and all, but it seems she is increasingly out of the picture. On to the choice of Biden. Will I be right on this one? I find the pick problematic. Biden is definitely a Washington insider. He is also an old white guy. This cannot be highly reassuring to those who see the Obama candidacy as a call for change. Not female. Nothing exciting about him. He's knowledgeable in foreign affairs, which might make him a good secretary of state. He's something of a loose cannon, verbally speaking, which could make him a liability in the election contest. He may help in states like Pennsylvania and Ohio, but he's not at all a household word, and I'm not sure how much people care about the vice president's resume. Unlike JFK's choice of Lyndon Johnson, he won't deliver a key state like LBJ delivered Texas. I'm glad that he takes stands opposing Obama sometimes -- glad that Obama wants something of an opposite number to challenge him -- but that's not really a meaningful criterion for a vice president as distinct from, say, a political advisor. Obama seems to have wanted someone who would provide a weight of experience, but that cuts both ways: it could seem, to some people, like the elder statesman is the one who should be running for president, while the relatively inexperienced kid should be his understudy. Certainly the choice of Biden seems unlikely to deliver a bounce in the polls, a wave of excitement, or anything that will get anyone very much fired up. My take on the matter is that Obama is becoming cautious, worried about his weak spots rather than concentrating on his strengths, and that he is therefore making himself weaker in the process. I don't know if Obama should have chosen Hillary, necessarily, but I do think he has suffered from a failure of imagination and courage here, and has diluted his message and appeal as a result. I'm not reading every last word about the election contest, so I may be missing something, but it seems like he's responding gently to McCain's jabs. If, as some say, the role of a vice president is to be the attack dog during the contest, I don't know why Biden (who is McCain's self-styled friend) would be a better choice than Clinton. I would think he needed a seasoned, demonstrably successful fighter. I guess he just didn't want to be dragged down by the Clinton baggage, which is understandable, but he could have gotten beyond that with time; he would have reasonably expected to make his own mark on the presidency and the world. Anyway, some dust is now settling. Bill Richardson didn't get the nod, despite betraying Hillary. John Edwards was evidently not the type that Obama was looking for, so his affair apparently didn't knock him out of the running. I would actually think that the ideal running mate for Obama would have been someone who would make him look conservative and/or thoughtful by comparison -- someone whom he could portray as a foil, a proponent of views he found too extreme -- but maybe I'd be wrong in that. What we're going to get, instead, is a campaign of politics as usual, and that's too bad. This wasn't the year for that.

Saturday, August 23, 2008

Pastor Rick Warren, John McCain, and Barack Obama

Pastor Rick Warren of the Saddleback Church interviewed presumptive presidential candidates Obama and McCain this past week. He stated, at the start of the interview, that he would be asking both candidates identical questions, that Obama would go first as a result of a coin toss, and that McCain was in a "cone of silence." It seems that Warren conducted the coin toss by himself, with neither candidate present. Also, Warren told Larry King that he thought McCain would be isolated in the church's Green Room when Obama started answering questions. That way, McCain would not be able to get advance notice of the actual questions; he would have to answer them on the spur of the moment, just as Obama was having to do. Warren says that he knew McCain was not yet actually at the church, shortly before he, Warren, was about to begin questioning Obama. It developed that McCain was actually being driven to the church during Obama's questioning. Thus, it would have been possible for McCain to watch Obama's questioning on TV, hear it on radio, or receive a relay of the questions via telephone, BlackBerry, or other electronic device. When confronted with that reality, Warren told CNN that he trusted John McCain not to do that. It appears that John McCain did know in advance that there would be a question about Supreme Court justices -- that he asked about that question before Warren actually got to it. When he asked that question, Warren seems to have replied by saying, OK, "you got all my questions." I am writing this post because I am surprised that CNN and the New York Times do not seem to have been covering this matter in much detail. They have evidently concluded that it is hard to pin the matter down. In my judgment, Warren made two critical mistakes at the start: he held the coin toss without any representatives for the candidates being present, and he went ahead with the program without confirming John McCain's whereabouts. These two mistakes, assuming they were innocent, were sufficient to make it appear that Warren was stacking the deck in McCain's favor. Favoring a Republican would be typical, of course, for an evangelical Christian in recent presidential elections. I noticed, myself, that Warren said he did not manage to get to a question about the environment -- that they ran out of time -- and I wondered why that topic, on which Obama would have an advantage, was the one that happened to be left out. In such regards, Warren has given ammunition to those who consider fundamentalist Christians politically untrustworthy and corruptive of the American system of government. It is not clear to me what Warren could have done, given that major news media were on a schedule. He does not seem to have wrestled with the question. It is incongruous for a minister, presumably committed to truth in some sense of the word, not to acknowledge that, with the entire nation watching, these irregularities would provoke legitimate controversy. In dismissing the issue as "bogus," he seems to be playing a political rather than honest-broker role. I do think that, if Warren were committed to a fair process, he would have done something. He might have deferred the start until McCain was determined to be securely located in the Green Room. He might have called McCain and made an announcement to indicate that the senator was still on his way and the process would be postponed until he arrived. Surely he should not have assured the nation that McCain was in a "cone of silence" when, by his own latest information, he was not. The reporters do not seem to be asking where McCain was during that time. It is not presently clear exactly when he got to the church. How is it possible that the major news networks and other sources, with dozens of personnel swarming the church, would not manage to send so much as a single junior reporter to watch the Green Room and keep an eye on McCain? The information I have been able to find, so far, seems to indicate that McCain may have been absent during the bulk of Obama's interview. If that is correct, how would it be possible? Where was McCain starting from, that would make him unable to be even remotely on time for a major media event? There is a question as to whether McCain cheated, with or without Warren's assistance. The focus of that question is on whether he actually listened to the questions being posed to Obama. But if cheating means not playing by the rules, then there is no doubt about it: McCain did cheat, by not being present at the church, as Warren seems to think he was supposed to be. Concerns about cheating would be reduced if McCain's campaign had already addressed these questions -- if, say, they had released a statement explaining why McCain was late and where he was. The best I can find on McCain's website, however, is a piece that says McCain was "off-stage, unable to hear the questions that would be posed to him later." But that is not literally true. He was able to hear them if he turned on the radio or was otherwise able to receive communications -- which he apparently was. The more accurate statement would be that he "did not" hear the questions; but there seems to be nobody who can verify that, and to my knowledge nobody has gotten around to grilling McCain on why he told Warren that he was trying to listen "through the wall," as if to create the impression that he had been in the building the whole time. The available information tends to indicate that there was something shady going on, and that McCain and Warren were comfortable with it. It does not seem there will be much further investigation, though. So evidently this will go down as one more minor piece of political gamesmanship that will fade quietly from the presidential contest -- at least until we come to the debates, at which time there will be another direct comparison of McCain against Obama. William Kristol to the contrary notwithstanding, however, Rick Warren is certainly not someone who should be leading future presidential interviews or debates.

Thursday, August 21, 2008

Early Prediction: It's Obama

Bill Clinton's motto in 1992 was, "It's the economy, stupid." And it still is. Especially in these times of increasingly dire talk from Wall Street et al. America is in the throes of adjustment to a developing-nation standard of living -- because the developing nations are the places against which we will be competing in the coming decade. Americans are not necessarily happy about this adjustment. They will be experiencing a lot of pain, discomfort, and unfamiliar territory. John McCain, as the candidate of the incumbent party, is woefully disadvantaged in this regard. As winter approaches and the economy continues to decline, his current parity with Obama will evaporate. That's my early bet.

Tuesday, August 19, 2008

VMware in Ubuntu: More Fixes

This post is the latest of a number of posts on my efforts to install and run Windows XP in a virtual environment on Linux. The version of Linux I chose for this enterprise was Ubuntu, and the virtualization tool I selected, after some testing and experimentation, was VMware Workstation 6. I was now well on the way to finalizing my Workstation installation, having just finished dealing with a number of issues. It looked like there were going to be some rough edges on the final result, such that I might want to reinstall Ubuntu and VMware at some point down the line (such as when they came out with an update). But for the time being, I almost had a complete working system. I started this post and immediately made a number of notes on continuing efforts to finish the project. Unfortunately,, the website on which I was posting this blog, lost my previous draft. I am not sure whether it did so with the aid of some bug in Firefox, the browser I was using to access Blogger. In any case, there were some notes at this stage of the process that were lost. So in that regard, this will not be an entirely complete log of all changes made to the system. Starting over on this post, then, I wanted to outline, briefly, the nature of the system on which I was doing the installation. I had two computers. I was installing Workstation on what I called the primary computer. It had an AMD X2 64 processor and 6GB of RAM. I was meanwhile using the secondary computer to post notes to Blogger on the process, as I made various changes to the setup on the primary computer. In a few instances, I was also recording efforts undertaken on the secondary computer. Both computers were dual-boot machines, mostly running Ubuntu 8.04 (Hardy Heron) but also capable of rebooting into Windows XP. The WinXP installation on the primary computer was a relatively bare-bones installation that I had used as the raw starting point for virtualization. That is, I had set up a basic installation, without Microsoft Updates, Microsoft Office, or other large programs or revisions that, in previous experience, had proved capable of slowing down the system and/or increasing instability. I had then used VMware Converter to create a VMware virtual machine containing that WinXP installation. After rebooting into Ubuntu and starting Workstation, I had then used that WinXP virtual machine (VM) as a starting point, making clones of it, adjusting its features, and adding different Windows XP programs to different clones for different purposes. Meanwhile, I was using the native WinXP installation on the secondary computer as a more elaborate installation, with various programs (especially USB-oriented programs, such as my printer software and my Palm PDA software) running in that boot because I could not connect with the relevant devices from within VMware or Ubuntu. So that was the background situation when Blogger so rudely deleted what I had been writing during the past few days. One thing that needed to happen next, as the story resumes, was that I needed Workstation to display Console View without scrollbars. In other words, there were two views (aside from Minimized) in Workstation. One was Full Screen, and the other was Console View. In Full Screen view, all I would see was the Windows XP virtual machine's desktop. In Console View, I would see a shrunk version of the WinXP VM's desktop inside a Workstation frame. In that frame, I would have the Workstation menus at the top, Favorites (i.e., commonly opened VMs) on the left, and a status bar on the bottom. The problem I was experiencing was that some VMs would not completely show the WinXP desktop inside the frame. Instead, Workstation would supply scroll bars at the right and bottom sides of the Workstation frame, and I would have to scroll around to see different parts of the WinXP desktop. It was weird because this would happen in one VM but not in another, even though I had them open at the same time and one was a clone of the other. Workstation supplied several options for this under its menu's View pick, including Autofit Window, Autofit Guest, Fit Window Now, and Fit Guest Now. For me, unfortunately, those options had not seemed to do anything. I had posted a question on it in a VMware forum, but at this point had not received any helpful replies. Since then, I had completely powered down the computer, let it sit, and rebooted, but the problem persisted. I right-clicked on the VMware Tools icon in my WinXP system tray (at the bottom right corner of the Windows desktop) and opened Tools. Its About tab confirmed, "The VMware Tools Service is running." I tried again on the View options, but again nothing happened. According to the Workstation User's Manual (p. 165), this was only supposed to happen "when the Workstation console is smaller than the guest operating system display." I thought maybe the problem was that the resolution was set differently in this VM, as compared to the VM from which it had been cloned. I fired up that other VM and went into WinXP's Start > Settings > Control Panel > Display > Settings. Its resolution was set at 1280 x 827 pixels (on a 22" monitor). In the clone, by contrast, it was set at 1680 x 1050, which would be the right setting for a full desktop on this size of monitor. So it seemed that maybe Workstation was not resizing it when it went from Full Screen to Console view. I tried resizing it manually. That removed the scroll bars and fit everything into the Console. Then I clicked on Workstation's Full Screen option. It was resized to fit the full screen. I went back to Console View, and it was still fixed. I noticed that, as before, in this VM there was no Workstation menu at the top of the Full Screen view -- not even in a minimized mode that would pop up when I moved my cursor to the top edge of the screen. To get out of Full Screen view, I had to use Ctrl-Alt-Enter. Now, as I checked, that was true in the other VM (its parent -- i.e., the one from which I had cloned it) too. I went into Workstation's Edit > Preferences > Display and changed it from Autofit Guest to Stretch Guest. That kept the little Workstation menu at the top of the screen, but now there were scrollbars at the side and bottom, and the Windows XP desktop icons were larger. I went to Control Panel > Display and reset the resolution to 1680 x 1050, and that was fine for the WinXP desktop, but now the little VMware menu at the top was gone, and when I did Ctrl-Alt-Enter to get back into Console View, the scrollbars were back. I reset the resolution to 1280 x 827 and tried to change it back to Autofit Guest, but I had no menu at the top of the screen to change it with. I had to toggle back and forth a couple of times and keep screwing around with these options until I did finally get it back to a working state, where the display was properly sized again in both Console and Full views. And now, for some reason, the little menu was appearing at the top of the screen in Full Screen view. I clicked on the button on its left end, to make it minimize when I didn't have my cursor on it, and now it was gone again and wouldn't come back in Full Screen view. It would come back each time I toggled from Console to Full view and back, but it would disappear again as soon as I moved my cursor off it, and wouldn't return until I toggled screen views again. Eventually I discovered that Ctrl-Alt would bring it back. So that solved that problem. Another problem: sometimes the Shift and Ctrl keys would not work in Ubuntu, or would do weird things in VMware Workstation. I noticed that pressing Ctrl-V to paste something into Ubuntu's gedit editor would cause it to shut down, and in Firefox (within Ubuntu) the Shift-arrow options would fail to select text and Ctrl-X would fail to cut text. Rebooting the system would solve this problem temporarily, but then it would come back. It didn't seem to be a problem with the keyboard: I was using the same keyboard on both the primary and secondary computers, thanks to a KVM (keyboard-video-mouse) switch. (I had connected the two monitors directly to their respective computers, so at this time I wasn't using the KVM switch for monitors. Only the keyboard and mouse were shared by the two computers.) This problem did not exist within the virtual machine: Windows XP in the VM was still able to cut, paste, use Shift (i.e., capitalize letters), etc. This turned out to be Launchpad bug #195982. The prevailing wisdom was that it was caused by VMware, by something having to do with going into Full Screen mode. The workaround was to type "setxkbmap" in an Ubuntu Terminal session -- which was fine, except I couldn't type anything in an Ubuntu Terminal session because the session would close as soon as I hit the first key. It appeared that VMware had been notified of the problem almost a year earlier and had still not resolved it at this point; scads of people were posting notices on it. I think I would have been able to create a desktop shortcut to setxkbmap by rebooting Ubuntu and then, before running VMware, right-clicking on the desktop, selecting "Create Launcher," and filling in the blanks. But I didn't get that far because, when I rebooted, Ubuntu gave me a black desktop instead of the heron wallpaper it normally showed, and there was no response to a right-click. I did a cold reboot and the wallpaper was back to normal. I went into VMware and switched to Full Screen and back and, sure enough, the keyboard was funky and also the Compiz feature of being able to go from one desktop to the other using the mouse wheel was disabled. I double-clicked on the setxkbmap shortcut I had just created and, lo and behold, all was well with the world. End of another problem. One thing that was very nice about having VMware Workstation, as I was reminded at this point, was that Windows could go ahead and be screwy and it really didn't matter. At the moment of writing these words, I had two VMs open. One of them was basically failing to run. WinXP was up, but it was responding extremely slowly. It wasn't responding to the fix-it tools I would usually run in Windows. So I just killed it. Click on the X and confirm, and the virtual machine is gone, with no effect on my ability to keep right on working in other virtual machines or in the underlying Ubuntu system. An important reason for using VMware was to help in the transition away from Windows. Whenever possible, I wanted to replace WinXP programs with Ubuntu programs. I ran into one such need at this point. I was using Firefox on Ubuntu for my web browsing, and I wanted to view a YouTube video. The webpage would open up, but the YouTube video wouldn't appear. This wasn't a problem of Firefox per se; I had been able to use it to watch YouTube videos on Windows. It seemed that what I needed was Adobe Flash player, and that there was no version of Flash available for 64-bit Linux. But another source said that was not true, it was not a problem for x64. The previous page in that same discussion featured some debate as to whether 64-bit operating systems were plagued with problems and were not yet receiving much support from companies and developers. I had also recently installed another program whose Read-Me file had said, "A reasonably modern 32-bit Linux environment is required. If you are running a 64-bit Linux distribution then you will need its 32-bit compatibility environment installed." One person in that discussion advised the questioner to go into Terminal and type "sudo apt-get install flashplugin-nonfree" and then reboot. I got an "Unable to lock the administration directory" error. I thought maybe the problem was that Firefox was still running, so I closed down all other programs and tried again. It ran this time, but it said, "flashplugin-nonfree is already the latest version." I searched Synaptic Package Manager for flashplugin and saw that this was true. I uninstalled it from Synaptic and rebooted, and then reinstalled it. That didn't fix the problem; I still couldn't play YouTube videos. I thought that what I might do was wait until October (it was now late August) and, when the next version of Ubuntu came out, install that new version in 32-bit form. But that might cause problems for my 64-bit version of Workstation. So that remained unclear. In the meantime, it seemed I would have to continue to open YouTube videos in Windows, either in a VM or in a native WinXP dual boot. Another thing that I had done in WinXP, that I was now trying to do in Ubuntu, so as to reduce reliance on Windows, was to run a script or batch file that would automatically open a bunch of webpages. In WinXP, I had prepared DOS batch files that would run whenever I clicked on an accompanying shortcut. Here is an abbreviated version of what one of these batch files would look like:

:: WEBDAILY.BAT :: Opens each website I want to visit daily. @echo off start firefox.exe start firefox.exe start iexplore.exe start explorer.exe /e,"D:\Path\Name of Folder to Open" start excel.exe "D:\Path\Spreadsheet to Open.xls" start notepad.exe "D:\Path\Text File to Open.txt" start winword.exe "D:\Path\Word Document to Open.doc" start acrobat.exe "D:\Path\PDF to Open in Adobe Acrobat.pdf" echo off cls echo. echo Close all other programs when you are ready to run DiskCheck. echo After running DiskCheck, reboot the computer. echo Upon reboot, the system will check all drives thoroughly. echo So don't run it until you're going to be away from the computer for some hours. echo. pause call "D:\Installed Here\DOS_UTIL\DiskCheck.bat" exit
I didn't know if I would have any comparable diagnostic or utility programs that I would want to run in Ubuntu. WinXP had seemed to need a lot more of that kind of thing. So the last lines of that batch file were offered here just for illustration. Otherwise, what I wanted from this batch file was the ability to open separate tabs in Firefox for each of several webpages (and, ideally, to open Internet Explorer for those webpages that did not display correctly in Firefox); to open a session of File Browser pointed at a particular folder; and to open specified files with specified programs (e.g., OpenOffice Writer instead of Microsoft Word). If I could figure out how to do this, I would have several different scripts for this purpose, just as I had had in Windows: one for websites, files, and folders that I wanted to open every day; one for those that I wanted to open on a weekly basis; one or more for those I wanted to open every couple of weeks, every month, or every several months; and perhaps one for those websites that really only needed to be checked once a year. I decided not to try installing IE View Lite, a Firefox add-on that would apparently permit the use of Microsoft's Internet Explorer within Ubuntu if Wine was installed. There seemed to be a video on it (I wasn't sure -- I wasn't able to watch it!). But I didn't need to go that route at this point, as I was encountering few IE-only websites and could just open those in a WinXP virtual machine if needed. For the other command lines, I posted a question about the Firefox script syntax, and I found sources on command lines to open File Browser and specified files. It sounded like these would be the commands I would need:
firefox nautilus /media/DATA/Name of Folder to Open gnome-open Filename.ext
I tried each of these in Terminal first. It turned out, though, that "firefox" meant Firefox 3.0, at least to my system, and Firefox 3 had given me problems previously. I went to Synaptic Package Manager to recall exactly what my version of Firefox was called. Easy enough: it was firefox-2. I tried that and it worked. Next, for the Nautilus option, when I tried it for a folder named Test Folder, I got two error messages: "Couldn't find 'media/DATA/Test'" and (oddly) "Couldn't find '/home/ray/Folder'." The solution there was (as in Windows) to enclose the multiword folder name in quotation marks. I then tried using gnome-edit to open files called Test File, with .doc, .txt, and .xls extensions. I created these files as empty files in File Browser. The .doc file opened in gedit, not in OpenOffice Writer as I had expected. I went into System > Preferences > Preferred Applications and found no option to change .doc files there. I had previously discovered the Ubuntu Brainstorm webpages, where people apparently would post their wish-list items for improving Ubuntu, and now I found a thread on there that addressed this issue. Apparently it was common knowledge that it wasn't always easy to specify which program would open which kind of file. This seemed to be something that might improve in a future version of Ubuntu, so I left it at that for now. I tried again with the Test File.txt file, and that, too, opened in gedit. Test File.xls also opened in gedit. Now I tried again, this time creating the .doc file from within OO Writer and the .xls file within OO Calc. This time, when I tried the gnome-open command, the .doc file opened in OO Writer, and the .xls file opened in OO Calc. So the extension, by itself, did not decide which program would be used to open a file; the system would actually look at what kind of file it was and would then instruct that program to open it. Also, for some reason, gnome-open worked better, for me, than nautilus in opening some folders. So, in short, the revised list of needed commands was as follows:
firefox-2 gnome-open "/media/DATA/Name of Folder to Open" gnome-open "/Path/File Name.ext"
Using those lines as models, I prepared scripts to open webpages and folders that I wanted to see every day, week, or whatever. Then, using the aforementioned advice, I made each script executable by typing "chmod 755 " into Terminal. I had to do this as root (i.e., typing sudo -i), else I would get an error message, "Changing permissions of `(filename)': Operation not permitted." Then I right-clicked on the Ubuntu desktop, selected "Create Launcher," and designated, as the Command, the full path to the script. As mentioned previously, I had pretty much accepted that my USB devices (e.g., Palm PDA, Olympus digital voice recorder) would have to be connected and updated to a WinXP native boot. I was using my secondary computer for this purpose, occasionally rebooting into Windows to update and download files. I had not been very successful in getting Ubuntu or VMware VMs to work with these USB devices. I thought I had an exception with my Kodak C653 digital camera. I plugged it into the primary computer, and Ubuntu recognized it and started up the F-Spot program. But then it gave me an error message:
Error connecting to camera Received error "Could not lock the device" while connecting to camera
So it was back to the secondary machine with that device too, at least for now. I had started out with a number of VMware Workstation virtual machines. I thought I would be using different programs in different machines. This had been reduced to just three machines. One was called WXMUpdated. This stood for Windows XP, Medium-sized (i.e., 1GB RAM), Updated with the latest updates from Microsoft Updates and Microsoft Office Updates (and whatever other programs needed to be updated). It had a couple of ancestors, named WXMOfcPure and WXMOfcUpd, indicating that these were in various stages of having updates added. I was cautious on the subject of updates because it had sometimes turned out that updating would make a Windows installation much slower or less stable. But this WXMUpdated VM was working well at this point. A second VM that I was using pretty often was called WXS-AlwOn. S stood for Small (i.e., 512K of RAM). I called it Always On because I was running Second Copy 2000 software in it. Second Copy would back up my data files (on shared NTFS partitions, not on the WinXP virtual program drive C) to an external drive every few hours. I had originally thought that this VM would be running a number of minor tasks constantly, but it hadn't turned out that way. The third VM that I was using now and then was called WXSOccnl, to indicate that it was where I installed occasionally used programs. I had thought this would be the place to update data files with USB devices. It was useful now and then, as its name indicated, but really the only productive VM was the WXMUpdated machine. Now it occurred to me that I might want to be using more than one clone of the WXMUpdated machine. I had originally thought that would be where I would work with my primary office-type programs, especially Microsoft Office (especially Word and Excel) and Adobe Acrobat. That much remained true. But as I turned away from full-time computer fiddling and got back into my usual work, it seemed that it might be handy to have different VMs for different projects. I could have Word, Acrobat, etc. open in each VM, but the files that I had open would be quite different. In one, I might have a Word document open as I was writing about subject A, and several Acrobat PDFs on that subject open as well. In another, I might have a different Word doc open, addressing subject B, with its own Excel spreadsheets and PDFs. When I wanted to work on project A, I would open that VM, and when I wanted to work on project B, I would open that one. I could suspend them when I wasn't using them, preserving their exact status regardless of whatever else was happening. (I had discovered that the previous inability to use Microsoft Word in Workstation VMs -- giving me error messages of "The disk is full or too many files are open" and "The save failed due to out of memory or disk space" -- vanished when I saved my Word docs on an ext3 partition rather than trying to save them on an NTFS partition. I hadn't yet tried saving them on a FAT32 partition.) The problem with the suspend option was that VMware was somewhat slow at suspending, resuming, and getting Windows programs functioning again after resuming. The time required for suspending and resuming, along with time and space needed for VMs and their backups, was another reason not to expand the VM beyond 15GB if I didn't have to. I had only experimented with this a bit; but when I hibernated VMware with four VMs open, and then started the power up again, it took at least a half-hour for the machine to be normally responsive again. The hard disk light was on all that time, as if the machine needed to read the full 60GB (4 x 15MB) of VMs into RAM or a pagefile or something. The concept just described, involving several Project VMs, called for a clone of WXMUpdated. This time, though, I wanted to try to do a better job of defragmenting. My understanding was that a clone would seal your fragmentation -- would make it permanent -- so that subsequent defragmentation would fix only the fragmentation that had occurred after the cloning. Defragging a VM was more complicated than defragging in Windows. The advice I got from the Workstation 6 User's Manual (p. 201) was, first, to defragment the VM (using e.g., Windows Disk Defragmenter); then power off the virtual machine and use Workstation's VM > Settings > Hardware > Hard Disk > Defragment; then run a disk defragmentation utility on the host computer. I guessed they were probably thinking of a Windows host, since I had the impression that Linux didn't fragment files, though I saw some advice to run "sudo apt-get clean" at some point to clean out deadwood (although possibly Synaptic Package Manager would take care of it automatically). IT World and Wikipedia confirmed that, while defragmentation could exist in a Linux system, it was primarily an issue in some server environments, not for desktop computers. It belatedly occurred to me to run WinXP repair utilities (especially Start > Run > sfc /scannow, Advanced WindowsCare 2 (with Security Defense, Registry Fix, System Optimization, and Junk Files Clean checked), and also right-click on the drive in Windows Explorer and choose Properties > Tools > Error-checking > Check Now > Automatically fix file system errors > Start. It wouldn't run that test immediately, so I rebooted after all this other stuff was done. Then I did the defrag steps once more. Finally, I cloned WXMUpdated and called it WXMUProjectA, for whatever I might have going on in there. I also made a WXMUProjectB clone. I thought I might try to use those clones and leave WXMUpdated alone as a backup, since it was running so well and God only knew how long that would last. There were limits to this strategy, though, as I soon found. Despite having 6GB of RAM, I got an error message indicating, "Not enough physical memory is available to power on this virtual machine," when I tried opening up WXMUProjectB. At that point, I had two 1GB and two 512K virtual machines operating, as well as about 50 tabs in Firefox in Ubuntu. I would not have thought that would have used up the RAM. Workstation's Edit > Preferences > Memory said that I had allocated 4384MB of RAM for virtual machines. I created one other clone -- and made backups of all these -- for multimedia. I called it WXMUMedia. This area of multimedia, including especially editing of audio, video, and images, seemed to be the last unexplored frontier, in terms of transitioning my various kinds of Windows activities to Ubuntu. From this one, I uninstalled Office 2003 and some other non-media programs to make space; WinXP was already using over 13GB of my 15GB virtual hard drive in the WXMUpdated ancestor. Before I could proceed with installing multimedia programs in the WXMUMedia VM, I decided I needed to convert my primary data partition (called DATA) from NTFS to ext3 format. The reason, as noted above, was that it seemed that both Ubuntu and Windows (when run in a VMware virtual machine on Ubuntu) could read and write to ext3 partitions; and Microsoft Word, running in a VM, was able to write to ext3 but not to NTFS. This step was a little weird, because it meant that I would no longer be able to directly access my latest files from a native WinXP boot, since Windows itself cannot normally read ext3 partitions. I consoled myself that (a) even if my Ubuntu installation failed, I could still get at those files by booting with an Ubuntu live CD and copying them somewhere else, and (b) I had both an external NTFS backup and an alternate internal NTFS partition where I made more or less current backups. So after verifying that those backups were now right up-to-date, I rebooted with the GParted CD and recreated that data partition as ext3, and then rebooted and prepared to copy the data back to it. Unfortunately, Ubuntu had other ideas. I got an error message: "Cannot mount volume. You are not privileged to mount the volume 'CURRENT'." If I let it go for a while, just sitting there, eventually it changed to another message:
Unable to mount location DBus error org.freedesktop.DBus.Error.NoReply: Did not receive a reply. Possible causes include: the remote application did not send a reply, the message bus security policy blocked the reply, the reply timeout expired, or the network connection was broken.
In response to another user's question about that same error message, one source had said, "If your partition is not corrupted, linux might just not be mounting it right, and you can look up how to edit FSTAB to make that work." That reminded me that I had edited fstab before. In Terminal, I typed "df -h" and saw that CURRENT was not listed. I typed "sudo gedit /etc/fstab" -- which I was proud to be able to do from memory -- and saw that, previously, CURRENT had been /dev/sdc5. Nothing had changed in the mounting via fstab, so did that mean my newly created partition was corrupt? I tried rebooting. If that fixed it, I would feel like I was working with Windows all over again: just reboot a confused machine and let it sort itself out, maybe. Fortunately, I still got the same unsolvable error message, proving once again that Ubuntu was a superior operating system. Well, looking again at that error message, if privileges were the issue, why not just log in as root and grope my way toward the partition that way? I typed "sudo -i" and then "cd /media/CURRENT." There was a partition there, and I was on it. I typed "nautilus" and then clicked on the Computer icon at the top of File Browser. That gave me another error message:
Couldn't display "computer:". Nautilus cannot handle computer: locations.
So, OK, still in File Browser (as root), I double-clicked on CURRENT in the left pane. Same "Cannot mount volume" error as above. So I didn't think this was really about privileges. I went through several other discussions, not always solved, and then found one that reminded me of what seemed like the obvious solution: fstab was still designating CURRENT as an NTFS partition, whereas I had changed it to ext3. So I needed to change its line in fstab to match what I had done with the VMS partition, which was also ext3. Taking another look at fstab, I changed it to be:
/dev/sdc5 /media/CURRENT ext3 defaults 0 0
which was what I had for VMS, except the device was a different number. I saved that and rebooted. For some reason, on bootup Ubuntu reported, "Routine check of drives: /dev/sdb6 ..." I wondered why it would be checking that drive in particular. Maybe it always checked them all, but it hung on that one for a while, going through it one or two percent at a time. But it went OK, and I was now able to click on CURRENT and see its contents, which were nothing except lost+found. I couldn't copy to it, though, because root was the owner. I went back into Terminal, did the sudo and nautilus thing again, and had to click around a bit until I found the route I wanted: in File Browser, click on File System > Media, then right-click on CURRENT and select Properties > Permissions. I changed Folder access to be "Create and delete files" and changed File access to be "Read and write." But then, when I bailed out of Nautilus and tried copying from the backup and pasting to CURRENT, I still got the same error message. I went back into Permissions and for some reason the File access was no longer "Read and write"; it was just "--". This time I tried clicking the "Apply Permissions to Enclosed Files" box at the bottom. But I went right back into it after closing and it was still the same; File access was "--" again. So this time I changed Group to ray and changed its folder and file access too. Now when I tried right-clicking on CURRENT in ray's (i.e., not root's) File Browser, the Create Folder and Paste Into Folder options were no longer grayed out. I said Paste, and it began copying files. It looked like it was going to be at it for a while, so I went to bed. When I got up, it was done, and a comparison of properties for the source and target folders indicated that all of the files had copied. I tried saving a Microsoft Word document to the CURRENT drive, now that it was ext3, or HGFS as Windows called it, and it worked. Problem solved. Since I had not yet restarted VMware Workstation 6, it seemed like a good time to tinker again with the RAM requirement. (this website) seems to have lost this note, but I thought I had previously posted my next step in that regard. Above, I mentioned that I had originally allocated 4384MB of my 5384MB of RAM for virtual machines, leaving 1000MB for the underlying Ubuntu and VMware to run in. With that arrangement, I had been able to run only a net total of 3GB of VMs to run -- two half-gig (512MB) VMs and two 1GB VMs. (I probably would have been able to run another half-gig machine, but I couldn't run a third 1GB VM.) Since then, however, I had upped that to 4600MB, because I wanted to see if I could get a net total of 4GB worth of VMs to run. That worked: I now had the capacity to run those two half-gig VMs and three 1GB VMs. Not wanting to pinch the underlying layer, I thought I might back it off to 4500MB and see if it still worked. I figured the allocation would be different if you ran half-gig rather than 1GB machines -- two 512MB VMs would presumably require more overhead than one 1GB VM. But these were the machines I had at the time, so I wanted to work with this. I logged in as root in Terminal, typed "vmware," went to Workstation's Edit > Preferences > Memory option, and changed it to 4500 out of 5384MB. (I had it set to "Fit all virtual machine memory into reserved host RAM," because swapping was so bloody slow.) And it worked: all five VMs powered up. So the sweet spot was somewhere between 4384MB (which wouldn't run the third 1GB VM) and 4500MB (which would). I powered on the last of those five machines at 7:23 AM. Each 512MB machine was allocated 10GB of drive space, and each 1GB machine was allocated 15GB. Some of the VMs had been suspended; others had been powered down. To get them all up and ready for work, the hard drive light on my computer was still running almost nonstop -- flickering at times, but mostly busy -- until about 8 AM. Until it settled down and relaxed, I had found, the computer was frequently unresponsive and could be unproductive to sit there and try to get anything done. So this was a possibly unavoidable drawback to having so many different virtual machines in use at once. Then again, as I was testing this on this particular occasion, I noticed that the hard disk light became a lot less busy once I started using the keyboard and mouse. So maybe VMware was trained to use the drive for whatever maintenance purposes when I wasn't using it. I saw that it would go back at it when I switched over here to computer no. 2. It seemed that I needed more experience with Workstation before I could say clearly how much warm-up time the computer would need. Workstation was never as responsive as native Windows, but even 80 minutes after the original startup, the hard drive light was staying busy and the computer was quite slow in responding to some commands. Anyway, I had discovered a problem with my native Windows boot and with the VMs made from it. When I right-clicked on a drive and chose Properties, I got nothing. That is, I would start up Windows Explorer, and in the left-hand (Folders) pane, I would right-click on, say, drive D. But after clicking on Properties, it would just sit there; and then, eventually, it would open up the Properties box and tell me that this was a network drive with an HGFS (?) file system, with zero bytes of used space and zero bytes of free space. I posted a question about it, in a Windows forum, and got a suggestion to try DialAFix, but I found that Dial-a-Fix wasn't helpful. Another suggestion was to use ShellExView. It occurred to me that this was supposed to be one of the advantages of virtualization -- that I could experiment with this stuff in a virtual machine without damaging my underlying system -- so I installed and tried ShellExView in my WXSOccnl virtual machine, following some instructions apparently posted by a Windows MVP. The problem, they said, was in the Context Menu items. I sorted by Type and selected them all. The status bar said there were 17. As advised, I reselected half of that list -- the first nine. I then right-clicked on those nine, there in ShellExView, and disabled them. I killed ShellExView, rebooted the VM, and tried the context menu now, to see if it worked properly. It did. So it seemed the problem had been caused by one of the nine I had disabled. I ran ShellExView again and re-enabled four of the nine. On reboot, the context menu still worked OK. So now I was down to five suspects. I reran ShellExView, enabled three more, and repeated the exercise. It ran again. Only two suspects! I enabled one of them, leaving just one disabled, and rebooted. Once again, I could get good results from the right-click Properties item. I enabled that last disabled item, rebooted one last time, and the context menu was working right again. Had ShellExView fixed the problem just by running? Ah, a closer look revealed that the context menu was reporting the same amount of used and free space for all partitions. In other words, it wasn't working after all. I tried the ShellExView steps again. This time, I started by disabling all Context Menu items and rebooting. Still the same result. So we were on the wrong track with this ShellExView approach. I re-enabled all Context Menu items and rebooted and dropped this question for the time being. Weird thing, at this point: I was not able to move files from my external (NTFS) drive to any other partition on my computer, whether NTFS or ext3. That is, I couldn't do it within one of my virtual machines. When I switched to the underlying Ubuntu layer, there was no problem: I selected and moved the files just like normal. I had no idea why this was. I was noticing that Workstation was not very responsive when I switched from one VM to another, when I had a full 4GB worth of them open. That is, it could take it a long time to respond to a click telling it to open Windows Explorer or minimize a window. I wondered if this was because I had robbed too much RAM from the underlying Ubuntu layer that VMware was running on -- if it was having to do a lot of swapping every time I switched virtual machines. It sure seemed like the drive was staying way too busy. At this moment, for instance, it had been nearly 2.5 hours since I had started up the computer in the morning, and yet the hard drive was still cranking away. It had to be caused, not by the startup load, but by the continued load of shuffling all these virtual machines. This was happening even though I had my Linux program files on an entirely different hard drive from my virtual machines. I later observed that, when I was suspending or resuming just one virtual machine, the hard drive activity light died down more quickly and functionality was more consistent. I think good ol' Blogger lost some more material I had written here, but I'm not sure. Several days passed with me just using my setup as God intended, and when I came back to this log, I couldn't remember for sure what I had last been working on. My reason for coming back was not to report a new effort to install and use software. I was in the middle of a couple of projects that I had to deal with, so I didn't have time for this at the moment. I just wanted to report that I seemed to have run into a problem that nobody else in the world was having. Story of my life! The problem was, I was using Second Copy 2000 within my WXS-AlwOn virtual machine to back up my newly changed data files to an external drive, and for some reason I couldn't figure out, it kept insisting on making a backup of D:\lost+found, which I took to be something like the recycle bin. It seemed to have been created when I reformatted D:, the DATA drive, as ext3 rather than NTFS. Second Copy 2000 had a feature that enabled you to specify folders that you didn't want to back up, and I had specified lost+found that way; yet while that feature in Second Copy had worked perfectly well with other folders, it wasn't working with this one. So each time Second Copy tried to make a backup of newly changed stuff on DATA, it tried to get into that folder; and each time it did, it came back with an error message in the Second Copy log: "Access is denied - D:\lost+found\". My guess was that this was some kind of problem caused by an imperfect translation between VMware's ability to see the ext3 partition (and to make it available to WinXP VMs) and Second Copy's inability to actually respond appropriately to that partition. It is really too bad that I had such a good working system, because my next step was to screw it up. That happened through the attempt to install multiple monitors -- and that, in itself, succeeded pretty well before transitioning to a disaster. This book-length treatment of installing Ubuntu and VMware continues in a separate post on that whole ordeal.