A Two-Computers-Per-User Desktop Arrangement
I was spending a lot of time at my desk, doing word processing and other typical desktop work. For this purpose I was using a customized Windows 7 installation on two networked computers for maximum productivity. This post describes that setup.
I had previously thought that, ideally, I would have four computers: one laptop; one test machine to hook up the occasional hard drive or other component for wiping, testing, etc.; and two desktop machines running side-by-side. Since then, however, I had switched from Ubuntu back to Windows and had found this to be a good move. So now I was doing very little testing and tinkering with hardware. Therefore, I dismantled and sold the parts from the fourth computer.
With almost all of my work happening on just two computers, and with a stable Win7 installation on each, the focus now was on getting the most out of them. I was using two desktop computers instead of one because there were still many occasions when a computer would experience downtime. I would be doing drive maintenance or imaging, or would still have to reboot Windows now and then to clear its head or to complete a program installation or upgrade, or Win7 would be running just fine but there would be some scanning or something else going on that would tend to monopolize the machine for practical purposes. I was not yet very impressed with multiple desktop software and was considering a return to VMware or some other virtual machine software, perhaps in a virtual appliance, though I wasn't sure I wanted to get back into the performance issues that had prompted me to try to use a native and/or bootable virtual hard disk or RAID array to improve the really bad performance I had started getting in VMware. So the second computer was also useful as a simple way of having a pretty solid alternate desktop. I could start up a project or leave a set of folders open there and just visit it occasionally, when the primary computer was doing its own maintenance or was otherwise unavailable for a while.
The starting point for this two-computer arrangement was to set up two machines that were almost identical in terms of hardware and software. In previous years, I had thought it was best to have dissimilar machines, so as to maximize resources. One machine or the other would have the right hardware or software to deal with almost any kind of system problem. That belief was probably justified for some purposes. Now, however, I was less patient with that, and it also seemed less necessary. A lot of the old problems had gone away. Meanwhile, it was much easier to learn how to maintain and troubleshoot just one set of problems, rather than have to learn the whys and wherefores of divergent sets of hardware and software. For purposes of getting my work done, Windows 7 was a significant improvement over operating systems I had used previously, including Windows XP and Ubuntu 10.10, in terms of networking and other capabilities.
The customized Win7 installation (see link above) was not as easy as a canned, plain-vanilla installation, but once I had it set up, it had some advantages. One important step was to make my work files available on both computers. My first attempt in this regard was to use a Synology network-attached storage (NAS) unit as a simplified file server, but that hadn't worked so well for me. In the second attempt, I used my home network (basically, just a router and cables to the two computers, though possibly a crossover cable would have sufficed even without the router). After some contemplation, I went with GoodSync to keep the two computers directly synchronized with one another. This was an important development. When combined with appropriate program settings (e.g., setting Microsoft Word to AutoRecover files every minute), it meant that, if the computer I was working on suddenly crashed or otherwise became unavailable, I could usually switch over to the other machine and pick up right where I left off.
I used GoodSync to synchronize my data partition (drive D), not the program partition (drive C). I also used it to synchronize parts of the INSTALL partition, including particularly the funky but advantageous shared Start menu. GoodSync did not need to be running on both computers, so I installed it on computer A. As the installation evolved, I found that computer A was handling most of my computer maintenance and other functions, while I did more of my moment-by-moment productivity stuff on computer B. In particular, computer A was becoming my backup hub. I would save a file on computer B; GoodSync would copy it to computer A; and then my backup software would copy it to other drives. After a variety of unpleasant backup surprises, I had evolved to two distinct backup systems running on computer A. In the first backup system, I was using Robocopy, as part of my customized installation (above), to make frequent, incremental backups to a separate partition on computer A. This was one of the few regards in which computer A differed from computer B in terms of hardware: it had three hard drives rather than two, so as to speed this internal copying (since it was faster to copy from one hard drive to another, rather than between partitions on the same drive) and make it safer (since a failure of one drive would usually not affect the other). In the second backup system, I was using Beyond Compare to do daily manual backups to an external drive that I could carry or store offsite as needed. These were manual in the sense that I had to click things to make them happen, and could therefore examine or at least spot-check what was going to be changed, if I wanted to.
Again, I could still use either computer to do my work, since they both had the same synchronized files and nearly identical software installations. Nonetheless, as the functions of the two computers diverged, I found that I was not really utilizing both monitors most of the time. On computer B, I tended to be opening PDFs, Word docs, Excel spreadsheets, Windows Explorer sessions, and webpages, among which I would copy text, links, and other materials. I could open some of that stuff on computer A instead, but it was cumbersome to have this happening on two different computers, and for the most part it actually was not happening on computer A. That computer, and its associated monitor, were mostly just sitting there, working up a file comparison in Beyond Compare or otherwise doing things that did not really need to be watched constantly.
What I really wanted was to make monitor A available for computer A, when I wanted to see what was happening on computer A, but to have monitor A also available for computer B, when I was doing my ongoing work on computer B. This called for a keyboard-video-mouse (KVM) switch. The PS/2 type of KVM was better for purposes of providing keyboard and mouse input during BIOS setup and in programs that would boot from a CD (e.g., Acronis Drive Image) and would therefore be at least partly unresponsive to a USB mouse and/or keyboard. Unfortunately, I did not realize that the type of motherboard I had installed in both computers did not have two PS/2 ports, so I had to use a USB KVM. It also seemed that I might have to spring for a more expensive DVI-compatible KVM, since I'd gotten some poor video performance when I had connected the monitor to the computer using the older D-Sub rather than the newer DVI kind of cable. In recent months I had been using the KVM only for the keyboard, while leaving each monitor dedicated to one computer and experimenting with having a separate mouse for each computer, so that I could click without having to transfer keyboard (and, optionally, monitor) focus between computers. It had lately occurred to me, though, that the D-Sub video quality problems might just be due to the quality of the video circuits on the motherboard. So at this point I was planning to get a dedicated video card for each computer and see whether its D-Sub connection would work acceptably, in which case I could use the USB/D-Sub KVM for the keyboard and for D-Sub video with monitor A. In other words, monitor B would continue to be dedicated to computer B, but monitor A would run to the KVM and could thus toggle between computers A and B.
This left the problem that, as I had discovered, when I was not seeing events on computer A, I tended not to use that computer. That was not terrible -- it would still be there as a running backup, ready to jump into service when I needed it, unless it hibernated itself in the meantime -- but experience suggested that, if I could not just glance to see what was happening on computer A, I would tend not to toggle over there on the KVM and take a peek. I thought of two solutions to this. One was to set up a reminder that would prompt me, every hour or two, to interrupt what I was doing on computer B, toggle the KVM, and look at events on computer A as displayed on monitor A. I suspected I might tend to disregard that kind of reminder, but I decided to give it a try. An alternative was to get a small, dedicated monitor that would just always be displaying events on computer A, though I realized its tiny resolution would not very well display all the stuff that would tend to appear on my widescreen monitor A. It looked like I could get a monochrome 10-inch Miracle Business MT209A CRT on eBay for $25 including shipping, but I didn't want the clutter or the extra power consumption. What seemed like a more practical option was rather to go with a full-sized monitor dedicated to computer A.
That's where this matter rested for the time being.
2 comments:
I wound up with a solution involving one monitor for the computer that would mostly handle backup and other administrative tasks, and two monitors for the computer where I did most of my work.
A later post extends some topics discussed in this post to the arena of cloud synchronization.
Post a Comment