Showing posts with label disk. Show all posts
Showing posts with label disk. Show all posts

Thursday, March 1, 2012

Windows 7: Assigning Process Priorities to Prevent Slowdowns

I had some programs that slowed down my system to the point that I could barely get any work done.  Examples included Beyond Compare (sometimes) and GoodSync (often).  These programs were not usually urgent; I would have been happy with letting them run at a slower pace, so that I could continue to use the machine in a more or less normal way.  This post describes my search for tools that would give me some control over the demands that those programs would place on my system.

The existing tool for this purpose, Windows Task Manager (available via Ctrl-Alt-Del > Start Task Manager, or via Start > Run > taskmgr.exe), would give me information about various processes.  Many other programs -- including various gadgets and other sorts of monitoring tools -- offered that.  A search led to Process Explorer and System Explorer Portable, which CNET and Softpedia portrayed as popular, highly rated replacements for Task Manager.  For basic system resource information, I was presently using (and liked) Moo0 SystemMonitor, and wasn't too interested in adding more of that right now.

What I was more interested in was actual control of processes.  Evidently it was possible to adjust process priorities manually in Task Manager, and perhaps in Process Explorer and/or System Explorer Portable; but it seemed that this would work only as long as processes were running. In other words, it seemed that the Task Manager manual setting would have to be recreated every time a program was shut down and restarted.

Further investigation led to utilities that would apparently allow me not only to adjust priorities, but to have my adjustments be remembered.  Programs I encountered at this point included Prio, Process Lasso, IO Priority, and Process Hacker. Softpedia searches indicated that Project Lasso was commercial, and had nothing for IO Priority.  I had seen a few very positive comments about Prio, so perhaps it was up and coming; but it was not a major player at either Softpedia or CNET. Those two sources also disagreed as to whether I was getting a free or trial version.  On both CNET and Softpedia, Process Hacker seemed to be the leading tool for process adjustment.

The CNET download gave me both .bin and .exe versions of Process Hacker.  I was more interested in a portable version, so I unzipped the .bin and drilled down to the appropriate .exe and ran that.  It gave me a colorful, hierarchical indication of running processes.

I decided to put it to a test.  I ran GoodSync and did a Ctrl-F in Process Hacker.  That gave me a separate list of files, processes, and threads that apparently involved GoodSync.  But where was GoodSync within the pretty list of colored processes, there in the main Process Hacker window?  I clicked on the Description heading, and that gave me an alphabetical list.  Ah, there it was.  GoodSync was currently using up to 3%, no, 5%, no, make that 10% of CPU resources.  The number kept rising.  In the amount of time it took me to write these words, it was up to 25%.  I had a quad core processor; maybe 25% was going to be the limit.  I clicked to sort on the CPU heading.  At first, GoodSync was solidly in second place as the greatest consumer of CPU resources, behind System Idle Processes.  It seemed that the latter just meant that I had a lot of idle capacity.  That seemed to be the message from Moo0 too:  the CPU was not very busy.

Process Hacker also said that GoodSync was using a steady 468MB in Private Bytes.  A StackOverflow post said that private bytes was "the current size, in bytes, of memory that this process has allocated that cannot be shared with other processes."  Another post in that thread clarified:  this value included pagefile (i.e., virtual) memory.  So it didn't necessarily mean that GoodSync was using almost a half-gig of what the RAM chips in my computer could offer.  The thread seemed to say that the most important value of that number was in whether it was growing, which would indicate a memory leak -- which apparently meant that the program in question kept asking for more and more memory, without reusing or returning what it had already requested and was no longer using.  This didn't seem to be a problem:  the Private Bytes number for GoodSync had not yet risen beyond about 480MB.

I clicked on the I/O Total Rate heading in Process Hacker.  It put GoodSync by far at the top of the list, using 22 mB/s (with variations), when the next most disk-intensive programs were generally asking for less than 20 kB/s.  This concurred with Moo0 -- it was tending to show GoodSync as the chief loader of the hard drive -- but this was giving me much more information about how GoodSync compared to the other programs running.  (I was working in the Processes tab in Process Hacker.  There were also Services and Network tabs.)  I also noticed, by this point, that the Private Bytes number did seem to be creeping upwards, now reaching 520MB.  I wondered where it would be in an hour or two.

So now, could I put the brakes on GoodSync?  I right-clicked on it, there in Process Hacker, and saw at least three relevant options:  Terminate, Suspend, and Priority.  Of these, the only one that would put on the brakes without completely ending progress was Priority.  That one was familiar; I had seen the same options elsewhere, probably in Task Manager.  The options were Real Time, High, Above Normal, Normal, Below Normal, and Idle.  Normal was the default.  I had seen somewhere that Normal was the default for everything in Windows.

So, sad to say, this thing wasn't going to give me fine-tuned tweaking options, like "40% of Normal" or "Run only when computer is idle" or "Try to finish this process by tomorrow morning."  I realized that not all of these possibilities might be feasible in Windows 7.  And, wow, what about RAM?  GoodSync was up to 905MB of Private Bytes now.  For that matter, Internet Explorer (where I was writing this) was at 870MB (with eight tabs open, and after 50 hours of uptime, according to Moo0).  I was using RizoneSoft MemoryBooster to keep RAM available, but it didn't have options to put restrictions on individual programs, and hitting its Optimize Memory button at this point didn't have any effect upon the Private Bytes reported by Process Hacker for GoodSync and IE.  I guessed that the situation was that memory leaks couldn't be squeezed much -- that you might basically have a choice between letting the program run or shutting it down so that memory could clear.

I dimly recalled that GoodSync might have its own internal option to throttle itself.  A look at this point reminded me:  its Auto > General tab would allow a specific sync job to slow down file copying if the average download speed exceeded a certain value.  I wasn't quite sure how that would work, but I didn't think the problem was with the speed of downloads (i.e., how quickly they would travel across the ethernet cable between computers).  What seemed most demanding about GoodSync was its use of system resources to do its calculations of what needed to be copied.  In other words, I didn't care whether it was slowing me down because of calculations, or downloads, or for some other reason; I just cared that, overall, it was slowing me down.  And at this point I was not seeing anything specifically geared toward that.  And even if I had, I'd still have a similar concern for some other programs.

Further exploration in Process Hacker indicated that I had overlooked another possible location of options.  I right-clicked on the Process Hacker line for GoodSync and chose Miscellanous.  There, I saw at least three intriguing options:  I/O Priority, Page Priority, and Reduce Working Set.  I decided to see what would happen if I changed a bunch of settings to their minimum values.  So I changed GoodSync right-click Priority from Normal to Below Normal.  I didn't see any difference, there, between Below Normal and Real Time (the highest setting).  On, then, to Miscellaneous > I/O Priority.  It was set to Normal; I set it to Very Low.  Still no obvious difference in the demands on CPU, I/O, or Private Bytes shown in Process Hacker.  Next, Miscellaneous > Page Priority.  It was at 5 (the highest possible value); I set it to 1.  Finally, Miscellaneous > Reduce Working Set.  It was just an on-or-off option.  I turned it on.  No visible difference in the numbers.  I decided to turn off Reduce Working Set, putting it back to its default setting, since I didn't know what it was.  Seems I was wrong:  it wasn't a checked, toggling option; evidently I was just reducing the working set further (whatever that meant) every time I hit it.

Well, by this point, as a correction to my impression from a moment earlier, it seemed that something had changed after all.  (It helped to see this when I changed Process Handler > View > Update Interval to a slower setting.)  Maybe this change was due to my adjustments to the settings, or maybe GoodSync had finished something it was working on, but for whatever reason it had dropped from an I/O Total Rate of 20+ mB/s to more like 3 mB/s.  I suspected that I had now given myself the equivalent of an option to "Run only when computer is idle."  With all these priorities set so low, hopefully the computer would suppress GoodSync when I had other things to do.  The risk now was, no doubt, that GoodSync would never finish its tasks.  I would have to just let it run a while and see how that went.

After closing Process Hacker and letting a few days pass (along with a reboot or two), I returned to see what was happening with GoodSync.  The system had now been up for a little over two days, and at this moment the spinning system tray icon showed that GoodSync was actively at work.  PH showed GoodSync as having 1.14GB in Private Bytes allocated to it.  Its I/O Total Rate was not exceeding about 1.5 mB/s.  It looked like my settings had persisted.  I was not actively using that machine at that point, so I couldn't say whether there would still be noticeable slowdowns.  But it seemed unlikely.  Moo0 was reporting that the CPU, RAM, and hard drives were almost never fully burdened.  And yet GoodSync did seem to be getting its tasks done.

I was not quite sure whether Process Hacker right-click settings all applied only to the specific process that I had clicked on.  Some appeared to apply to all, or at least multiple, processes.  There was more to learn.  But at least it did appear that Process Hacker had enabled me to slow down a program that was grabbing too much of system resources, without noticeably impairing that program's functioning.

Saturday, February 18, 2012

Windows 7: Overlapping Partitions, Entire Drive Is Unallocated, Has No Brain, Still Feels Great

I was using Windows 7.  I had a hard drive containing multiple partitions.  I was in the habit of booting with an Ubuntu Live CD, now and then, to get a GParted view of the drive's condition.  GParted would quickly show me problems with partitions, in a way that seemed superior to what I could get in Windows.

(I was new to Ubuntu 11.10.  Unlike earlier versions, there was no longer an option to start GParted via an easy menu pick, which as I recalled was System > Information > Gparted.  I found it in 11.10 by mousing to the top left button (tooltip:  "Dash Home") and doing a search for GParted.  Once I did that, the Live CD temporarily added it to the toolbar stretching down the left side of the screen.)

This time, GParted gave me the surprising information that the entire drive (which I had just been using, minutes earlier) was unallocated; and when I took a closer look via the right-click Information option, GParted said, "Can't have overlapping partitions."  This post discusses this situation.  (The processes described here unfolded over a period of some days, so there may be some discontinuities in the account, but I think the basic picture comes through.)

I began with a broad search and then a narrower one.  These included repeated suggestions that I go into Terminal (available via search, and also down toward the bottom of the left-side button bar, as above) and type "sudo fdisk -lu."  (That's a -LU, not -1U.)  I had a couple of drives, and thus had to enlarge the Terminal window (or scroll back up) to see what it was saying about /dev/sda, which was the drive GParted had considered problematic.  As I looked down the list of what fdisk was telling me about partitions, I couldn't figure out what GParted was complaining about.  What I expected to see was something like this:

Device        Start        End
/dev/sda1         63     5000000
/dev/sda2    4999999     8000000
In that example, sda2 would start before sda1 ended.  But I didn't see anything like that.  The numbers in my list made sense.  I also didn't have an error message in my list, like that shown by one user:  "Partition 1 does not end on cylinder boundary."  A later post in that thread suggested typing this in Terminal:
sudo parted /dev/sda unit s print
That just gave me the same "Error:  Can't have overlapping partitions" message.  This was to be expected:  GParted was a front end to parted.  This was just two ways of getting the same report from the same program.

The problem identified in another thread was in the total disk size reported by fdisk.  The top part of the fdisk output said there was a total of 312581808 sectors in /dev/sda, but the list of individual partitions said that the extended partition (and a logical partition within it) ended at 312590879.  The latter was a bigger number than the former.  That is, the partitions were supposedly continuing on past the end of the drive.  There was also a discrepancy between an early line in the fdisk output, which said that sector size was 512 bytes, and a later line, which said that sector size was 2048 bytes.  The advice given in that thread was to use fdisk to delete and recreate the partitions with the correct size.  I would have been inclined to use GParted for that, as it seemed easier, but on reflection I realized that I had probably used GParted to create these partitions in the first place.  But I guess I could have used GParted and then tested it with fdisk again.

But anyway, I didn't have those problems.  The numbers in my fdisk output made sense.  So far, no answer.  I drifted through another thread that pointed me toward TestDisk.  Typing "TestDisk" in Terminal told me that I would have to install the "universe" repository of program downloads in order to install TestDisk.  This might not have been a problem with an installed copy of Ubuntu but I wasn't sure how to do that with a live CD in Ubuntu 11.10.  It appeared that I might have to remaster my Live CD to include the universe repository.  That seemed to be getting pretty far away from the original mission.

It occurred to me that I ought to be able to get similar output from a Windows program -- to see a list of partitions and sectors like that which I could see in fdisk in Ubuntu.  This would not be CHKDSK, which would check the file structure within a partition.  At the moment, I wasn't sure what program I would use for that purpose.

Before I could pursue that thought, I looked again at the fdisk output.  Now I saw something I hadn't noticed before.  My last partition did not go beyond the end of the drive.  But it did go beyond the end of the extended partition.  In other words, I was supposed to have this arrangement:  Primary Partition 1, Primary Partition 2 (optional), Primary Partition 3 (optional), and then either Primary Partition 4 or Extended Partition; and the Extended Partition was supposed to contain any additional (logical) partitions -- up to the end of the drive, usually.  Mine didn't do that.  The relevant lines from fdisk -lu looked like this:
Device        Start       End
/dev/sda3    7000000    9000000
/dev/sda8    8000000    9500000
In the System column provided by fdisk (not shown here because of insufficient line space), sda3 was the extended partition.  In other words, these numbers were saying that dev8 was starting inside the extended partition, as it should; but it was ending after the extended partition, and that was improper.  It wasn't a question of GParted not being reliable, as I had begun to fear.  GParted appeared to be identifying a legitimate issue.  But GParted wasn't going to help me fix it:  as noted above, it was showing the whole drive as being unallocated, which was incorrect, and the only option it was offering me was to create a new partition in that big unallocated space -- that is, wipe out all my data and partitions on that drive.

I was thinking that I should double-check GParted using a Windows tool, and that anyway it would be nice to have a Windows-type alternative to this Ubuntu tool.  I assumed Microsoft itself would not be inclined to give me something useful for this purpose.  The last partition was a Linux partition, not a Windows partition -- using ext3 format, I believed, not NTFS -- and Microsoft was not known for doing much that would be helpful in the Linux world.

It seemed that I would have to use fdisk, from the Ubuntu CD.  I wasn't entirely sure how to proceed.  Fdisk was giving information in terms of "blocks," but when I typed "fdisk" by itself at the command prompt, it gave me options in terms of cylinders, heads, or sectors per track, but not blocks.

Then again, as I thought about it, I realized that I actually could go into Windows > diskmgmt.msc and delete that ending Linux partition.  Disk Management did display it.  It surprised me that, if I wanted to use a GUI tool rather than a command-line option like fdisk, the tools offered in the Windows operating system would be more helpful, in this case, than those offered by Ubuntu.

I had been able to see that ending partition in GParted previously.  It had been marked in some way to indicate that it was problematic.  Why had GParted ceased to display it that way?  It occurred to me to boot up with an older Ubuntu CD -- version 10.04 rather than 11.10.  I did that and went into System > Administration > GParted.  But no, it was showing "unallocated" too.  So something had changed.  GParted was wrong, and I was definitely going to have to use another tool to fix the situation.

Before taking the plunge into fdisk, I decided to use this opportunity to play with one or two other partitioning utilities I had burned to CD.  One was Minitool Partition Wizard.  It got a glowing Editor's Review and four stars from 389 voters at CNET.  Unfortunately, it produced "Boot failed: press a key to retry" when I tried to boot my machine with the CD I had burned.  Oops.  Same thing on retry.  Well, evidently it was time to download a newer copy.  They were up to version 7.0.

While that was downloading, it turned out that I had another copy of Minitool Partition Wizard CD, version 5.2.  I tried that.  It loaded OK, and it showed the partitions without a problem.  It was also faster and easier to get to the information, using a dedicated partition CD, instead of having to load Ubuntu and find GParted.

So, OK, this was looking promising.  In Partition Wizard, I selected the appropriate drive and clicked the "Show Disk Properties" option.  It didn't report errors.  I wasn't sure if it was even capable of reporting errors.  It said that that last, troublesome partition was actually unallocated space.  As I recalled, I had formatted that partition to be ext3, in case I wanted somewhere to install Ubuntu.  Maybe the Linux partition had deteriorated somehow; maybe that's why it was now problematic.

I decided I could do without that Ubuntu partition.  Since I knew there was nothing in it, I told Partition Wizard to extend the preceding partition to take over this allegedly unallocated space.  But not all of it.  I had become superstitious about running partitions right up to the ends of drives.  Just for good luck, I made that unallocated space into an NTFS drive.  Partition Wizard decided that the last 14MB of that space would have to remain unallocated.  I clicked Apply.  Then I deleted the partition I had just made and replaced it with a relatively small (3GB) ext3 partition (which would not show up in Windows, just in case "unallocated" was a potentially troublesome status at drive's end), putting the rest of the space into the preceding partition.

That last step didn't go swimmingly.  Partition Wizard said that it had "Failed to execute the following command" with "Error Code 36':  Minitool Partition Wizard detects that file system of the partition have errors.  Please use 'Check File System' function to fix it first."  Ah, so there was a tool of that sort lurking there somewhere in Partition Wizard.  But where?  Now I saw why it hadn't popped out at me earlier:  it was in the Partition menu (and also the context menu), but it was greyed out.  I couldn't very well check the file system with a greyed-out tool.  It didn't seem willing to run on any partition or drive on that computer.

By this point, the download of Partition Wizard version 7 had completed on the working machine, so I installed it there and took a look at it.  Apparently it would work, for at least some purposes, while Windows was running.  I doubted it would help the troubled computer, since the partition that I was trying to resize contained my paging file.  The Check File System option was not greyed out.  But now it seemed that, if I wanted a bootable CD, I had downloaded the wrong thing.  The Minitool page for the bootable CD didn't specify that the bootable CD ISO would give me the Check File System option, but I decided to give it a whirl.  I downloaded that ISO and tried to burn it to a blank DVD.  ImgBurn said, "Invalid or unsupported image file format!  Reason:  First image file part is less than 2048 bytes in size."  So, OK, bad download.  I re-downloaded the ISO and tried again.  It was a slow download.  This time, when I tried to boot the newly burned DVD, I got "Unknown keyword in configuration file:" followed by eight junk characters and then "Could not find kernel image:  linux."  I tried a cold boot but still got the same thing.  This DVD was junk.  I looked for other sites to download the ISO from, but they seemed to be the kinds of sites that would install malware.

I did want a bootable CD alternative to GParted, for situations like this.  I looked into Parted Magic, but it didn't sound like it had the power of GParted.  Their Live USB option had TestDisk, though, and would apparently only take about 45MB, so I thought it might be a good option to put on an old 256MB USB drive.  I started up UNetbootin and pointed it toward Distribution > Parted Magic > Latest_Live.  I had already plugged in my USB drive, so I chose that, and indicated that I wanted to reserve 50MB for files to be preserved across Ubuntu reboots.  (I wasn't sure exactly what that was about, but it sounded good.)  It started the process of downloading and installing whatever it needed, onto my USB drive.  It said the download would be 175GB.  Larger than expected.  I decided to install from an ISO instead, so that I wouldn't have to re-download if the first try didn't succeed.  So I downloaded the Parted Magic ISO and then went through the UNetbootin installation process that way.

While that was in play, I rebooted the troubled machine and went into Windows > Start > Run > diskmgmt.msc.  As expected, Disk Management reported no drive problems.  It showed all my partitions, including the smallish ending partition I had set aside for some possible future Ubuntu installation.  I right-clicked on the partition adjacent to the ending partiiton, which Disk Management showed as "unallocated."  There was an option to "Extend Volume."  I went partway through that.  It looked like Windows 7 was ready to fix the problem.

I installed Partition Wizard on the troubled machine.  I wondered whether it would perform differently than the bootable CD version (above).  It looked like it, too, was ready to go.  Why was life so hard for had the CD?  Another highly recommended alternative was Easeus Partition Master.  I downloaded it from CNET (four stars, 943 voters) and installed it.  Same thing there:  the Windows installed version saw the partitions as expected, and seemed prepared to merge or resize as desired.  But their "Bootable CD" option took me to a webpage that said the free version -- what I had just installed -- wouldn't include a bootable disk option.

By this point, the UNetbootin process was nearly done.  But I decided to reboot with the Ubuntu 11.10 CD to test some other drives first.  To my surprise, GParted was now showing everything as being OK on the previously troubled machine.  Had we fixed something when I wasn't looking?  And now I saw what the problem was -- why I'd gotten that funky fdisk output (above).  The last partition on the disk, the one that I had set aside as unallocated, was not in the extended partition.  It was a primary partition.  Somehow, I had gotten myself into this arrangement:  Primary Partition 1, Primary Partition 2, Extended Partition, and then Unallocated space outside of the extended partition.

Well, I didn't want that, especially not if it was going to confuse GParted or anyone else.  It looked like I was going to have to wipe out the extended partition -- what's 900GB, between friends? -- and rebuild the thing, and have no excuse to test my cool new bootable USB version of Parted Magic (sniff!).  Then it occurred to me to wonder what those Windows programs -- Disk Management and Easeus and Partition Wizard -- were planning to do with this situation.  Were they somehow going to merge that last unallocated space into the extended partition, in some way that GParted wouldn't do?

Or, no, wait.  I was trying to get GParted to merge the unallocated space directly into the last logical partition.  That's not how these things are done.  I needed to merge the unallocated space into the extended partition, and then shuffle that space on down the line, inside the extended partition, to whatever logical drive was most deserving.  Is that what Windows Disk Management was planning to do?

I decided to find out, because if Windows could walk & chew partitions at the same time, I could meanwhile use the troubled machine to work on other things.  Boot back into Win7; back into Disk Management; extend the volume.  It took ... about seven seconds.  Kind of ridiculous.  I checked it with a GParted reboot.  Now GParted was reporting a single partition filling the entire drive, plus a 1MB partition at the end.

Some days had passed since I had started the processes described in this post, and I wasn't entirely clear on exactly what I had done as I reviewed my notes (above).  But I was sure I had set up an empty ending partition, in my superstition that having a little space at the end could sometimes prevent problems, and 1MB sounded like a possibility.

But in any case, the question had recurred:  why was GParted not seeing the multiple partitions that I had just been working in, in Windows?

I decided to try out the bootable Parted Magic USB drive that UNetbootin had concocted for me.  I made sure my BIOS was set to boot USB-ZIP first (instead of USB-FDD or USB-CDROM), and proceeded to boot that Parted Magic USB.  Its boot menu gave me choices between the default, which would run from RAM, and a couple of bootup alternatives, in case my system had less than 312MB of RAM.  It also gave me submenus for Extras, Failsafe, and RAID.  The Extras menu contained options to load various hardware diagnostics (e.g., Hardware Detection Tool, Memtest86+) and boot managers (e.g., GRUB, GRUB2).

I loaded the default (which, as I soon discovered, would load automatically after 20 seconds if I didn't make a selection).  This gave me an impressive desktop:  Parted Magic was offering me at least 20 to 30 utilities (e.g., Disk Health, Monitor Settings, File Manager) plus Firefox.  I was surprised they were able to squeeze so much onto one little USB drive.  I tried out the Firefox:  it worked.  This definitely seemed like a tool worth having.

But then it seemed that maybe I had played with the Parted Magic boot menus too much.  After a first or second reboot, the graphics became kind of buzzy (i.e., unexpected colored dots flashing in various colored spots, and along random horizontal and vertical lines) and off-centered (i.e., with a couple inches of black space on the left edge of the monitor) and began flashing on and off (i.e., intermittent black screen).  I tried a cold boot (i.e., shut the machine down for at least 30 seconds before restarting, to clear memory).  (Incidentally, the shutdown menu gave me the option of saving my current Parted Magic session.  I guessed that this option was possible on a USB drive, which would have space to store such information, but might not have been available on a bootable Parted Magic CD.)  But when I rebooted, I got two unexpected results:  the USB drive did not boot -- instead, I went into Windows -- and now the buzzy and off-center graphics were affecting Windows too.  Hmm ... probable hardware issue.  I tried rebooting with an Ubuntu DVD, without the USB drive plugged in.  But no, same thing there.  It seemed to be getting worse:  the black spells were longer.  A monitor reset didn't help.  I connected the monitor to a different computer.  It worked OK there.  I tried doing a longer cold shutdown -- several minutes -- and booting again, still without the USB drive plugged in.  That worked.  Now I got a normal Ubuntu screen.  I tried booting the USB drive again.  Now it worked.  Without further ado, I went straight into its Partition Editor.  But that turned out to be just GParted.  It showed the same thing as GParted had shown when run from the Ubuntu Live CD.  And I was getting the funky graphics again, and had to do another five-minute shutdown.

I had not yet succeeded in finding a bootable freeware CD or USB drive that would give me a believable impression of the partitions existing on that hard drive.  I booted into Windows to take a look with the installed (as distinct from bootable) versions of Easeus and Minitool that I had installed there.  I expected them to provide a realistic picture, even if all they did was to parrot what Windows was detecting (i.e., multiple NTFS partitions on that drive).  But now, even after a 10-minute shutdown, the graphics were still buzzy, off-centered, and flashing (indeed, mostly) black.

What in the world had happened?  I was tempted to try the bootable USB drive in another computer, to see whether it was the cause of this, but then I decided I really didn't want two messed-up computers.  It did appear that the USB drive had caused it; there had never before been anything of this nature.  The screen was totally black by now.  I had to do a hard reset to see anything.  I wasn't getting any distortion at the bootup phase.  I tried loading the fail-safe defaults in the BIOS.  I got an option to boot into Safe Mode, so I tried that.  I was still getting some buzziness there.  I tried Control Panel > Device Manager > Display Adapters > right-click on the adapter > Update Driver Software.  It said my software was up to date.  I tried right-click > Uninstall the display adapter (but not its driver software).  A reboot into Normal Mode still showed some buzziness here and there (e.g., in a CMD window).  I went back into Device Manager.  Instead of Display Adapter, I had Other Devices > Video Controller (VGA Compatible).  After five minutes or so, I saw a balloon tip telling me that the drivers specific to my video adapter (ATI Radeon HD 4250) had installed successfully, and now that device was visible as a Display Adapter in Device Manager.  But apparently that took us back to the Dark Ages.  After a reboot, the screen was black.

I rebooted into Safe Mode, hoping to do a System Restore.  Funny thing:  the login screen wasn't taking keyboard input.  I couldn't enter my password.  Even if I typed the password and hit Enter, the login screen did not change.  I rebooted and tried the same thing in Normal Mode, though this time I was entering the password into a black screen.  (I saw a flash of the login screen before it went black.)  I got a Microsoft happy sound, which as I recalled indicated that I had succeeded in logging in.  But the screen remained dark.  Moving the mouse, hitting WinKey, etc. brought no joy.  I could see the hard drive light burning away -- there was obviously a lot going on -- but I was blind to it all.

The monitor had VGA and DVI ports.  It was connected via DVI.  I thought I should try a VGA cable and see if that made a difference.  This transition led to the culprit:  loose DVI connector.  No VGA necessary.  Sorry for slandering the good name of the Parted Magic bootable USB drive.  I mean, it still had GParted, and thus continued to be useless for present purposes in that regard.  But at least it hadn't completely fubared my graphics circuits.  At least not as far as I could tell.

Back in Windows Normal Mode, I started Easeus Partition Master 9.1 and MiniTool Partition Wizard Home Edition 7.1, both in their installed forms.  Both saw the multiple partitions on that drive that GParted had been unable to see.  Minitool listed them in alphabetical order by name; Easeus listed them in the preferred alphabetical order by drive letter.  Easeus did, and MiniTool did not, show an 1.6MB unallocated partition at the end of that drive.  Both appeared to be glorified and perhaps enhanced versions of Windows Disk Management.  Both provided an indication as to whether a given partition was primary or extended.

On the drive in question (unlike another drive in that machine), both Easeus and MiniTool were listing the partitions as neither primary nor logical, but rather as "simple" partitions.  I hadn't paid much attention to the difference until now.  I didn't recall requesting any simple partitions; I had always just used primary and extended.  I guessed that Windows Disk Management had converted the primary and extended partitions previously visible in GParted to simple partitions in that little seven-second operation where it "resolved" the previous situation.  The general idea was that there were dynamic disks with simple partitions, and there were basic disks with primary and extended partitions, but the two did not mix:  you could not have a dynamic disk with a primary or extended partition, or a basic disk with a simple partition.

It seemed that GParted was unable to work with simple partitions.  Another way to say this was apparently that GParted would work only with basic (not dynamic) volumes, and the latter was what Disk Management had given me.  I noticed that MiniTool did, but Easeus did not, provide a right-click option to resize a partition, also available in Windows Disk Management.

A search led to a MiniTool webpage that said MiniTool could convert a dynamic drive back to a basic drive without data loss.  (Of course, one could always wipe and recreate partitions and then restore data from backup, assuming backup existed.)  It looked like I would have to buy the pro version to get this capability.  Another thread said that this wasn't possible without data loss via Acronis products, though one person did interject that a bit of expert-level hex editing could do it pretty easily.  (I suspected that, if it were that easy, these programs would have been offering the capability, but maybe that was exactly what Minitool was doing.)  The MiniTool webpage said that Partition Magic could do it.  I had used PartitionMagic for years, almost always with good results, but thought it was defunct and incompatible with Windows 7.  (Symantec had bought a good program and let it go to pot.)  So I wasn't quite sure what that MiniTool page was trying to say.

I was out of time for researching this issue.  My present impression was that I could (a) stay with the dynamic disk and use Windows Disk Management or the free MiniTool to resize or delete its simple partitions as needed; or (b) backup and wipe the drive, and then create a basic drive and fill it with primary and extended partitions, using virtually any of these tools, including GParted; or (c) buy the pro version of MiniTool and try converting the simple partition to primary without data loss; or (d) explore that expert editing approach to convert the partition manually.

My principal reason for wanting to be able to use GParted was to have a non-Windows perspective on what was happening on the drives.  This was useful in two regards.  First, until Windows converted the basic disk to a dynamic disk, I had been able to see partition information (with GParted and also with fdisk) and get insight into possible problems on my drive.  Second, GParted gave me a very quick heads-up as to whether there were problems on a drive that would call for CHKDSK /R.  Without GParted, I just ran CHKDSK /R on each partition.  It was a very slow and inconvenient process, but I had the impression that it was better than just using the disk tools available within Windows drive properties.  Its inconvenience tended to discourage doing it.  Being able to book GParted (with or without Ubuntu) and take a quick look seemed to encourage more often disk checks.  Typically, there would be no more than one partition needing this attention.

I decided that it would be easier to stay with the dynamic disk, at least for now, and that doing so would give me a chance to learn something about that kind of disk and its simple volumes.  I just had to remember that this was why I was getting those weird results from GParted.  There was also the possibility of an eventual update from GParted or some other tool to handle these volumes.

Saving Disk Space; Finding Types of Files to Shrink

I wanted to save drive space.  The first line of attack was to use a freeware program to find large files.  Among the various possibilitiesWinDirStat, TreeSize, and SpaceSniffer seemed to be the most positively reviewed and/or familiar to me.  Among those three, TreeSize and WinDirStat used a directory listing to indicate the largest folders, while SpaceSniffer and WinDirStat used a graphic approach to highlight large individual files.  In other words, WinDirStat offered both.  The graphic approach led me to multiple space-saving solutions faster than the TreeSize approach.  Between the two graphic progams, I found SpaceSniffer's graphic presentation more readable and zoomable than that of WinDirStat, except that WinDirStat made it easier to tell, right from the start, whether a folder was large because it had many small or few large files.  SpaceSniffer also offered a full right-click Windows Explorer context menu, while the right-click options in WinDirStat were more limited.

This sort of program was good for quickly locating large space hogs.  The concept, which I pursued to some extent, was that you would probably free up the most space most quickly by homing in on very large files or folders.  But once I got past the point of being impressed by the graphics, I noticed certain drawbacks.  One was disorientation, especially in SpaceSniffer:  its method of zooming in on a particular folder was not the exact opposite of its method of zooming out.  It seemed like I was coming out by a route different than the one I had gone in on.  So it could be hard to build an intuitive sense of where you were located, with respect to the drive or directory as a whole.  Another drawback was that I could not compare across partitions.  I had to back out and start over, or run a different session of the program, to see clearly that I should be focusing on one drive rather than another.  Another missing dimension of coherence:  file type.  I might never know, from looking at the graphic maps, that I was consistently making PDFs or JPGs that were larger than they needed to be.  Even a grossly large PDF could escape notice when nestled among AVIs ten times its size.  There was also no logging or comparison capability that might alert me to the fact that a certain folder had been growing more rapidly than I would have expected -- or, for that matter, that a certain folder had disappeared since the last time I ran the comparison.

I thought that I might be able to capture the chief benefits of these programs -- that is, their ability to draw attention to large files -- while also adding at least some of those missing ingredients (though admittedly I could not compete with their graphics).  What I wanted, for this purpose, was a simple file list that I could sort according to selected criteria, particularly folder size, file date, file size, and file extension.  This approach seemed likely to require a much larger time investment up front; but once I had the file list, it seemed I would be able to do quite a bit of analysis and revision of files and processes, so as to root out a variety of kinds of disk space waste.  In short, I was seeking a systematic approach that would minimize my preoccupation with the same few large files (which may have had to remain large for good reason), turning my attention instead to other areas where I could make an impact on drive bloat.

The approach I took was to develop a somewhat automated method of generating a list of files across multiple partitions, and then put that list into a spreadsheet.  It took some work to figure out a way to automate the production of a file list that would work simply in a spreadsheet.  The problem was that simple batch commands, with which I had some familiarity, did not want to put all relevant information about each file on a single line.  (While I was particularly interested in the full path, file size, and date, others might have wanted to draw on some other available types of file information, such as file attributes.)

As described in another post, I found the desired solution by installing TCC/LE and running its PDIR command in a batch file. This operation required two parts. First, I had to work up the command that I would run to make everything happen. This command would start TCC/LE and would tell it to open a batch file. To find where TCC was, I used the Properties > Target path I found in a right-click on its Start Menu icon. To see if this would all work, I generated a little batch file called Test File.bat. Then I ran this:

"C:\Program Files\JPSoft\TCCLE13x64\tcc.exe" "D:\Current\Test File.bat"
It worked. Test File.bat ran. So I went ahead and replaced the second part of that command with the name and location of the real batch file that I wanted to run. I called it ListAllFiles.bat, and I put it in a permanent location with other batch files that I would run for various purposes. (For the moment, it was an empty file, but that would change momentarily.) I also made a copy of the Start Menu shortcut that would run TCC. I modified that shortcut's Properties so that its Target line contained the information just described: the path to tcc.exe, and then the path to ListAllFiles.bat, each in quotation marks as shown in the line quoted above. Now I could double-click on that icon to make TCC run ListAllFiles.bat, instead of having to look up the proper command syntax. Once that was done, I just needed to make sure ListAllFiles.bat said what I wanted. I was still working on that, but for now it looked like this:
@echo off
cls
echo.
echo For cleanest results, empty the Recycle Bin before proceeding.
echo.
pause
cls
echo.
echo Notepad may take several minutes to display a large list.
echo.
echo Be patient ...
echo.
:: PDIR requires TCC/LE to be installed

pdir D:\ /s /(dy-m-d zc fpn) > "D:\Current\List of All Files on D and E.txt"

pdir E:\ /s /(dy-m-d zc fpn) >> "D:\Current\List of All Files on D and E.txt"

start notepad.exe "D:\Current\List of All Files on D and E.txt"

exit
The last four lines (double-spaced for clarity, in case the blog wraps them) were where the action was; everything before that just displayed a few informational notices and a comment about PDIR that would not be visible when the batch file ran. I opened the resulting text file, "List of All Files on D and E.txt," in a LibreOffice Calc spreadsheet, since LibreOffice (alternately OpenOffice 3.3) could accommodate a large number of rows. When I did that, LibreOffice Calc detected that it was a text file and defaulted to LibreOffice Writer, but I just copied and pasted from there into Calc. I told it that the imported text had fixed-width columns, and I pointed out where those columns were.  LibreOffice Calc crashed repeatedly during this process -- I had to remember to save frequently -- but in the end it came through.

So I had my table. I added a column to display the extension, extracted it using the RIGHT function (using different values to extract extensions of different length, e.g., .html, .px), and saved the extension in an adjacent column (i.e., undisturbed by those calculations of varying extension length).  I added a column in which I could record the date on which I had last examined the file to see whether it was currently feasible to shrink it, and another column for notes.  For instance, I had a 6GB zip file that I couldn't get into right now, because there were things I needed to do and learn before I would be ready for the project that its opening would commence.So I added that "Last Examined" column.  For purposes of making a first pass through the files, I could filter out that one, among many others, as not being of further concern right now.

Now that the spreadsheet was ready, I could do some file sorting.  First, I sorted by date, and also by extension, to find any files whose name or other properties might need to be adjusted, perhaps with the aid of a relevant utility (e.g., SetFileDate, TrID).  Then I sorted by file extension.  I added a Flag column to the spreadsheet, to indicate files that would be worth looking at, to see if I could possibly shrink them.  I put an X in the Flag column for each AVI file.  I suspected that, with or without editing, most AVIs could probably be converted to MP4 or some other more compressed format that would retain about the same apparent quality for a fraction of the space.  Likewise for BMPs (most could probably be JPGs) and WAVs (many could be converted to MP3).  I also flagged all ISOs, which I probably didn't need to keep, and all ZIP (and RAR and 7z) files, because they could hold a lot of unnecessary stuff somewhat removed from notice.

Next, I sorted the spreadsheet in order of declining file size, and flagged all files larger than 500MB.  Within that sort, I filtered for common image extensions (including PDF as well as JPG and PNG) and flagged all files larger than 100MB.  I could see that these steps were going to draw attention to whole folders full of files.  For example, I noticed a folder of mixed MP3 and WAV files, all of which could have been MP3s.  In that instance, I suspected I would probably wind up converting the whole folder at once.

I figured I would probably refine this approach, if I went back through it again sometime in the future.  But for right now, these steps resulted in the flagging of about 9% of the total number of files on these partitions -- and those 9% accounted for about 60% of disk usage.  I probably wouldn't have time to work through all of those files individually.  But identifying a few major categories of unnecessarily large files did seem likely to yield some reductions in disk usage that I could achieve through mass conversions and other relatively simple steps.  Finally, while I had concluded that I liked LibreOffice Calc -- it had pretty much stopped crashing as I got more familiar with it, thus probably making fewer finger fumbles -- the focus on just 9% of the files gave me a list short enough to handle in Excel 2003.

Friday, January 7, 2011

Windows 7: Notes on the System Imaging Feature

I decided to make a drive image using Window 7's system image feature. I had made an image previously, but System Recovery couldn't find it when I needed it.  As I was starting the system imaging process, I saw this warning:

Backing up to dynamic disk gives you limited functionality while performing system image restore.
I wondered whether my previous image would have been visible, when I tried to restore from it, if I had copied my Win7 image to a partition that was not on a hard drive containing any RAID arrays -- more specifically, to what Disk Management (diskmgmt.msc) would show as a "basic" rather than a "dynamic" drive.  I had originally saved the image to a basic drive, but had then moved them to a partition on a dynamic drive.  If I had originally tried to save it to the dynamic drive , maybe I would have gotten this "limited functionality" notification at that point.

My mission at this point was to see whether Win7's system imaging could do what Acronis True Image Home 2011 apparently could not.  Could it restore a bootable Windows 7 partition to a new drive C located on a RAID0 array?

I went into Control Panel > Backup and Restore > Recover system settings or your computer > Advanced recovery methods > Use a system image.  It gave me the option to create a backup now.  I took that, just in case.  I tried saving to a partition on the dynamic drive, just to see what would happen.  It did give me a warning, "The selected volume is on a dynamic disk."  I clicked on its "More information" link.  That opened this dialog:
Set up backup

When restoring a system image from this volume, the disks on your computer cannot be formatted to match the layout of the disks in the backup.  To have full restore functionality, select a volume on basic disk as your backup location.
It wasn't clear whether that meant that the very process of creating a backup on a dynamic drive would have this effect, or if they were just saying that trying to *restore* from a dynamic drive would be difficult.  I decided to save to the partition on the dynamic drive nonetheless.  But as I proceeded, it appeared that I was only setting up an ordinary backup.  The option wasn't available to "Include a system image of drives: PROGRAMS (C:)."  The explanation, which didn't entirely persuade me, was, "Because you are trying to restore your computer to a previous state, you can only back up data files" -- not program files, that is.

I canceled out of that and clicked Restart to continue the system recovery -- bearing in mind, here, that I was hoping to restore to the RAID0 array, which had not yet entered service, whereas this backup program was sounding like it intended to restore to (and overwrite) my current drive C.  I was actually willing to let it do that -- the system had been through a rough night -- but upon rebooting, it said, "Windows cannot find a system image on this computer."  It defaulted to the "Select a system image" option," so I went with that.  Then Advanced.  But still, we weren't able to find those images that I now had on two separate partitions.

So now I wondered if the failure to find my previous system images was due to the fact that I had changed their names.  I thought it made sense to do so.  They were all called WindowsImageBackup.  I changed them to put the date in front, e.g., 2011-01-07 WindowsImageBackup.  Maybe this kind of rigidity was calculated to make the program easy to use.  I suddenly realized that, contrary to my assumption, perhaps later backups would not overwrite the previous WindowsImageBackup folder.  I looked inside the WindowsImageBackup folder, and it did appear that images might be saved in subfolders with different dates in there.

So, OK, back in the partition on the dynamic drive, I changed the name of one of those folders back to plain old WindowsImageBackup.  It was still in a subfolder, not in the root folder.  I went again to Control Panel > Recover system settings or your computer > Advanced recovery methods > Use a system image you created earlier to recover your computer.  I skipped the backup step, this time, and clicked Restart.  The "scanning for system image disks" process still said, "Windows cannot find a system image on this computer."  I canceled that automatic search and again tried "Select a system image" and then Next.  Still no joy.

That was the story for the search for the WindowsImageBackup folder on the dynamic drive.  Back in Win7, I tried renaming the copy of the WindowsImageBackup folder that I had copied to the basic drive.  I went through the same steps again.  "Scanning for system image disks" and this time, praise Jesus and pass the tequila, we had a discovery!  We had liftoff!  We had a window that said, "Select a system image backup."  So the answer to this little mystery was that (a) you were not supposed to change the name of the WindowsImageBackup folder and (b) it had to be on a basic (i.e., not dynamic) drive.  Since this worked, I did not try the alternative of going into the WindowsImageBackup folder, in Win7, and just double-clicking on various files to see if they would spontaneously fire up and run the system recovery software.

So now that we had a system image ready to be restored, the question was whether it would insist on being restored only to the basic drive, or whether I would have an option to restore it to the RAID0 array I had set up to be my boot drive.  Actually, it did not say where it was going to restore.  It said that it was going to restore drive C, but it did not say where it believed drive C was located.  But surely it would not wipe out data partitions, at least.  I went ahead with it.  After five minutes or so, it said, "Restore completed successfully," and it rebooted back into Win7.  It had restored the image to the existing drive C, the one I had booted from.  So there did not appear to be an option to choose a different target drive.

So I now had restored my original hard drive to its original condition as of about midnight the previous evening, before things started getting funky.  Along the way, I had learned a bit about how Win7's system recovery feature functioned.

The remaining unknown was whether I could just munge my various copies of the WindowsImageBackup together into one.  Keeping backup copies, I went into Windows Explorer and did a simple copy-and-replace operation to combine those WindowsImageBackups, into one properly named WindowsImageBackup folder.  I didn't try to figure out which versions of the same file to keep; I just copied and replaced, more or less at random.

I left this combined WindowsImageBackup on the dynamic drive for the moment.  Then I went into System Image and let it do its reboot thing.  Its scan for system image disks failed to detect the ones on the dynamic drive.  So it seemed to be pretty much confirmed that it didn't want to save to, or restore from, dynamic drives.

I moved the WindowsImageBackup folder, containing the munged combination of two different backups, to the basic drive, and tried System Recovery again.  This time, it found the folder, but it detected only the older of the two backups within it.  So it seemed my file munging process had probably replaced the newer versions of some of the indexing files in that folder with older versions.  Apparently the imaging program looked at those small files when deciding which images were available.  I didn't test whether overwriting older indexing files with newer ones might have shown only the newer images.  Tentatively, it seemed that the best solution, when there were two distinct WindowsImageBackup folders, was just to keep them separate and eventually delete one of them, rather than try to combine them.

Sunday, July 18, 2010

Making Space on a Windows XP System Drive

I wanted to make space on my computer's C drive.  For some purposes, it could be more sensible to just install a bigger hard drive or make a larger partition for drive C.  But for other purposes (in my case, where drive C was in a virtual machine (VM) in VMware Workstation and you really didn't want to deal with the slowness and overhead of a huge virtual drive), there might be no alternative but to make space.  Drive C had a habit of just continuing to grow and grow, if you let it; I had one that got up to around 35GB.

I started with a general-purpose Web Developers Notes cleanup page.  I was already doing one thing they suggested, which was to use portable versions of various utilities.  IrfanView, for example, was available in both an installed version and a portable version.  Typically, there was no difference in functionality; the main thing was just that you had to create a link to the portable version if you wanted to have it listed in your Start > Programs list.  Portable versions could be run from anywhere, which means they wouldn't have to be on drive C.  So could installed versions, in theory; but in practice, programs didn't always run correctly and updates were not always applied, when the program was not located where the programmers expected it to be.  Back in the late 1990s, I did spend an enormous amount of time trying to figure out which installed programs could safely be installed somewhere other than the default location, but ultimately I concluded it wasn't worth the hassle.  In short, if it was a portable version, I put it in a folder labeled "Installed Here" on drive D; otherwise, I installed it on C.  Those who hadn't done this during installation could, as advised, uninstall from C and reinstall on D.

I was also doing another thing they suggested, which was to keep data files (including e-mail) on a separate partition.  Program files went onto drive C if they had to be on drive C, or if they would be a lot less hassle if they were on drive C (e.g., see previous paragraph).  Stuff generated by me and by the rest of the world went on drive D whenever possible.  It helped, for this purpose, to relocate those folders (unwanted by me, at least) that Windows automatically created, including "My Documents" and "My Pictures" and "Microsoft, I Need Your Help in Telling Me Where to Put Everything."  Also, in Microsoft Office programs (among many others), I could change settings to store files by default in a folder that was not on drive C.  Then you could back up drive C once every couple of months - whenever you had accumulated enough new program installations and adjustments -- using Acronis or some other drive mirroring program, while continuing to back up your drive D (data) partition on a daily if not hourly basis.

Another suggestion was to delete programs that were not being used via Control Panel > Add or Remove Programs.  It was unwise to remove programs that you need, or to remove programs whose function was unclear.  No point making extra work for yourself or screwing up your system.  (Incidentally, the command to open Add or Remove Programs was this:

rundll32.exe shell32.dll,Control_RunDLL appwiz.cpl,,,
That was a pretty funky command, and for future reference I saved a webpage containing others like it.  I will be combining these commands in a single batch file (below) for one-click all-purpose cleanup.)

Another space-saving step they recommended, which I rejected for my purposes, was to empty out the browser cache (in e.g., Internet Explorer, Firefox).  Why bother?  It would fill up again -- I would want it to fill up again, so as to load pages faster and save the cookies that would store my login information for many webpages -- and I would still need the disk space to accommodate it.  This step would make sense for a one-time task, like making an image of drive C.  For more enduring space saving, the more sensible step was to go into those browsers and make the cache smaller.

A step they should have recommended, but didn't, was to move the page file.  It could be huge.  After moving it and rebooting, I made sure there was not still a copy on drive C.  There was also apparently a Microsoft utility to protect against performance degradation due to pagefile fragmentation.

Moving the paging file was more complicated in my case, because I was using VMware.  (In other words, those who are not using VMware should skip this paragraph.)  In the case of my virtual machine, it seemed to be preset to about 2GB.  Then I came across a mention of the option of setting up a separate virtual drive, within my virtual machine, for the paging file.  The advice, there, was to go into VMware Workstation for this virtual machine and choose VM > Settings > Hardware tab > Add > Hard Disk > Next > Create a new virtual disk > IDE, Independent, Persistent > Next.  I set the disk capacity at a 4GB single file, which seemed like plenty when combined with the 2GB of RAM I was allocating to the VM.  Then, continuing with the advice, I powered up the VM and, with a series of right clicks, I initialized, partitioned, and formatted that drive, set its drive letter to I (so that it would not interfere with D or other drives I was already using), and set its pagefile.  I varied somewhat from the advice on one point; I set the size of the drive I pagefile to a minimum and maximum of 3.5GB, having heard that making the system enlarge the file could take a hit on performance.  (I originally tried 4GB, but Windows gave me warnings that the pagefile drive was running out of free space.)  Finally, they advised me to reboot again and set drive I to nonpersistent. This, it seemed, would take care of the problem of pagefile fragmentation.

They also recommended defragmenting the hard drive.  I used Smart Defrag for this purpose.  It was supposedly running all the time, but I included it in my batch file anyway because it always seemed to have things that needed to be done whenever I did open it.

They suggested using WinXP's Disk Cleanup (command line:  cleanmgr).  Good idea, but typically this made less space than one might have imagined.  Again, there was a tradeoff:  you could make more space by deleting things that might cost you extra time (e.g., Office setup files) whenever you did next need them.

Another possibility was to run a program to see which files and folders were most space-consuming. Raymond recommended TreeSize Free and JDiskReport, both of which were portable freeware.

Someone at HelpWithWindows.com recommended deleting unneeded old files.  Some of this was already being taken care of, for me, via Advanced WindowsCare 2, which I had included in my Startup folder.  Still, I used a complex command to open a search dialog, where I could search for files matching these patterns.
*.bak
*.chk
*.gid
*.old
*.tmp
*.~mp
*.$$$
*.000
~*.*
*~.*
Not every file coming up in those searches would deserve deletion, but many would.

These steps, combined, freed up about 2GB (10%) of my drive C.  The batch file I wrote to automate some of these steps looked like this:
start rundll32.exe shell32.dll,Control_RunDLL appwiz.cpl,,,
start "" "C:\Program Files\IObit\IObit SmartDefrag\IObit SmartDefrag.exe"
start cleanmgr
start "" "D:\Miscellany\Installation\Installed Here\TreeSizeFree.exe"
start "" "D:\Miscellany\Installation\Installed Here\JDiskReport.exe"
type nul > %temp%\1.fnd & start %temp%\1.fnd & del /q /f "%temp%\1.fn
It ran slowly, but it did tend to automate the steps needed in the process.