Showing posts with label software. Show all posts
Showing posts with label software. Show all posts

Thursday, November 24, 2011

Freeware: A Thanksgiving Tradition: First Cut

Summary

I decided that Thanksgiving would be a good time to revisit, annually, the question of what freeware I was using, and what an appropriate contribution would be. I was continuing to develop my customized Start Menu as a repository of links to all of the websites, installed programs, and portables that I used. So a search of the Start Menu, plus a list of Firefox add-ons, seemed to give me a substantial if not complete list of freeware programs for which some contribution might be appropriate. I developed that list in a spreadsheet.  At least for the time being, I excluded some programs (e.g., those that I had used previously but wasn't using anymore; those provided by corporations like Google and Microsoft), so as to focus on the ones that seemed most currently entitled to compensation.  I decided on appropriate values for each program, and also decided to do writeups or reviews.  There were a number of them, so I set up my computer to reopen the spreadsheet weekly as a reminder.  I hoped to be caught up by the next Thanksgiving.

Discussion

My computer, like many, was running a variety of free and paid-for programs.  The motives behind the free programs seemed to vary.  Some programmers evidently hoped their creations would become famous, at which point they could begin selling the software rather than giving it away.  Some supported their work via advertisements.  Some wanted to help others; some just shared a tool that they had invented to address their own needs.


Whether the inventor asked for payment or not, it seemed only fair to pay them something for their work.  There were, however, some problems with that thought.  One was that paying them would cost money.  Most of us, at one time or another, have been tempted not to pay even when we could easily afford it.  In a less piggish vein, there was also the reality that many of us could not afford to pay a fair price for all of the many free tools that a computer system might be running.  We might instead be inclined not to use them, with inferior results for everyone concerned.

A related problem was that it was not clear how much to pay.  Few freeware writers seemed guilty of asking too much.  To the contrary, even the developers of incredibly useful programs tended to ask far less than their programs were worth.  Maybe they were humble, or were underselling themselves; maybe they didn't want to appear too demanding or ridiculous.  For whatever reason, it appeared that freeware compensation provided on an honor-system basis would preferably draw upon an estimation of each program's comparative value, regardless of what the programmer proposed to charge for it.

There was another side to that question of how much to pay.  If I wanted to use Microsoft PowerPoint, I would have to buy a copy.  Depending on Microsoft's internal decisions, I might be able to buy PowerPoint by itself, or I might have to buy a copy of the entire Office suite.  This would be true regardless of whether I wound up using PowerPoint all day, every day, or actually only had a one-time need for it.  In the for-profit market, this issue tended to be worked out on the macro level -- Microsoft's profits depended on charging at a balanced price across a large number of potential purchasers -- but not on the individual level.  That is, I would pay the same price as someone whose usage was very different from mine.  But in the honor-system freeware world, I could choose whatever payment plan made the most sense.  I could buy it outright, or set aside money on a per-use basis, or pay an annual license-like fee, or adopt some other basis, as I chose.

Over the past several years, I had written up a couple of blog posts on the question of how to calculate how much I had used various pieces of freeware.  There didn't yet seem to be a widely used system that would help me in this.  I had eventually decided that maybe this would be something to deal with once a year, during the Christmas holiday season, but that didn't work out.  That season tended to be busy, and it also wasn't usually overflowing with spare cash.  So then I came to the idea of pinning this inquiry to Thanksgiving instead.  As I thought about it, that actually seemed like a better connection.  Freeware was a gift, to be sure; but it was a gift to be thankful for.  And if I made it an annual thing, it could boil down to a couple of relatively simple questions:  how thankful am I, based on my past year's usage, and how do I express that?  The answer to the latter question could range from gratitude to cash payment, depending on the situation; I would have to work that out.

For starters, I decided that it would be OK to do this calculation just once a year.  Yes, there would be programs that I had used during the year but had then discarded, and I might forget or unintentionally minimize their importance to me as of Thanksgiving.  But I didn't think that would be a major problem, and I also felt it would be unwise to try to do it more frequently.  An annual tradition could become something to be proud of; but an expectation that I would do this every month could convert the whole thing into a chore.

The next step, I thought, would be to figure out what I was using.  In my case, there seemed to be two principal locations for freeware:  Firefox add-ons and my Start Menu.  The Firefox part was easy enough:  I could just go into Firefox Tools > Add-ons for a list of the extensions and themes in use.  Alternately, as someone advised, I could type "about:support" in the Firefox address bar to get a printable report.  The Start Menu was also easy enough to see:  I could just go to the Windows 7 Start button and write down all of the programs visible there.  My Start Menu was an especially concentrated location for the programs that I would use because I had customized it to include not only links to installed programs but also the complete program folders for portables.  I also had a project underway to convert my Firefox bookmarks to links in the Start Menu (for websites that I considered tools, such as Softpedia) or to items for my Reference list (for informational sites like Wikipedia).  So it seemed that Firefox and the Start Menu would pretty much capture the list of programs I was using. 

I decided to create the list in an Excel spreadsheet.  (I was using Excel 2003.)  I had columns for the name of the program, the version, and the serial number, if any.  I got a good start on this by copying and pasting the results of that Firefox about:support list.  But I had tons of stuff in my Start Menu.  I didn't want to copy all that information manually.  It seemed advisable to automate the process, if possible.

Next, I extracted relevant information from my customized Start Menu.  This could be done manually.  The following comments describe my attempts to automate that process somewhat.  I began by using Windows Explorer to visit the folder where my Start Menu was located.  I had moved my customized Start Menu to a drive other than drive C, so as to share it across my network and to back it up along with my other data; but as far as I could recall, the way to find the Start Menu folder in a more virgin version of Windows 7 would have been to right-click on the Start button and choose Open Windows Explorer.  Once I had the top level of the Start Menu, I went to the address bar in Windows Explorer and selected and copied its path.  Then I opened a command window (Start > Run > cmd) and typed two commands.  First, C: (or whatever the drive letter was, for where the Start Menu was located), and then "cd " followed by the pathname that I had just copied from Windows Explorer.  (To paste into a command window, I had to right-click on its top bar and then choose Edit > Paste.)  Since this pathname had spaces in it, I began and ended it with quotation marks.  Example:  cd "C:\Folder\Start Menu" and then Enter.  Now I ran a few commands.  Of course, I could save these in a batch file to simplify things in the future.  The commands were as follows:
dir *.lnk /s /b > "D:\Folder Name\SMProgs.txt"
dir *.exe /s /b >> "D:\Folder Name\SMProgs.txt"
These commands would fill SMProgs.txt with directory entries for every shortcut and executable file in my Start Menu folder.  Since the second command was almost identical to the first, the fast way to enter it was just to press the up-arrow and then use the left arrow to go back and add a second ">" symbol and change LNK to EXE.  (I chose the /s and /b options for the DIR command based on information obtained by typing "dir /?" and I was able to view the full printout of resulting information by highlighting the cmd window and pressing WinKey-LeftArrow to make the cmd window tall.)  These two commands created a file called SMProgs.txt.  I opened that file and copied and pasted its contents into an empty Excel spreadsheet.  I did search-and-replace operations to remove the .exe and .lnk extensions, and then ran a formula down an adjacent column to automatically detect exact duplicates.  (That is, sort by the column to be tested, and then use a formula like "If A1 = A2, put an X here, otherwise put nothing."  Of course, the results produced by such commands would then change if I sorted the Xs together, unless I first used an Edit-Copy, Edit-Paste Special-Values combo to convert the formulas into values.)  After deleting exact duplicates, I used a reverse-text function with FIND and MID commands to extract the filename and folder into separate columns.  I now realized that the preceding duplicate-detection step was probably unnecessary, as I now sorted on the filename column and deleted duplicates again.  So, for example, I now had only one entry for a file called Microsoft Excel.  But I still had more than 1,500 rows in the spreadsheet.  Further sorting, editing, and filtering gave me a list of about 450 actually installed programs.

The automated steps had helped somewhat.  I hoped the process would become faster if I did it again in subsequent years.  But from this point forward, it was a manual process.  Using that list of 450, I added spreadsheet columns to mark purchase dates for programs I had already purchased, to exclude those that I did not intend to pay for (e.g., free Microsoft utilities), and to indicate those that I had actually used, as distinct from those that I might have tried but didn't remember, or had installed because I thought I might need them someday.  In the resulting list of about 150 programs, I looked at the list of about a dozen that I had used but would probably not use anymore.  I barely remembered some of these programs, but a few had been really useful in Windows XP.

At this point, I had to decide what I owed.  I felt there was probably not much of an obligation to the people who had written programs that I had only used on a trial basis, though at least I could write reviews for the benefit of others who might use those programs, if I remembered enough to say something helpful.  So I started with that thought.  My reviews could be on sites like Softpedia or CNET (or Newegg or Amazon, for purchased programs), or perhaps a discussion here in a blog post would be appropriate.  I probably would not bother doing a writeup if there were already many reviews, especially since these programs were increasingly outmoded.  It occurred to me that it might also be helpful if I wrote reviews of purchased programs.  I decided to treat the question of hardware reviews separately.  So now I went back down my list of 150 programs I had used and, in a new spreadsheet column, marked those for which I had enough experience at least to write a brief comment or review.  The result was roughly 50-50:  I could say something about half of the programs, and not about the other half.  I looked at the latter and, not surprisingly, found that I felt no particular obligation to pay anything either.  These programs were generally on their way into my life, or out of my life, but had not yet been and might never be useful to me, aside from possibly a brief exploration at some point.  No doubt the list would change somewhat by the time another Thanksgiving rolled around; I planned to revisit them again next year.

So I focused on the 75 or 80 programs that I had used enough to write something about.  There was a question of what to write, and where to write it.  I had reviewed some commercial programs on various websites (e.g., Amazon, CNET), and had also written about my use of some programs in posts on this blog.  While any serious writeup would be better than no writeup, I preferred writing posts on my own blog, for several reasons.  One was that, here, if I added links to other sites, they would not be removed.  I could also describe a process, and the program's performance in it, in much greater detail than would be acceptable in a typically brief review on someone else's website.  Of course, I also appreciated the opportunity to build up my own site while I was discussing someone else's product.  There was the additional concern that writing on a website that might have more visitors (e.g., Amazon) could help to make it more appealing than another that I might actually consider better (e.g., Newegg).  In the past, I had sometimes posted reviews across a number of commercial and sharing websites (including e.g., TigerDirect, Major Geeks).  This could have the drawback of confusing potential users who might think that such websites were sharing reviews among themselves, seeing that they would encounter exactly the same review on multiple sites.  It could also appear that I was propagandizing.  And it could be time-consuming for me to post reviews on six to ten websites, when they requested not only a review but various stars, statements of pros and cons, bottom-line summaries, and so forth.  I decided, as I had decided previously, that the best approach, where possible, was to do a writeup on my own blog that would provide detail and information beyond what would be allowed on a commercial site, so that I could find it, link to it, and expand on it in the future.

Going down the list again, in another spreadsheet column, I marked off those programs that I had already reviewed or discussed in some detail (e.g., IrfanView) and those for which there did not seem to be much need for a review because they were already well known and I was not using them in any noteworthy way (e.g., Skype).  That cut the list in half again, down to about 40 programs that I hadn't yet reviewed and felt I probably should, for the benefit of the programmers and/or of the users.  I added a line to my WEEKLY.BAT file to open this spreadsheet, as a reminder to write something about one of these programs each week.  There would be weeks when I didn't do it, but I hoped that, come next Thanksgiving, I would have substantially reduced this particular obligation.

With the topic of reviews out of the way for the moment, I had to face the matter of money.  The first problem I tackled, in this area, was to decide which programs called for payment.  Among programs that I had used in the past but no longer used, some had been worth paying for.  I wasn't sure how to recall or reconstruct which programs those might be.  I decided to defer that question for the time being, so as to keep this project manageable, and focused strictly on those programs that I expected to continue to use, that I had not yet paid for, and that were of a type for which payment could reasonably be expected.  I excluded those for which I did not yet know enough to write much of a review, on the theory that, in those cases, I was still in something like a shareware trial period.  That is, it seemed unlikely that I would have bought these programs for which I did not have much present use.  Filtering the spreadsheet for these criteria yielded a list of about 50 programs.

Now, given this list of programs for which I should pay something, how much should I pay?  One answer would have been that I should buy the Pro version -- should upgrade from the freeware version, that is -- for those programs that offered that option.  Before reaching that conclusion, though, I decided that payment should ideally be on a sliding scale.  If I were rich, I would want to buy the company or support the individual that had done such good work, in hopes that they would do more of the same.  If I were well-off but not truly rich, I might think that I should pay five or ten times the asking price for the professional version, or maybe buy and distribute five or ten pro licenses, so as to make up for some others who had not yet gotten around to paying, or who couldn't afford it.  If I had only enough money to take care of myself but not enough to cover others, the answer might be to just buy the pro version, with one caveat:  I probably should pay more if the programmer priced it too low.  It vaguely seemed to me that the pro versions of programs I had bought in the past year or two had tended to be around $40, so I tentatively decided that my target payment + contribution for a pro version of a significant program should be in that range, if I could afford it.  In my experience, less significant shareware programs tended to cost around $10-20.  In the commercial market, of course, major programs could cost $100 or more.

Where money was tight, there would be an option of trying to pay or contribute a few dollars per program, so as to cover all programs at the same time, or instead singling out a few for more generous reimbursement.  This question was already decided, in the case of those programs where I needed the pro version and therefore just had to pay what they asked.  But for these ~50 programs, it was up to me.  I decided to start at the low end, with a target of $5 for relatively trivial and $10 or $15 for more significant Firefox add-ons -- for the ones, that is, that saved me money and/or time -- and I assigned these values to those programs in a Target Price column in my spreadsheet.  That accounted for a total of 19 rows in the spreadsheet and a value of $155.  Among the remaining programs, I decided that, on the other end of the spectrum, Firefox had been, for me, an incredibly valuable and complex program easily worth $100 to me, even though it humbly suggested contributions of $5 to $30 -- and that I would probably have had to pay $100 or more to get it, if its major competitors (i.e., Google Chrome and Microsoft Internet Explorer) weren't supported by mammoth corporations that apparently saw the browser as a way to control access to the Internet for profit.  Given that view, I probably wouldn't have paid more than around $20 for Opera, which I used only occasionally.  With thoughts like these, I proceeded to ascribe target values to each of the other ~50 programs on my list -- the question being, again, not what would be the lowest price I could get it for but, rather, what was it worth more realistically, considering such factors as my own need and encouraging software development.  Most of the other programs on my list wound up in the $10, $15, and $20 categories.  I would have to adjust those values if further investigation revealed that there were pro versions I didn't know about, at whatever price they might be selling.  I had also kind of rushed through my estimate rather than focusing on each program.  But for purposes of rough estimation, taking account of everything from Firefox to its add-ons, I estimated a value of $750 for these ~50 programs, for an average of about $15 each.

I wasn't in a position to spend $750 on software right then.  I also didn't want to donate $20 to some program and then find out, later, that they had come out with a pro version or had gone to a for-profit model and were now going to charge me another $30.  I decided that this question of payment would be best completed item by item, as I revisited the spreadsheet on a weekly basis -- so that, hopefully, I would be caught up by the next Thanksgiving.  For now, I decided to start by paying for a few programs and add-ons that I had been using for years and for which I was most appreciative.  As I began to focus on those programs, I realized that this drawn-out, weekly approach would probably counteract a bit of stinginess that had set in as I was rushing through my valuation of those programs -- that, in other words, I would probably tend to bump the prices up closer to an average of $20 as I proceeded.  And so it seemed I was set for the coming year's worth of weekly returns to the spreadsheet, with writeups and contributions as circumstances warranted.

Monday, January 17, 2011

Windows 7, Vista, and XP: Networking Four Computers

I had installed Windows 7 on two computers.  Here, I'll call them computer A and computer B.  Both were connected in a basic wired network by ethernet cable to a router.  The router I was using was a Belkin Connect N150, model no. F7D5301.  I did not then know of the possibility of turning my computer into a WiFi hotspot, but I probably would have gone with the purchase of a router even if I had known of it.

I wanted to make the computers visible to, and able to share files with, one another.  I also had a laptop running Vista and an old Windows XP desktop.  This post describes the steps I took on the software side, after sorting out prior hardware issues.

I started with just computers A and B connected to the router.  I ran a search, but it seemed like most of the webpages that were coming up had to do with troubleshooting.  I wasn't smart enough to get into trouble yet.  But then, as I started looking at some of those sites, I wondered whether possibly the computers were already visible to one another, merely by being connected to the router and not complaining about any IP address conflicts or anything like that.  In Windows Explorer on computer A, I clicked on Network.  I got an error up by the menu bar:

Network discovery and file sharing are turned off.  Network computers and devices are not visible.  Click to change.
So I clicked and selected "Turn on network discovery and file sharing" and chose only the home network (not public) option.  Within about 30 seconds, Windows Explorer on computer A was showing both computers.  Well, that was easy.  So now I ran the router's setup software on the Vista laptop and on the WinXP machine, and then plugged them into the router.  I changed the WinXP machine's name and rebooted it.

Changing that name left an outdated entry for the WinXP machine's previous name in the displays on several computers.  On the WinXP machine, it produced an error message:  "[Oldname] is not accessible.  You might not have permission to use this network resources."  The outdated entry wouldn't go away when I refreshed (F5) the view.  Note that I was still down in the Network section of the Windows Explorer folders pane.  I was not using the Homegroup option up topside.

At this point, a search led to the advice to go into my router's firmware, via its built-in webpage, and look for a way to configure its list of devices.  Apparently that's where the old name was being stored.  Finding that place in my router required me to go into its installed software.  Doing that required me to reinsert the Belkin CD and try to run the router utility from there, because the link that its installation had added on my Start Menu wasn't working.  It looked like the CD was apparently only for simple installation purposes, either reinstall the software or quit.  Eventually I figured out that Belkin expected me to have auto-play running on my CD drive, else it would not show me the option of going into its advanced features option.  So I went into Control Panel > Autoplay.  But there, I saw that Autoplay was already on by default for all devices.  I had only noticed the advanced features option by accident -- I left the CD in the drive when I rebooted the machine after an update of some other software.  Taking another approach, I downloaded an upgrade of the router's software, installed that, and used its Advanced Settings option to get into the router.  Once there, though, I did not see any entry for the old computer.  By this time, though, a couple of hours had passed -- I had been working on something else -- so maybe the router refreshed itself.  Hitting F5 to refresh the Network entries in Windows Explorer showed that they, too, had forgotten that old machine by now.

At this point, the two computers (A and B) running Win7 were showing Network entries for all four computers, plus administrators on computers A and B, plus the router.  The Vista laptop and the WinXP machine were seeing the Win7 computers, but not each other.  Oddly, the router's DHCP client list was showing the names of three computers, but not the Vista laptop.  The Win7 computers were able to go online; the others were not, even after a reboot.  So far, networking in Win7 was looking a lot easier than it had been in Vista or XP.  I wasn't interested in investing a lot of time in this, and I had been thinking of dismantling the XP machine, so probably that would be the solution to part of this networking problem.

To summarize, networking in Windows 7 appeared to be a matter of plugging computers into a router, maybe fiddling with the router's settings or upgrading its software (neither necessary in this case), and turning on network file sharing when that option came up.  Networking problems with Vista were still an unknown.  I had previously been able to just plug the laptop into the router and go online, and had not tried to share files among other computers.  But then I noticed that someone (I) had set the TCP/IPv4 settings on the laptop to something other than automatic.  That took care of it.  The Vista laptop saw everything.  Its eyes were opened.  That solved the problem for the WinXP machine too.

The remaining question:  was I actually able to do anything among these computers?  On computer A, I double-clicked on the icon for computer B.  It said,
\\Computer B is not accessible.  You might not have permission to use this network resource.
I searched on that and, as recommended, went into Control Panel > Network > Change advanced sharing settings > Home or Work (in the Win7 computers).  There, I made sure everything was on except password-protected sharing.  The only change I actually made was to turn off the requirement for a password.  And that did it.  Computer A was now able to see the contents of computer B.  A similar change on the Vista laptop had the same effect.

Now I could see "Users" or "Users Share" folders.  How to get full access to one computer's contents from another?  The first step seemed to be to right-click on a drive or folder, in Windows Explorer, and choose Share with > Advanced sharing > Sharing tab > Advanced sharing > Share this folder.  I did that with one drive on computer A.  I tried to view it on computer B, but I got an error:
Windows cannot access \\ComputerA\SharedFolder
You do not have permission to access file://computera/SharedFolder.  Contact your network administrator to request access.
For more information about permissions, see Windows Help and Support
I clicked on the Help and Support link.  It said this could be because
You haven't created or joined a homegroup
You're not using a homegroup, and the folder or printer you're trying to access has not been shared
Network discovery is turned off
Password-protected sharing is enabled
The computers aren't in the same workgroup
Your computer doesn't have the latest updates for your router
Of these, it was true that I hadn't joined a homegroup; I wanted to see about doing it without, primarily because I had seen a few notes of people having complaints about homegroups.  The folder had been shared.  I had turned on Network Discovery and password-protected sharing.  I had now installed the latest downloads from Belkin for the router.  I tried to access \\ComputerA\SharedFolder from the laptop.  Same error message -- except that, instead of pointing me toward Windows Help and Support, it said this:
No more connections can be made to this remote computer at this time because there are already as many connections as the computer can accept.
That sounded like old-school networking voodoo.  I didn't want to go there.  I just wanted the computers to link up.  I decided to try the homegroup option, since that seemed to be the only option that might help.  I reasoned that Microsoft had probably created the homegroup thing because the non-homegroup approach to networking was such a cluster.  Ah, but then I discovered a possible reason for that last sentence in the Vista error message.  I had set the Advanced Sharing properties > "Limit the number of simultaneous users" option to 2.  I thought that was probably all I would need.  I tried setting it to 20, where it was before.  While I was there, I clicked on the Permissions button and gave Full Control to Everyone.  I closed out of there and took another look in these other computers.  Now I got more error messages.  On computer B, the attempt to look at \\ComputerA\SharedFolder gave the same error as before.  But on the Vista laptop, it produced this error:
Windows cannot access \\ComputerA\SharedFolder
Check the spelling of the name.  Otherwise, there might be a problem with your network.  To try to identify and resolve network problems, click Diagnose.
I did that.  It came back with this:
\\ComputerA\SharedFolder is available but the user account that you are logged on with was denied access.
Well, it was true -- as I discovered, back in the Advanced Sharing properties dialog -- that if I clicked the Caching button, I got a new option that said this:
All files and programs that users open from the shared folder aer automatically available offline
But I decided that meant that files from computer A would be copied to computer B and stored there somewhere, if I looked at them on computer B.  I didn't want that.  I had enough clutter already.  I set the caching to "No files or programs from the shared folder are available offline."  I tried the reciprocal step of setting up a partition on computer B, to see if it would be available on computer A or on the laptop.  No joy.

I came across a webpage that made me think perhaps I hadn't explored the requirement (above) of making sure the computers were all on the same workgroup.  In Control Panel > System > Computer Name on the XP machine, it was just WORKGROUP.  Same thing on the laptop and the Win7 computers.  So that wasn't the explanation.

So, OK, maybe it was time to try homegroups.  On computer A, in Windows Explorer, I went to the Homegroup link at the top of the folders pane.  That opened up an option to join the homegroup that Windows detected already existing.  I did that.  It told me I needed to get the password from Ray on computer B.  That was odd, because I was Ray, and I didn't have any such password.  I also didn't want the homegroup to have a password.  It was just me here.

Upon seeing a page that showed something different from what I was seeing, I decided to back up here.  Before going on to the homegroup option, I noticed that, when I right-clicked on the drive I wanted to share and selected "Share with," I wasn't seeing a list of groups or people.  But, ah, when I clicked on a folder instead of a drive, I did.  Hmm.  But surely it was possible to make a drive visible to other computers?

I did a search and found a thread where people were having exactly the problems I've described here.  At this point, the last post in that thread suggested starting with Start > Run (or just type) fsmgmt.msc.  I did that, on computer A.  I went into Shares and double-clicked the drive I was trying to share.  This gave me pretty much the same stuff as before, except in its Security tab.  There, I selected Users > Edit > Full Control > Apply.  I clicked back and forth a couple of times to make sure it took.  The first time, it didn't.  I okayed out and tried again to access this drive from computer B.  No luck.  But now I wondered what kind of user this was, this person on another computer.  The answer seemed to be that it was an Everyone.  In other words, the groups or users presently listed in that Security tab were Authenticated Users, SYSTEM, Administrators, and Users; and since those all had full permissions and the person (me) trying to access computer A from computer B *still* couldn't get on, apparently I had to be in the Everyone class.  So, as advised, I clicked Edit to add Everyone, and then gave Full Control to Everyone > Apply > OK.  This would have been just as easy to do back in the drive > right-click > Share with > Advanced dialog, actually.  Now that Everyone was allowed to see everything, surely someone on computer B would now be able to see the contents of the shared drive on computer A.  And it worked.  Woo-hoo!  So the missing part here was that I needed to add Everyone in that special place.

It was getting late, and I'd had too much fun for one evening, so I decided not to press the point and work through it with all of the machines.  I had faith that this was approximately the answer at least for letting other computers see what was on a Win7 machine, and that's all I really wanted to achieve here anyway, as I had nothing interesting on the laptop or the WinXP machine.  There were some other interesting networking possibilities I was eager to get on with, as described in some other posts on this blog that I wrote up about the same time as this one.

Thursday, January 13, 2011

Windows 7: Native Virtual Hard Disk (VHD) Boot

I was interested in making Windows 7 Ultimate run faster.  One possibility that came to my attention was using native virtual hard disk (VHD) boot in connection with RAID0.  This post discusses that.

A virtual hard disk was just a hard disk in virtual form.  Normally hard disks were physical.  A virtual hard disk, like a virtual machine (VM), would exist as a file (or a set of files) on a physical hard drive, not as a physical device in itself.  In other words, the user would have a physical hard drive in his/her computer, and somewhere on that drive s/he would have a VHD.

The "native" part of "native VHD boot" meant that the user would have direct access to the VHD.  Ordinary VM software (e.g., VMware, VirtualBox, Virtual PC) would require the user to install an operating system (e.g., Windows 7, Ubuntu) to serve as host (also called "parent").  Then the user would have to install the VM software (e.g., VMware) to manage the guest, and then install another operating system (e.g., Windows XP) inside the VM software as guest.

The VHD concept seemed to be that the host layer would be removed from the equation.  There would be a virtual hard drive that could be loaded or unloaded, perhaps without rebooting the computer.  The computer would be running the "guest" machine (e.g., a Windows XP installation) in virtual form.  For example, a packaged Windows XP installation could run on computer A, or on computer B, without having to be reinstalled and without having to install a host and a VM program like VMware.  In addition to simplicity, this scheme would presumably have the advantage of improving performance by removing extra layers and tasks.

In Windows 7, native VHD boot was apparently available only in the Ultimate and Enterprise versions.  This meant that it could be out of the reach of ordinary users who were not inclined to spring for such expensive software, unless perhaps they had access to it through their jobs or through academic versions.  It did not appear that Ubuntu, for one, would be developing a comparable capability in the near future.  There were hypervisors like VMware's ESXi that apparently provided somewhat similar bare-metal functionality, but ESXi seemed unavailable for my purposes.

I was interested in the possibility of using native VHD boot with RAID0.  This interest arose from a desire to make Windows 7 and other operating systems run quickly.  New installations of operating systems would typically perform well, I found, but then time and the addition of more programs and other demands would slow them down.  So, for example, I had found that Windows XP running as a VM inside VMware Workstation on an Ubuntu host was much more stable than native WinXP, but its performance had slowed dramatically.  My attempts to fix it (even through reinstalling both Ubuntu and VMware) had failed.  After several years of experimentation, I had therefore recently given up on Ubuntu for my purposes.

VHD seemed to offer a good way to run Windows 7 in a RAID array.  I had found that Win7 did not, itself, support booting from the software RAID0 arrays that Win7 was capable of creating.  In other words, a Win7 user could use Win7 to create RAID0 arrays, and could speed things up somewhat by putting some programs and data files on such arrays; but the user could not boot Win7 from a Win7 software RAID0 array, and would therefore have to install and run some Win7 program files from a basic hard drive.

As an alternative, the user could create a RAID array using hardware, either on the motherboard or on a third-party controller.  Hardware RAID had potential advantages of performance, bootability, level support (i.e., the ability to choose versions other than RAID0 and RAID1, including some I had not previously heard of), multiple operating system access (i.e., the data would remain available regardless of how the machine is booted), and reliability.  The performance story was a bit controversial.  I had seen indications that motherboard RAID was no faster than software RAID, and I had heard some claim that RAID0 was no faster than non-RAID, but my own experience (in Ubuntu) was consistent with tests indicating that even a software RAID0 was markedly faster than non-RAID.  Cost was a consideration.  While RAID5 was perhaps ideal, a Newegg search for a SATA RAID5-capable controller would cost hundreds of dollars.  Removing the RAID5 constraint led me to a $30 Rosewill item.  I didn't need that myself; I had motherboard RAID 0, 1, and 10.  For reasons of cost, noise, and heat, I wasn't going to be installing four drives to have RAID 10 (a/k/a RAID 1+0, which Wikipedia (apparently backwardly) called a stripe of mirrors, as distinguished from RAID 0+1, a mirror of stripes).  So for my purposes, it was a choice between hardware RAID0 or software RAID0; and since I was now using Win7 rather than Ubuntu, the desire to boot in RAID0 meant I would be using motherboard RAID -- unless VHD gave me a good alternative.

At this point, I had a general sense of what VHD sounded like.  Among other things, it sounded great.  But it seemed that I wouldn't run into the actual drawbacks and impossibilities until I created my own VHD and tried to use it.  As described in another post, the first step in doing so was just to create and play around a bit with a VHD, within a regular, running Win7 system.

Wednesday, January 12, 2011

Windows 7: Upgrade Installation to Win7 Software RAID0 Array

I was trying to install an upgrade version of Windows 7 on a RAID0 array.  This post contains some notes on what I learned about the possibilities.

I had a new basic hard drive.  I started by installing Win7 on that drive.  The upgrade version of Windows 7 required a previous version of Windows to be installed.  It was not enough just to have the previous disc or serial number.  I was interested in upgrading from Windows XP.  To accomplish this installation, then, I had to install my copy of Windows XP and then upgrade from there.

Having done that, I used Disk Management (diskmgmt.msc) in Win7 to create a couple of Windows 7 software RAID0 arrays on two other empty hard drives.  Unlike other RAID solutions, Win7 was willing to create multiple arrays and single-drive partitions on a pair of drives being used in a RAID0 array.

I hoped to install Win7 into one of those arrays (which I called PROG-FUTURE), and to put my data into another.  Of course, since this was RAID0, I planned to have a good backup scheme for the data.

I went ahead and copied my data into that RAID0 data array.  Later, when it came time to try to install the Win7 upgrade to the PROG-FUTURE array, it seemed that this might have been a mistake.  An attempt to install WinXP to PROG-FUTURE got as far as the point where the installer recognized the various partitions on my drives.  It saw the entire hard drive as a single dynamic disk.  In other words, WinXP might have been willing to install to at least one of the two drives I was using for my RAID arrays.  It gave no sign that it would install itself in any array format to two drives simultaneously.

I was not sure whether an attempt to install WinXP, Win7, or any other operating system to a dynamic drive would run into problems.  There did exist a Dynamic Disk Converter program, and probably others like it, that would apparently be able to convert the dynamic disk to a basic disk format.  I could not say how well such programs would work.

It had occurred to me that perhaps I could use the Universal Restore feature of Acronis True Image Home 2011 (ATIH) to restore a working Win7 installation to the PROG-FUTURE array.  My attempts along those lines did not succeed.  As far as I could tell, ATIH was not capable of restoring a RAID0 array.

Another possibility was to use Ubuntu 10.10 to copy Windows 7 program files from a Win7 installation on a basic drive to the PROG-FUTURE array.  This did not appear feasible at this time, however, because Ubuntu evidently could not see the Win7 RAID0 array as such.  I also wasn't sure whether the resulting partition would actually boot.

An attempt to install directly from the Win7 upgrade CD to the PROG-FUTURE array failed early in the process, when I received this error message:

Windows cannot be installed to this hard disk space.  The partition contains one or more dynamic volumes that are not supported for installation.
It appeared, in other words, that Windows 7 could not be installed to a software RAID0 array created by Win7 itself.  I found a thread suggesting that there were ways to make it work, but it seemed that the process was tricky and prone to problems.  It appeared that the array would probably better be created from some other software or by using a RAID0 controller on the motherboard or on a separate controller card.  Another possibility that I had not heard of previously was native virtual hard disk (VHD) boot.

Monday, January 10, 2011

Acronis True Image Home 2011: Restoring Windows 7 to RAID 0: FAIL

I had installed Windows 7 on a regular hard drive -- what Win7 calls a "basic" drive.  I had made an image using Acronis True Image Home 2011 (ATIH).  Now I was trying to restore that image -- using the ATIH Universal Restore feature installed by the Acronis Plus Pack -- to a new, empty software RAID0 array that I had just created in Windows 7.  This post describes my efforts.

For perspective, I could perhaps restore an image of a Windows 7 installation onto a hardware RAID0 array, and I might be able to restore an Acronis image of a RAID0 array to a non-RAID partitionAcronis promised that the Plus Pack would enable me to restore to a striped software RAID0 array.  But now that I had made the purchase, a search led to a thread suggesting that the promise was false.

Here's how the effort unfolded.  After booting the ATIH CD, I went into Recover My Disks > Browse (wait) > select the .tib file to restore (or, in my case, the first of the five DVD-sized files comprising the backup).  I went on, OK, Next, and came to "Recover whole disks and partitions" and "Use Acronis Universal Restore."  It gave me the option of adding device drivers.  On the assumption, at that point, that drivers were not necessary in a software array, I clicked Next.  I indicated the partition I wanted to recover, and specified New Location as the dynamic volume that I had created for this purpose.  (Dynamic volumes were listed at the bottom of the screen -- I had to scroll to see them all.)  I clicked Next, and here is the message I got:

You are about to recover a partition containing OS files.  If the recovery destination is an existing non-active dynamic volume, then the system will be unbootable because activation of dynamic volumes is not supported.  Are you sure you want to continue?
I searched and found only a few links containing that statement about activation.  A post by an Acronis employee in an Acronis forum said, "The issue will be resolved in a future update of our software."  To see if the most recent update had resolved it, I went to the Acronis site and downloaded the latest build of ATIH.  I was doing this from an Ubuntu live CD, so I had to save the download (an .exe file) to a USB drive and jump it over to my laptop, running Vista, to install the .exe so I could burn an updated CD that would hopefully have better news for me in terms of restoring to a Win7 RAID0 array.  This was a lot of fooling around.

While that was happening, I went ahead with the next step, using my present copy of ATIH.  Alas, a new error:
There may not be enough free space on the system partition to boot up your operating system after recovery.
This was an odd message.  I had two 50GB partitions in my new RAID0 array.  The backup I had made was from just one partition of either 40GB or 50GB.  I clicked OK there and Acronis stopped.  It didn't try to see whether the restore would fit.  Evidently it was not set up to think in terms of RAID0 arrays.

I guessed that even the updated ATIH would not try to fix this.  But it was going to be a while before I could find out for sure.  The installation on the laptop was taking its sweet time.  After seemingly completing most of the installation, it aborted with this message:
Installation Incomplete

The installation was interrupted before Acronis True Image Home 2011 could be installed.
I didn't know who or what interrupted it.  The laptop was just sitting off to the side, doing its own thing.  I wasn't touching it.  I tried again.  Now I got a new error message:
The error was encountered while the installation.
That's really it.  That was what the message said.  It provided technical details that I didn't understand.  Toward the bottom, it said, "A possible reason might be that you do not have enough privileges."  OK, Vista.  Even though I was running as Administrator on the laptop, that was not enough, and possibly the good people at ATIH couldn't have noticed that until we were at the end of the installation process.  I was unfortunately not knowledgeable enough about the solution and was quite tired and not very patient with the idea of researching that question in order to resolve this tangent from a tangent.

I gave up on that Vista errand and just installed the upgraded ATIH on a Windows XP machine.  No permissions issues.  The upgrade took a while, but then it was successful.  Now I needed to remember how to install the Plus Pack.  My previous post about Plus Pack led me to an Acronis instructions webpage.  The basic process seemed to be to install ATIH, install the Plus Pack, and then go into the new Start Menu entry for Plus Pack and click on Acronis WinPE ISO Builder.  That required me to "Specify a path to the folder with the WinPE files."  A search for that exact statement yielded only a post where someone was trying to combine ATIH and Acronis Disk Director on one CD.  A different search led to an Acronis page instructing me to download the Windows 7 Automated Installation Kit (AIK) from Microsoft.  This was a 1.7GB download.  Was this really what people had to do if they wanted to use Acronis Universal Restore?

I went back to the search and tried a different Acronis page.  This page said that Plus Pack had three benefits, and it pointed me to three separate webpages describing those benefits:  it would support dynamic drives and GUID partition tables (GPT); it would facilitate Universal Restore between dissimilar hardware, including virtual machines; and the WinPE part would create bootable rescue media.  The page on dynamic drives led to a page on RAID support, cited above, that led in turn to a table, summarizing the kinds of RAID support provided by various versions of Windows.  It said that ATIH Plus Pack supported restoring to RAID0.

The page regarding Universal Restore said that it would work only if "You have created Acronis Bootable Media (standard, WinPE, or BartPE) after the installation of Acronis True Image Home 2011 Plus Pack."  So apparently there were three different kinds of Acronis bootable media.  To see more about that, I went back into the Start Menu, on the XP machine where I had just installed ATIH, and chose the option to start up ATIH.  In its main screen, I went to "Create bootable media."  So this was going to give me the standard variety.  I burned it to a CD.  Much easier than creating WinPE media.

With that CD, I was ready for the next step.  Acronis said that I would need drivers for the hard drive controller or the chipset, in .inf, .sys, or .oem forms -- extracted, if necessary, from .exe, .cab, or .zip files.  The last time I had played with Acronis Universal Restore, I hadn't understood this driver situation and, as I dimly recalled, part of the problem was that I didn't know which drivers I should get and how I should extract them.  It was clearer to me, now, that there was no getting around it:  I had to have exactly the right drivers.  I have written up that pursuit in a separate post, for those whom it puzzles as it puzzled me.

Not to say that I came to a clear understanding.  I just made a stab at it.

With the drivers collected in a folder, I booted the Universal Restore CD.  When I got to the Drivers Manager step, I clicked Add Search Path and pointed to that folder.  Oddly, when I did, Acronis reported, "No items to display," even though I had just put 16 driver files in there.  I guessed that this meant it had not refreshed its view at that point.  It did not have an option to do so.  I proceeded to designate what I wanted to restore and where I wanted to restore it.  Once again, sadly, I got that error message indicating that "activation of dynamic volumes is not supported," followed by that error indicating that there might not be enough space to boot the operating system after recovery, and once again that was the end.

The working conclusion, at this point, was that ATIH did not support Windows 7 programs partitions in RAID0.  I could install Win7 manually in that kind of partition, and perhaps I could use ATIH to back up a manual installation, but I could not use ATIH to restore any such backup to that location.

I clicked on the Help button in Acronis and browsed its contents, to see if I could get a clearer idea about all this.  They didn't seem to have any information on it.  I posted a question about this on an Acronis forum.  Responses to that question tentatively supported the working conclusion that ATIH does not restore images to Win7 software RAID0 arrays.

Windows 7: Choosing Partition Wizard to Resize Partitions

In the process of installing Windows 7, I found that I needed to make some space on a drive.  In Windows 7, I ran Disk Management (Start button > type "diskmgmt.msc" > right-click on the partition to be shrunk > Shrink Volume).  I got a  message:  "Querying volume for available shrink space, please wait ..."  Then it said that it could only shrink the volume by 5182MB.  This was absurd:  Disk Management itself showed that the volume had hundreds of gigabytes of free space.  The dialog said this:

You cannot shrink a volume beyond the point where any unmovable files are located.
In that dialog, I clicked on the "Shrink a Basic Volume" link.  It offered advice on using the command-line DISKPART tool.

I did a search and found references to various third-party programs for this purpose.  I had previously used GParted, which was available as either a standalone or as part of the Ubuntu live CD.  I was now seeing some indications that GParted was not the best Windows 7 partition manager.  I had already tried using it to shrink this partition, but it had not been able make much space either.  I also shared the complaint that it was slow.  I preferred slowness over losing data, but it seemed that people were having good luck with alternatives.

Among the various free and paid alternatives, it seemed that the freeware program Partition Wizard was getting good reviews.  I downloaded and installed it.  I also downloaded the .iso file for its bootable CD, and burned that to a disc for standalone functionality, which would ordinarily be capable of doing more than a program could do while Windows was running.

I burned the CD.  (In Windows 7, just right-click on the .iso and choose Burn Image.)  I used it to reboot the target computer.  My first usage, to move a large partition, was faster than a similar operation had been in GParted.  I proceeded to use the moved and resized partitions without problems.  Partition Wizard appeared likely to be my partition tool going forward.

Monday, September 27, 2010

Dual-Boot RAID 0: Ubuntu 10.04 and Windows XP

I wanted to set up a SATA RAID 0 array that would function like any other dual-boot system:  I would turn on the computer; it would do its initial self-check; I would see a GRUB menu; and I would choose to go into either Windows XP or Ubuntu 10.04 from there.  This post describes the process of setting up that array.

With no drives other than my two identical, unformatted SATA drives connected, I turned on the computer.  The BIOS for my Gigabyte motherboard did not give me the obvious RAID configuration option I had hoped for.  I hit DEL to go into BIOS setup.  Nothing jumped out at me.  Desperate for guidance, I turned to the manual.  I was looking at an Award Software CMOS Setup Utility.  The manual directed me to its Integrated Peripherals section.  There, I set OnChip SATA Controller to Enabled, OnChip SATA Type to RAID, and OnChip SATA Port4/5 Type to As SATA Type.  I hit F10 to save and exit.

According to the manual, that little maneuver was supposed to give me an option, after the initial boot screen, to hit Ctrl-F and go into the RAID configuration utility.  Instead, the next thing I got was this:

Press [Space] key to skip, or other key to continue...
I didn't do anything.  In scanned my drives and then led on to the Ctrl-F option.  I rebooted and tried it again.  Hitting the space key led to the same result.  Ctrl-F opened the AMD FastBuild Utility.  I hit option 2 to define an array.  This gave me a list of my two drives, labeled as LD 1 and LD 2.  Apparently it wasn't supposed to show anything.  LD was short for "logical disk set."  It was essentially showing two separate arrays, each having one drive.  So although the manual didn't say so, it seemed that I needed to get out of here and go into option 3 to delete these arrays.  I did that and then went back into option 2.  Now I was looking at a blank list of LDs, just like in the manual.

So now I was ready to prepare my array.  In option 2, I hit Enter to select LD 1.  This defaulted to RAID 0 with zero drives assigned to it.  I arrowed down to the Assignment area and put Y next to each of the two drives listed.  Now it said there were two drives assigned.  But now I had a couple of things to research.  The screen was giving me options for Stripe Block, Fast Initialize, Gigabyte Boundary, and Cache Mode.  The manual didn't say what these were.

I did a search for information on the Stripe Block size.  I found an old AnandTech article that took the approach of choosing the lowest stripe size where performance tended to level out -- where, that is, increasing the stripe size another notch did not increase performance.  For the RAID controllers they were testing, it looked like performance kept increasing right up to the range of 256KB to 512KB, for those controllers whose options went that high.  Mine only gave me a choice between 64KB and 128KB, so I chose the latter.  A more recent discussion thread seemed to support that decision.

Regarding the "Fast Init" option, a search led to some advice saying that slow initialize would take longer but would improve reliability.  A different webpage clarified that the difference was that slow initialize would physically check the disk and would be suitable if you had had trouble with the disk or if you suspected it had bad blocks.  I decided to stay with the default, which was Fast Init ON.

The "Gigabyte Boundary" option would reportedly make the larger of two slightly mismatched drives in an array behave as though it were the same size as the smaller one.  The concept appeared to be that, if you were backing up one drive with another (which was not the case with a RAID 0 array), you would use this so that the larger drive would never contain more data than the smaller drive could copy.  Mine was set to ON by default.  I couldn't quite understand why anyone would need to turn it off, even if the drives were the same size.

Finally, the "Cache Mode" option was apparently capable of offering different choices (e.g., write-back), but mine was fixed at WriteThru with no other options available.  So I thought about it a long time and then decided this was acceptable to me.  So then I hit Ctrl-Y to save these settings.  Now I was back at the Define LD Menu, but this time it showed a RAID 0 array with two drives and Functional status.  That seemed to be all I could do there, so I exited that menu.  I poked around the other options on the Main Menu.  I seemed to be done with the FastBuild Utility.

Next, the manual wanted me to use a floppy disk to install the SATA RAID driver.  I could have just gone ahead and done that -- I still had a floppy drive and some blank diskettes -- but I thought surely there must be a better way by now.  Apparently there was:  use Vista instead of WinXP.  But if you were determined to use XP, as I was, the choices seemed to be either to go through a complex slipstreaming process or use the floppy.

There was, however, another option.  I could buy a RAID controller card, for as little as $30 or as much as $1,000+, and it might come with SATA drivers on CD.  This raised the question of whether the RAID cards actually had some advantage beyond their included CD.  My brief investigation suggested that a dedicated RAID card could handle the processing task, taking that load off the CPU, but that there wasn't much of a processing task in the case of RAID 0.  In other words, for my purposes, a RAID controller card wouldn't likely add any performance improvement.  Someone said it could even impair performance if it was a regular PCI card (as distinct from e.g., PCIe) or if its onboard processor was slower than the computer's main CPU.  There did seem to be a portability advantage, though:  moving the array to a different motherboard would require its re-creation, in at least some cases, but bringing along the controller card would eliminate that need -- though the flip side was that the card might fail first, taking the array with it.

Further reading led to the distinction between hardware and software RAID.  An older article made me think that the essential difference (since they all use hardware and software) was that software RAID would be done by the operating system and would run on the CPU, and would therefore be dependent upon the operating system -- raising the question of whether dual-booting would be impossible in a software RAID array, as a generally informative Wikipedia article suggested.  To get more specific, I looked at the manual for a popular motherboard, the Gigabyte GA-MA785GM-US2H.  That unit's onboard RAID controller, plainly enough, was like mine:  it depended upon the operating system.  Wikipedia said that cheap controller cards provide a "fake RAID" service of handling early-stage bootup, without an onboard processor to take any of the load off the CPU.  FakeRAID seemed to get mixed reviews.

An alternative, hinted at in one or two things I read, was simply to set up the RAID 0 array for the operating system in need of speed, and install a separate hard drive for the other operating system.  I was interested in speeding up Linux, so that would be the one that would get the RAID array.  I rarely ran Windows on that machine, so any hard drive would do.  A look into older, smaller, and otherwise seemingly less costly drives led to the conclusion that I should pretty much expect to get a 300GB+ hard drive, at a new price of around $45.  Since I was planning to use Windows infrequently on that machine, it was also possible that I could come up with some kind of WinXP on USB solution, and just boot the machine with a flash drive whenever I needed access to the Windows installation there.

I decided that, for the time being, I would focus on just setting up the Ubuntu part of this dual-boot system, and would decide what do to about the Windows part after that.  I have described the Ubuntu RAID 0 setup in another post.

Tuesday, April 29, 2008

Update: Recorded Speech to Text Conversion

Every now and then, I run into someone who is doing qualitative research and wants to know about speech-to-text conversion. Here is an update that will answer some of the questions I have been asked during the past year or so in this area. I know some people will be doing interviews this summer, so maybe this will help. The basic idea is that you have recorded something -- an interview, perhaps -- and now you want to connect your recorder to your computer and have the interview become automatically converted into a Word document. If the computer knows when to insert a colon rather than a semicolon, so much the better. Unfortunately, that's still a dream, according to an article published last week by James A. Martin of PCWorld. Then again, a review by Nate Anderson of Ars Technica indicates that Nuance's Dragon NaturallySpeaking, a leader in this area, has made great advances in recent years. Anderson provides an illustration of several paragraphs in which Naturally Speaking actually outperformed his own typing at the keyboard -- provided that he first sat down with a microphone and trained the software to understand his pronunciation. Minimum training requires a few minutes; best results come after several months of use. So the program may do pretty well with your side of the interview -- but less so, most likely, with the words uttered by the interviewee, who will ordinarily be doing most of the talking. (Note: I have seen "Naturally Speaking" spelled both with and without an internal space, so you may want to try both if you're searching for more information. See e.g., http://tinyurl.com/58238m for suggested search syntax.) In theory, you could train the software to understand the voice of the interviewee instead -- by e.g., having him/her read a section of text into the recorder, and then teaching Naturally Speaking to understand it later, in the privacy of your own communal workstation, and in that case I think it would be your questions, not his/her answers, that would require repair afterwards (though perhaps you could avoid that by simply cutting and pasting pre-typed versions of your questions into your interview transcript. Naturally Speaking allows for multiple user profiles, so apparently this would require the mere addition and training of another voice account, without having to delete the previously entered account. Best results in this area have traditionally come from dictating into a high-quality microphone directly connected to the PC. Now, however, Anderson says that Naturally Speaking does an "acceptable" job even when the recording was made using the internal microphone on a mediocre MP3 player. Martin's point, in his article, was that the Sony ICD-MX20DR9 digital voice recorder (now about $230 plus $25 or so for an additional memory card) comes with a copy of Naturally Speaking (so you don't have to buy a copy separately), was designed for use with Naturally Speaking, and is listed as a compatible model on Nuance's Hardware Compatibility List. Among the numerous options on Nuance's Hardware Compatibility List (e.g., Headset Microphones (legacy)), there are two for recorders specifically. On the Recorders (legacy) list, only five old models get a three-star rating. On the Recorders (current) list, by contrast, the Sony ICD-MX20 (I assume the DR9 suffix simply means that Dragon Naturally Speaking is included with the hardware) gets six stars -- and is the only recorder that gets more than five stars. I suspect each additional star means, in practice, a somewhat higher percentage of accuracy -- which could translate into many hours of seeking and correcting text. So if you were doing your interviews within the next few weeks, one approach would be to shoot first and ask questions later -- i.e., buy the Sony, train it, experiment with it, and see if it saves you a ton of typing. That might make less sense if you were doing your interviews in a noisy environment, though. There are some differences among versions of Naturally Speaking (ranging in price from $60 (standard) to $1,200(legal)). I'm not sure which version comes with the Sony ICD-MX20DR9. In a detailed review of the software (i.e., not the Sony), Cade Metz of PC Magazine says the "results were pretty darn good," and describes the option of doing voice rather than keyboard correction of errors -- which may be available only on more expensive versions (not sure). Elsa Wenzel of ZDNet seconds Metz's view that Naturally Speaking is the best consumer voice recognition program available. As is often the case, however, users' opinions vary dramatically -- possibly as a function of having the right hardware and/or doing the required system training in the recommended manner. Off the topic of interviews, but of social work relevance: Rita Zeidner of the Washington Post points out Naturally Speaking's productivity implications for persons with disabilities.