• Sorting out the revolution in PC backups: Part 2

    Home » Forums » Newsletter and Homepage topics » Sorting out the revolution in PC backups: Part 2

    Author
    Topic
    #493456


    TOP STORY

    Sorting out the revolution in PC backups: Part 2

    By Fred Langa

    In Part 1 of this two-part series, I gave an overview of the five major types of backup technologies available today for Windows PCs. This week, Part 2 shows the enormous speed differences in backup methods; it also includes some real-life scenarios to help you pick the best method for your needs.


    The full text of this column is posted at windowssecrets.com/top-story/sorting-out-the-revolution-in-pc-backups-part-2 (paid content, opens in a new window/tab).

    Columnists typically cannot reply to comments here, but do incorporate the best tips into future columns.[/td]

    [/tr][/tbl]

    Viewing 27 reply threads
    Author
    Replies
    • #1439959

      In the first article, Fred suggested OneDrive as a backup option for a PC. Unfortunately I cannot see any way to make this work from the desktop. My data (apart from pictures and music) are in my 20GB Data (D:) partition. I created a new OneDrive partition (S:) which I synced to my OneDrive folder in the cloud. Twice per week, I sync D: to S: (mirror) using FreeFileSync.

      What I need is for OneDrive to mirror changes in S:, but not to redownload files to my PC that are no longer in S: (ie, that I have deleted or moved in D:). I have had some discussion of this on another tech site (http://answers.microsoft.com/en-us/windows/forum/windows_8-files/how-to-use-skydrive-to-backup/83f83187-2fab-4d68-9933-b3fd5f1fa459) but consensus there is that OneDrive from the desktop is not capable of a one-way sync and thus cannot be used as a backup option. One respondent indicated that the OneDrive app from Metro is capable of one way syncs, but because I only work in the desktop, this is not much use.

    • #1439978

      In the 2nd part you compared the speed of backup. I want to mention something I did not notice in your article.

      I have used different backup sytems and program over the years and I have used different destinations. As the disk size has increased and so the backups are bigger, I have started to use NAS drive, and in a multidrive unit I have used RAID 5. The problem there is that it is not so fast if you store there thousands of small files. It is a problem as some backup programs compress each file and store them individually. I use Genie Timeline for day to day backup as it is nice to add automatic versioning, but it is really slow with NAS with RAID 5. So I am using external single drive connected with eSATA for that.

      The NAS drive is very good to store “normal” backups, which are usually one or more big files. Also there I have noticed big differences in speed betweeen programs. I usually schedule the backup to run once a week and it runs when I am not working with my computer.

    • #1439989

      Differential / incremental backups can greatly increase speed.

      Noted that the assessment of cloud storage backup speed must consider the upload data rate available on the particular internet connection/contract/technology/server load applicable, – the usual issues. For me on ADSL, it would not be something I would consider as a main backup option.

    • #1440006

      I fundamentally disagree with Fred’s assessments. Here’s why

      First, is speed really the most important factor? I think not. Convenience and reliability are most important. The world’s fastest backup is no use if you don’t actually do it, or if you can’t retrieve the file you just lost.

      Second, there are many reasons to back up your system, one is to recover from a disaster such as fire, flood (in the UK!) or theft. Another is to recover from a less serious issue such as hard drive failure. Another is to recover the file you modify constantly that you know had the data you need last Wednesday but somehow no longer has it. Those are the ones I use most. Backup speed isn’t the time to backup a full drive, it is the time to perform a routine backup or restore.

      In Windows 7 and 8 the “previous versions” feature helps with the last of those, but uses space on your primary drive or requires you to use Windows’ backup tool.

      Until the advent of cloud backup I used to back up commercial servers (up to1Tb data) to removeable tape cartridges on a cycle of incremental every night and full every weekend, with a complicated tape re-use sequence (Towers of Hanoi) that maximised the time a file version was available on a backup. Tapes were stored both off-site and on site in a fire safe. Not something a home user could maintain. Smaller servers and individual PCs were backup up to USB hard drives which were swapped out periodically. At home, my laptop was lucky to get a proper backup once a week and my wife’s laptop once a month if I remembered. Those backups took hours, and tended to make the system pretty unusable while running. Not good.

      For the past year I’ve been backing up data on my own and my wife’s laptop using Livedrive cloud backup. Once in a while I also run a Windows image backup to USB to preserve the system and software. Based on the success of that strategy I’ve moved one of my clients’ servers to Livedrive as well (there is also a simple local USB backup). Between the two laptops there is around 150Gb backed up and the server has around 300Gb.

      Fred’s estimates for upload times are wildly optimistic for a UK cable or ADSL user. Livedrive offer a 2 week free trial. I barely uploaded my laptop in those 2 weeks – that is around 240 hours for 100Gb. However, the software allows prioritisation of certain file type groups so I had a usable backup of most of my work in a day or so. My theoretical upload speed was a little shy of 1Mb (on a 20Mb connection) or 100Kbytes/sec but clearly I wasn’t achieving that constantly. For the commercial server I started over a holiday period, but as the ADSL line also had to carry their VOIP phone traffic, email, and 15 people using the internet I later throttled the upload to 40% of the theoretical capacity (50Kbytes/sec). The initial upload took almost 3 weeks, the last week pushed up some 0.5Gb video files.

      But Fred suggests that restoring a backup would be as slow. It isn’t. the A in ADSL means Asymmetric. When I bought my wife a new laptop recently her full backup downloaded overnight. Initially I aimed to use the Windows transfer tool but that was predicting over 20 hours and I thought it time to test the backup!

      While using USB backup I was constantly bedevilled by full disk errors. Running incremental backups to a USB drive using Windows backup seems to multiply the space requirement out of all proportion. Filling a 1Tb drive backing up a laptop with 300Gb of data + programs doesn’t take long, and once it is full it is useless – I’m not even sure you could do a restore. So another disk, another full backup…

      Livedrive on the other hand is a breeze. Once the initial upload is over it scans for changed files every hour and uploads the changes in minutes. No user intervention is required, it backs up in the library, in Starbucks, on the train. Some of my regularly updated files have 30 or more versions stored and I have on several occasions retrieved old copies to check on past changes or recover accidental deletions (you can restore to the original location or to a new file and deleted files are kept for 30 days) With a 2Tb storage limit I don’t need to worry about running out of space.

      So in summary, online backup wins massively because it is fast (enough), completely unattended, very regular, very reliable, has no storage limits and has saved my sanity. It also works as cloud storage – I can access my files from any PC, anywhere.

      And no, I don’t work for Livedrive, other online backup services are available and may well be better. And OneDrive, GoogleDrive, Dropbox etc are NOT online backup.

      Ian.

      • #1440011

        Hi Fred,
        I see from your Table you did not include eSata External Storage. As a point of interest, on my Windows XP System, my 300MB of imaged data using Macrium Reflect is completed in 2 hours 7 minutes. Considerably less than 100Mbs Ethernet timing.
        Stuart.

    • #1440016

      Fred needs to get a different internal drive, configure the system properly or do better extrapolation.

      I can backup 300GB of data in considerably less time than 9 hours copying from one internal drive to another internal drive. The files I backed up where ISO images of software, not easily compressible which really doesn’t matter as a straight copy to another disk drive does not compress. Even a USB 3.0 drive is considerably less than 9 hours. Since his disk to disk copy times are not much less than 100Mbs ethernet something is seriously wrong with his calculations.

      I would also submit that 100Mbs ethernet is the exception rather than the rule as all current devices support 1Gbs.

    • #1440017

      Here’s an approach that gives me two instant copies outside of my main computer and a cloud-based off-site backup. All for $40/year with unlimited backup space. It’s a combination of two solutions:

      1. The first solution is simple – I use an external eSATA drive along with Memeo Backup Premium, which lets me choose source and destination folders. Memeo Backup automatically detects changes to files in the source folder and copies it to the destination folder. So I get a real-time external backup to a separate physical drive.

      2. I also use PogoPlug. This is a cool little external device that you plug into your network, then plug a USB drive into it. Install their little software client and the USB drive shows up as a drive on your main computer. Any or all of your local computers can use this drive. You then sign up for PogoPlug’s UNLIMITED backup service, which is currently $40/year. Configure the PogoPlug backup service to backup your USB drive to the PogoPlug cloud. It first does a full backup of your USB drive, then constantly watches it and copies any changes to the cloud. Finally, create a new plan in Memeo to automatically backup your pictures, music, and video to the PogoPlug USB drive. The PogoPlug software then copies the files to the cloud automatically.

      A few notes…

      1. since the eSATA drive is local and a very fast connection, files get copied to it very quickly.

      2. the USB drive is network-connected, so files get copied to it slower, but the software does not slow down your computer at all, and no user intervention is required.

      3. the PogoPlug copy-to-cloud is extremely slow. It took a couple weeks to copy my 1T of pictures, music, and videos. But does it matter? My main computer was shut on and off during this time, only the tiny PogoPlug box was on. And in the mean time, I have TWO external copies of my data.

      4. the PogoPlug web site lets you view and share your pictures and music with anyone. It even has a decent but not too fancy viewer/player built in.

      Overall, minimal cost, three complete data copies, one off-site, together with easy sharing and remote access.

      If anyone has a better system, I’d sure like to see it…

      • #1440018

        What apps would you recommend for backing up to Google Drive?

      • #1440428

        Have you verified that the same version of your file has been backed up on both destinations? As I understand Window files have an archive attribute. (From Wikipedia “Archive Bit”) “On Windows and OS/2, when a file is created or modified, the archive bit is set, and when the file has been backed up, the archive bit is cleared. It is by use of the archive bit that incremental backups are implemented.

        Backup software also provides the option to do a full backup while clearing archive bits – that is, to back up all files regardless of their archive bit status, and to clear the archive bit of all files processed by the backup. This allows for the creation of an initial full backup that will be supplemented by incremental backups in the future.”

        So if your first software clears the bit when you back up, how does your second software know to back it up.

    • #1440029

      Hi Fred,
      One other communication option not mentioned is eSATA. Many notebooks support this option vai a combined USB/eSATA port, and it only costs about $10 to add it to most desktop systems – it is just a connection to a spare SATA port on the motherboard to a back plate. eSATA (external SATA) requires no additional electronics, just a different cable. As such, transfer speeds should be equivalent to your second internal drive option.

      Odds are good that an eSATA backup option will cost a lot less than a USB 3.0 option for backup.

    • #1440117

      Good info so far, but a backup is no good unless you can restore from it. Do you have any data on restore capabilities? Or is that coming next week? Thanks.

    • #1440119

      Hello, I have been using a slightly different method for backups, but it seems to work well.

      When I get a new system, I purchase a second hard drive of the same size and type, which I then initialize as a bootable drive.

      I use and external drive adapter and a CD startup disk. I then simply do a disk copy from the active drive to my backup drive. This way I always have a usable drive in case of a problem. I usually do this about every week or two, depending on how much new work I have done.

      No special software to purchase, and the adapter and spare drive costs about $100 on my last system.

      Alex

    • #1440122

      Isn’t there a danger when using cloud backup or a permanently connected external hard drive, that if your computer is affected by ransomware, the backups too will be encrypted? I think an external hard drive that is disconnected after a backup is the safest option.

      • #1440132

        I have to support the NAS solution mentioned early in the thread. For years I used on-line backup, first Dell’s version which disappeared, AVG Live Kive which created a huge MGb file in my AppData folder, which they could never explain, then CrashPlan, which claimed that you could backup both locally and online with the backups synced. The only problems were, when I used my NAS drive for local I had to create a special connection to it using a command batch file with mklink. Of course, when my hard drive crashed I couldn’t get the connection back again. To add insult to injury, the onlinerestore process skipped files that their algorithm judged as inactive and I had to go to my online file structure and reactivate all the files.

        My current backup consists of data backup and imaging using Acronis True Image 2014 on the NAS drive with incremental backups. I do a twice a month image using Windows disc imaging on an external USB drive. Since we live in an apartment building, fire and flooding are less a risk, files on the NAS are encrypted and I have enough security software and settings to make me feel OK.

        What hasn’t been mentioned in these articles is how your backup choice affects your restore choice and when you make an image drive in Windows you are asked if you want to create a system disk and you have to say yes, otherwise you are out luck. You only have to do this once. Similarly if you use a third party backup software you need to know how to restore the disc/data and keep a hard copy of the users manual.

        • #1440222

          One other point – I would not use my online backup as a sharing centre. If I want to share photos, I’d set up a shared folder for those images. Opening files can cause corruption. Rare, but possible.

        • #1440228

          Oh – and the reason I use different software for my data backup? So I can get immediate access to my data, even if the OS is down. That’s not the case if I have the files in a Windows image that only the same OS can open.

        • #1440998

          Hi, Fred – I really enjoy reading your articles, and have been a long time reader from back in the days of Langalist. In your latest backup study, the timing table caught my eye. It seemed to me that the transfer rates were really slow, for internal HD and for USB 3 in particular. My recent tests with even a lowly USB 2.0 external hard drive and HD Tune showed a consistent transfer rate of 30 MB/s – USB-limited; with a USB 3.0 thumb drive, the transfers went up to 75-80 MB/s. And internal Sata III HD transfer rates across a variety of drive routinely hit from 200 MB/s (outer tracks) to a low of 75 MB/s (inner tracks, slowest drive).

          I also recently switched my home network to Gb Ethernet, so I can’t comment on 100 Mb speeds anymore, but I’m particularly sensitive to transfer rates, since I have a number of high-capacity systems on my network that are all backed up by a homebuilt Windows Home Server, and during baselines, a LOT of data gets passed over the network during the backup window. 100 Mbit Ethernet just wasn’t cutting it, prompting the move to gigabit.
          So it seemed that your results were at odds with my experience. Just to be sure, I ran a copy test of 16.6 GB (8328 files in 1079 folders, mixed document data). The copy completed in 14.5 minutes – from USB 3.0 thumb drive to C:;test. During the copy, I saw 2-5% CPU utilization overall, w. 58% of my 16 GB of memory used by other apps (IE (6 tabs open) and Chrome (30 tabs open in 4 windows) browsers, Word (3 docs), Excel (1 workbook), AOL, Foxit Reader (2 PDFs), notepad, calculator, and Windows Explorer – 9 GB memory utilization in all). I also run MSE as my real time AV.
          I then thought AHA! It’s because my C: drive is a Samsung 840 Pro SSD – I should repeat to a regular hard drive. So I repeated the test, this time to my H: drive, which is a lightly loaded (125 GB used) 2 TB Hitachi HDS723020BLA642 on which HDTune shows read speeds of about 150 MB/s in that area of the disk (it’s slow compared to my other drives). The test completed in 12 minutes and 15 seconds (? faster than to the SSD).

          That’s more data than in your 15 GB backup test, yet the transfer completed in less than half the time (~19 to 22 MB/s, respectively). Would that be attributable to the backup internal procedure overhead vs. the overhead of a simple copy? With so low a CPU load, it’s hard to imagine the CPU being the bottleneck. It just seemed curious to me. Or could it be that there’s fixed overhead prior to the start of the backup that makes extrapolated times inaccurate?

          For reference, I ran the same test over my gigabit network between the machine with the USB 3.0 thumb and my Windows Home Server – both with Gb NICs; the test completed in 29 minutes and 45 seconds – disappointing for gigabit Ethernet, but perhaps limited by the target hard drive on the server.

          I agree on cloud speeds – too slow for backup practicality, but excellent for security from disaster and convenience of anywhere access. But encrypting before transfer introduces a randomizing component that makes compression dysfunctional; compression should be run first, then encryption. There’s also risk involved in either process, because a single bit error in either resultant file can render the backup unusable – both encrypted and compressed files are hybrid data and metadata, so if the metadata is damaged, the encryption or decompression algorithm can’t reconstruct the data. But compression is very attractive when you are talking about pushing data out of a broadband spigot over the net. Less is more.

          Anyway, thanks for years of insightful analysis that keeps us techies thinking about these things.

        • #1441051

          Hi Fred,
          Notwithstanding your caveat that my backup times would differ from yours, I have to say that my backup times differ radically from yours. Were you using plaino Windows Explorer copying or maybe a slow imaging program? Acronis True Image 2010 just imaged and verified my Win7 C: and D: drives in 33 minutes. They total 94 GB and the image was 66 GB. This Win7 Ultimate computer has a 3 1/2 year old Gigabyte motherboard, a 4-core, 3.4 GHz AMD Phenom chip and 4 GB of RAM. The HDDs are only occasionally defragged. I used to use a USB2 external Buffalo drive but now I use an ordinary internal SATA HDD that plugs into a trayless mobile rack. My computer has two of these mobile racks. The first holds the “permanent” HDD with its three partitions and the second holds the backup disk. After making an image I’ll shut down the computer and remove the backup HDD to off-site storage. Then I use another backup disk to receive more frequent images for convenient, on-site storage. Windows was unstable when I first started using the mobile racks but I noticed a little sticker on them that said “3.0 Gb/s”, guessed that they were not qualified for a 6 Gbps connection and modified them by removing the little plastic pass-through data connector on the back side and gluing an ordinary SATA cable in its place, thus removing a connector pair from the circuit along with its extra impedance discontinuities, reflections, etc. That easy fix seemed to solve the instability problem.

    • #1440125

      I happened to have the opportunity to “backup” a drive this weekend. I run a karaoke show, and had all my music on an external WD 2Tb Passport. I maintain a spare backup drive which had recently failed. The last time I cloned the drive with less than 700G of mp3 and zipped files each consisting of a pair of .cdg and .mp3 files (total about half and half of mp3 and zip) it took a day and a half from one passport to the other using USB 2.0. This time, with about 1.04 T of files, it completed over USB 3.0 in only 5 and a half hours – a major improvement! so I’m not sure why your test of 300G over USB 3.0 took as long as 10.3 hours. My two drives were connected to the same USB 3.0 card, since my motherboard doesn’t have native USB 3.0; don’t know if that made a difference.

    • #1440176

      Fred – I’m a little surprised. Your old backup reviews always talked about the practicalities of Restore. For example, you used to use an older version of an imaging tool because the newer was less useful for a real-world restore.

      I would not put a system image in the cloud not because of time – even if it’s a factor – but due to the practicalities of restore. How can you restore a system image if you require Internet access and special software running to get to it? You need an OS running to get your OS back. Fail.

      A local off-computer system image on an automated routine, supplemented by an occasional off-site copy of a full system image is the most functional solution. And you need an imaging tool that can restore from a bootable DVD. Otherwise, what’s the point?

      I still use a separate data partition to most easily back up all the data on an automated routine, mainly locally. It can be sent to a different folder on the same backup location as the image. But I use different software for this. (on Win7) That can be supplemented by a secondary online backup for key files but in my tests, I would not want my data to depend on on-line backup. There are still way too many possible points of failure with it to make it any kind of primary backup. And you’re using 3 layers of encryption to store them? Good luck if any one of those fails.

    • #1440231

      I’m disappointed a “best effort” wasn’t made for the local ethernet backup … 100 Mbps is something I stopped using years ago (along with USB 2.0 and ADSL :).
      I’d be shocked to hear that your PC doesn’t have a gigabit ethernet. And, gigabit switches are $30 or so. Rrouters, cable modems, and WAPs are all available with gigabit ports, and have been.

      Secondly, I wish you’d tried BackBlaze. They seem to be one of the few “good guys” out there … publishing the specs on their hardware, and making hard-won data available to users (e.g., failure rates on various hard drives). I’ve used them on my Mac: easy to install, quick backup, unobtrusive, easy to restore, and … support listens and talks to me!

    • #1440232

      Re: “At a glance, it’s apparent that large, whole-system data sets will be problematic […] but two-plus days to restore a system with a few hundred gigabytes of data is probably unacceptable.”

      That’s why BackBlaze offers, at a reasonable cost, the option of sending you a disk drive with your data on it.

    • #1440269

      Fred,

      It would be helpful and interesting to know how much personal time you spend in doing backups each day or week. I don’t mean how long the computer runs, but how much of your personal time is occupied with this task.

      Tim

    • #1440378

      Hi Fred,

      Some additional thoughts.
      I’m a home user. Business users with large data sets will have quite different requirements.

      Simplicity: I’m cautious of incremental types of backup. They reduce backup times but at the cost of having lots of extra backup files. They require any restore to use the software that they were created with. This creates additional dependencies. I favour backups that create ZIP files that standard ZIP utilities can unpack. I keep my various backups over long periods of time because I have found that I may loose or corrupt a file – yet only discover this a long time later. But if the backup utility is no longer supported i can no longer access old backups. I have direct experience of this. I have been using ‘GFI Backup Freeware’ for many years. GFI have ceased to support this. I have just ‘upgraded’ to Win 8.1 where I find that VSS Volume Shadow Copy does not work. This is not critical to me but i shall have to consider migrating to a new product some time. But because all of the backups are in ZIP format I shall not loose any backups.
      My general approach is to copy small data sets in their entirety daily (Word/Excel etc). For large data sets (eg photos I do use incremental backup, at the cost of more awkward recovery.
      I make one exception. IMAGE BACKUP. The data set is large and recovery must be done with the source backup utility (in my case Paragon). In this case differential backup is sensible. Also the useful life of an image backup is short as I find I can do daily images with no effect on my work.
      Current disk cost: These affect how i now work. My new PC (Win 8.1) uses SSD for ALL of my programs and data sets (180GB at £80). This lets me backup to an internal disk (3Tb at £80). Backup speed is very fast and convenient. . Whilst recognising that using an internal (or permanently connected external) disk has some weakness concerning machine failure, i feel it is better to take this risk that fail to remember to take backups.

      Have you heard the adage: “There are 2 types of computer user – those that that regularly take backups and those that are yet to have a disaster.”

    • #1440382

      The trouble with constantly backing up all data rather than using an incremental scheme is that you are regularly backing up maybe 100 or 1000 times as much data, because the vast majority of your files don’t change between backups. A good full then incremental backup system gives you a single save set with multiple versions of changed files, much easier to restore than trawling through many full backups.

      I like the earlier post that suggested fast copy to an external drive then online backup of the external drive. I’ve investigated something similar but have yet to do a full trial. I suspect the snag will be persistence of deleted files – sometimes you really want deleted files to go away and with this scheme they might never do so.

      Ian.

    • #1440394

      Hi Fred,

      Just a note on your backup suggestions. I back up every night using Backup Maker (open source and free). I then create DVDs once a month using the backup copies (which are zip files). I keep one copy of these DVDs in a separate building locally (I have a woodshop out back) and send a second copy to a relative of mine that lives in Michigan (I live in California). I figure that will protect me from “The Big One” and any other calamities that I can think of.

      Thanks for all of the excellent info you have provided for these many years.

      Charlie J.

    • #1440459

      In defense of Fred, IMHO:
      Fred makes sure all conditions are the same through the various backup methods. So the speed data he presents is *relative*, not absolute. Relative data tell us the ratiometric difference between various methods. The difference is valid, relatively.
      At a glance of the table, we can compare the speed relations between different methods, relatively, 2.3X faster, 5.1X faster, etc.
      It is a way to do comparison, not to measure absolutely.

    • #1440474

      Fred, you are incredible with your pieces when explaining details in a clear, semi-novice and understandable manner. Your Part 1 of this series hit the nail on the head; however, I’m still in the dark on a few things. Your advice on backup procedures is extremely valuable. However, after starting my computer work in the late 1960s laboring at an IBM 029 keypunch and, as long as I could protect my deck of punched cards, my data/programs were safe. Today, with all of the threats, hardware failures, and technical problems, I do regular backups. However, my typical backup procedures are in the form of selecting my most recent and changed data files and copying them to my 600-GB external hard drive by overwriting the older files on the external drive which always include Outlook pst, Excel, document files, calendar, and other changed files. Now with my recently purchased WD My Book® desktop storage external hard drive, USB 3.0, 4 TB capacity, I am wondering if routinely creating a “disk image” is the best way to backup because I often change settings in my programs. I know that my new external HD does automatic backups with appropriate settings; however, I don’t know what these backups entail or include. The thoughts of reformatting my internal 1-TB HD and reinstalling everything are terrifying, I don’t have that kind of time. Therefore, with my lack of total understanding of exactly what a “disk image” is, would you do one of your classic detailed pieces on disk images, i.e., exactly what they are, how to put them on my external HD then, considering the worst, reformatting my internal HD and putting the image back on my computer’s HD exactly as I backed it up without needing to reinstall any programs or settings? I may be old fashioned but I don’t yet trust cloud backups or the permanency of CDs and DVDs as viable backups. My OS is Windows 7 Pro, SP 1. Thanks for you great work.

    • #1440613

      I do a LOT of video, WAVs and hi-res photos. It’s literally terabytes over the years. I’ve put everything on DVDs because they’re cheap and standard. I’m constantly stunned at how little thought is given to archiving. A photo can easily be kept in good shape for 50 years. I’ve lost digital media in less than 5. So… my concerns are:

      a) The longevity of DVDs? I’ve tried to keep them dry, dark, etc. But I have seen any reliable info on how long before they deteriorate.

      b) If I -were- to move them over to hard drives, how long can an un-powered SATA drive retain data reliably? Years? Or do they need to be kept powered?

      c) My other concern about moving to hard drive storage is how long the -standard- will be around. IOW: how long before ‘SATA’ or ‘NTFS’ is an issue?

      Your thoughts? Suggestions?

      • #1440734

        Thanks for that. Boy is that disquieting. Every so often, I pull one out and it works, but I’ve also had a few that are filled with errors after only a few months. And there doesn’t seem a way to know in advance.

        What about Hard Drives? Do they retain data when powered off? For how long? Is that a much better long term solution?

    • #1440647

      I think it’s all guesswork, anywhere from 5 to 50 years, for DVD longevity.

      Bruce

    • #1440650

      Hey Fred,
      Thanks for the work you put in BUT, how about doing some test on the top 10 or 12 programs themselves? 3 or 4 free versions and the rest paid. Make Ian help you. Everybody’s setup is so different, I’d rather see functionality and reliability test more than such in depth speed test. I know I know, but if you’d ask, a whole bunch of us would give a dollar each for the results. You’ve got mine. Where’s the PayPal button.

      REMEMBER
      “if you don’t play well with others,
      you could end up playing with yourself”

    • #1440893

      Having grown up decades ago in the old mainframe days I’m always concerned about taking backups on the fly and how a backup of a database file (such as Outlook files and Access files among many others) could be useless if the an index or pointer is being updated at the exact instant that a backup is being created. As a precaution I always shut down every window on my computer before manually taking a backup. I have never seen this issue addressed when talking about PC backups, but I believe it is a real issue. Can anyone put my mind at ease?

    • #1440922

      A couple of points on longevity. It is my firm belief that writing with good ink on a good substrate, and making many copies, is a reliable way to preserve information. It has been proven to work for up to 1500 years. Pressing marks into clay tablets may work for 5,000 years in a hot, dry climate. There are some residual problems with language, but scholars have proven themselves very good at solving such problems.

      Once you move to digital recording, the longevity of the media is the least worry. How many people have a device that will read 1/2 inch computer tape? Punched tape? Cards? 12 inch optical disks? 5 1/4 floppies? If you want to keep access to your digital data keep it on a technology that your next computer will support and copy to a new technology when that appears. Keep several copies. And then there is formatting. Stick to the lowest tech form. Archive databases as CSV text files. Somewhere in the future be prepared to convert all your photos from jpeg to kpeg or whatever comes next, avoid manufacturer formats which could disappear tomorrow.

      So think about whether your backups are for day-to-day convenience or true archiving. And plan accordingly. I have my grandfather’s photos on paper, but not many of my father’s on colour slide film, and very few of my own from the beginning of the digital age which I failed to copy from zip disks while I still had a computer that would read them. But I’m not crying over my lost photos the way I would over last month’s lost business data.

      And as for backing up open files, most decent backup software will handle open files, but to back up a live database you should use a specialist tool, usually provided with the database, that backs up a consistent snapshot plus transaction logs. Even then, the database design must anticipate backup and wrap changes inside transactions.

      Ian.

    • #1441076

      Which folder should I back up my documents from in Win 7, the Libraries My Documents or the User My Documents? I am the only user on this computer and they look like they hold the same info.

      • #1441081

        Which folder should I back up my documents from in Win 7, the Libraries My Documents or the User My Documents? I am the only user on this computer and they look like they hold the same info.

        Either; because, yes, they’re the same; but backing up the library is probably better practice; then you can add any miscellaneous document folders to that library:

        “If you’ve been working in Windows 7 for a while and storing documents to the My Documents folder, when you open the Documents Library, you’ll see those documents.”
        Get to Know Windows 7 Libraries Inside and Out

        Bruce

        • #1441316

          I have a question regarding the method of USB-connected external drives for backups. I have several users set up with USB-connected external drives and weekly scheduled backups. It won’t help them in case of fire or flood, but it will handle the most common-cases (hard drive failure or human error). But if some ransomware got into their machines, I’m guessing the backups would get encrypted as well. Is there a strategy to prevent this , e.g. a batch job to set the savesets to read-only?

    • #1441627

      @Dr.Whom re ransomware screws up attached USB backup drive(s) [as soon as you plug it to the infected PC]
      Very VERY good point.
      Suggestions?

    • #1442070

      The variation in the throughput of backups is quite spectacular. Clearly there are a number of significant variables involved.

      One variable that has not been mentioned is that of sector size of the logical volume. Alternatively called cluster size or allocation unit size. I have always run my large volumes at the maximum sector size. At the cost of slight wastage of disk space (not relevant with large capacity disks occupied by large files), a large sector size means less physical chunks of data need to be read. Also the file’s index (MTB perhaps) will be significantly smaller. How this might actually affect throughput I don’t know as its sure to depend on the actual way that the disk firmware actually reads data. No one has quoted sector sizes.

      Several rules seem to come through this conversation.
      – is a backup for recovery of data in the short term or archival over the long term. For me it is both. For archival purposes considerations of longevity (recording medium; recording format; care with proprietary formats) are far more important than speed/throughput.
      – I like my backups to be at file level with the output in ZIP format. Then I can recover either single files or whole backups with equal ease. And I can do this long after the product that created backup has ceased to operate.
      – How often do you take backups and for how long do you retain them. I tend to take backups daily. I retain some of these over years. I have a presumption against incremental or differential formats. This is because I do not wish to be dependent on the backup utility’s restore mechanism.
      – understand the situation concerning frequently open files (database; email client; access)
      – consider ease against risk. Yes, ransom-ware is a concern but if backups require media to be specifically mounted for the backup there is a real risk that backups won’t be taken. Ransom-ware can be largely defeated by practicing safe-computing. Backups can be copied to remote/removable media as a separate process on a regular but not necessarily frequent cycle.

    Viewing 27 reply threads
    Reply To: Sorting out the revolution in PC backups: Part 2

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: