News, tips, advice, support for Windows, Office, PCs & more. Tech help. No bull. We're community supported by donations from our Plus Members, and proud of it
Home icon Home icon Home icon Email icon RSS icon
  • Storing a system image on One Drive

    Posted on dmt_3904 Comment on the AskWoody Lounge

    Home Forums AskWoody support Windows Storing a system image on One Drive

    This topic contains 19 replies, has 12 voices, and was last updated by  jabeattyauditor 1 month ago.

    • Author
      Posts
    • #1997900 Reply

      dmt_3904
      AskWoody Plus

      I’m thinking of creating & storing a system image on One Drive, in my Personal Vault. It has 2FA, as opposed to storing on an external hard drive – which I can do – but hardware fails.   I believe One Drive Personal Vault is secure  – security would be my primary concern.  Are there any other concerns I’m not thinking of?  Would like to know what you think – good idea or bad? thanks.

       

    • #1997904 Reply

      PKCano
      Da Boss

      How much space do you have on One Drive. Those images can be quite large (GBs).

      1 user thanked author for this post.
    • #1997914 Reply

      joep517
      AskWoody MVP

      IMO, not a good idea. System images are large files (multi-GB) and will take an extremely long time to upload.

      Then, how do you propose to use the file if you need to restore a system? Yes, an external hard drive can fail but you should test the image after creating it.

      --Joe

      1 user thanked author for this post.
    • #1997949 Reply

      anonymous

      I think a few good quality USB flash drives and a small fire safe (to store them in) is a much better option. It keeps everything local and within your full control with zero reliance on an Internet connection.

    • #1997977 Reply

      anonymous

      Maybe not bad as a secondary, if all else fails, type of backup. You have to consider that during upload and download you could potentially have file corruption.

    • #1998011 Reply

      TechTango
      AskWoody Plus

      The current file size limit on OneDrive appears to be 15 GB, personal and business including “vault” files.  My Shadow Protect image files are around 25 GB and there is a lot stuff I don’t keep on the OS drive.

      Unless your image backup is made with Windows built-in app, most imaging software makers have cloud services available and would eliminate file size restrictions.

    • #1998037 Reply

      JohnW
      AskWoody Plus

      I would think local drives would be the best solution, especially if you rotate a couple of drives to cover the hardware failure aspect.

      That eliminates the internet issues with multi-GB image files, and the need to place the backup image onto a local drive before attempting any restore process.

      Ideally you would secure one copy of your image and place it offsite in a secure location, such as at a friend or family’s home, your office, or safe deposit box. This would cover you in the event of a fire, flood, theft, etc.

       

    • #1998136 Reply

      Paul T
      AskWoody MVP

      The chance of your external disk failing at the same time as you needing to restore would be extremely small. More likely to lose both in a burglary or fire.
      If you are feeling extra paranoid a second copy of the external disk stored off site would be better methinks.

      cheers, Paul

      1 user thanked author for this post.
    • #1998584 Reply

      mn–
      AskWoody Lounger

      MO, not a good idea.

      That depends.

      I can see several possible use cases for this, none very common but once in a while you do run into the uncommon situations. There’s even a couple of scenarios where I actually would recommend doing this, but not very many of those.

      (They all involve having ‘net access but no access to the physical backup media when needing to do the restore, for various reasons.)

      System images are large files (multi-GB) and will take an extremely long time to upload.

      … that’s just a matter of planning.

      Now, the right way to do this is to use a multi-file image. Say a Clonezilla image with fragment size set to something reasonable like 500 MB, or one of the others that do split images.

      Yes, it’ll take time to transfer, but individual file size restriction isn’t a problem that way and the sync system will do a reasonable job of it.

      Oh, I’d also password-protect the image if your chosen tool has that capability. (Clonezilla does have it.)

      Then, how do you propose to use the file if you need to restore a system? Yes, an external hard drive can fail but you should test the image after creating it.

      Well of course you’ll need to download it all to a local media before doing the restore, which can be its own headache… but possible.

      And, you need to have a copy of the necessary tools too. Beware of the chicken-and-egg problem where you’d need to extract a copy of the tools from inside the image to be able to restore the image.

    • #2003467 Reply

      John
      AskWoody Plus

      For a solid image and/or file backup strategy, here’s one:

      1. Backup weekly to a dedicated 2TB usb drive which is used only for image backups.  After the image backup completes, disconnect the power supply to the usb drive, which isolates the drive from ransomware attacks.
      2. Then periodically (once a quarter or once a year), copy the most recent image backup which is stored on the external usb drive to a flash drive.  Take that flash drive offsite to bank safety deposit box, or some external location.  That helps you out in case of fire or earthquake, or theft etc. destroying your computer.
      • This reply was modified 1 month ago by  Microfix.
      • This reply was modified 1 month ago by  John.
      • #2003693 Reply

        JohnW
        AskWoody Plus

        Regarding the frequency of #2, I would recommend to do it more often than quarterly or yearly. But this would be an individual requirement based on how often there are significant changes to the OS, applications, and or hardware.

        Obviously, user data and docs can be easily backed up to a cloud storage service on a daily basis, so really no need to haul them offsite to protect them.

    • #2003589 Reply

      AlexEiffel
      AskWoody_MVP

      Just a word of caution regarding flash drives or SSDs. I think that depending on the type of memory cell, they can degrade quite fast when not plugged for a while, so to me using a mechanical hard disk seems like a better idea if you are going to leave the unit many months in storage.

       

      1 user thanked author for this post.
      • #2003686 Reply

        JohnW
        AskWoody Plus

        That is a very interesting point about flash/SSD that I don’t believe I have seen much discussion about!

        I do use 1TB and 2TB external spinning drives for my image backups and other archived data. Mostly due to the cost factor of price/GB.

        I have also thought that for long term archival of static data, you should rotate/copy to new media occasionally to prevent having a bunch of old drives that you can no longer read someday. 🙁

        • #2003888 Reply

          mn–
          AskWoody Lounger

          That is a very interesting point about flash/SSD that I don’t believe I have seen much discussion about!

          I *have* mentioned that old-style SSDs had integrated backup batteries… because their data retention while powered off was even shorter. Some were really SSHDs, with integrated spinning HDD for long-term retention… and enough battery to commit all data to that.

          That was back when one of those with 16 GB capacity had a 6-digit price tag in dollars and required its own rackmount enclosure.

          I have also thought that for long term archival of static data, you should rotate/copy to new media occasionally to prevent having a bunch of old drives that you can no longer read someday.

          Exactly – this is a very widely applying general rule, and the interval varies by technology.

          Or media production batch in some cases – some writable CD/DVD media would go bad in less than 3 years from writing.

          1 user thanked author for this post.
        • #2004075 Reply

          AlexEiffel
          AskWoody_MVP

          The whole topic of long term data retention is very interesting to me and I have a hard time finding complete reliable information about it.

          For example, I wanted to maybe use 2 SSDs in mirror instead of HDs on my next station using ReFS, but I am not sure to understand clearly what happens if one drive starts to have issues when those are SSDs. Also, is the data is periodically refreshed (rewritten to) to prevent the discharge just by being plugged in or being active, does the OS has anything to do with it? Maybe ReFS detects the issues with SSDs for cells that haven’t been written for a long time and rewrite them on the fly when it detects inconsistencies during a read, thus making an SSD only setup viable. At this point, with just a quick research on the subject, I don’t know if an SSD only setup with ReFS is a good idea vs mechanical HDs, but I was really looking forward eliminating mechanical HDs and their sometimes very slow wake up time from sleep. Maybe I should just setup that NAS and leave the data there probably, but I liked the idea of an autonomous station because convincing people to have a separate NAS unit is not easy.

          My priority is long term data storage, in and outside the PC. I have a lot of static data in the PC that I never rewrite. I like to use some PCs for 10-15 years when possible and I don’t mind having to replace a failing drive unit as long as the data remains safe. It is not clear to me what happens when a ReFS drive fails. Can you read the other drive by itself to copy data or do you need to add a new drive first to rebuild? Can you rebuild on another station if the original one fails? I once move a drive to a different station and it was seen as raw, so unreadable, but maybe it was because of a different ReFS version…

          I used LTOs and DAT but the problem is you need a backward compatible unit that is still working to read them, so time ends up destroying everything at some point. You have a good media, but you can’t read it.

          M-discs seems a pretty good solution, but the capacity is not great. Also, the future for blue-ray readers doesn’t seem very bright.

          1 user thanked author for this post.
          • #2004094 Reply

            JohnW
            AskWoody Plus

            I image my system SSD daily, and my two SATA internal DATA data partitions weekly, onto a pair of spinning USB drives. So I know that data is refreshed on a regular basis.

            It’s the old 1TB spinning drives that I keep archived offline, containing old digital photo archives, and backups from old PCs where I did not migrate all of the existing data to the new systems. Plus a few images from PCs that have been retired.

            I wonder how long it’s safe to keep them lying around. A couple of them are actually mirrors (copies), so if one of them fails, I am still good.

    • #2004668 Reply

      Paul T
      AskWoody MVP

      Lost data (bit rot) can only be detected by regular checks, meaning your data needs to be on a live system. Scheduled data scrubbing is the usual method.

      You could consider ZFS instead of ReFS. It’s standard on Ubuntu, FreeNAS, XigmaNAS etc. Then you don’t have to rely on MS and Windows.

      cheers, Paul

      1 user thanked author for this post.
      • #2004721 Reply

        mn–
        AskWoody Lounger

        SSDs come in many kinds, some of them do keep better if power is on… others don’t.

        Lost data (bit rot) can only be detected by regular checks,

        Well it’d be more accurate to say that it can only be detected after the fact, but would usually be gradual so detecting it after only very little has rotted would be beneficial. While you still have other copies of it 😉

        Bitrot detection on a single disk is the best that many traditional systems could do even with a checksum-enabled RAID/mirror, but modern systems may have checksum-enabled copies of blocks or files on a single disk so might even be able to recover from that. ZFS and Btrfs at least – ReFS has checksums but I’m not sure if it does multiple copies per device?

        1 user thanked author for this post.
      • #2005009 Reply

        wavy
        AskWoody Plus

        Also open source vs MS proprietary and

        There are no tools to repair or recover a ReFS filesystem. Third party tools are dependent on reverse engineering the system and (as of 2014) few of these exist.

        https://en.wikipedia.org/wiki/ReFS#Stability_and_known_problems

        🍻

        Just because you don't know where you are going doesn't mean any road will get you there.
        • #2005010 Reply

          jabeattyauditor
          AskWoody Lounger

          In 2014, there were no repair or recovery tools for ReFS.

          We’re nearing the end of 2019; we have third-party tools in addition to those provided by Microsoft.

          When NTFS was first introduced, there were no third-party tools for it either.

          1 user thanked author for this post.

    Please follow the -Lounge Rules- no personal attacks, no swearing, and politics/religion are relegated to the Rants forum.

    Reply To: Storing a system image on One Drive

    You can use BBCodes to format your content.
    Your account can't use Advanced BBCodes, they will be stripped before saving.