• What kind of backup do you need?

    Home » Forums » Newsletter and Homepage topics » What kind of backup do you need?

    Author
    Topic
    #2484566

    Seeing the devastation in Florida reminds me that having key records stored someplace else is wise. But there are two things to always keep in mind: S
    [See the full post at: What kind of backup do you need?]

    Susan Bradley Patch Lady

    3 users thanked author for this post.
    Viewing 9 reply threads
    Author
    Replies
    • #2484577

      I still use Microsoft SyncToy to sync my Documents folder to another drive on my pc. That folder is then synced to the cloud to Sync.com.

      • #2484750

        I also image my pc’s weekly, if not more so.  I keep a copy of the latest image, plus a copy of all files from the user account on a separate fast read/write 256gb usb drive.  My most sensitive documents are encrypted using 7-zip.  Nothing is foolproof, but this works for me.

         

        1 user thanked author for this post.
    • #2484631

      Hardened Windows.  There are three copies of my data on two PC’s, as well as synced with OneDrive.

      Create a fresh drive image before making system changes/Windows updates, in case you need to start over!
      We all have our own reasons for doing the things that we do. We don't all have to do the same things.

      1 user thanked author for this post.
    • #2484633

      Ransomware cost US schools and colleges $3.5 Billion in 2021. And some of those colleges have been forced to close.

      What assurance do we have that records stored on a cloud service – potentially exposed to ransomware – are safer than those saved on an external usb drive?

      Has anyone done a statistical analysis of the risk of losing data stored in the clouds?

      1 user thanked author for this post.
      • #2484635

        What assurance do we have that records stored on a cloud service – potentially exposed to ransomware – are safer than those saved on an external usb drive?

        In my experience anything that has online exposure is not secure.  For that reason I have single copies of my data stored in OneDrive, three copies on my PC’s, and two more copies stored on HDD’s offline.  Ransomware is not a concern to me.

        Create a fresh drive image before making system changes/Windows updates, in case you need to start over!
        We all have our own reasons for doing the things that we do. We don't all have to do the same things.

        1 user thanked author for this post.
        • #2484636

          And remember when you say “online exposure” that’s also the local PC being exposed as well.  If you can go on the Internet on that device, it too is online.

          Susan Bradley Patch Lady

          1 user thanked author for this post.
          • #2484672

            And remember when you say “online exposure” that’s also the local PC being exposed as well.

            Of which I am well aware.  On the other hand, the two stored HDD’s that contain multiple drive images are offline, and remain offline even when I am transferring my latest copies—I unplug my router from my switch.

            Create a fresh drive image before making system changes/Windows updates, in case you need to start over!
            We all have our own reasons for doing the things that we do. We don't all have to do the same things.

      • #2484637

        In my mind it all depends on how you set it up.

        Are there appropriate permissions set on the info.

        Is it encrypted?

        Is there two factor set up/good authentication set up?

        Just because it’s there, doesn’t mean it can’t be made secure. Like on premises, it’s all in how you set it up.

        Ransomcloud Attacks in 2022: What to Know About Cloud Ransomware (symmetry-systems.com)

        Susan Bradley Patch Lady

      • #2484707

        What assurance do we have that records stored on a cloud service – potentially exposed to ransomware – are safer than those saved on an external usb drive?

        Ransomware detection notifies you when your OneDrive files have been attacked and guides you through the process of restoring your files.
        Ransomware detection and recovering your files

        If many of your OneDrive files were deleted, overwritten, corrupted, or infected by malware, you can restore your entire OneDrive to a previous time. Files Restore helps Microsoft 365 subscribers undo all the actions that occurred on both files and folders within the last 30 days.
        Restore your OneDrive

        SharePoint Online and OneDrive for Business Protection have built in features that help protect against ransomware attacks.
        Versioning: As versioning retains a minimum of 500 versions of a file by default and can be configured to retain more, if the ransomware edits and encrypts a file, a previous version of the file can be recovered.
        Recycle bin: If the ransomware creates a new encrypted copy of the file, and deletes the old file, customers have 93 days to restore it from the recycle bin.
        Preservation Hold library: Files stored in SharePoint or OneDrive sites can be retained by applying retention settings. When a document with versions is subject to retention settings, versions get copied to the Preservation Hold library and exist as a separate item. If a user suspects their files have been compromised, they can investigate file changes by reviewing the retained copy. File Restore can then be used to recover files within the last 30 days.
        How Microsoft mitigates risks from a ransomware attack

        Windows 11 Pro version 22H2 build 22621.898 + Microsoft 365/Edge

      • #2487424

        Re:  “a statistical analysis of the risk of losing data stored in the cloud”

         

        I honestly do not believe a robust statistical analysis is needed.

        Just search the Internet for “data breaches”, for example:

        Google “data breaches”

        finds 12 Million results today!

    • #2484630

      Well, I think this will be my first post, so I’m no longer a lurker I guess!

      My syncing setup is a bit of a mish-mash built up over time, which I probably need to rationalise / sort out one day.

      I use SyncThing to auto-sync between our home devices (2 x laptops, 2 x mobile phones, 1 x PC, 1 x QNAP NAS). I use (paid) SyncBackPro to run manual backups for the things I don’t autosync to the NAS, and which I don’t need instant remote access to. Finaly, I use (free) sync.com to sync some other things that I want access to on our mobile devices.

      I then sync from the NAS to (paid) Backblaze nightly using one of the NAS backup apps.

    • #2484680

      And then there are privacy considerations in the clouds.

      Do you understand what Microsoft can do with your data that is stored on OneDrive?

      I don’t.

      Take a look at Microsoft’s Privacy Statement at  https://privacy.microsoft.com/en-us/privacystatement

       

      1 user thanked author for this post.
      • #2484715

        Encrypt it before it gets there.  Then they can’t read it.

        Susan Bradley Patch Lady

        2 users thanked author for this post.
        • #2484720

          Ok dumb question, well 2 – how do I encrypt/unencrypt my data?  I have MS365 and use OneDrive – not for backup, just for online access for a few files.  I have backups on a USB (which I haven’t updated lately), Macrium Reflect full backup to a 1TB Seagate HD (which I backup monthly, that timeline is fine for me) and offsite backup to Carbonite, which is convenient but expensive. Carbonite also has online file access but it’s not as easy to use as OneDrive.  I like OneDrive, I know there are potential security issues, but that goes with any online service.  Carbonite expires 8/2023. I would like to use One Drive afterwards, but I have some of same concerns expressed. So, if I can encrypt/sync to OneDrive, Macrium backup to hard drive and copy files to USB that should be sufficient. I’d also save a copy of Windows and have list of apps to reload.  Macrium is full backup – the other two are files/data only. Q2 – what is sync? Does that sync file updates/changes automatically from PC to cloud service – like Carbonite? thanks! Donna

    • #2484709

      And then there are privacy considerations in the clouds.

      Do you understand what Microsoft can do with your data that is stored on OneDrive?

      I don’t.

      Take a look at Microsoft’s Privacy Statement at  https://privacy.microsoft.com/en-us/privacystatement

       

      I use Dropbox to back up vital documents to the “cloud”. My PC is backed up using 200 GB+ full boot drive image files that are locally stored.

      However, before adding a file to my Dropbox I always use 7-Zip to encrypt the file/folder with AES-256 encryption, so all that the “cloud” service can see is a “blob” of bits in that archive file. Privacy remains intact!

      If I need to retrieve the data from the cloud, I can then just open the archive on my PC using the password for the encrypted file.

    • #2484708

      SyncBackPro is a great tool.  Just works.  Have not updated it many years.  Can backup to many different cloud companies along with ftp and network connections.

    • #2484762

      In answer to the question “What kind of backup do you need?” our reply is external drives.

      In the end, the need to store data in the cloud comes down to risk mitigation.

      In our case the risk of onsite data loss in low.

      • It is highly unlikely that we will be impacted by a solar storm.   https://www.reuters.com/article/column-power-solarflux-idINN1E76H0UV20110817
      • Our exposure to a storm surge is highly unlikely – living hundreds of miles from the ocean.
      • Flood damage is unlikely – being located in a river valley about 300 feet above flood elevation and basement flooding during heavy rain limited through the use of a sub pump as well as the basement door being at ground level allowing any flowing water to drain outside.
      • Nominal exposure to wind damage due to location in the river valley and nominal exposure to tornado damage.
      • Nominal exposure to forest fires since over the last five years we have remover more than 25 90-year-old-plus dead ash trees from the property and replaced the trees with a mowed meadow.
      • All work stations are protected by high-end uninterruptible power supply systems.
      • Structures are fitted with smoke detectors and fire suppression systems.
      • All work stations are backed up daily to internal drives as well as weekly and monthly to external drives with some external drives stored offsite.
      • Located in an area with extremely low crime rate and real-time facility security system.
      • Our greatest risk is storm related downed trees taking down power lines and blocking area roads with the greatest outage in the last 30 years lasting 10 days – such disruptions are now mitigated through the use of an onsite generator, onsite well, and ample food supply.

      So, for us, the need for cloud storage, and its inherent weaknesses, offers little additional security.

      2 users thanked author for this post.
    • #2486661

      Terabyte Unlimited’s Image for Windows/Linux for disk images.

      Scooter Software’s Beyond Compare for comparing/syncing files and folders (containing volatile data) between images.

      Backups are stored offline.

       

       

      1 user thanked author for this post.
    • #2487382

      Our routine backup procedures make frequent use of 2 home-grown Windows BATCH programs:

      GET.bat

      and

      PUT.bat

      I don’t know the Forum policy about posting actual BATCH code here, so I’ll wait for Susan’s answer and provide a simple example instead.

      First of all, we maintain a very strict distinction between the C: system partition and all other NTFS drive letters on any one workstation.

      What follows are simplified examples of how we backup our various databases stored on drive letter E: which is dedicated to archiving private data files, databases, drive images etc.

      in Windows Command Prompt:

      E:

      cd \

      put belkin X

      is parsed to execute the following command line:

      xcopy belkin X:\belkin /s/e/v/d

      There is a second command line option that allows “/l” to modify the behavior of XCOPY:

      put belkin X /l

      is parsed to execute the following command line:

      xcopy belkin X:\belkin /s/e/v/d/l

      The “/l” command line option prevents any changes and merely reports what WOULD happen withOUT “/l” on the XCOPY command line.

       

      GET.bat is the exact opposite of PUT.bat.

      get belkin X

      is parsed to execute the following command line:

      xcopy X:\belkin belkin /s/e/v/d

      likewise, “/l” can be added to the command line (as above):

      get belkin X /l

      is parsed to execute the following command line:

      xcopy X:\belkin belkin /s/e/v/d/l

       

      Our website has become so large, we wrote a successful BATCH sequence that only updates a specific sub-folder, and does so for every partition where copies of that same sub-folder are maintained for extra redundancy purposes.

      In the following example, assume copies of folders and their sub-folders are maintained on E:, G:, and I: as follows:

      E:\belkin\drivers

      G:\belkin\drivers

      I:\belkin\drivers

      The BATCH program we run to update all “drivers” sub-folders is DOSUB.bat, like this in Command Prompt:

      E:

      cd belkin

      dosub drivers

      The latter is parsed to execute the following command lines:

      xcopy drivers G:\belkin\drivers /s/e/v/d

      xcopy drivers I:\belkin\drivers /s/e/v/d

      And, of course “/l” can also be appended to the command line, same as above.

       

      One of the amazing and very useful features of XCOPY is its ability to treat network drive letters the same as local NTFS drive letters.   Thus, in the examples above, X: could be a network drive letter, just as easily as a local NTFS drive letter.

      Over time, these procedures have helped a lot to keep data and hardware independent of each other.

      If any one storage device should fail, we don’t cry over spilled milk because all data files “lost” with that failure are readily available on multiple NTFS drive letters that are still working AOK — either on the same workstation and/or on other nodes in our home lab LAN.

      Hope this helps.

      • #2487387

        Rather than become committed to any given third party’s philosophy for backups, we chose to write and maintain our own BATCH programs.

        This was very EZ for me, because I have prior professional experience doing lots of system development and database management.

        With a minimal of experience with XCOPY, such BATCH programs are easy to write and maintain on almost any Windows PC.

        On our home lab LAN nodes, a different super-simple BATCH sequence is designed to maintain copies of top-level folders ONLY, like this in Command Prompt:

        E:

        cd \

        bak belkin

        is parsed to execute the following command lines:

        xcopy belkin G:\belkin /s/e/v/d

        xcopy belkin I:\belkin /s/e/v/d

        The one restriction I impose upon myself is to require all top-level folder names to have NO IMBEDDED BLANK SPACES.

         

        I’ll repeat the essential code in BAK.bat here, because it is so trivial;  and,

        it only takes a minute or so to write and test this BATCH code, using NTFS drive letters that are available on your computers e.g. external USB enclosure and/or Network drive letter etc.:

        rem BAK.bat

        E:

        cd \

        xcopy %1 G:\%1 /s/e/v/d%2

        xcopy %1 I:\%1 /s/e/v/d%2

      • #2487462

        These custom BATCH files were also intended to minimize wear on host SSDs.

        A separate discovery we made with XCOPY helped us confirm that the “/l” command line option had the positive effects of improving overall performance and minimizing wear.

        Compare A and B now:

        A:

        xcopy website x:\website /s/e/v/d

        B:

        xcopy website x:\website /s/e/v/d/l

        xcopy website x:\website /s/e/v/d

         

        The absence of the “/l” command line option appears to require NTFS to enable a file-level WRITE LOCK on every discrete file it encounters.

        WRITE LOCKs are standard in database management systems, to prevent collisions between 2 or more concurrent writers.

        By using the “/l” command line option, multiple readers are supported and, consequently, there is no need to enable a WRITE LOCK on every discrete file.

        Moreover, the first command line in B above also reads file attribute entries into memory.

        Then, the second command line in B can access those attribute entries that are memory-resident, further increasing the overall speed of B, as compared to the overall speed of A — EVEN THOUGH A has only one command line and B has two command lines.

        Bottom Line:  SSD wear is reduced over time by making B the default whenever updating multiple archived copies of any top-level folder e.g. “website.mirror” .  cf. DOSUB.bat discussed above.

    Viewing 9 reply threads
    Reply To: What kind of backup do you need?

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: