Basically title. I’m in the process of setting up a proper backup for my configured containers on Unraid and I’m wondering how often I should run my backup script. Right now, I have a cron job set to run on Monday and Friday nights, is this too frequent? Whats your schedule and do you strictly backup your appdata (container configs), or is there other data you include in your backups?

  • battlesheep
    link
    fedilink
    English
    32 months ago

    Backup all of my proxmox-LXCs/VMs to a proxmox backup server every night + sync these backups to another pbs in another town. A second proxmox backup every noon to my nas. (i know, 3-2-1 rule is not reached…)

  • @Darkassassin07@lemmy.ca
    link
    fedilink
    English
    33
    edit-2
    3 months ago

    I run Borg nightly, backing up the majority of the data on my boot disk, incl docker volumes and config + a few extra folders.

    Each individual archive is around 550gb, but because of the de-duplication and compression it’s only ~800mb of new data each day taking around 3min to complete the backup.

    Borgs de-duplication is honestly incredible. I keep 7 daily backups, 3 weekly, 11 monthly, then one for each year beyond that. The 21 historical backups I have right now RAW would be 10.98tb of data. After de-duplication and compression it only takes up 407.98gb on disk.

    With that kind of space savings, I see no reason not to keep such frequent backups. Hell, the whole archive takes up less space than one copy of the original data.

    • @FryAndBender@lemmy.world
      link
      fedilink
      English
      3
      edit-2
      2 months ago

      +1 for borg


                         Original size      Compressed size    Deduplicated size
      

      This archive: 602.47 GB 569.64 GB 15.68 MB All archives: 16.33 TB 15.45 TB 607.71 GB

                         Unique chunks         Total chunks
      

      Chunk index: 2703719 18695670

    • Sips'OP
      link
      fedilink
      English
      63 months ago

      Thanks for sharing the details on this, very interesting!

  • @Xanza@lemm.ee
    link
    fedilink
    English
    23 months ago

    I continuous backup important files/configurations to my NAS. That’s about it.

    IMO people who redundant/backup their media are insane… It’s such an incredible waste of space. Having a robust media library is nice, but there’s no reason you can’t just start over if you have data corruption or something. I have TB and TB of media that I can redownload in a weekend if something happens (if I even want). No reason to waste backup space, IMO.

    • @groet@feddit.org
      link
      fedilink
      English
      22 months ago

      It becomes a whole different thing when you yourself are a creator of any kind. Sure you can retorrent TBs of movies. But you can’t retake that video from 3 years ago. I have about 2 TB of photos I took. I classify that as media.

      • @Xanza@lemm.ee
        link
        fedilink
        English
        12 months ago

        It becomes a whole different thing when you yourself are a creator of any kind.

        Clearly this isn’t the type of media I was referencing…

    • @Appoxo@lemmy.dbzer0.com
      link
      fedilink
      English
      22 months ago

      Maybe for common stuff but some dont want 720p YTS or yify releases.
      There are also some releases that don’t follow TVDB aired releases (which sonarr requires) and matching 500 episodes manually with deviating names isn’t exactly what I call ‘fun time’.
      Amd there are also rare releases that just arent seeded anymore in that specific quality or present on usenet.

      So yes: Backup up some media files may be important.

      • @Xanza@lemm.ee
        link
        fedilink
        English
        12 months ago

        Data hoarding random bullshit will never make sense to me. You’re literally paying to keep media you didn’t pay for because you need the 4k version of Guardians of the Galaxy 3 even though it was a shit movie…

        Grab the YIFY, if it’s good, then get the 2160p version… No reason to datahoard like that. It’s frankly just stupid considering you’re paying to store this media.

        • @Appoxo@lemmy.dbzer0.com
          link
          fedilink
          English
          12 months ago

          This may work for you and please continue doing that.

          But I’ll get the 1080p with a moderate bitrate version of whatever I can aquire because I want it in the first place and not grab whatever I can to fill up my disk.

          And as I mentioned: Matching 500 episodes (e.g. Looney Tunes and Disney shorts) manually isnt fun.
          Much less if you also want to get the exact release (for example music) of a certain media and need to play detective on musicbrainz.

          • @Xanza@lemm.ee
            link
            fedilink
            English
            02 months ago

            Matching 500 episodes (e.g. Looney Tunes and Disney shorts) manually isnt fun.

            With tools like TinyMediaManager, why in the absolute fuck would you do it manually?

            At this point, it sounds like you’re just bad at media management more than anything. 1080p h265 video is at most between 1.5-2GB per video. That means with even a modest network connection speed (500Mbps lets say) you can realistically download 5TB of data over 24 hours… You can redownload your entire media library in less than 4-5 days if you wanted to.

            So why spend ~$700 on 2 20TB drives, one to be used only as redundancy, when you can simply redownload everything you previously had (if you wanted to) for free? It’ll just take a little bit of time.

            Complete waste of money.

            • @Appoxo@lemmy.dbzer0.com
              link
              fedilink
              English
              0
              edit-2
              2 months ago

              I prefer Sonarr for management.
              Problem is the auto matching.
              It just doesnt always work.
              Practical example: Looney. Tunes.and.Merrie.Melodies.HQ.Project.v2022

              Some episodes are either not in the correct order or their name is deviating from how tvdb sorts it.
              Your best regex/automatching can do nothing about it if Looney.Tunes.Shorts.S11.E59.The.Hare.In.Trouble.mkv should actually be named Looney.Tunes.Shorts.S1959.E11.The.Hare.In.A.Pickle.mkv to be automatically imported.

              At some point fixing multiple hits becomes so tedious it’s easier to just clear all auto-matches and restart fresh.

  • @truxnell@infosec.pub
    link
    fedilink
    English
    23 months ago

    Daily backups. Currently using restic on my NixOS servers. To avoid data corruption, I make a zfs snapshot at 2am, and after that restic does a backup of my mutable data dirs both to my local Nas and CloudFlare r3. The Nas backup folder is synced to backblaze nightly as well for a more cold store.

  • itsame
    link
    fedilink
    English
    13 months ago

    Using Kopia, backups are made multiple times per day to Google drive. Only changes are transferred.

    Configurations are backed up once per week and manually, stored 4 weeks. Websites and NextCloud data is backed up every hour and stored for a year (although I’m doing this only 7 months now).

    Kopia is magic, recommended!

    • @IsoKiero@sopuli.xyz
      link
      fedilink
      English
      43 months ago

      Yep. Even if the data I’m backing up doesn’t really change that often. Perhapas I should start to back up files from my laptop and workstation too. Nothing too important is stored only on those devices, but reinstalling and reconfiguring everything back is a bit of a chore.

  • @zero_gravitas@aussie.zone
    link
    fedilink
    English
    4
    edit-2
    2 months ago

    Right now, I have a cron job set to run on Monday and Friday nights, is this too frequent?

    Only you can answer this. How many days of data are you prepared to lose? What are the downsides of running your backup scripts more frequently?

  • Scrubbles
    link
    fedilink
    English
    53 months ago

    Boils down to how much are you willing to lose? Personally I do weekly

  • slax
    link
    fedilink
    English
    32 months ago

    I have

    • Unraid back up it’s USB
    • Unraid appears gets backed up weekly by a community applications (CA app backup) and I use rclone to back it up to an old box account (100GB for life…) I did have it encrypted but seems I need to fix that…
    • Parity drive on my Unraid (8TB)
    • I am trying to understand how to use Rclone to back up my photos to Proton Drive so that’s next.

    Music and media is not too important yet but I would love some insight