Upgrading a self-hosted server (1)

Welcome

Hi, I’m starting a series of posts that will follow the upgrades I’ll be doing to a self-hosted machine that serves as NAS and also runs all kinds of self-hosted software. I’m lazy so it will probably take time, don’t expect me to post too often.

About me: I’ve been using Linux exclusively for personal use (both desktop and servers) for about 20 years now. I’ve used several distributions over the years, I’ve built my own stuff from source (including kernels) and I’ve done Linux From Scratch. I’m not a Linux expert or professional sysadmin but I know my way around it, and I can learn what I don’t know. So don’t be afraid to make any suggestions no matter how complicated.

The current state of the machine

  • It’s a PC using an i5 7400 CPU, has a built-in GPU with support for h264 hardware encoding and MPEG2, VP8, VP9 and HEVC hardware decoding (this will come in handy for video transcoding).
  • Only 4 GB of RAM, I have ordered a dual 2x16 GB kit.
  • The system drive is a Transcend M.2 SSD (32 GB). SATA rather than PCIe unfortunately but it will do fine for the time being.
  • The OS is Ubuntu Server 16.04 LTS using Expanded Security Maintenance for updates.
  • It’s currently running SSH, NFS, Samba, CUPS, OpenVPN, Emby and Deluge on bare metal. Some of them come from distro packages, some from binary releases straight from the developer.
  • There are 6 HDDs forming 3 pairs of RAID 1 arrays. 6 drives was a limit I chose from the beginning, and the case and motherboard were chosen accordingly (cage for 6 drives and 6 SATA connectors).
  • My ISP provides a public dynamic IP and allows port forwards.
  • I have a router that I’ve recently upgraded to the latest OpenWRT so it also runs Linux, can install packages, it has a web admin interface etc. and can do some interesting stuff.

What I’d like to do

  • Increase the RAM to 32 GB.
  • Stick with a Linux distro, as opposed to a NAS-tailored OS, Unraid etc.
  • Install Debian Stable on a SSD, most likely via debootstrap from the Ubuntu system.
  • Add a GRUB menu entry that makes a passthrough to the other system, so I can keep them both around for a while.
  • Use docker-compose and possibly Portainer for as many of the services as it makes sense. Not sure if it’s worth bothering to make containers for things like SSH, NFS, Samba.
  • Add more services. I’d like to try Jellyfin, NextCloud and other stuff (trying to degoogle for example).
  • I’d like to find a better solution for accessing services from outside the LAN. Currently using OpenVPN which is nice for individual devices but gets complicated when you want an entire remote LAN to be able to access (to allow smart TVs or Chromecast to use Emby/Jellyfin for example). I’m hoping Authelia + reverse proxy will be able to help with this.

What I’m not interested in

  • Not interested in using Plex. I’ve used it for a couple of years, it’s a fine piece of software but I don’t like the fact they now mandate access through their server or injecting ads.
  • Not interested in changing the filesystem or the RAID setup for the HDDs. RAID 1 pairs give me enough redundancy. The HDD upgrades are very simple. I’m fine with losing 50% of capacity.

Any and all suggestions and comments are welcome! Even if they’re about things I said I’m not interested in. It’s always possible there are things I haven’t considered.

  • vegetaaaaaaa@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    For SSO I use old-school LDAP (openLDAP) because it is mature and integrates with anything (reverse proxies, most web applications, various file sharing/VoIP services…).

    As a general recommendation, I recommend using some kind of config management tool to manage your setup, it makes it easy to replicate your setup (in case it goes down), bring up/tear down test environments, store and version your configuration, test and rollback changes… I use ansible [1] for this as it can manage any kind of infra or deployment methods (bare-metal, VM/VPS, container-based…). Currently managing a few dozen servers with it.

  • binwiederhier@discuss.ntfy.sh
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Install Debian Stable on a SSD, most likely via debootstrap from the Ubuntu system

    What an interesting way to install a new system. I’ve only ever done that for image building purposes. Why would you do that instead of just installing it from a flash drive?

    Also: it sounds like you’re manually installing things. I would suggest Ansible or something similar, so that reinstalling isn’t so brittle and manual.

    • lemmyvoreOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      What an interesting way to install a new system. I’ve only ever done that for image building purposes. Why would you do that instead of just installing it from a flash drive?

      It minimizes downtime for the system. You can run the debootstrapped system as a guest and take your time with the initial setup and configuration.

      Interesting tidbit, I don’t use actual flash drives anymore: I download an ISO to my phone and use the DriveDroid app to make the phone look like a bootable flash drive to the PC.

      it sounds like you’re manually installing things. I would suggest Ansible or something similar, so that reinstalling isn’t so brittle and manual.

      If you mean the original install, that’s something that only happens once every 5-10 years (and that because I got taken with trying other distros instead of sticking with Debian continuously since 2003 like I should have).

      I’m not sure if I understand Ansible correctly, but attempting to replicate a system install 5-10 years later probably won’t yield the same result, the repos having moved on and so forth.

      It’s also going to be a very basic Debian system + docker-compose, not a big chore.

      If you mean for recovery purposes I usually take periodical system snapshots and can restore from those.

  • LanyrdSkynrd@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    If you’re going to try Authelia and a reverse proxy, I recommend using SWAG. It’s a docker container that includes Authelia, nginx, fail2ban, geoip restrictions, and has premade config files for most of the selfhosted software that people run. The config files are especially useful since they include comments that describe the settings you need to change within the services you run, like changing the external domain in Emby for example.

    https://docs.linuxserver.io/images/docker-swag

    • Lasso1971@thelemmy.club
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Do you think there’s any advantage to use SSO if all your external facing services already have built in 2fa (ex. Nextcloud). I use vaultwarden so it’s not like any passwords need to be remembered. Just seems like extra setup

      • lemmyvoreOP
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        What attracts me to Authelia is the ability to whitelist an IP for a limited time (2-3h) so everything in the LAN behind that IP can access, say, Emby. A person over there on their WiFi logs into Authelia and that’s it, they can stream my Emby to their Chromecast wherever they are…

      • LanyrdSkynrd@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I think SSO is less important than having everything behind the reverse proxy. The importance of the proxy is that if there is a security hole in the web server component of your service, it cannot be exploited without a second flaw in the proxy. It’s an additional layer of abstraction and security that doesn’t add a ton of overhead.

        An attacker would have to find an exploit in nginx, which is used by most of the big tech companies, so it is well secured compared to the services many of us selfhost.

        Another advantage of using SWAG is being able to use fail2ban and geoip restrictions. Any ports open to the ipv4 internet get scanned by security services and malicious actors many times each day. It’s nice to be able to have nginx refuse connections from any of them that repeatedly fail to login, or that come from outside your geographic region.