Hopefully someone can shed some light on this idea. Or explain something that kind of fits/fills the use case and need. I am looking for a basic operating system that can be updated across multiple devices like a living OS.

For instance I have a desktop PC high end specs with the same Operating System as a laptop or tablet but it’s live sync. Meaning apps, files, changes made on one system are the same on all devices. I’ve looked at cloning drives and have done it. Far too slow and cumbersome.

This would be essentially changing devices based on hardware power requirements but having the same living operating system synced across all devices so all data and abilities remain the same anytime something is needed.

Maybe I’m being far fetched or what have you and this might possibly be in the wrong Sub. But I assumed it would fall under self hosted almost. Ive considered a NAS and I’m open to other ways to structure the concept ALL IDEAS WELCOME feel free to expand on it in any way. But dealing with different operating systems and architectures of various devices is wildly difficult sometimes for software, mobility, power requirements not watts but processing power, cross compatibility. I’ve seen apps that sync across devices but some desktop apps and mobile apps aren’t cross compatible and with self hosting so many services that function well across networks and devices after years of uptime you sort of forget the configs of everything it’s a nightmare when a single app update or container causes a domino affect. Thanks everyone hopefully this is helpful to others as well with similar needs.

  • Deckweiss@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    11 days ago

    I run this somewhat. The question I asked myself was - do I R-E-A-L-L-Y need a clone of the root disk on two devices? And the answer was: no.


    I have a desktop and a laptop.

    Both run the same OS (with some package overlap, but not identical)

    I use syncthing and a VPS syncthing server to sync some directories from the home folder. Downloads, project files, bashrc, .local/bin scripts and everything else that I would actually really need on both machines.

    The syncthing VPS is always on, so I don’t need both computers on at the same time to sync the files. It also acts as an offsite backup this way, in case of a catasprophical destruction of both my computers.

    (The trick with syncthing is to give the same directories the same ID on each machine before syncing. Otherwise it creates a second dir like “Downloads_2”.)

    That setup is easy and gets me 95% there.

    The 5% that is not synced are packages (which are sometimes only needed on one of the computers and not both) and system modifications (which I wouldn’t even want to sync, since a lot of those are hardware specific, like screen resolution and display layout).


    The downsides:

    • I have to configure some settings twice. Like the printer that is used by both computers.

    • I have to install some packages twice. Like when I find a new tool and want it on both machines.

    • I have to run updates seperately on both systems so I have been thinking about also setting up a shared package cache somehow, but was ultimately too lazy to do it, I just run the update twice.


    I find the downsides acceptable, the whole thing was a breeze to set up and it has been running like this for about a year now without any hiccups.

    And as a bonus, I also sync some important document to my phone.

      • corsicanguppy@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 days ago

        Yeah. And the full root disk clone thing is honestly gonna be more trouble than value. Ensure the big-bang stuff is the same - packages, but even not perfect (as above) but just same-version where installed; and general settings - and then synch the homedir.

        God help me, I’m thinking gluster between 2-3 machines, running a VM off that (big files so lock negot isn’t an issue) and having it commandeer the local vid for gaming. It’s doomed but it’ll be fun ha ha learning ha ha.

        There are exciting ways to ensure some settings and configs are kept the same, too, when they’re outside that synched home space. Ansible if you like thunky 2002 tech, chef or salt for newer but overkill, or mgmtconfig if you want modern decentralized peer-to-peer reactive config management.

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 days ago

      This is what I was going to suggest. Have all computers running the same OS and then just sync the home directory with SyncThing.

  • just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    11 days ago

    You’re describing a number of different things here, but you’re thinking about it in an overly complex manner.

    You need a centralized file store like a NAS, and a mountable workspace from said NAS that will mount to each machine, then you need some sort of Domain Directory service to join it all together. If you want the different desktops settings and stuff synced, you can achieve this with that setup, or you can go a step deeper and use an immutable distro of some sort, and commit and keep the same revision from one machine checked out on all your other machines (works kinda like a git repo). This will likely present issues if it’s not all the same hardware though, so I would go with probably just keeping it simple if you go that route.

    User experience example would like this:

    • setup all your files on your centralized storage
    • join one machine to your domain (you can use LDAP, Samba+LDAP, NFSv4 domains…whatever)
    • login and have it pull your userinfo from the domain
    • your network mounts and user preferences will be pulled down and out in place

    Obviously this is simplified for the purposes of this post, but it should give you a direction to start investigating. Simplest path you can test this with is probably Samba, but it will be fairly limited and just serve as a starting point.

    Edit: if these concepts are a bit much for you, maybe consider getting a NAS with a good UI to make managing it much simpler. Synology has this baked in already, and I think Qnap does as well: https://www.synology.com/en-global/dsm/feature/active_directory

    • OhVenus_Baby@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      11 days ago

      The immutable distro is nice which I started putting /home in a separate parrition as a start and syncing across devices. I’m working on setting up a NAS now to make the process more longerterm friendly. By working I mean aquiring drives for storage currently have about 6tb. I just didn’t fully know the process and what it entails for software besides Tailscale. I’ve self hosted servers for games and some minor stuff. I was thinking about using synology but their hardware is wildly expensive. I really only need the drivebay and I can connect it to my server PC. Ill do a deeper dive after work.

  • sntx@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 days ago

    I’m running such a setup!

    This is my nixos config, though feel free to ignore it, since it’s optmized for me and not others.

    How did I achieve your described setup?

    • nixos + flakes & colmena: Sync system config & updates
    • impermanence through btrfs snapshots: destroy all non-declarative state between reboots to avoid drift between systems
    • syncthing: synchronise ALL user files between systems (at least my server is always online to reduce sync inconsistencies from only having a single device active at the time)
    • rustic: hourly backups from all devices to the same repos, since this is deduplicated and my systems are mostly synchronised, I have a very clear record of my file histories
  • BastingChemina@slrpnk.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 days ago

    If I understand well you want to have the same OS, software, configuration and files on different devices.

    You could have a look at nixos.

    I’m doing something similar, I have a computer in my office for work and a laptop at home for personal use and a bit of work.

    I have a config shared between the two computers, they share similar modules but also have their own specificities.

    This way when I configure VSCode for example I get the configuration synced on both.

    For synching my files I’m using a synology NAS.