two ways syncthing
Linux
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
or 3 way with an always on server (like a raspi or cheapest VPS with just enough storage) so that you don't have to have both computers on at the same time (thats what I am doing currently and it works great).
I use syncthing for this purpose all the time. I seemlessly move from my work PC, home PC or laptop. I sync my data directories and most of my config settings. some are different per system (monitors, etc). 10/10 highly recommend
Question if you don't mind: is it theoretically possible to use syncthing on the root directory of a given arch install, somehow blacklist hardware specific components, and basically have a running clone between both systems? I've never heard of syncthing before this but it sounds intriguing
I am not sure technically, but even if possible it would be a nightmare of resolving conflicts manually, since a lot of system files are constantly written to and read from and it would mess everything up if syncthing is overwriting the file at the same time.
Syncthing could do it.
As a sysadmin I would try making the PC’s hypervisors and syncing a VM? Might be over engineered but I think it would work.
Regardless of what technical solution you decide to rely on, e.g borgbackup, Synchting or rsync, the biggest question is "what" do you actually need. You indeed do not need system files, you probably also applications (which can fetch back anyway) so what left is actually data. You might want to then save your ~ directory but that might still conflict with some things, e.g ~/.bashrc or ~/.local so instead you might want to start with individual applications, e.g Blender, and see where it implicitly or you explicitly save the .blend files and all their dependency.
How I would do it :
- over the course of a day, write down each application I'm using, probably a dozen at most (excluding CLI tools)
- identify for each where data is stored and possibly simplify that, e.g all my Blender files in a single directory with subdirectory
- using whatever solution I have chosen, synchronize those directories
- test on the other device while being on the same network (should be much faster and with a change of fixing problems)
then I would iterate over time. If I were to often have to move and can't really iterate, I would make the entire ~ directory available even though it's overkill, and only pick from it on a per needed basis. I would also insure to exclude some directories that could be large, maybe ~/Downloads
PS: I'd also explore Nix for the system and applications side of things but honestly only AFTER taking care of what's actually unique to you, i.e data.
Thank you for the detailed response! Yes, the what data and how to not create conflicts has been troubling me the most.
I think I might first narrow it down with test VMs first, to skip the transfer part, before I actually use it “in production“.
Honestly a very imperfect alternatives but that's been sufficient for me for years is... NextCloud of documents.
There are few dozen documents I need regardless of the device, e.g national ID, billing template, but the vast VAST majority of my files I can get on my desktop... which is why I replied to you in depth rather than actually doing it. I even wrote some software for a "broader" view on resuming across devices including offline, namely https://git.benetou.fr/utopiah/offline-octopus as a network of NodeJS HTTP servers but ... same, that's more for the intellectual curiosity than a pragmatic need. So yes explore with VMs if you prefer but I'd argue remain pragmatic, i.e what you genuinely do need versus an "idealized" system that you don't actually use yet makes your workflow and setup more complex and less secure.
This should work, as on Linux you can also share a home directory. In my experience (using the same home partition for different installations) there might be minor issues like a additional plasmoid not working on both systems, although this was on two different distros so you may not experience any issues.
There is the overkill method of proxmox clustering VMs. You could work from a cloud instance of your distro. There is NixOS, you would clearly define your whole system and then back up and import your home folder when switching between the PCs. Since you are using an immutable distro already you can probably skip nix and use a onedrive type solution setup to sync your home directory. I havent used it but other people have suggested Syncthing and it seems like it would work for your use case and be the simplest option.