How do you guys quickly sync your settings (especially bash aliases and ssh keys) across your machines?

Ideally i want a simple script to run on every new server I work with. Any suggestions?

44 points

I suggest you don’t sync SSH keys. That’s just increasing the blast radius of any one of those machines being compromised.

permalink
report
reply
10 points

Exactly this. Don’t move private keys between machines. Generate them where you need them, it’s not like they cost anything

permalink
report
parent
reply
3 points

I mean, you want to copy the public keys that represents your machines, right?

permalink
report
parent
reply
4 points

Fair point, but I would equate that with syncing the authorized_keys file rather than thinking about how to sync the keys.

permalink
report
parent
reply
2 points

Right. Use some kind of centralized authentication like freeipa.

For bash aliases, I just pull down a .bashrc from github gists.

permalink
report
parent
reply
10 points

OP should just generate a unique SSH key per device (+ user).

permalink
report
parent
reply
4 points

Agreed. I’ve probably got 100 keys registered with GitHub and 98 of them the private key is long destroyed due to OS reinstalls or whatnot. Format machine, new key. New machine, new key.

permalink
report
parent
reply
2 points

I d rather have 2 to 3 (for critical, mid, and test systems) ssh keys that are regularly rotated than 1 key per machine. I m not gonna balance 50 ssh keys; neither enter my password every time i jump hosts.

permalink
report
parent
reply
1 point

Is the url is easy to rember?

permalink
report
parent
reply
20 points

Dotfiles go in git, SSH keys are state.

I’m looking to migrate to home-manager though because I use Nix on all my devices anyways.

permalink
report
reply
4 points

Home manager is great

permalink
report
parent
reply
1 point

I also have multiple versions of by bash_profile with syntax specific to the OS. It checks if we’re on MacOS or Linux with a kernel check and then reads the appropriate ancillary bash_profile for that platform. Anything that can live in the main bash_profile with the same command on both platforms lives there and anything that needs to be system-specific is in the other one.

I have all my important functions as individual files that get loaded with the following:

function loadfuncs() {
	local funcdir="$HOME/.dotfiles/functions/"
	[ -e "${funcdir}".DS_Store ] && rm "$HOME/.dotfiles/functions/.DS_Store"
	local n=0

	for i in "${funcdir}"*; do
		source "$(realpath $i)"
		n=$(( n + 1 ))
	done
}
loadfuncs

permalink
report
parent
reply
1 point

Interesting way to go about it. Though when I’m at the point where I need differences between linux and darwin, I’m probably going to do that at the home-manager level.

permalink
report
parent
reply
1 point

Just for fun, here’s how I’m checking that (this was written in 2016 and may require adjusting as I haven’t been keeping up on Linux for a while):

function oscheck() {
	if [[ "$(uname -s)" == 'Darwin' ]]; then

		# echo Darwin
		osType=Darwin
		return 0

	elif
		[[ "$(uname -s)" == 'Linux' ]]; then

		# echo Linux
		osType=Linux

		grep CentOS /etc/os-release > /dev/null
		if [[ "$?" == 0 ]]; then
		    # echo "CentOS"
		    export theDistro=CentOS
		    return 0
		else
			:
		fi

		grep Ubuntu /etc/os-release > /dev/null
		if [[ "$?" == 0 ]]; then
		    export theDistro=Ubuntu
		    return 0
		else
			:
			# echo "Not Ubuntu"
		fi

		printf "  %s\n" "Error: osType tested true for Linux, but did not find CentOS or Ubuntu." ""
		return 1

	else
		osType=Unknown
		return 1
	fi
}
oscheck
permalink
report
parent
reply
18 points

I’m surprised no one mentioned ansible yet. It’s meant for this (and more).

By ssh keys I assume you’re talking about authorized_keys, not private keys. I agree with other posters that private keys should not be synced, just generate new ones and add them to the relevant servers authorized_keys with ansible.

permalink
report
reply
2 points

If the keys are password protected… eh why not sync them.

Also ssh certificates are a thing, they make doing that kind of stuff way easier instead of updating known hosts and authorized keys all the time

permalink
report
parent
reply
1 point

Passwords will be brute forced if it can be done offline.

Private SSH keys should never leave a machine. If a key gets compromised without you knowing, in worst case you will revoke the access it has once the machine’s lifespan is over. If you copy around one key, it may get compromised on any of the systems, and you will never revoke the access it has.

And you may not want to give all systems the same access everywhere. With one key per machine, you can have more granularity for access.

permalink
report
parent
reply
1 point

Passwords will be brute forced if it can be done offline.

Set a good high entropy password, you can even tie it to your login password with ssh-agent usually

Private SSH keys should never leave a machine.

If this actually matters, put your SSH key on a yubikey or something

If a key gets compromised without you knowing, in worst case you will revoke the access it has once the machine’s lifespan is over.

People generally don’t sit on keys, this is worthless. Also knowing people I’ve worked with… no, they won’t think to revoke it unless forced to

and you will never revoke the access it has.

Just replace the key in authorized_keys and resync

And you may not want to give all systems the same access everywhere

One of the few reasons to do this, though this tends to not match “one key per machine” and more like “one key per process that needs it”

Like yeah, it’s decent standard advice… for corporate environments with many users. For a handful of single-user systems, it essentially doesn’t matter (do you have a different boot and login key for each computer lol, the SSH keys are not the weak point)

permalink
report
parent
reply
2 points

I use Ansible for this as well. It’s great. I encrypt secrets with Ansible vault and then use it to set keys, permissions, config files, etc. across my various workstations. Makes setup and troubleshooting a breeze.

permalink
report
parent
reply
10 points

This looks popular: www.chezmoi.io

permalink
report
reply
2 points

+1 this, it is amazing. The scripting features are the cherry on top.

permalink
report
parent
reply
10 points

Git and GNU stow.

permalink
report
reply
3 points

I love this solution, I’ve been using it for years. I had previously just been using the home directory is a git repo approach, and it never quite felt natural to me and came with quite a few annoyances. Adding stow to the mix was exactly what I needed.

permalink
report
parent
reply
1 point

Ditto – I’ve been keeping a central to me git repo for my settings for years. Any new machine I’m on ‘git clone ; ./settings/setup.sh’, then my pull’d .profile does a git pull on login.

permalink
report
parent
reply
1 point

This is the only answer for me. Bonus points if your .login file does a background git pull.

permalink
report
parent
reply

Linux

!linux@lemmy.ml

Create post

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word “Linux” in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

  • Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
  • No misinformation
  • No NSFW content
  • No hate speech, bigotry, etc

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

Community stats

  • 7.5K

    Monthly active users

  • 6.6K

    Posts

  • 179K

    Comments