DudeWithaTwist
This is all from the POV of a person who just installs Ubuntu and expects it to work. So yes. I meant major releases of the distro and I assumed they weren’t going to fuss with backports.
I figured modules that already exist in the kernel would be updated, but I’ve seen new modules added in later versions of the kernel. And since distro releases seem to stick with a specific kernel version, you would need to completely update your distro to get that support.
Genuinely curious since I haven’t used Debian-based distros in a while, I always thought new modules are installed via the linux-kernel
package. Are kernel modules installed separately?
WSL is good if you need Linux/GNU tools. file
, grep
, find
, and the occasional CMake compile are my typical use cases.
I wouldn’t consider it anything more than a tool. Try installing Linux in a VM or old computer if you want to try switching.
As others have said, Ubuntu is great for non-technical users. The only issue I could forsee is drivers. Apt loves to brick itself after 1 mistake. Since apt lags behind it may not support new hardware, forcing you to download drivers elsewhere, which is a recipe for disaster.
Just wondering why, because you need some justification to take the harder route. Oath2 is enterprise level, developed by Meta, Google, and others to be top-notch. basic_auth
works to dissuade intruders.
Unless you have a stalker trying to infiltrate your network, I can only imagine this will cause more headaches than it’s worth.
I’ve always used Nginx for my reverse proxy and their basic_auth
directive for password protection. For a homelab setup, I’m not sure why you’d need anything heavier.
Do everything in moderation. Including moderation.
Helped me realize I don’t need to be perfect all the time.