Pause for Professional Development 2

I'm still in the process of working on my professional development, so not too much reading has been done that would fit in my more operational research writing. I should probably try and merge that somewhere along the way, but I feel like the life expectancy of knowledge about software means that it's not particularly relevant.

I do want to expand on my notes at some point. I like the idea that there are levels of expertise with various languages/packages/technologies and trying to ingest all of X at once can be extremely daunting. I'm not entirely sure what the next subject should be in those. I'm thinking I should start with the more low level things and build up- cover the knowledge that people probably assume you would know.

I think I may try and tackle VMs or containers, that sort of thing. I spent a lot of time trying to work around/through the windows/linux divide. In terms of time spent, most of my time has been spent in Windows environments, but I started seeking out Linux in an effort to build consistent workflows. Fast forward a few years and I've come to realize that Linux is a step in the right direction, but is only a half measure. Problems with older OS versions, RHEL vs Debian vs ?, Python 2/3 issues, x86 vs ARM, etc can be constant.

I thought at one point that WSL and then WSL2 might be the magic bullet I had been searching for. While it has lots of potential, there are small hurdles that cause problems, such as connectivity issues that prevent it from being the fix I had been looking for. At this stage, I primarily lean on Docker for my local development. While Docker Desktop has turned into an excellent resource, it only gets me half-way there.

I had a little time this weekend, so I decided to repurpose an old router and pc and create a homelab. So I managed to setup a nice little copy of Proxmox, a PiHole in a container to experiment with that and act as a DNS, and a Ubuntu vm to help with configuration. I put it all behind an old router I had to make it somewhat isolated from my normal network. To make it easy on myself, I did put the proxmox interface within the DMZ to I can play with it from within my normal network. The machine I have doesn't have a ton of cores, but it should be enough for me to tinker with.

Eventually, I would like to build a little Raspberry Pi Kubernetes cluster, but I want to create a PoC in Proxmox first. While I feel pretty confident in my Ansible skills, I want to explore provisioning through something like Terraform, so my plan is to use Terraform and possibly Ansible to be able to provision my K8s nodes. While I stood everything up this weekend manually, ultimately I want to make sure and provision everything via code.

Given the nature of k8s, I feel like the nodes themselves should be relatively clean, so it doesn't make sense to me to do like i have in the past to distribute my ansible configuration via ansible pull, but if I put it within proxmox, I consume a core, which I only have 8. Once I have my Pi cluster, it shouldn't be a problem, so maybe that's increased motivation to make that happen.