Re-approaching Productivity

I bought David Allen’s Getting Things Done (GTD) oh so many years ago (I think when I bought my first Kindle!), but I never finished it. I’m sure a big part of it was getting distracted — there’s no denying that — but I think I’m also a bit hesitant to revisit it. “Productivity” is a scary word for me, and I think part of that is its abuse within the framework of capitalism.

I think I’m ready to revisit it now. When I was learning from Manager Tools, I was just beginning my drift even further leftward from my already liberal identity. During that time, I was cognizant of the primary audience being business owners, managers and stakeholders — not the stumbled-into-managerial-role I had at the time. I have a feeling that something similar will hold for GTD, and therefore I need to have a similar approach to learning from it.

So I’m deciding to try out the audiobook version of GTD, which will give me the guide rails I need and also give David Allen’s updated version a chance. I don’t expect him to even mention capitalism, but I do expect a bit more self-awareness. Probably Elizabeth Warren-levels of awareness.

I think my own environment and context has changed significantly enough that the concepts I’m currently tackling — productivity, equity, mental health, neurodivergence, Left ideologies — are driving me to unify them in some way or another. I don’t expect something neat and tidy here, but — in the tradition of Grace Lee Boggs — I think exploring how they’re related will push me to a greater understanding of how we can continue to evolve and push for change.

Windows Development

With the recent advent of .NET 5, I’m taking the plunge and setting up my gaming PC as a development work station. I haven’t had to setup a Windows PC since last year, so I figured it was time to revisit my setup and document what I did. Here we go!

Scoop

Chocolatey is usually the go-to package manager on Windows, but for awhile I’ve been watching the cool things done over at Scoop. After installing it, I wanted to do a few more things.

sudo command

It’s generally a pain-in-the-ass to run CLI things with elevated permissions, and in the past I’ve just run PowerShell as an admin. With the introduction of Windows Terminal, it’s once again difficult to just run things in admin. And that’s fine, really — we shouldn’t be running everything as admin anyway. So to make our lives easier, we can do the following in an elevated PowerShell terminal.

scoop install sudo --global

Then later — when we’re back in a non-elevated PowerShell in Windows Terminal — we can just do sudo <command> and be on our merry way. The sudo command will take care of prompting us to accept the elevated permissions (which we’re used to doing on Windows anyway with that second-nature pop-up) and then run the command, no-sweat. Tadaa! Then later when we want to install global packages like vim we can run the following without a second thought:

sudo scoop install vim --global

Default Git Config

There are a few approaches to this. The first is to figure out where your config file is and just edit it directly. You can figure out where each config is located by running the following:

git config --list --show-origin

The second way is to just run a bunch of commands, since that way you don’t need to worry about getting the syntax right. Here’s a list of commands I usually run:

git config --global core.editor "vim"
git config --global alias.co checkout
git config --global alias.b branch
git config --global alias.ci commit
git config --global alias.s status

PowerShell Setup

Next I want to activate some nicer features in PowerShell, so I need to edit my PowerShell profile. Since its location is stored in $profile I can just do

vim $profile

I can then add the following, for starters:

Import-Module posh-git
 Import-Module oh-my-posh
 Set-Theme Zash

 Set-Location ~/Workspaces

Obviously there’s lots more customization you can do. Scott Hanselman wrote a good post to get you started.

AutoHotkey

AutoHotkey is a lifesaver for anyone wanting to do keystroke customization on Windows. I specifically use it to rebind my caps lock key to escape. Below is the script I launch when Windows starts. (The script isn’t actually Ruby, but I wanted to get some syntax-highlighting working for y’all.)

#SingleInstance force
Capslock::Esc

Conclusion

There’s a lot of customization to be had here! My next steps will probably be to automate setup to some degree — I’ve already done that with my dotfiles repository for macOS. Now that I’ve become much more of a .NET developer over the years, I think it makes sense for me to be able to do development on either my MacBook Pro or my Windows gaming rig, whichever feels more comfortable, subject to my whimsy.

PowerShell 7.1

Howdy! It’s been awhile. Just wanted to write a quick update about setting PowerShell 7.1 as my new default PowerShell.

  1. Install PowerShell 7.1
  2. Find the PowerShell shortcut. I usually do this by searching using the Windows menu.
    1. Mine was here: C:\ProgramData\Microsoft\Windows\Start Menu\Programs\PowerShell
    2. If you installed via the Microsoft Store it’s slightly more complicated but still possible, I just don’t have the steps on-hand.
  3. Open a new Explorer window and go to C:\Users\{Username}\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Windows PowerShell to find the shortcut that Windows uses when you right-click the Windows icon and choose “PowerShell”
  4. Replace that shortcut with a copy of the one for PowerShell 7.1
  5. Test it out!

These steps apply to any version of PowerShell, but I’ve found them especially useful for 6 and 7. I wanted to get it figured out as I get Vagrant and Docker setup on my gaming PC so I can mess around with them in Windows. Cheers!

Edit: I tried the Windows Store installation of PowerShell and it just worked! It must update something similar to what I set for the above steps. To clarify a bit, here’s the target that I had for the “Windows PowerShell” shortcut located above:

"C:\Program Files\PowerShell\7\pwsh.exe" -WorkingDirectory ~

Destiny Weapons

I recently got back into Destiny 2 and now that I hit the power level cap pre-raid, I’m trying to get on top of the legendaries that I constantly grind out and place in my vault. Gotta keep up on space! Datto has a good overview of weapon perks, and I think I’ve found a good loadout that works for me in PvP.

I’m still absorbing weapon perk info, but I’m pretty decent at knowing what feels good for myself. So below are the weapons I enjoy using. The following screenshots come from Gunsmith.io.

Last Perdition

This is my favorite pulse rifle, hands down. It has good range (especially with the masterwork) and I’m able to take down enemies from across the map, save for running into set-up snipers. At first I wasn’t wild about Full Auto Trigger System, but it’s starting to grow on me since I don’t need to worry about manually keeping my DPS up. It also does void damage which is nice if I’m running with Nezarec’s Sin.

Blasphemer

Because I’m running a pulse rifle in my energy slot, this shotgun is my go-to pairing for PvP. It has a range masterwork so mostly gets the job done. I’m still not that great at using PvP (except as a counter), so not sure what weapon mod to put on this.

Bad Omens

This rocket launcher has been great as a stand-in for Truth (until I get it) and also helps with Gambit kills when invading b/c of the blast radius. It’s also good in PvE for hitting lots of adds (like grinding out the rocket kills for the Truth quest, haha), but I bet I could use a better direct-damage rocket launcher. I might have one somewhere, but for now this one has surfaced to the top of my heavy slot.


I think that’s about it for my thoughts! Wanted to get these in, try blogging again, and add some more images for fun. Catch ya around!

Deleting Old Git Branches — PowerShell Edition

I did a post on this earlier (I’ll link it once it’s uploaded, perhaps), but it relied on me being in a Unix-like system. Because I do Windows development right now, I wanted a way to easily do this in PowerShell. StackOverflow certainly has answers, but I cobbled together my own version below that works for the toolkit I have.

git branch --merged | rg -v "(^\*|master)" | %{ git branch -d $_.trim() }

ripgrep has been in my toolkit for years now at this point, so I used it for filtering out master. The last piped part is strictly used in PowerShell and was the part I was missing, since I was used to using xargs before. Tada! Works like magic.

Generating Stored Procs for DbUp

Retrieving the Procs

I recently tasked myself with setting up DbUp to add database migrations to a .NET project at work. It was straightforward enough to generate the table definitions with DataGrip but I needed to get my hands dirty to create the stored procedures. To get a lay of the land, I checked out my favorite schema in SQL Server, INFORMATION_SCHEMA.

SELECT * FROM INFORMATION_SCHEMA.ROUTINES

The columns I care about here are ROUTINE_NAME and ROUTINE_TYPE, the latter which I want to make sure is always “PROCEDURE”, which it is for my case. (yay) ROUTINE_DEFINITION is also worth paying attention to, but it’s capped at 4000 characters so I need to query the sys schema to make sure I get the full procs. Below is the information I need.

SELECT 'NEW PROC STARTS HERE', o.name, sm.definition FROM sys.sql_modules sm
INNER JOIN sys.objects o
ON sm.object_id = o.object_id AND o.TYPE = 'P'

I take this output in DataGrip and download it to a TSV I can parse later to create a set of SQL scripts that can be run to create the procs in my database.

Creating Temporary Proc Files

At this point it’s worth noting that if I had a SQL GUI that let me select all the procs and download them to a set of scripts, I would totally just do that. So I want to re-iterate what I’m trying to accomplish so I don’t go down a rabbit hole:

  • Each proc needs to be its own SQL script named after the proc
  • I want to parse this TSV without altering it
  • I shouldn’t have to do this again

That last point is worth discussing — it’s true I won’t have to do this again for this project (since future procs will be in source control and using DbUp for change management), however if this were worth turning into a generic tool then I’d want to make this parsing code re-runnable. I used to work at a company that would’ve definitely benefited from such a process, but alas they never encouraged me to go down this path and clean-up their bad practices.

Exploratory Parsing

To kick things off, I looked for my NEW PROC STARTS HERE text to figure out what kind of formats I had to deal with. In short, it looked a bit like this:

NEW PROC STARTS HERE	SelectSomeStuff	"create procedure

The only thing I could count on was that after that first double-quote, the proc would actually begin. Then we’d have newlines that would get introduced that would be part of the proc. We’d only know we were at the end when we got to another NEW PROC STARTS HERE delimiter.

The Script

Here’s what I cobbled together using the guidelines above.

class ProcFile
  attr_accessor :name, :contents

  def initialize()
    @contents = ""
  end
  
  def write_to_file
    File.open("#{@name}.sql", "w") do |f|
      f.write(@contents)
    end
  end
end

proc_start = "NEW PROC STARTS HERE"

new_proc = ProcFile.new()
File.readlines("procs.tsv").each do |line|
  line.sub!("CREATE PROCEDURE", "CREATE OR ALTER PROCEDURE")

  if line.include?(proc_start)
    # NEW PROC STARTS HERE\tProcName\t"This Is The Proc
    r = /^NEW PROC STARTS HERE\t(.+)\t"(.+)$/

    unless new_proc.contents.empty?
      new_proc.write_to_file()
    end

    match_data = r.match(line)

    new_proc = ProcFile.new()
    new_proc.name = match_data[1]
    new_proc.contents = match_data[2]
  else
    new_proc.contents += line unless line =~ /^"$/
  end
end

# write final file
new_proc.write_to_file()

Translating to a DbUp-Friendly Structure

Here’s the CLI tool syntax I wrote to create new database migrations from scratch:

$ customtool generate migration --name CreateNewTable

And this would create something like 20200214200931_CreateNewTable.sql and put it in the Migrations directory and everyone would rejoice. But unlike table definitions and other types of migrations I’d like to run, I want to treat my procedures more like code where the whole thing gets re-run anytime I know it changes. Therefore, my tool needs a new syntax.

$ customtool generate procedure --name SelectSomeStuff

This command instead deposits generated procs in the Procedures directory. Order no longer matters, so there’s no timestamp component to the filename. And when I want to run them, I just do:

$ customtool deploy procedures

Caveats

It should be obvious, but I want to note this here — the script I wrote mostly worked. It didn’t quite work when the SQL syntax wasn’t capitalized correctly or spaced consistently through all the scripts. To test the stored procedures I converted, I ran them locally and made an assumption that any conversion issues would show up as syntax errors and not still be valid SQL. You’re always better off doing a healthy dose of testing — even though this is still in-progress for me, I plan to run the CREATE OR ALTER on the lower environments so we can verify that everything works as intended now and not have to worry about it later if I run all the procs and introduce some small bug.

It’s also worth noting that this is exactly why stored procedures aren’t popular in frameworks like Ruby on Rails. They’re so difficult to test and do change management on! I really only encountered them when I entered .NET land, and then I was horrified how often engineers I worked with thought they were a good idea to implement. Yes, you can get performance gains from them. But you’re almost always better off doing something else that’s a bit more testable, just so you can sleep at night.

Addendum: System.CommandLine

Wasn’t going to write much here, except to say that I’m using the new System.CommandLine library to write my CLI tool. It’s still in pre-release but the functionality it currently provides is more than enough for me to write my tool without incurring too much of a headache either parsing the input myself (i.e. I don’t use a library) or learning a confusing library API, which was my experience with Command Line Parser.

Revisiting Management

I recently hopped to a new company a few months ago and am now an individual contributor. And while I’m glad I no longer have to manage people for the time being, I know it will be something I revisit down the road, at either the behest of the company I now work for or my own renewed desire, or perhaps both. I also continue to reflect on how well I managed my teams over the last few years, which is what has prompted recent reflection as well as this blog post.

In 2018 I became a manager for the first time. My boss at the time — who also happened to be our team’s architect, my mentor and now my friend — sent me to Dallas for the Manager Tools Effective Manager and Effective Communicator conferences. It took place over two days, and I learned a lot about how to approach leading and managing a team. Mark Horstman broke down a lot of the myths of management I had inadvertently steeped over time, and expanded upon a lot of the intuitive lessons I had learned but not yet grounded in rationale or experience. His talk at USI gives a good taste of what it was like to not just listen to him, but work with him on management and communication.

In the talk he goes over the four behaviors that matter, and the tools his company developed to enact those behaviors. I’ve listed them below, with the latter in parentheses.

  1. Know Your People (One on Ones)
  2. Talk About Performance (Feedback)
  3. Ask For More (Coaching)
  4. Push Work Down (Delegation)

I’ll be honest — I haven’t thought about these in this way since Mark first taught them to me. And for better or for worse, a lot of those behaviors were enacted by the org structure we had at the time. Here’s how my boss fared with these behaviors with me as his direct:

  1. He was excellent about doing 1:1s with me and getting to know me. He did them consistently and knew about my interests outside of work as well as how well I was doing with schooling.
  2. 1:1s help facilitate this, as did his involvement in our sprint retros. He also filled out and discussed quarterly reviews with me. Each of these feedback loops (weekly, bi-weekly, quarterly) helped me learn from my mistakes.
  3. This didn’t happen often, but I would seek out coaching during 1:1s and I remember when I had to have a talk with one of my directs and he challenged me to take that on while stepping outside my comfort zone.
  4. Work was naturally pushed down because he was involved with planning our sprints.

At the end of 2018 I got a new boss — a remote boss, an overworked boss, a boss with no prior history working together. During that time I didn’t reflect on these behaviors, but looking back I notice something scary:

  1. He was not consistent with 1:1s every week. It took over a month to get them started, and he only started after I kept pushing it. He also frequently canceled 1:1s if he had a meeting with me at some other time during that week, figuring that was a valid substitution. (Narrator: it was not.)
  2. Outside of spotty 1:1s, he was not involved with our sprint and therefore had little context for our sprint retros. We had quarterly reviews that were based on a list of accomplishments he had each of us write up and send to him. Then these would be hastily written up and discussed before they were due to HR.
  3. This sometimes happened, but instructions were vague and expectations were never set.
  4. Work was only pushed down when someone was breathing down his neck. And it never aligned with what our team was doing or with what product wanted, so getting it into a sprint was a mess.

Going back to Mark’s talk, Mark spells out the two most important responsibilities of a manager: results and retention. The first boss did both of these things; the second boss did neither. Which brings me to my next reflection: how well I did as a manager. Without going into another breakdown of the four behaviors, for each of my directs — relationships are with people, not teams! — I can tell you that I was focused so much on retention that I papered over a lot of the bad that my own boss was doing to make my job impossible.

Retention was most important in my eyes because we were bleeding a lot of people in 2018 (both those let go and those looking for greener pastures) and HR wasn’t doing a damn thing to help us hire good candidates. Mark even mentions it — hiring the wrong person is worse than hiring no one at all. But we as a company weren’t trying to do more with less, we were doing less with less with the expectation of more. It was devastating.

Towards the end of my time at that company, I started to focus more on results when I realized that retention of my people wasn’t enough to combat the lack of retention in the rest of the company. That first boss I mentioned — the one who did everything right — was eventually stretched thin and chose to leave, along with most of the other architects in the company. And I soon learned that other teams were bleeding people as well. We got a new CEO halfway through 2019 but by then we had lost a lot of blood and it was going to take time to close the wound and begin to heal. And that’s when I decided to leave.


I don’t really have a happy ending or anything here. I’m still working through the details and reflecting on where I was and where I am now. I should probably get my copy of The Effective Manager back and maybe even purchase a copy of The Effective Hiring Manager.

I also know it’s two in the morning, so I’m not going to make good decisions right now, especially purchasing decisions. And I’m not currently a manager — I should be focusing on being an effective IC. Which probably means getting better sleep, eating healthier, making sense of my Todoist tasks and sticking to a budget. 🙂

Therapy

2020 achievement unlocked — I filed a claim for my most recent therapy session.

A few things made this possible. One was just having a better job — not only do I have energy I didn’t have before, but I have benefits that cover therapy. I actually forgot about this when I went back to check on old claims with my old insurance, and why I opted for an HSA in the first place. Also helped that the site for my new healthcare is a lot easier to use than my old one.

Is this what it feels like when your employer takes care of you? Is this the fabled workplace of yesteryear that my parents’ generation used to depend on?

I was trying to figure out if I should go back to therapy. If I do, I want to have a plan in place. Something to work on. I haven’t been in a position before where therapy was something that I could regularly have — I mostly just saved up money for when I really needed it, such as going through rough relationships or enduring shitty jobs.

My partner and I are learning what it’s like to work at companies that care in substantive ways. Sometimes that reaction is surprising — I was stressed out this weekend after our company party and when I talked to my mom about it, she suggested I was feeling guilty and I initially balked at that insinuation. Now that I have calmed down a bit, it doesn’t sound so far-fetched. Either way, she suggested I go back to therapy to talk about it and this might be a good starting topic.

Upgrading PHP

WordPress now has a health check you can run against your site to make sure it has all the latest bells and whistles. You can find it under Tools > Site Health — this is new to me, but I haven’t used WordPress in years.

My health check had some straightforward items — including to remove themes I wasn’t using. But there were two that I knew I would have to get my hands dirty for — upgrading PHP to at least 7.3, and installing the PHP extension imagick. I had to bumble around the Internet to figure out all the steps, so I figured I’d detail my findings here to make sure I understood what I did.

Installing PHP

I decided to go straight to PHP 7.4 since it’s the latest and this WordPress blog doesn’t have a lot of customization. On my host DigitalOcean, Ubuntu 18.04 comes with PHP 7.2 by default. So first thing was to SSH into the box and start getting the relevant packages.

$ apt-get install software-properties-common
$ add-apt-repository ppa:ondrej/php
$ apt-get update
$ apt-get install php7.4

The software-properties-common was already installed, but I’m pretty sure it enabled me to add the Personal Package Archive (PPA) on the next line. It looks like Ondřej Surý maintains the PPA for PHP — seems odd, but I saw multiple sources cite this repo so I went ahead with it. Then I ran a standard apt-get update and installed PHP 7.4 next.

For a sanity check, I ran php --version and was surprised it was on 7.4! But alas, this wasn’t enough for WordPress to start using it. So next I had to figure out how to get off of PHP 7.2.

Loading PHP Via Apache

This part was cool b/c I learned more about how Apache works! In the /etc/apache2/mods-available directory are a list of available mods for Apache to use, including php7.2.load and the newly installed php7.4.load. My gut told me I had to enable PHP 7.4 and disable 7.2, so that’s exactly what I did.

$ a2dismod php7.2
$ a2enmod php7.4
$ systemctl restart apache2

Loading Remaining WordPress Libraries

There was a DigitalOcean tutorial that suggested I install the following commonly-used WordPress PHP extensions.

$ apt-get install php-curl php-gd php-mbstring php-xml php-xmlrpc php-soap php-intl php-zip

Of course that wasn’t enough. After making Apache configurations above and restarting, I was told I needed to install the MySql extension.

$ apt-get install php-mysql

This worked! Now that I had WordPress running on 7.4, I went ahead with the remaining imagick extension.

$ apt-get install php-imagick

That’s it!

Execute Program

I recently started getting into Execute Program — it comes free with a subscription to Destroy All Software and I was feeling a bit down about work. (More on that later.) I’m making my way through the JavaScript Arrays course as a mix of refresher and new. Once I’m done (or close to done), I’ll attempt TypeScript, since there are some things I want to build in React using it and it’ll be useful at work.

Speaking of work and why I was feeling down: I had a pretty bad week last week. I was really struggling with the codebase I was working in and wasn’t quickly learning from my mistakes. This week has started off better; I attribute that both to a better attitude and my amazing co-workers. Everyone is really supportive of each other and that goes a long way towards getting myself out of a rut. On top of that, over the weekend I reflected on where I was mentally and I think that’s helping me figure out how to ground my problem-solving.

So that’s why I’m continuing to do the JavaScript learning. It’s useful. It’s fun. It’s productive. Of course I’d love to create my fantasy football API or build a React project in TypeScript but those projects take a lot of energy and right now I only have so many spoons to give.

I’ll catch ya’ll later. Hoping to make this a more regular thing.