I recently got back into Destiny 2 and now that I hit the power level cap pre-raid, I’m trying to get on top of the legendaries that I constantly grind out and place in my vault. Gotta keep up on space! Datto has a good overview of weapon perks, and I think I’ve found a good loadout that works for me in PvP.

I’m still absorbing weapon perk info, but I’m pretty decent at knowing what feels good for myself. So below are the weapons I enjoy using. The following screenshots come from Gunsmith.io.

Last Perdition

This is my favorite pulse rifle, hands down. It has good range (especially with the masterwork) and I’m able to take down enemies from across the map, save for running into set-up snipers. At first I wasn’t wild about Full Auto Trigger System, but it’s starting to grow on me since I don’t need to worry about manually keeping my DPS up. It also does void damage which is nice if I’m running with Nezarec’s Sin.

Blasphemer

Because I’m running a pulse rifle in my energy slot, this shotgun is my go-to pairing for PvP. It has a range masterwork so mostly gets the job done. I’m still not that great at using PvP (except as a counter), so not sure what weapon mod to put on this.

Bad Omens

This rocket launcher has been great as a stand-in for Truth (until I get it) and also helps with Gambit kills when invading b/c of the blast radius. It’s also good in PvE for hitting lots of adds (like grinding out the rocket kills for the Truth quest, haha), but I bet I could use a better direct-damage rocket launcher. I might have one somewhere, but for now this one has surfaced to the top of my heavy slot.


I think that’s about it for my thoughts! Wanted to get these in, try blogging again, and add some more images for fun. Catch ya around!

Deleting Old Git Branches — PowerShell Edition

I did a post on this earlier (I’ll link it once it’s uploaded, perhaps), but it relied on me being in a Unix-like system. Because I do Windows development right now, I wanted a way to easily do this in PowerShell. StackOverflow certainly has answers, but I cobbled together my own version below that works for the toolkit I have.

git branch --merged | rg -v "(^\*|master)" | %{ git branch -d $_.trim() }

ripgrep has been in my toolkit for years now at this point, so I used it for filtering out master. The last piped part is strictly used in PowerShell and was the part I was missing, since I was used to using xargs before. Tada! Works like magic.

Generating Stored Procs for DbUp

Retrieving the Procs

I recently tasked myself with setting up DbUp to add database migrations to a .NET project at work. It was straightforward enough to generate the table definitions with DataGrip but I needed to get my hands dirty to create the stored procedures. To get a lay of the land, I checked out my favorite schema in SQL Server, INFORMATION_SCHEMA.

SELECT * FROM INFORMATION_SCHEMA.ROUTINES

The columns I care about here are ROUTINE_NAME and ROUTINE_TYPE, the latter which I want to make sure is always “PROCEDURE”, which it is for my case. (yay) ROUTINE_DEFINITION is also worth paying attention to, but it’s capped at 4000 characters so I need to query the sys schema to make sure I get the full procs. Below is the information I need.

SELECT 'NEW PROC STARTS HERE', o.name, sm.definition FROM sys.sql_modules sm
INNER JOIN sys.objects o
ON sm.object_id = o.object_id AND o.TYPE = 'P'

I take this output in DataGrip and download it to a TSV I can parse later to create a set of SQL scripts that can be run to create the procs in my database.

Creating Temporary Proc Files

At this point it’s worth noting that if I had a SQL GUI that let me select all the procs and download them to a set of scripts, I would totally just do that. So I want to re-iterate what I’m trying to accomplish so I don’t go down a rabbit hole:

  • Each proc needs to be its own SQL script named after the proc
  • I want to parse this TSV without altering it
  • I shouldn’t have to do this again

That last point is worth discussing — it’s true I won’t have to do this again for this project (since future procs will be in source control and using DbUp for change management), however if this were worth turning into a generic tool then I’d want to make this parsing code re-runnable. I used to work at a company that would’ve definitely benefited from such a process, but alas they never encouraged me to go down this path and clean-up their bad practices.

Exploratory Parsing

To kick things off, I looked for my NEW PROC STARTS HERE text to figure out what kind of formats I had to deal with. In short, it looked a bit like this:

NEW PROC STARTS HERE	SelectSomeStuff	"create procedure

The only thing I could count on was that after that first double-quote, the proc would actually begin. Then we’d have newlines that would get introduced that would be part of the proc. We’d only know we were at the end when we got to another NEW PROC STARTS HERE delimiter.

The Script

Here’s what I cobbled together using the guidelines above.

class ProcFile
  attr_accessor :name, :contents

  def initialize()
    @contents = ""
  end
  
  def write_to_file
    File.open("#{@name}.sql", "w") do |f|
      f.write(@contents)
    end
  end
end

proc_start = "NEW PROC STARTS HERE"

new_proc = ProcFile.new()
File.readlines("procs.tsv").each do |line|
  line.sub!("CREATE PROCEDURE", "CREATE OR ALTER PROCEDURE")

  if line.include?(proc_start)
    # NEW PROC STARTS HERE\tProcName\t"This Is The Proc
    r = /^NEW PROC STARTS HERE\t(.+)\t"(.+)$/

    unless new_proc.contents.empty?
      new_proc.write_to_file()
    end

    match_data = r.match(line)

    new_proc = ProcFile.new()
    new_proc.name = match_data[1]
    new_proc.contents = match_data[2]
  else
    new_proc.contents += line unless line =~ /^"$/
  end
end

# write final file
new_proc.write_to_file()

Translating to a DbUp-Friendly Structure

Here’s the CLI tool syntax I wrote to create new database migrations from scratch:

$ customtool generate migration --name CreateNewTable

And this would create something like 20200214200931_CreateNewTable.sql and put it in the Migrations directory and everyone would rejoice. But unlike table definitions and other types of migrations I’d like to run, I want to treat my procedures more like code where the whole thing gets re-run anytime I know it changes. Therefore, my tool needs a new syntax.

$ customtool generate procedure --name SelectSomeStuff

This command instead deposits generated procs in the Procedures directory. Order no longer matters, so there’s no timestamp component to the filename. And when I want to run them, I just do:

$ customtool deploy procedures

Caveats

It should be obvious, but I want to note this here — the script I wrote mostly worked. It didn’t quite work when the SQL syntax wasn’t capitalized correctly or spaced consistently through all the scripts. To test the stored procedures I converted, I ran them locally and made an assumption that any conversion issues would show up as syntax errors and not still be valid SQL. You’re always better off doing a healthy dose of testing — even though this is still in-progress for me, I plan to run the CREATE OR ALTER on the lower environments so we can verify that everything works as intended now and not have to worry about it later if I run all the procs and introduce some small bug.

It’s also worth noting that this is exactly why stored procedures aren’t popular in frameworks like Ruby on Rails. They’re so difficult to test and do change management on! I really only encountered them when I entered .NET land, and then I was horrified how often engineers I worked with thought they were a good idea to implement. Yes, you can get performance gains from them. But you’re almost always better off doing something else that’s a bit more testable, just so you can sleep at night.

Addendum: System.CommandLine

Wasn’t going to write much here, except to say that I’m using the new System.CommandLine library to write my CLI tool. It’s still in pre-release but the functionality it currently provides is more than enough for me to write my tool without incurring too much of a headache either parsing the input myself (i.e. I don’t use a library) or learning a confusing library API, which was my experience with Command Line Parser.

Revisiting Management

I recently hopped to a new company a few months ago and am now an individual contributor. And while I’m glad I no longer have to manage people for the time being, I know it will be something I revisit down the road, at either the behest of the company I now work for or my own renewed desire, or perhaps both. I also continue to reflect on how well I managed my teams over the last few years, which is what has prompted recent reflection as well as this blog post.

In 2018 I became a manager for the first time. My boss at the time — who also happened to be our team’s architect, my mentor and now my friend — sent me to Dallas for the Manager Tools Effective Manager and Effective Communicator conferences. It took place over two days, and I learned a lot about how to approach leading and managing a team. Mark Horstman broke down a lot of the myths of management I had inadvertently steeped over time, and expanded upon a lot of the intuitive lessons I had learned but not yet grounded in rationale or experience. His talk at USI gives a good taste of what it was like to not just listen to him, but work with him on management and communication.

In the talk he goes over the four behaviors that matter, and the tools his company developed to enact those behaviors. I’ve listed them below, with the latter in parentheses.

  1. Know Your People (One on Ones)
  2. Talk About Performance (Feedback)
  3. Ask For More (Coaching)
  4. Push Work Down (Delegation)

I’ll be honest — I haven’t thought about these in this way since Mark first taught them to me. And for better or for worse, a lot of those behaviors were enacted by the org structure we had at the time. Here’s how my boss fared with these behaviors with me as his direct:

  1. He was excellent about doing 1:1s with me and getting to know me. He did them consistently and knew about my interests outside of work as well as how well I was doing with schooling.
  2. 1:1s help facilitate this, as did his involvement in our sprint retros. He also filled out and discussed quarterly reviews with me. Each of these feedback loops (weekly, bi-weekly, quarterly) helped me learn from my mistakes.
  3. This didn’t happen often, but I would seek out coaching during 1:1s and I remember when I had to have a talk with one of my directs and he challenged me to take that on while stepping outside my comfort zone.
  4. Work was naturally pushed down because he was involved with planning our sprints.

At the end of 2018 I got a new boss — a remote boss, an overworked boss, a boss with no prior history working together. During that time I didn’t reflect on these behaviors, but looking back I notice something scary:

  1. He was not consistent with 1:1s every week. It took over a month to get them started, and he only started after I kept pushing it. He also frequently canceled 1:1s if he had a meeting with me at some other time during that week, figuring that was a valid substitution. (Narrator: it was not.)
  2. Outside of spotty 1:1s, he was not involved with our sprint and therefore had little context for our sprint retros. We had quarterly reviews that were based on a list of accomplishments he had each of us write up and send to him. Then these would be hastily written up and discussed before they were due to HR.
  3. This sometimes happened, but instructions were vague and expectations were never set.
  4. Work was only pushed down when someone was breathing down his neck. And it never aligned with what our team was doing or with what product wanted, so getting it into a sprint was a mess.

Going back to Mark’s talk, Mark spells out the two most important responsibilities of a manager: results and retention. The first boss did both of these things; the second boss did neither. Which brings me to my next reflection: how well I did as a manager. Without going into another breakdown of the four behaviors, for each of my directs — relationships are with people, not teams! — I can tell you that I was focused so much on retention that I papered over a lot of the bad that my own boss was doing to make my job impossible.

Retention was most important in my eyes because we were bleeding a lot of people in 2018 (both those let go and those looking for greener pastures) and HR wasn’t doing a damn thing to help us hire good candidates. Mark even mentions it — hiring the wrong person is worse than hiring no one at all. But we as a company weren’t trying to do more with less, we were doing less with less with the expectation of more. It was devastating.

Towards the end of my time at that company, I started to focus more on results when I realized that retention of my people wasn’t enough to combat the lack of retention in the rest of the company. That first boss I mentioned — the one who did everything right — was eventually stretched thin and chose to leave, along with most of the other architects in the company. And I soon learned that other teams were bleeding people as well. We got a new CEO halfway through 2019 but by then we had lost a lot of blood and it was going to take time to close the wound and begin to heal. And that’s when I decided to leave.


I don’t really have a happy ending or anything here. I’m still working through the details and reflecting on where I was and where I am now. I should probably get my copy of The Effective Manager back and maybe even purchase a copy of The Effective Hiring Manager.

I also know it’s two in the morning, so I’m not going to make good decisions right now, especially purchasing decisions. And I’m not currently a manager — I should be focusing on being an effective IC. Which probably means getting better sleep, eating healthier, making sense of my Todoist tasks and sticking to a budget. 🙂

Therapy

2020 achievement unlocked — I filed a claim for my most recent therapy session.

A few things made this possible. One was just having a better job — not only do I have energy I didn’t have before, but I have benefits that cover therapy. I actually forgot about this when I went back to check on old claims with my old insurance, and why I opted for an HSA in the first place. Also helped that the site for my new healthcare is a lot easier to use than my old one.

Is this what it feels like when your employer takes care of you? Is this the fabled workplace of yesteryear that my parents’ generation used to depend on?

I was trying to figure out if I should go back to therapy. If I do, I want to have a plan in place. Something to work on. I haven’t been in a position before where therapy was something that I could regularly have — I mostly just saved up money for when I really needed it, such as going through rough relationships or enduring shitty jobs.

My partner and I are learning what it’s like to work at companies that care in substantive ways. Sometimes that reaction is surprising — I was stressed out this weekend after our company party and when I talked to my mom about it, she suggested I was feeling guilty and I initially balked at that insinuation. Now that I have calmed down a bit, it doesn’t sound so far-fetched. Either way, she suggested I go back to therapy to talk about it and this might be a good starting topic.

Upgrading PHP

WordPress now has a health check you can run against your site to make sure it has all the latest bells and whistles. You can find it under Tools > Site Health — this is new to me, but I haven’t used WordPress in years.

My health check had some straightforward items — including to remove themes I wasn’t using. But there were two that I knew I would have to get my hands dirty for — upgrading PHP to at least 7.3, and installing the PHP extension imagick. I had to bumble around the Internet to figure out all the steps, so I figured I’d detail my findings here to make sure I understood what I did.

Installing PHP

I decided to go straight to PHP 7.4 since it’s the latest and this WordPress blog doesn’t have a lot of customization. On my host DigitalOcean, Ubuntu 18.04 comes with PHP 7.2 by default. So first thing was to SSH into the box and start getting the relevant packages.

$ apt-get install software-properties-common
$ add-apt-repository ppa:ondrej/php
$ apt-get update
$ apt-get install php7.4

The software-properties-common was already installed, but I’m pretty sure it enabled me to add the Personal Package Archive (PPA) on the next line. It looks like Ondřej Surý maintains the PPA for PHP — seems odd, but I saw multiple sources cite this repo so I went ahead with it. Then I ran a standard apt-get update and installed PHP 7.4 next.

For a sanity check, I ran php --version and was surprised it was on 7.4! But alas, this wasn’t enough for WordPress to start using it. So next I had to figure out how to get off of PHP 7.2.

Loading PHP Via Apache

This part was cool b/c I learned more about how Apache works! In the /etc/apache2/mods-available directory are a list of available mods for Apache to use, including php7.2.load and the newly installed php7.4.load. My gut told me I had to enable PHP 7.4 and disable 7.2, so that’s exactly what I did.

$ a2dismod php7.2
$ a2enmod php7.4
$ systemctl restart apache2

Loading Remaining WordPress Libraries

There was a DigitalOcean tutorial that suggested I install the following commonly-used WordPress PHP extensions.

$ apt-get install php-curl php-gd php-mbstring php-xml php-xmlrpc php-soap php-intl php-zip

Of course that wasn’t enough. After making Apache configurations above and restarting, I was told I needed to install the MySql extension.

$ apt-get install php-mysql

This worked! Now that I had WordPress running on 7.4, I went ahead with the remaining imagick extension.

$ apt-get install php-imagick

That’s it!

Execute Program

I recently started getting into Execute Program — it comes free with a subscription to Destroy All Software and I was feeling a bit down about work. (More on that later.) I’m making my way through the JavaScript Arrays course as a mix of refresher and new. Once I’m done (or close to done), I’ll attempt TypeScript, since there are some things I want to build in React using it and it’ll be useful at work.

Speaking of work and why I was feeling down: I had a pretty bad week last week. I was really struggling with the codebase I was working in and wasn’t quickly learning from my mistakes. This week has started off better; I attribute that both to a better attitude and my amazing co-workers. Everyone is really supportive of each other and that goes a long way towards getting myself out of a rut. On top of that, over the weekend I reflected on where I was mentally and I think that’s helping me figure out how to ground my problem-solving.

So that’s why I’m continuing to do the JavaScript learning. It’s useful. It’s fun. It’s productive. Of course I’d love to create my fantasy football API or build a React project in TypeScript but those projects take a lot of energy and right now I only have so many spoons to give.

I’ll catch ya’ll later. Hoping to make this a more regular thing.

Another First Post

I had a long day at work but because I made a large batch of congee yesterday, I was able to put on Snoopy in Space as soon as I got home and work on getting this blog setup. It runs on Ghost — a version of the blogging software that is much more advanced than the last time I used it. And yes, once again, I’ll eventually move all of my old posts over to this blog.

Edit: I moved to WordPress instead. Something about not being able to do indented Markdown lists and also no POST endpoints for API fun.

Click thru for the full thread!

I’m not entirely sure what I’ll write about. I really just know I want to write again. A lot has happened since my last post, especially professionally — I now have a much healthier relationship with my job! And I know that over time, I’ll find my voice again. In the meantime, you’ll just have to get used to me figuring out how to use words again.

2019, here we are

Today In-Summary

Today I accomplished:

More about that fried rice…

The other day I purposefully made more rice than I could eat so I’d have leftovers for fried rice. My mom also made a ham for Christmas and gave me some of it, so I used that for my protein and practiced my knife skills cutting it up. The veggies were frozen. I used soy sauce and oyster sauce and even threw in some butter as advised on “Basics with Babish” (still used a wok, though!). I even thought about how the salts and fats came together in this dish — that’s about all I know since I’m still in the middle of Salt, Fat, Acid, Heat.

More Generally

It’s 2019. I was thinking about how the year change is arbitrary, but having the time to reflect is still real. Because of Christmas, we’re encouraged to take time off and regroup. I was also reading about the history of the weekend and in summary I just want to say that even though the year-change is arbitrary, the desire to improve is continuous and the act of talking about it now is coincidental.

During my senior year, I set some goals for myself that I wanted to accomplish. I remember how useful it was to frame that year’s projects and reflect on what I was doing. I hope to do something similar this year with completing my thesis, working as a developer, and being healthier with what I eat and how active I become.

Tool Boxes

Photo by s w on Unsplash

I like to tell people to develop skills like they would add tools to a tool box. I think I first came across this metaphor as a musician and it made perfect sense: learn scales, rhythmic exercises, excerpts from different composers — combine in different ways later on when you encounter something new. Rinse and repeat.

And it applies to what I do now with software too. Except… I’m just not applying it. I’m not thinking about it in the same way that I would with music, which I approach very theoretically and like a puzzle. You would think that kind of thinking would be obvious with programming for machines, but I’ve been a bit stressed out. And when I’m stressed out, I stop trying to add new tools — I instead rely on the ones I have, and even then only the very familiar ones.

I thought of this again today with cooking. I was using my rice cooker and only now decided to measure out the amount of rice I was going to use. I used to do this when I would measure out rice when living with my family! But I lost my dad and with that stress I forgot how to cook. And it’s only until today I realized I could gain that tool back, measuring rice.

Going back to music, I want to reflect on how I succeeded reasonably well without practicing. And it’s because I created a bunch of tools and I became intimate with how they worked. Then when I was stressed out playing (which was, admittedly, most of the time) I’d still know how to use those tools.

It’s also worth noting that because I had those tools, I was also less stressed out about what I was playing. The same goes for cooking. The same goes for coding. And the same goes for whatever it is you’re trying to do. If you can develop an intimacy and familiarity with the tools of your craft, you’ll be able to wield them that much better when you’re under a lot of pressure and stress.