artificial-intelligence-blog

Embrace the Challenge: Thom Lamb’s Transition from Strength to Software Engineering

Thom Lamb is many things. He was a Combat Engineer in Canadian Armed Forces, a former strongman and an experienced Software Engineering Lead. In this episode he talks about his experience and how it has affected his career today.

Host

Welcome to the RH podcast. This is Iheatu. I'm sitting here with a former colleague of mine and a fantastic software engineering lead. This is Thom Lamb. He's had quite a lot of experience in the industry and in several other industries. He's, of course, a software engineering lead, but he's also a veteran and a former strongman. If you would just talk a little bit about the beginning of your career, how you started off.

Thom Lamb

Okay, um, yeah, hey, I'm Thom Lamb. I'm a software engineer, and yeah, so the first part of my career... I'm in my 40s, so I've had a fairly long career. The first 10 years were spent in the military. So after I graduated from high school, I went into the Royal Military College and did the regular officer training program.

Host

Where is that, by the way?

Thom Lamb

Kingston, Ontario.

Host

Okay, yeah.

Thom Lamb

So, we like to joke that Queens is the second-best engineering school in Kingston. Um, so, yeah, I went to RMC and I was a combat engineer. I did a civil engineering degree and then when I graduated, I was trained as a military combat engineer. I served for another five years after that and got out in 2003.

Host

So, how was that experience? Obviously, there's the civil engineering part, which I'm sure is no joke, but how is it to do that in that kind of environment, where there's obviously high requirements?

Thom Lamb

Yeah, um, one thing I guess I would kind of put it in the context of how it relates to software—stuff that I've learned in the past that helps me as a lead, or as a developer, or as a manager of developers. So, combat engineering is obviously... what you're doing is you're bringing engineering support to the battlefield. Whether you're purifying water, building fortifications, crossing water with bridges or rafting, or removing mines. So, anytime it gets more complicated than digging a trench, you usually want to get engineering involved.

Host

Okay, and is there an expectation that engineering would also participate in combat?

Thom Lamb

Yeah, combat engineers are trained as infantrymen second. So, we're expected to fight and defend ourselves and also occasionally to be on the offense.

Host

Yeah, and actually, the motto of the engineers is "first in, last out." So, normally, if you're going to attack an enemy, they're going to have a bunch of defenses like minefields or obstacles, or natural obstacles like bodies of water, and you're going to have to cross those. Before your soldiers can actually engage the enemy, you need to get them in front of the enemy, and that involves engineering. So, engineers are very often fighting while they're actually conducting engineering operations.

Host

That must be high pressure.

Thom Lamb

Yeah, it is, and usually you need to coordinate other assets like helicopters or armored support or artillery to defend you while you're literally doing your job.

Host

So you're basically doing math while you're getting shot at.

Host

Well, with all that then, how does that affect your ability to lead when you're coming out of that? How did that sort of buttress your leadership skills?

Thom Lamb

So, one thing that was good...Sorry I'm just going to turn my notifications off. One thing that was a really good learning experience for me being like 23 and having just graduated, and then being put in charge of 42 sappers, that's what you call an engineering Soldier, was embracing the idea that you don't know everything. You're not really going to accomplish a complex engineering project in combat if you don't develop subordinates and empower them with what we call Mission command, which is just the idea that we know the goal, like the goal is to get to the hill. We don't exactly know how things are going to break down, so we'll have a plan, but usually we like to say no plan survives contact with the enemy because the enemy is going to do everything in its power to mess your plan up. So usually you have three or four subcommanders that are non-coms and you also have a very senior NCO that's your warrant officer, and you trust those people to understand what we're all trying to accomplish and how they can contribute. So, like, very quickly you learn it's not... it's funny because people think the military is very like, you know, I tell you to turn left, I tell you to salute, I tell you to tie your shoes. But when you actually get into operations, it's very distributed, kind of like a distributed system, if you will, and there's a lot of microservices inside of the Troop. So it's... and I kind of think of Mission command kind of like API contracts where it's like, look, don't tell me how you're going to do something, just tell me what you're going to accomplish, like what's the output that I can expect.

Host

Sorry, so my next question is, how do you start? How do you build that trust? Because it seems, you know, the margin for experimenting on an individual's ability is very low. Obviously, the training can be trusted, but how do you really feel like this is the person that needs to take responsibility for what is going to be not only a critical part of an engagement but... is there... there's everything in the way of that as well, training?

Thom Lamb

Um, so like the Canadian Army has the best, highest-trained soldiers in the world, best-trained soldiers in the world. Obviously, you don't start out with saying like, you know, go put that bridge in. You teach the person and you develop them, and trust goes both ways too. So it's just training, and it's practice, and it's running the same thing over and over and over again. And then it's also, instead of being very imperative and giving people, you know, directions like do this, then do this, then do this, it's more teaching them fundamentals and teaching them how to think. So it's very analogous to like developers, where you might want to like be in control of everything, but the reality with combat is you can't be in control of everything. So all you can do is give your people strong fundamental skills and then give them the freedom and the support to actually do their job.

Host

Yeah, sounds like clearly explaining to them what you expect so that they actually know what to work towards, being very good with setting up the right requirements for them so that they can achieve your results.

Thom Lamb

Yeah, okay, it's just like writing good tickets, you know. You're going to take a big, complicated problem and you're going to break it down into a bunch of more granular tasks, but each one of those tasks is going to be framed on, like, how does this get put back together, not necessarily how does this one specific piece get built, like every single line of code. Okay, it's pretty cool. So, after the military, you decided to switch into, sort of an adjacent variation, but, you know, I guess we can call it being a strong man. So how was that experience?

Thom Lamb

Um, yeah, so after I got out of the military, I basically had a lot of opportunities to consider. I did go back to school, I went back to Queens, and I studied cognitive science for a couple years, and I was going after a masters in cognitive science, which is basically like computer science with... it was like an interdisciplinary program, so it was in the Faculty of computing, but it was. Focused on like how do humans think, how do we make computers more, um, more supportive of, uh, it's more human-computer interaction than what cognitive science is today. Um, so I spent some time, uh, in academics and, uh, and while I was doing that I was always... I'm a meathead, so I like to lift weights, um, and, uh, I've had a few more than a few times where that's been held against me as an engineer because there's a perception that, you know, engineers shouldn't be athletic, they should be... they should be bookish, um, whatever. Uh, engineers don't fit any particular mold. They come in all different varieties, but yeah, but, um...

Thom Lamb

So I guess after being in the military and having a really controlled, old, um, uh, life up to about the age of like 30, 31, I just wanted to have some freedom and explore other things. I love training and I was very strong, um, and I wanted to see, uh, I just wanted to see how far I could go in that in that career. I really didn't make a lot of money, uh, but it was a lot of fun. I spent a few years doing that. I was also like a trainer and a strength coach for teams and all sorts of stuff, um, and it didn't really have anything to do with software at all. It was just... it was just another interest that I wanted to pursue, um, and I just wanted to see how far I could go with it when I was completely focused on it.

Host

And what kind of life lessons did you take from that when you moved into software engineering as a career?

Thom Lamb

Well, um, the one thing that I lost... so the reason that I... how I got from being a strongman to being a software engineering professional was I was, um, I started to train a lot of clients. I had a lot of, um, like a passion for process and a passion for coaching and leadership from the military, and so that helped me succeed with clients. And at one point I think I had over a hundred online clients. The other thing that was happening this time was, um, it was the first time people were training people online. This was way before Co, um, but, um, it was the first time where people embraced the idea like I could actually get training advice from somebody in a completely different area, um, using video conferencing or whatever. Um, so I kind of migrated because like I wasn't... I wasn't going to be half Thor. I wasn't going to be like a, you know, the world's greatest strongman, just wasn't in the cards for me genetically. Um, and I really enjoyed coaching other people. I actually enjoyed coaching more than actually competing. I realized that much like software engineering, coaching required the same dedication to process, structure, and continuous improvement, which naturally led me to dive deeper into professional coaching. So I started to coach regular people and I started to figure out business systems so that I could have coaches working underneath me and started to make a really viable business from this. And then what you see a lot of times in startups is I had, I was a subject matter expert that was like, I really wish I had a software application that did this, and it didn't exist, um, so I built it, um, and I launched a startup, and, um, yeah, it was funny, like I was in a pitch and they were talking about how I was going to have a hard time, um, being... being a meathead was going to have a hard time with being in a tech startup and getting an application built, and I was like, I've actually already built it, it's right here, um, and I'm like, by the way, I'm an engineer, I'm... thanks for judging me and writing me off as a meathead, but like here you go, here's your, you know, your Angular app with your Node microservices and your, you know, your MySQL backend, or actually was MongoDB, and you know, I was like, yeah, yeah, like I know how to do that too. Um, so I think the most important lesson I got from that whole transition is just being comfortable with like, yeah, specialization, uh, like Robert Highland say, specializations for insects, um, human beings, at least the some of the interesting ones, have a lot of different capabilities, and just because a person's good at one thing doesn't preclude them from being good at something else. Um, and those are like interesting people to work with because usually your applications need, um, human beings to actually empathize with them and look at them and figure out how users are going to use them to make them better. So if you're just completely laser-focused on code, you may not have that product context that gives you the ability to look at it and go, "Wait a minute, why does this form work the way it does? Is this what we're actually trying to accomplish?" So I guess that's one thing I got out of it—life experience in general can be useful.

Yeah, I mean, I've, uh, I think the way I've been able to sort of look into it is that a lot of people expect engineers to be T-shaped intellects, meaning, you know, they shadow on lots of other things, but they're very, very deep on one thing. Yeah, and I think what ends up happening is, as you continue to grow, it's very important to be deep on something, right? It's very important to really, really know something, because one of the things about that is no one can really take that away from you. But it's good to basically be something like called an M-shape, you know, life meaning you're deep in as many things as you could possibly be, you know, and it helps you grow. It helps you, and life experience helps with that. And, you know, obviously, with your experience in the military, being a strong man, and sort of applying your tactics to growing the business, and then from there, you know, finding a solution to a need through software engineering, like, it's very, very M-shaped. It's also, like, I think the problem with T-shaped intellect is, like, I'm never satisfied. So, like, if somebody says to me, "You're a software engineering expert," I'm like, "Well, I don't know everything—there's still so much to learn." "Well, that is the way it is," and it's outside of my one of my, you know, verticals, a lot of people that are T-shaped be like, "Well, okay, whatever." And I've noticed this, actually. I was working with a developer when I was building a fit path—that was the app that we—and I would keep asking these questions, like, these classic startup questions of, like, challenging, like, the sacred cows and being like, "Well, wait, why do we do this?" And there was this kind of complacency, like, "No, no, that's just the way it is. You know, that's just the way SQL works." You know, like, "Why would we need, like, a NoSQL database?" Like, even in your vertical, if you're not capable of, I guess, kind of meta-thinking of being like, "There's nothing sacred," and always challenging your assumptions, I think when you have that M intellect, you kind of realize, like, "Oh, in other places, I've seen this kind of problem, and they came up with this other idea." So, that's something that, like, that creativity I think you only really get if you've really embraced the idea that you want to be great at things, like not just saying, "Oh, well, this is what I do for a living," and so everything else, it's okay if I suck at it. It's like, "No, like, if you're going to do something, you might as well do it as well as you can, and you might as well challenge it as much as you can."

Host:
Makes sense. No, absolutely. Being great has a lot to do with, you know, liking what you like, but also knowing that you know that thing affects others, right? And being willing to experiment and find the other things around it that are related, that will only enrich your understanding of your co-competency, right? Because you want to be able to have that conversation, to communicate, you know, the conversation you have internally. And you want to have better internal conversation, and that requires you being as broad as you can. And, uh, I always, that's, I, you know, I hope everyone can try and do something like that. Um, so, great, that's really cool. Um, so let's get into a little bit of software stuff. Sure. Um, so one of the questions I have for you, uh, is how do you assess the quality of software at first glance?

Thom Lamb:
Um, okay, there's an—so I guess in what context? Like, how am I being asked to assess it? Am I looking at an existing piece of software and trying to, like, bid on, like, "Oh, we're going to add some new features, but we want to know..." Or that's—That's actually a really good question. So, one of the things I've recently been asked to do is actually assess the quality of software, but I didn't have any code whatsoever. So they just gave me a website, and they told me to click around. Um, there's some internal reasons why I didn't have access, and it definitely wasn't on purpose. But, you know, given the situation, I need to come up with something reasonable about that. It says, "Hey, I have some experience," and here is—and I was able to do that, and I'm happy about that. But software engineering is a lot about evaluating how clean and maintainable code is, and that's where I relied on my knowledge. Also, software engineering practices emphasize user experience, which helped guide my assessment. But I'm sure everyone has their own different process for—so almost with no help, like, how do you say, "Okay, this isn't very good"?

Host:
Um, yeah. Okay, so I mean, the first thing is I would actually just use it, um, and play around with it and see what it's like from a user's perspective.

Thom Lamb:
Um, so, uh, yeah, I did that recently where basically I just downloaded an app, and I went through and just used every single feature and then, uh, kind of like reverse engineered how I thought it was being made, um, based on how it was responding and then finding any... and then I just opened up Dev tools and watched for like, you know, warnings and errors, um, also like how long queries are taking to resolve. Um, and so there's the product side of it, um, which is actually kind of important. Um, so that's one way—just looking at what the product is like to work with.

And then, I mean, if you actually can see the code, then there's a lot more that you could talk about.

Host:
Yeah, please go ahead.

Thom Lamb:
Okay, so, um, well on a real, like, high level, uh, one thing I tend to do is I'd actually take the app and I'd actually look at how complicated the app is. Like, okay, so, um, do we have, like, uh, do you have users? What kind of, like, state do we have? How complicated is the app? Like, is there routing and—there's literally, like, a laundry list of things that an app is going to do in 2020. Um, you know, and a lot of them—about half of the functions are pretty common across a lot of apps: like you need to sign in, you need to maybe message, uh, maybe you store some information, um, you know, maybe you do some sort of transactions.

So if it has X features, one thing I do on a real macro scale is I go, okay, how many chunks of code and how many lines of code are actually solving this complexity? Because for this complexity, there should be this much code. If there's more code, that's not a good sign. Um, the smallest app that solves the biggest complexity is the best app, in most cases.

So if I go and I look at the code base and I see—and there's nothing wrong with, like, a really well-factored-out code base—that's the second thing: is how well-factored out is the actual code? Like, has somebody thought about where do I put these things, where do I put routing? How many sources of truth for the different kinds of state of the application are there? Like, is there a bunch of different paradigms? If you're talking about React, like, is there Redux, but there's also, um, you know, um... say, partially, they're using local state, but then they're kind of going over to Redux, but then they're also using GraphQL, and there's some state being held over there. Well, now you've got three different paradigms that you need to understand, so the developer needs to know three more things to get the job done.

So one is I just look at how big the code base is; two, I look at how clean it is; three, I look at are common things being refactored and stored in a generic folder somewhere, uh, for how much state is being handled. Um, and where does that live? And then I start going down into the backend and, like, okay, so, like, what kind of persistence do we need in this app? How's the database organized? You know, like, if it's a product catalog, like, how easy is that to reason about? And I basically just look at how easy is it for a new developer that only knows the framework that it's built on, or the frameworks that it's built on, to actually start landing PRs. Yeah, how long would that take? And that timeline is literally one over that, like the inverse of that is the quality of the code.

Host:
Okay, so now that you've seen the code base, maybe it's a little bit difficult to go through and, you know, you've identified it's going to be difficult for your juniors, how do you take that leadership experience to navigate through, uh, to navigate sort of those juniors through that code base? You know, how do you guide them and say, "Hey, here's a piece that's only a little bit more difficult than you can handle?" You know, how do you sort of, how do you lead them?

Thom Lamb:
So, um, you take that same process where you go, okay here's the things that are going to be stumbling blocks and then you explain—you try to find them, like say, "Oh, in these screens here, in order to update state, you're going to need to use Redux. Over here, this is actually being held in GraphQL." You navigate the app for them.

Just one thing to back up—one thing that people don't talk about very often, but I think they should, is— and this is kind of controversial—there’s basically this default assumption of, like, you get the app, you're going to get paid, you're going to add new functionality to the app. If the app’s more than two years old, I would just say, for most cases, just rewrite the whole thing. Because the time you're going to spend trying to figure out how somebody wrote this two years ago compared to just a net new build, at least a third of the time, a net new build is more economical. This highlights the importance of understanding the software development life cycle, which includes considering the time and resources needed for such decisions.

And I'm just talking about, like, the client-side app; I'm maybe talking about the backend, although the backend nowadays too—you get a lot of this time where people are saying, "Oh, well, it's Postgres. It's got a business logic layer; it's got an ORM." It's like, okay, so why don't we just put it onto a third-party product API, like Shopify or Contentful? Why don’t we just migrate all that stuff? And they're like, "Well, we've paid for all this." It's like, yeah, that's a sunk cost philosophy—it’s not valid. This thinking also ties into software design principles, where the goal is to design systems that can evolve without being bogged down by outdated solutions.

So there is a discussion to be had there, like—and also you can refactor a portion of the app. You say, like, maybe you could use, like, a micro-app, drop one app inside of another app, and be like, "Look, we're going to do net new here because you guys are saying we're going to want to work on this for the next year. So it's worth two months of refactoring so that the rest of the year is productive." This refactoring process is a crucial aspect of software maintenance and support, ensuring that the app stays manageable and adaptable.

So that’s not necessarily leadership. Now, leadership is providing the environment so that your people can get the job done. So the first job of a leader is to look at his commander and say, "Hey, dude, have you ever thought about maybe this? Because then we can get more work done." This reflects the role of leadership in fostering professional growth by helping teams find more efficient ways to achieve their goals.

But assuming they go, "There’s no way. This is a massive app. It’s Facebook; you’re not going to..." Oh, by the way, they did rebuild Facebook. So if Facebook could be rebuilt, I’m pretty sure most apps have their could-be. But anyways, all I'm saying—and I’m sure there’s still a bunch of legacy—

But once you’ve had that discussion, and you've had to say, okay, we still have to work on this, then it’s navigating the app and predicting what problems they're going to run into, and then giving them the information, like saying, "Oh, you're going to have to learn how to use Apollo because you're going to need to know how to use GraphQL." This emphasizes the importance of software testing and quality assurance, where developers need to understand how to test and debug new technologies and frameworks to ensure the app runs smoothly.

So just giving them the training they need to get the job done, breaking down the problem for them, and telling them where they get the resources to make it happen.

Host:
And you talked about that—you also talked about something else a little bit partially, which is, and this I think interests a lot of people—you have an app that's two years old; you kind of feel like you want to rebuild it. It is controversial to say that for many reasons. But how do you start having that conversation? How do you start validating that to maybe—I mean, I think on the technical side, it's possible. But how do you start having a conversation with the business side, especially from a leadership point of view? Because those are the rooms only you get to get into, you know?

Thom Lamb:

It’s funny because, since it’s engineering, we tend to treat it as, it’s software—it’s a... it’s a fir—fmal... eeral...

Host

Ephemeral?

Thom Lamb

Ephemeral, okay, yeah. So like, the idea that you’d buy a building and then you’d rip the whole building down and rebuild it—which, by the way, they do that all the time too, for the same exact reasons—because the new building needs a different infrastructure to support the fact they want to put a condo where there used to be a shoe company or restaurant. They don’t try to put a condo on top of the shoe company.

Actually, in software, we do. We go, "Oh, we've got this real estate," and the first thing most smart developers— and I mean real estate developers—do, they rip the old infrastructure out to make room for the brand new.

In software, for some reason, we think there's a value. And we think that if we agree that we want to refactor it, that we'll lose value.

Host

Mhmm.

Thom Lamb

So that’s—and so, in any conversation, it’s getting the person to admit what they’re thinking has to happen before you can get them to change what they’re thinking. So if you don’t acknowledge that, you're like, "Okay, you think this app’s worth $2 million right now because it does this, but if I wave a magic wand and tomorrow we have a brand new app that's easy to work on, and it still does this, and that app is completely gone. Has your company actually lost any value? And then, when they think about that for a second, they're like, "Well, no," and then they have to kind of go, "Oh, well, it sucks." Yeah, it sucks because, you know, software engineering requires maintenance; it requires time. This is where software maintenance and support plays a critical role in understanding the long-term implications of tech debt.

But so there’s that conversation, and there's also data. So if you say, "Okay, we added a new view to the React Native app, but because it was a legacy app, the backend needed us to spend three developer days figuring out how to query the old backend. The net new code took us a half day to build; the backend interaction with the legacy app took us three days." This is a key part of the Software Development Life Cycle, where evaluating the cost and time of each phase can lead to better decision-making. So from now on, when you want more features like that, and you've said you wanted 20 more features, we could either do it this way or that way. And here's the costs— which one do you want to do? And now, it’s simply economics; there's no emotion, there's no sunk cost.

So it's being able to get data by writing out the issue very clearly and keeping track of the developer as they work through it. This is really freaking hard to do, and there's hardly ever enough time given to do it. And if you actually allocated the time to do that at the start, you would save a lot of money. Professional growth comes into play here because when developers get the right training and understanding of these processes, they’re better equipped to make informed decisions. In software testing and quality assurance, it's crucial to test these features and their costs upfront to avoid unnecessary setbacks. In software engineering, it's not just about writing code—it's about understanding the long-term costs of maintaining and adapting the system in line with software design principles.

But most companies are just like, "We need to get this done. We don't have time to think about it; we know exactly what we want to build; just build it on the old, it'll be fine." And that's where projects run into problems.

Host: 

Yeah, I agree with you; I've had personal experience with that for sure. And one question that sort of came to my head as you were saying this is, you know, there's an impression that you get that I think it's different from every other engineer, and they assume that you're not one. So I was interested in, for most people who are engineers who are listening to this, there are certain kinds of conversations they don't implicitly get to hear. Is there a way for you to sort of talk about that, where you know what the engineering conversations are, but what is it that an engineer needs to start looking for when they especially want to get into leadership? When they get into these other rooms, how do they know what kind of questions to ask? What questions are not being asked, and how to get those answers?

Thom Lamb: 

Okay, so like somebody that's just starting to be a lead or starting to be a person that's client-facing, and they're worried or they want to ask the right questions in the meetings. Is that what you're asking?

Host: 

Yes, they want to ask the right questions in the meeting. They can't always just talk about, you know, if they're being asked, "How long is something going to take?" or if they're being asked about technical concerns. Those questions they’re going to be comfortable with, but you have experience as someone who started a company, and a technical person, and we’re talking about funding, things of that nature.

Thom Lamb: 

Yeah, yeah. I think one of— if I can just reflect what you said there— I think a lot of times, the business people want to cut the engineer's input off at a certain point. They kind of want to be like, "Look, just tell us how long this feature—" they want to be really granular. They want to say, "Just tell us how long it's going to take to add this new form. We don’t want you to talk to us about the state of our app and the future and all the things you're seeing," because, well, they don't trust you to have that conversation because they think engineers are code monkeys.

So it kind of comes back to that military thing I said about Mission Command. And so there are a few things here, and just help me to keep track of my thoughts because I’m probably going to throw down three different thoughts here.

So, like I said in the beginning, managing engineering operations in combat is really complicated. There's a bunch of complexity that's unpredictable, and usually, the idea you have on day one, once you go into contact— called "contact" when you meet the enemy— the plan falls apart because you discover a bunch of things. A lot like working with legacy code, right?

Host: 

Very much so. I was sitting in a meeting, they asked me how long the form would take. I said to myself, last time I built a form, it took a day. Should take a day. Write the issue, we go into development, we look at the code, we go, "Holy [__], this is way more complicated than we thought." And now we're behind because there's all the stuff we didn't see. We didn't see it until we drilled into this file. We literally never looked at that file before until we navigated into it to figure out where to put our form. Okay, so that's one thing is that.

Thom Lamb

And then what I said was, in order to actually win in that kind of a situation, you need to give your subordinates the knowledge and the freedom to come up with solutions intelligently. And then it's called bottom-up problem solving. You don't try to tell them how you're going to build the form because you're not the one looking at the code. And ideally, the person that's looking at that problem has more knowledge about that kind of code than you do. Like, maybe you're like, "Look, I need you to read up on GraphQL. I need you to read up on Relay because, you know what, this app's using Relay and it's using the older version of React, so you can't use hooks. You can't use this. I need you to learn about this stuff." So you set them up with the right training to actually get the job done, which contributes to professional growth.

Okay, but by being able to think about that, if you can, and this is where it gets really hard, is the business guy is going to be like, "Look, just build the form. Don't tell me about the problems." The problem with getting all of those individual items done has to do with looking at the app as a whole and understanding how much tech debt is inside of it. This is where the Software Development Life Cycle comes into play, because understanding each phase—planning, design, implementation, and maintenance—is crucial for managing the bigger picture. If you can talk like I'm talking right now about a specific app, then you can get the business guy to trust you, and then they'll say, "Okay, I get it. You know, let's spend a little bit more time in the investigation phase before we start knocking out the issues." You're not always going to get that, and it doesn't—it is what it is, but it doesn't mean you can't try as long as you do it in a respectful manner and you keep track of all the data.

So that as you have more and more conversations and, God forbid, you're behind, you go, "You remember how I said this would be a problem? It's now a problem. Do you want to revisit this discussion?" You know? It's a lot like software maintenance and support, where ongoing assessments ensure the project is on track, and changes are implemented thoughtfully.

Yeah, and what I found is that, you know, once you're—you start bringing in more and more evidence of what you're saying, they start having to tell you more about what your needs are because, you know, your needs become more confident, theirs become more complicated, and the only solution is to come together. This process ties into software testing and quality assurance, where testing becomes an ongoing conversation to ensure both parties align on expectations. If they didn't want to do it before, you kind of have to do it now because it's—you've been able, and also you've given them an opportunity to have a little bit of learning because, you know, like all kinds of learning, it's about spaced repetition, so you know it's going to happen, and it strengthens adherence to software design principles.

Thom Lamb

It happened, remember a month ago, and you sort of bring that together. So, um, at least that's my experience. The other thing is I find, and this is a tricky thing, especially when you get into like agency land or whatever, I find a lot of times like, you know, the nature of the business is you're going to have these conversations where you—you want to provide people with hope so they'll actually hire you.

So you start off with these really optimistic meetings, and then you bring the engineers in, and then there's this like unspoken anxiety where you're like, "So we said this, that's going to be okay, right? Right?" And you're—and the only acceptable answer is "Yes."

So, and unfortunately, you know, being able to say, "Listen, it's my job to actually make sure that in three months we're in a good state, so I don't think it's responsible of me to hope for the best. I do hope for the best, but let's plan for the worst. Let's think about what could happen. Because if you're telling me that in three months you've got to release, uh, you know, um, whatever, um, you know, this new feature and you've absolutely got to have it done, then that means that if there's even a 5% chance that it won't be done, let's figure that out now. Let's talk about that now, and let's get on the same team today, not like try to get on the same team when I promised you something, you're expecting something, and all of a sudden something happens that I can't control and no amount of engineers is going to fix it because it's going to take time. Let's get on the same team today, figure out what could happen—worst-case, best-case, none of that stuff happens and we're done. But you're insulating your risk.

I think the other thing too, if you're going to go from being an individual contributor to someone that's sitting in front of a client, you need to learn how to talk to them in the same terms—like cost-benefit analysis, talk about economics, talk about risk, insulating against risk. Use the buzzwords so that they know you've read the same books or at least these same blog posts about the books. This will also contribute to your professional growth, enabling you to better communicate the complexities of projects. If you can master these conversations, it’ll serve you well throughout the software development life cycle, from initial discussions through delivery.

When you get into the nitty-gritty, managing expectations is also crucial in maintaining the integrity of your software design principles. Without a clear structure or understanding, you’ll often find yourself needing to revisit the design due to unforeseen challenges. Regular software testing and quality assurance ensure that your product is robust, minimizing the chances of unexpected issues arising late in the process. Throughout, software maintenance and support will play a key role in managing ongoing changes, especially when scaling or adding new features. These layers of thoughtful consideration will protect your project from falling apart under the weight of assumptions and unrealistic timelines.

.

Thom Lamb: 

Yeah, and getting back to that "M versus the T"—it's like, you know, actually, as you were learning to be a really great JavaScript developer, you kind of noticed all this project and all this business stuff, and you hoisted it into your brain because you needed to. Okay, I think it's really great advice, honestly.

Host: 

Um, but from all the—I mean, and we've said a lot of things about leadership, software, and sort of the cross-disciplinary sort of skills that you learn. So I'm going to be optimistic in my next question and say, what are the things you're most excited about in the future of software? There's been a lot of talk about GPT-3, there's obviously, there's lots of things coming out newly and lots of other sub-sectors of the software industry. What excites you the most and what are you most interested in?

Thom Lamb: 

Um, what excites me the most... well, uh, I think because of Co, a couple of things happened that are actually a really good thing for the tech industry. One, remote work became like the de facto way we do things, which just massively changed. Like, you don't have to pay downtown Toronto rent to be a developer anymore. You could live anywhere with good Wi-Fi, so it kind of democratized things a little bit. Like, personally, I'm kind of like thinking, like, what am I doing in Toronto? I'd rather be somewhere with a little bit more green space and less rent. So, I think if we are smart about that, there's a whole bunch of talent that we've been kind of treating as second-class developers that it's no longer fair. It was never fair, but now at least the business people have seen categorically like remote-only teams can still get the job done well.

Host: 

Before, I think they had an idea, an artifact from the Industrial Age of like going to work means showing up at 9. So, that is really interesting because there's literally millions of people out there that have access to a computer. There should be more, obviously, we need— that's one thing we need to improve, but the talent pool just massively increased.

Thom Lamb: 

The next thing is, it's imperative for a lot of companies that are focused on physical, uh, in brick-and-mortar to digitally transform, and I know everybody's saying this, but the amount of work that needs to be done has probably just increased tempo because things need to get online. Government services, like the idea that you need to go to, um, go to services Ontario to get your driver's license, is absolutely insane. And it's dangerous, it's time-consuming, and it just does not make any sense. So, as much as Co was a horrible thing, at least it's totally annihilated that idea. At least, I hope it has, of this—you know, you must come to this physical space. So, you've got banks that are literally allowing people to create accounts without ever physically leaving their home. Now you've got all of this, like, you know, change.

Host: So, I'm not really interested... I'm a big-picture kind of guy. Um, I'm not terribly interested in the next great front-end framework or, uh, the next way we're going to manage rows and columns, you know, or how we're going to speed up atomic transactions. Like, that stuff will get sorted out. Um, that's for the—I... it's not for the tech guys. It's really interesting, I like that stuff too, but I'm more excited about just the opportunity, um, and the demand and, and, and the fact that the talent pool just massively changed and that's actually part of what, like, I'm working on now. If you want to talk about, like, what the heck I do for a living and how I pay my bills.

Thom Lamb

Well, yeah, let's go into that. Sorry about that.

Host:

No, no, what's your day-to-day though?

Thom Lamb

Um, yeah, so I—right now, I'm kind of in a reflection period where I'm just figuring out what I want to do. There's a number of things that I'm interested in based on what I just talked about. A big part of that is, I'm not going to say too much because I'm still trying to figure everything out, but it's the idea that the way we're recruiting talent is really, really broken. We've got a number of recruiting agencies that basically think about a signal and a noise ratio. All they do is increase the amount of noise in the hiring process. They just go and find as many people as they can that are able to say the difference between truthy and falsy, and then they fire them at employers, and they don't provide any infrastructure. No, um—so, one thing that I've been playing around with is interviewing as a service. It's BAS, basically. It's—you're not Amazon, you don't have a full-time HR department, you're a small company, you want to hire one or two developers every year, but you still want to do a really good job. How do you do that? And one thing I've been working with some companies on is helping them develop a pipeline so they can filter out the bad actors that are pretending to be developers, and then give a really good experience to those that are actually good, well-meaning candidates and then ultimately find the right one with a really well-thought-out, data-driven interviewing pipeline that involves, you know, actually coding. So that's something I'm really excited about.

Host

Um, and then also, hey, so how far down—and you can answer as much as you like in terms of depth—but, uh, how far down the pipeline are you with this? This sounds like a universal service, or at least, you know, like, everybody wants this to happen, from developers to business people.

Thom Lamb

So we're actually—we're kind of in Alpha, if you will. We do have—I actually—we're resource-positive. I'm actually, like, I've got the market validation. I've actually got companies to engage with me to do this, to provide this service, and we've actually conducted interviews. Um, and we're getting close to actually having, like, our first hire, which is like something that, that part of this project is, you know, milestones. Um, and having a successful hire is the first milestone. So right now, we're just figuring out who our target market is, how we—how our, like, you know, early adopters, who they are, how we target them. It's probably companies that are startups or they're newer companies because they are going to be more open to new ideas, but we don't know yet. I haven't—and then also, I'm just, like, talking about it with people, and trying to find other people to, um—I'm not going to say who, but I got a couple people that are kind of interested in it and that might join the team. Um, yeah, but it's actually going quite well. It was—it was something that I kind of just stumbled upon and I just made a joke. I said, like, "Well, you know, if you don't know how to hire for this, then you should hire someone to hire for this," and they were like, "Yeah, that makes a lot of sense." And then it started as—just it started as, you know, if you know React, you should be able to hire a good React developer. Actually, that's not true. Um, the skill of interviewing—like, go back to that M versus T, right? Um, if you're a T-focused person, you're like, "I know this, therefore I'll be able to find someone else that'll know this," and we know in JavaScript that trying to find the intersect on two Venn diagrams of JavaScript knowledge is a problem. Um, it's, it's—it's—we—I could go into that, but I mean, I've been talking for a long time, but it's that idea of actually understanding interviewing just like—just like teaching someone else. Being a good teacher does not—being a good developer does not necessarily make you a good teacher of developing, and it doesn't also make you a good assessor. And so if—in larger companies, they actually have whole departments that analyze data and figure out, like, which questions they test in interviews they do—all this advanced data analysis to make sure that they have the best pipeline possible to get the best candidates, um, that's a whole specialization that smaller companies don't have access to. And as I started just helping out some people with this, I realized, like, it's a massive need. And, um, traditional recruiting does absolutely nothing; in fact, they contribute to the problem, not the solution.

Host:
So, is there... how do we get people to contact you on this? Are there, you know, is there an online website or somewhere people can reach out, or anything really?

Thom Lamb:
Yeah, I guess just my LinkedIn for now. I, uh, I'm suffering the problem of, um, I was kind of just... I was like, this is a neat idea, and then it just kept getting traction without even really actively trying to focus on growth, so now I'm just like, oh yeah, I should probably stand up a website or something like that. Um, and it's only one of a few projects, but it seems to be the one for me. It's the one that's the most exciting. Um, so yeah, LinkedIn would probably be the best way to get a hold of me. The working name of the company is "Too Many Hats." You get it? Because, like, you got too many hats, you don't have time to interview people, you just want a good candidate, and it's perfect for small companies because that's where you wear all the hats.

Host:
Yeah, exactly. So that's how you get a hold of me. But the only... one more thing I just wanted to say with the whole interviewing thing is that the other problem is the online marketplace problem. You got supply and demand there with a lot of the interviewing, and this is a lot of interviewing. The problem is that once you decide that they don't want you anymore, they just say, "We're not interested," and you get no feedback, and you get no development. So talking about that talent pool and ongoing, constantly improving people's capabilities, that's an important thing to do too, and there's no better time to learn like what gaps you have in your knowledge than when you're getting interviewed.

Thom Lamb:
So one of the things I'm trying to figure out is how do you actually have a positive impact? Because the thing is that the 10 or 20 people that interview for your company are going to have so much more to say about your company than people you're randomly trying to target with advertising. They have the potential to speak well of your company, and I think that's important. So if you can give them a good experience too, that's something we're not doing with traditional interviewing. It's really just like, "Sorry, we're not interested. Next candidate." There's an opportunity there too, but I don't know what it looks like yet, but I just think it's very interesting.

Host:
I do too. I think, you know, even the idea of, you know, you have a lot of... a lot of people come in, and this whole idea of the Venn diagram of JavaScript is very real because obviously it's, it's... JavaScript exploded quite quickly, and a lot of people made a lot of contributions, which, uh, the pro is a lot of people made a lot of contributions; the con is a lot of people made a lot of contributions.
So, yeah, you know, you have this situation where someone could be very, very good at, you know, on the leftmost side of A, and they're fantastic developers, they have a lot of experience in that, but that overlap doesn't exist for you, and your company, it just doesn't make sense for you. Uh, but then because the HR or the recruiter isn't specialized, doesn't really know the ecosystem or even just the technology, they're not able to track, take that and say, "Hey, you're not good for this, but whoever the person that did interview you, they knew that you were good; you just weren't good for this." Uh, you know, let me put you somewhere else with this. You can, you know, because you do know and you did the interview and said, "Look, this company may not need you, but this company definitely does."
Yes. And that just, that creates a lot of, uh, a lot more of smiling faces, so I think this is fantastic.

Thom Lamb:
Yeah, it's one of my... the, the kind of dream that I had, I was just talking with a friend a couple days ago, you know him too, uh, Sam, Samir, and, uh, I was like, "I know this doesn't, this doesn't sound like something that would get funding, but, and I don't care," but part of my dream is that idea of, like, people actually look forward to going through this interview process because it's consistent, and also even if they don't get this job, they'll come into a pool, and they'll get some sort of feedback or support.
One more idea I just want to say is that we are always trying to treat people as points on a graph. We're always trying to say this person has this capability, and they're right here; they're a junior developer, and they know this. See, if you actually look at a person as a vector, because they're going to learn—good people are going to get better, and not good people are going to stay the same. And there's no way to measure that vector if you don't collect at least two data points. So if you're only doing this interview process with the person once, there's no way for you to actually measure that vector. If you can measure that vector, then you could start seeing people's potential, and you could reward people for their initiative in being like, "You know, I didn't get this interview, but then I went and I did a React Native project, and now I've got this." Now you start actually having meaningful data on candidates.

Host:
Yeah, you know them much better because you know their trajectory. So, you know...So we're going to sort of round this off, but if you have anything more to say, please go ahead.

Thom Lamb:
But, I think I said a lot. The opinions expressed are strictly mine and mine alone, not representing any party.

Host:
Tom, I support Tom L's. But, uh...

Host:
But no, but thank you so much for coming, coming on the podcast, being my first guest. I appreciate it. Contact, uh, Tom Lamb on LinkedIn, it's T-H-O-M and Lamb, and I'm the first one that shows up because I have like a lot of LinkedIn followers.

Thom Lamb:
Indeed, he's, he's a very... he's a very popular guy. You may not contact him right away, but when he does get back to you, you know, take a picture of that. So thank you so much, really appreciate it.

Host:
All right, so this is the end of the podcast. Uh, please like, uh, please subscribe, and, uh, we'll see you next time.

Thom Lamb:
Thanks for having me.

Recursive House

Recursive House provides consulting and development services tocompanies looking to integrate AI technology deeply into their companyoperations. Using our expertise we teach and build tools for companies to outcompete in marketing, sales, and operations.

Trusted Clients

Helping Clients Big and Small with
Clarity & Results
recursive-house
recursive-house-client-openai
recursive-house-client
recursive-house-client-td
recursive-house-client-staples
recursive-house-lululemon

Drop us a line, coffee’s on us

What's better than a good
conversation and a cappaccino?

recursive-house-address

Address
Toronto, Ontario

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Looking for More Information?

Download our latest report on the AI market to gain valuable insights, understand emerging trends, and explore new opportunities. Stay ahead in this rapidly evolving industry with our comprehensive analysis.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
View all News
recursive-house-logo

Lets Chat About Your Future

Unlock the power of AI with Recursive House’s tailored AI/ML and GenAI
services. Our expert team follows a proven development process to
deliver innovative, robust solutions that drive business transformation.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.