Don’t you just love technical interviews, with someone who just saw your resume or CV 5 minutes ago asking you to write some code on a whiteboard probably code that has nothing to do with anything you’ve done before or anything you will do at the company.

No? Neither does Nathan Aschbacher. So when he started building the team at his company, he decided to do things differently.

Transcript for episode 182 of the Test & Code Podcast

This transcript starts as an auto generated transcript.
PRs welcome if you want to help fix any errors.

00:00:00 Don’t you just love technical interviews with someone who just saw your resume, like five minutes ago asking you to write some code on a whiteboard? Probably code that has nothing to do with anything you’ve done before or anything you will do at the company? No. Well, neither does Nathan Ashmarker. So when he started to build the team at his company, he decided to do things differently.

00:00:24 Support for testing code comes from Sauce Labs, the end to end solution provider that helps development teams build digital applications so they work exactly as they should on every browser, OS and device. Every single time. They offer full stack and full software development lifecycle testing, including low code, mobile app, mobile beta, API, error reportering and monitoring, cross browser, UI and visual and automation. Learn more at SauceLabs test continuously test smarter, develop with confidence.

00:01:10 Welcome to Test and Code.

00:01:20 Welcome, Nathan, to the show. And this is their first time on Test and Code. So can you tell me a little bit about yourself?

00:01:26 Hello, Brian. Thanks for having me here. So my name is Nathan Ashbacher. I am the co founder and CEO of a company called Oxen.

00:01:34 We build software for continuous verification and validation of robotic systems, more or less, or I guess cyber physical systems. So software that touches the real world.

00:01:45 Cyber physical. That’s a great word.

00:01:47 Yeah. Well, when we were raising money for the company originally, nobody knew what that meant.

00:01:53 Fortunately, in the last couple of years, it’s become a much more common niche word. I guess it is a cool word. I agree. It has all of the things that have real meanings, plus all of the nonsense that you want in a term using tech.

00:02:07 Okay, a little bit more specific.

00:02:12 How about an example?

00:02:14 Yes, like a lunar Rover or wave power generation system or autonomous car or an autopilot drone, these kinds of things. So piles of embedded components talking to each other, trying to achieve some larger mission and the things that they interact with. Right. So sometimes that’s like a command center that’s trying to keep track of them all.

00:02:37 A lot of folks who are dealing with testing are often dealing with testing one program or they’re dealing with some integration between a handful of different services. And for us, it’s more about kind of a broader, more general sense of a system. So interactions at a control panel as they go down and filter their way to how does that affect the control cycle on a robot that’s doing sort of sensing and deciding and acting? And then what is the reaction that causes on something that’s in the environment that may be instrumented somehow to take some kind of signal about the reactions of the robot space?

00:03:12 The focus tends to be like primarily on the embedded space on the integration side, but the use cases tend to be more in the systems of systems.

00:03:22 It’s kind of like the terminology for them that actually sounds like a ton of fun.

00:03:29 Yeah, it’s great. I mean, sometimes depending on the audience, we’ll kind of talk about in the software space there’s chaos engineering has become a kind of brand identity or like a name or a practice that has got popularized somewhat recently.

00:03:44 And so I was a contributing author to the I think the most recent possibly the most recent O’Reilly book by Casey Rosenthal and Nora Jones, not the musician, the Jelly co founder.

00:03:58 And my section was on cyber physical systems. And it is super fun when we’re starting to work with a lunar Rover project right now, a team putting together a lunar Rover that’s supposed to drive around the South Pole for water and kind of starting at system level testing. And then we’ll kind of transition over time, I think, into the more adversarial exploratory chaos engineering type things because they want to bleed out risk before they launch it to the Moon. Right.

00:04:22 And so they’re being able to say, yes, we’re going to try to get chaos engineering on the moon in 2023 is kind of a really cool thing to think about. Nerd Me, eleven year old Me is very excited.

00:04:33 Exactly.

00:04:37 So is a lot of the work in simulated stuff or using physical stuff?

00:04:43 Both. Yeah. So the way that we set up our stuff is it’s kind of additive or kind of Bolt on infrastructure to testing processes or automation that you may already have. So if you have a software and loop environment or a hardware and loop environment or a simulator and loop environment, and really our platform is about the collecting event and signal data about what’s going on. So data from the test harness data simulator data system itself, bringing that stuff into a back end and then essentially evaluating over that. So we’re looking to answer the same kinds of questions you might ask in like, model checking, except for we’re asking them over what we’ve observed rather than some formal specification, part of the reason for the exploratory testing bits that we do have to do with trying to make that model more rich.

00:05:33 Right.

00:05:33 So just having an empirical model of what you the happy path and maybe a campaign of sad paths that you saw is not that great of a model. If you’re trying to sort of more fully verify or validate the behavior of a complex system, you want to try to drive it into situations that are going to be abnormal or edge cases. And so we have a pile of infrastructure that you kind of Bolt on to simulation or to exposing parameter points that we can fill with like stress points in a system to change those parameters and try to get the system to do something interesting to enrich that model more.

00:06:02 Wow, that sounds cool.

00:06:05 Casey used to be a Portland person as well. Did you know Casey when he was here?

00:06:09 I did, yes.

00:06:12 Well, I worked for him at one point.

00:06:15 He implicitly hired me at Basho. We both were Basho React guys at one point. And so we worked together for a while there, and we’ve known each other ever since. I was sitting next to him when he got his at a coffee shop here in Portland when he got his offer to go to Netflix.

00:06:30 So, yeah, we’ve known each other for a while.

00:06:32 And this is all fascinating. I really should have asked you on to talk about testing systems because that is our systems assistance. But instead I want to start and talk about a zoologist.

00:06:45 Yeah, that’s right. That’s how we originally, I guess you virtually met.

00:06:49 Yeah.

00:06:50 So I had an interesting experience. So Lauren Hockstein, who’s at Netflix actually posted get a tweet about something about zookeeper. Incidents are rare, but they’re epic or legendary or something like that. And I had one. I was like, he’s absolutely right, but not in the way that he thought.

00:07:10 It’s true that when so many things critically depend on zookeeper, that when it tips over ten, really bad things tend to happen. In my particular case, though, I was in a situation where I joined a team that was starting to build out a lot of infrastructure for specially payments, like transaction processing stuff. And we are trying to scale the team pretty quickly. And so you get a lot of resumes come through. You end up getting kind of signed up for interviews that you didn’t necessarily don’t know much about before you sit down in the room and start talking to folks just for a typical tech hiring practice. Right. Like throw an engineer in a room with another person who’s unsuspecting and see what happens.

00:07:50 And so I sit down and the CV is in front of me and I’m just looking at it and going over it. I’m scanning it and I’m scanning it, and there’s nothing on it. Literally nothing on it about anything to do with software development or It infrastructure or any of this. And I was like, looking at the person sitting across the table from me, and I go, what is this about? I see a lot of experience. It seems like you’ve worked with at zoos. You’ve been an animal caretaker. What is happening? And they just looked at me like, I don’t know. But you called me in here, right.

00:08:26 And finally things started to piece together and I went, oh, I know what happened. And they go, yeah. And I go, there’s a technology called zookeeper that’s on the wreck for this where we say, like, experience with zookeeper a plus, I gather you’re an actual zookeeper.

00:08:49 And I was like, okay. And then, of course, my brain is catching up with myself, right? And as we’re sitting there kind of talking, it’s fascinating. Like, I knew nothing about zookeeping, so I asked a lot of questions about that. And I was really curious and interested.

00:09:03 But I just like.

00:09:04 After a little while, I went like, well, I understand kind of what happened. I get it, right. It’s like recruiting tends to look for keywords and they reach out to folks. And depending on how fast hiring has to happen, there’s only so much kind of diligence that can be done. And usually the diligence is both sides. One side goes, I get a bunch of here’s a fantastic JavaScript job for you. And it’s like, I’m CEO of a company now. I can’t do that. So it’s like, there’s the diligence on the other side going, this isn’t for me. No, thank you. Right. And so I went, well, why do this? Why are we sitting here? Like, surely this made sense to you.

00:09:40 All the rest of it was about software.

00:09:43 And I went, yeah, but I don’t know, it’s Silicon Valley like, people spend all kinds of crazy money on things that make no sense. I don’t know, maybe it was some kind of petting Zoo feature. And I was like, okay, I mean, that makes complete sense. If you’ve recently watched the show, that makes complete sense.

00:09:58 The best part I’ve heard of Google and Microsoft hiring like poets and things like that.

00:10:04 Yeah, I was the first interview of the day.

00:10:10 So they decided not to go on and do the rest of them because I think it’s possibly just kind of when I notified, I was like, hey, this doesn’t make any sense. They went, oh, we should just stop this.

00:10:21 But I really wanted to hire the person. I think Casey might have followed up on Twitter and he was like, yeah, the funny thing is she really wanted to hire. I was like, yes, that’s true. I really wanted to find a reason.

00:10:33 So if people have not actually used zookeeper, zookeeper is an Apache thing, but I’ve not used it. What does it do?

00:10:41 It’s like consensus as a service, basically. So it’s been a long time since I had to deploy it for anything either. But back in the day, it was largely a key value store that gave you distributed essentially consensus consistency for key value things you wanted to store. So it gets used for service discovery and configuration, template, management, things where you need to keep a global sense of something for all the rest of your infrastructure to pull on.

00:11:07 Yeah.

00:11:08 Okay. So I reached out just because I really wanted to hear the story just from you. And this is hilarious.

00:11:15 But then you said that you hire now because you have a company.

00:11:24 Well, how big is your company?

00:11:25 It’s not that big. We’re still under ten.

00:11:28 We have a couple of reasons. One is we’re still fairly early stage.

00:11:32 It’s a long road for companies like us because the OT rather than it space, the operational technology space just tends to move a little slower.

00:11:42 There’s a lot of big logos there’s, like the Lockheeds and the Volvo, and companies like this. And they tend to move at certain speeds.

00:11:48 Yeah. But even though the projects are longer scale.

00:11:51 Yeah, they’re larger scale, longer scale. But for us, that means we have to stay lean longer.

00:11:55 That’s kind of the thing in terms of scaling up the company. But the other part of it, too, is that just as a thesis, my co founder and I tend not to buy into the hyper growth model of we’ve seen it fail so many times. Even companies we’ve worked with that got started where we first met teams there, and they had ten people, and then a year later they had 200. And every single time, every single time we talk to the SVP and go, how’s it going? They’re like, it’s a disaster.

00:12:23 We can’t onboard people effectively, and you can’t like, you can’t bring people into a culture that fast.

00:12:31 People who are getting paid to do good, expensive work feel eventually an obligation to start doing something. And so they don’t want to sit there and do nothing. And so then you end up dealing with, like, well, a lack of sort of coordination and orchestration on the one side bringing them in, plus a very good sort of work ethic impulse to be producing something that creates a different set of challenges, which is like, what do we do with all the stuff we didn’t need to have happen that now exists?

00:12:56 And so it’s just an alignment nightmare. So we’re saying, like, lean on purpose for that reason.

00:13:05 Just a short pause to shout out to everybody out there if you want to hire a really excellent developer but you don’t want me to do anything, I totally reach out.

00:13:20 Some companies that I can introduce you to.

00:13:28 You said you have an unorthodox or a different way to onboard and hire people. So let’s dig into that.

00:13:37 How do you bring people on?

00:13:38 So part of this is motivated by I’ve been on all sides of this for a long time.

00:13:45 I like to say I don’t look it, but I’m pretty old. I’m in my 40s now. I think it’s mostly just my generally immature attitude that creates the perception of plausible youth.

00:13:56 But I’ve been on the end of it where I came out into tech, into bubble busting. So I left College. Nobody was hiring. There were no the year before, there were all kinds of signing bonuses.

00:14:11 Every student parking lot at Gonzaga University pre 2002 had a new Land Rover in it that some student in the engineering school had bought because they had gotten their signing bonus and they were going to go. They use that as a down payment for a car they didn’t need. Right. And so by the time I got to my senior year, the market had turned and there was no recruiting happening. I was in Spokane, Washington.

00:14:32 It was like five or six months until my first interview even. And by the time I got there. I was getting interviews.

00:14:40 I was up against folks who have been in intel for 20 years who had lost jobs, had kids, had mortgages. And so I saw kind of like that end of it. Whereas, like, the lean times and how the sending out thousands, literally, of Was the only game in town back then for that kind of thing. Just sending out these resumes, no response, just miserable and saddening experience. Right.

00:15:04 And then now it’s the happy times. Like, the job market, I’m told, Is just incredibly crazy out there. I wouldn’t know because I made the mistake of starting a company, but there’s still this kind of a trend that has persisted. I don’t really know exactly where it came from, But I can’t identify its origin, but a lot of folks talk about the tech interview. There are books about it, like acing, the tech interview, this kind of thing.

00:15:32 And it’s always seemed really insane to me that especially as engineers of systems that were so colossally Sort of backwards on this sort of thing, Because we see a lot of people kind of mimicking the hiring practices or they’re kind of doing the cargo quilting. Like, well, what does Google do? And you go like, well, yeah, but you’re an early stage, four person company. You have different criteria and different needs and different ambitions and goals, and you’re hiring than Google has. Also, Google has many money printing machines.

00:16:07 They can’t even tell if they’re doing a bad job at certain things. Right there’s. Like the feedback signals are so much worse and so different. Right. Yeah.

00:16:18 So I just want to interrupt you for a second. Even large companies, I work at a relatively large company, but often the hiring is a local thing. We only have a handful of people dealing with the hiring stuff.

00:16:31 Like, even large companies have small.

00:16:34 It’s more like a start up.

00:16:36 Yeah, for sure.

00:16:37 Yeah, totally.

00:16:38 Yeah.

00:16:40 These kind of practices, it’s kind of like there’s that Ned Flanders meme something like we’re all out of ideas and we’ve tried nothing or whatever it is. Right.

00:16:55 And I kind of have felt that way. When I’d go through around 20, 10, 20, 11, 20, 12 era, I was out there being. Doing more interviews on the being hired side, like trying to find a job kind of thing and going through these really bizarre interview processes Where clearly the person who was doing it didn’t want to be there.

00:17:16 They weren’t prepped or primed to do this. It’s weirdly, stilted, and adversarial in a way that just kind of dehumanizes the entire process. And often you get asked things that there’s this patina of objectivity, but we ask everybody the same questions.

00:17:34 That’s a faux objectivity that you’re getting out of that. It’s not necessarily a great signal of having somebody stand there on a whiteboard and try to derive in real time Something that took three PhDs, 20 years to come up with that you just happen to know about because of this interview, right? Yeah.

00:17:52 It’s that kind of thing.

00:17:54 As I kind of, like transition through my career and kind of got into these positions of being able to influence hiring processes. And then, of course, now ultimately deciding what the hiring process is, I try to focus it much more on maximizing what I believe to be the real signals that you’re looking for. There are certain signals that these kind of traditional kind of tech hiring or interview processes look for which are supposed to try to filter, like, can the person write a basic program, you know, these kinds of things. But if you can’t figure out if a person tried a basic program before they show up to have an interview, then the process is broken someplace else.

00:18:34 And there are a lot of ways to get to that answer that don’t require sort of hazing, a person in real time, like in a weird office with a person who doesn’t want to be doing the hazing and a person who doesn’t want to be receiving the hazing.

00:18:48 So let’s just pause there.

00:18:50 What are some of the purchases you do? Do you have them do take home assignments?

00:18:55 No. That seems like also a little bit unfair. And so what we do is I always say the way I kind of phrase it is if the goal is to try to figure out if this person can do the work with your team and you’re trying to build a team that you’re adding this person to, it’s a two way street. They need to understand what it’s like to what they’re going to be working on and what their team is going to be like. And the other end needs to understand what this person is like. And if they feel like they’re going to be added to this team, instead of conducting or constructing some kind of weird artifice as a workplace simulator, I just go like, let’s just bring the workplace to them. And so instead we’ll do like basically I do screening resumes, CVS, just looking for some characteristics that I tend to find for the kind of weird corners of problems that we work on, people who have picked something strange to align themselves to. So they’re a little bit out, like if they’re like a closure expert or they’re like an older, Lang developer. And so there’s a certain amount of pre screening that I look for there. This person had some motivation to take on something that nobody told them they should do.

00:20:12 Absolutely nobody told them they should do this. Their only incentive was, like, internal motivation. They thought this was interesting and they wanted to pursue it and kind of become an expert at a weird thing. Right. And so I’ll do a little bit of screening for stuff like that because there’s a lot of self determination and kind of agency that I look for in people that they run into a problem, they don’t stop and just roll over. Right. They kind of dig at it. They pull on people to help them like that kind of thing, and then just do a basic sort of phone screening where we talk and it’s really informal. I set it up that way too. I just wanted to have a conversation where it’s two humans trying to figure out if any of this makes any sense.

00:20:51 And then the next step after that is just set them up into a couple, maybe two, 3 hours, if possible, of direct pairing on a real thing that somebody on the team is doing. So whatever thing they’re doing that day, it could be design review. It could be digging around in some bug that’s creating a problem.

00:21:10 Whatever it is, even if it’s creating some strife internally, it’s try to create a real experience for this person to go. This is the kind of thing that’s going to happen here and you’re going to see this engineer, when they encounter this kind of problem, handle it this way to try to just make it be a real workplace kind of scenario and to then have a debriefing with both sides afterwards where talk to the candidate and go like, well, what was that like? And what challenges do you run into and which parts of it did you feel were a struggle in terms of what it was like to interact with the other person? Where did you feel like you left something on the table that you wish you had been able to communicate or convey that you don’t feel like was effectively evaluated in that way so that they feel like they have an opportunity to kind of essentially follow up on how they’re viewed or what the perception is, give them a sort of second impression opportunity. Right.

00:22:09 And then do the same thing on the internal team side. Go, okay, what was this like?

00:22:14 What kinds of things did you do? So I have a sense of kind of what you worked on so I can follow up with some additional questions and then really just try to make it be more about finding good alignment to build a high functioning, highly effective team and not about do they know some esoteric that they bought a book to tell them what the answer was and they got it more right than the person before?

00:22:38 Okay, that’s pretty cool. I really like this idea of kind of an opportunity after the like right after. I assume it’s the same day you’re talking with both.

00:22:49 It depends, but sometimes it’s like a couple of days, but yeah, it’s soon after.

00:22:52 Soon after to be able to say and actually it might even be better like the next day or a couple of days so somebody could think about it.

00:22:59 Yeah. Times of summer. Think about it.

00:23:04 How was that experience? Is there something you left on the table or I liked that part. I remember an interview I had where part of the interview was I was supposed to discuss something complicated or something like that.

00:23:20 Most of my career I’ve been in embedded systems, but they’re not like tiny embedded systems.

00:23:28 These are test equipment, and it’s usually there’s half a dozen to a dozen different processors in there speaking different languages, different operating systems.

00:23:38 Yeah.

00:23:39 It’s like an entire Internet in a box with things coordinating at different clock speeds and all sorts of chaos going on inside and coordinating all of that with APIs and different things.

00:23:53 There is a complex system. It’s also a system where you can’t know the whole thing. You depend on large teams to understand different pieces. And I was describing a lot of that stuff, and I could tell the person that the interviewer had completely checked out.

00:24:15 They had decided that embedded was simple and they weren’t listening. And it was a frustrating experience. And I never got an opportunity to talk to the hiring manager or somebody else to say that sucked. That part of having somebody that just wasn’t interested in what I had to say didn’t feel good.

00:24:34 Yeah.

00:24:36 It’s really unfortunate, too, because I feel like what has happened obviously not everywhere, but in a lot of cases, this push to try to hire a lot of people really fast to fill seats or spend investor money or whatever it is you’re supposed to be doing. Right. Scale, scale, scale. It turns the process into a meat grinder, which is not what you want. That’s the way that you do canvassing for consumer goods. You go like, hey, I got some new widget and I want to spam the universe with as much as I possibly can. But when you’re trying to build a team, you’re never going to have 100 million team members that you filter through. Right. And so the idea is to have you want to have as little churn as possible and you want it to be as high outputting and effective as possible. And so it’s much more like the kind of thing you would do in enterprise sales where you’re doing very precise targeting. And you’re like, look, I’m looking for this. This is the way we qualify this customer. What kinds of criteria do they have? And there’s a lot of forethought that has to go into it.

00:25:40 Right.

00:25:40 But in the absence of forethought, we replace it with machinery and scale. Right.

00:25:45 Yeah. And it’s interesting, the comparison to, like, an enterprise sales or something, because in test equipment, we’ve got consumer level stuff or commodity systems like small spectacs and scopes and stuff like that. And those are often sold through third parties of just people want to scope. So they’ll go get a scope based on specs. Yeah. And then larger systems. These are definitely like a description that you brought up where both the customer and the supplier have to discuss whether or not this is a good fit before they just try it. That’s not going to happen.

00:26:28 It’s an interesting comparison.

00:26:30 Yeah.

00:26:33 How about onboarding? When you bring somebody on, do you think you try to treat that different? Do you do mentoring, for instance?

00:26:39 The team is so small that we all end up being everybody works really closely together, and so there’s a lot of constant communication in terms of onboarding. Like I can say as an early stage company, especially in the Pandemic, we’re not great at this.

00:26:57 A lot of it is like prepandemic. There is an opportunity and where people kind of work congregating or aggregating. You get to do a lot of the soft onboarding, like the kinds of stuff that when you’re doing enterprise business development, where you’re like, we got to lunches and you get to know each other and that kind of stuff. And that’s so much harder in this environment.

00:27:18 Right.

00:27:18 We’ve hired a couple of people, like our VP of Business development and another engineer during the Pandemic somewhat recently, and getting them integrated and getting them to feel comfortable. I don’t think I’ve succeeded at it, frankly, speaking as an indictment of myself as CEO, but it’s something I’m aware of and trying to find ways to improve it. And of course, when restrictions start to wane and stuff, we’ll try to do some offsite kind of things, but probably we could provide more structure to the onboarding process.

00:27:49 But there’s also a kind of just get people into the thick of it immediately kind of thing where they just are to try to remove the barriers of stilted team integration, where they’re like, they feel like an outsider as quickly as possible and be like, look, this person’s in the trenches with us, and we’re going to be brainstorming and working on problems and to try to give them a voice that’s at parity with anybody else who’s already been on the team.

00:28:17 That’s kind of the approach. But in terms of the schwag and the onboarding and whoops we forgot to provision a UB key thing, I got to fix those problems.

00:28:28 I remember my first job out of College, I was at HP, and I totally felt lost because there was a lot of mentoring and stuff. But I felt like I had total imposter syndrome because I was pure computer science and I was getting hired, and most of the people there learned programming on the job, and they were like physicists and stuff like that, electrical engineers. And one of the guys that had been there for, like, ever, I think that 40 years or something like that was talking to me and asking me questions at a meeting as if I had been there as long as he had. Yeah. And I went and talked to him later, and I said, That’s Super cool.

00:29:13 Are you consciously doing that? And he said, well, everybody that works here has been here less than me.

00:29:20 Also, we hired you because you got skills and I don’t know what they are. So I got to ask you. Sure. Yeah. Anyway, that treating somebody as an equal from the start is incredibly powerful, and I think it goes longer further than people realize.

00:29:36 Yeah.

00:29:38 I don’t know where that comes from. I’m sure I tend to be fairly self deprecating and don’t take myself very seriously, which maybe helps. I imagine it can be like a little bit can be usefully disarming, I think, in a lot of cases.

00:29:52 Right. Yeah.

00:29:55 I don’t really know where that comes from.

00:29:58 Maybe like being raised Catholic and so feeling like everything is my fault all the time.

00:30:04 But also I have struggled.

00:30:09 Even the thing that we’re doing now, it seems fairly novel. Like the kind of stuff that we’re doing.

00:30:14 We kind of try to take the best of property based testing and the best of model checking, like the value that they’re trying to provide to the development and kind of understanding of systems and systems engineering and then trying to take them out of the realm of academia and turn them into something that could be productionized or productized. And we don’t find a lot of competitors to this. But by the same token, I had it kind of dropped out of me. The idea that I would have any kind of like novel or unique insight when I thought I invented the jet engine when I was eleven and then saw a cutaway of one and went, oh, that’s all the ideas that I thought I had invented the decades before.

00:30:54 Okay. I’m never going to have an original idea. And so for me, what that means is there’s no sacred sort of like idea. Everything is up for grabs to be able to go like, okay, let’s indict this thing.

00:31:09 Having new voices around the table to perform an indictment of a thing that we all kind of get a group think about. This is the way to go is one of the most valuable things in bringing new people on initially. Right. Actually leveraging that outsider perspective because you’re only going to get it for a minute until they’re indoctrinated and now they’ve become useless in that regard. Right. And so being able to leverage that, I hope that what that does is it helps people feel comfortable and that they are able to have that voice because it’s actually valued. I think it’s the most important thing to get out of somebody in the first month, month and a half that they’re on board before they’re so embedded in it that it’s hard for them to they have a curse of knowledge at that point.

00:31:48 Right.

00:31:49 And they don’t know anymore, like what it was like when they first joined.

00:31:53 Yeah.

00:31:56 I still tripped up at eleven. You independently invented a jet engine.

00:32:02 Well, I think you did.

00:32:03 My grandfather was a pilot, and it was pictures of him back there above my Silicon Graphics workstation.

00:32:11 And so I got really into materials science and aerodynamics at a really young age. I had all these books because I just thought it was all fascinating. And so I remember having the idea.

00:32:23 I was like, oh.

00:32:24 Well, this is how if you’re combusting the fuel and mixing with oxygen and that’s creating some amount of pressure pushing out this way. What if you had a bunch of fan blades that were kind of increasing the amount of pressure at the same time as igniting different stages of this thing. And I had this whole mental model for it. And then I told my grandfather about this.

00:32:46 Look at what I thought up. And I had these awful children. There was like green marker pen on a piece of paper. And he went, that’s really neat. And then we got into his airplane and flew to the Airspace Museum in Seattle. And he showed me and there was a cutaway of such a thing. And it just had all the parts. They’re obviously much better than my imaginary parts because somebody who knew what they were doing had tried really hard to do this very well. But like the premise was in there. And I just went, oh, okay, that’s it.

00:33:17 That’s incredible, though. I mean, at eleven, I think I was attaching marbles and Plato to mousetraps to be able to drive bigger holes in the wall from a distance.

00:33:29 And you’re doing stuff like this. It’s incredible.

00:33:33 That’s pretty cool.

00:33:35 I refuse to take the compliments, but my mother will be very happy to hear that.

00:33:41 I think I had recently gone to a Universal Studios. So I thought I was going to be a stuntman and I was jumping off roughs and things like that and seeing if I could safely roll down the stairs until I got told to not do that anymore.

00:33:56 Could you do it?

00:33:58 No, but we did. Let’s see, we did manage to figure out a way while we were doing something stupid. Figure out hard sided suitcases, make a decent sled downstairs, but dangerous open face.

00:34:15 Well, yeah, the top you kind of hold because you have to pull it up. Because if you don’t pull it up, the bottom hits a stair and you’re done. Yeah, but don’t try this at home. It’s way too fast.

00:34:29 No, try it at work instead.

00:34:32 No, we’ve got like cement stairs at work.

00:34:35 Never mind.

00:34:40 When it comes back to pair programming, one of the things you said is you spend two or 3 hours with a candidate working on a problem or something like that.

00:34:49 Is pairing or group coding part of your daily routine the company? Or is it something you kind of only do during an interview process?

00:34:58 No. I mean, I have to ask the team, but I mean, I try to encourage it as much as possible. I mean, it’s pretty creating that sense of, I don’t know, camaraderie and willingness to kind of pull on each other for when there’s help needed or something I think produces it largely organically. I mean, there’s certainly times where folks go off and they’re kind of heads down working on a problem and then kind of bring it back out when it’s far enough along some path to go. Does this make sense? Right. Yeah, but there’s a whole Dev channel in Slack that I’m not even in just for the developers to have a place where there’s not the prying eyes of management to be able to communicate and pull on each other and sit down and use parallel day long. I mean, there’s been a couple of times where I don’t think they actually did this because it would probably drive them the same. But I was like, look, just pair altogether.

00:35:55 A few of you just the whole day just be at least kind of like as though you’re in the same room working through this problem so that when you have banter and chatty ideas. But not all engineers want to be constantly on.

00:36:13 One of the challenges that I would find is I very much honor flex hours. So in order for people to actually be pairing for 8 hours a day, they have to have the same schedule. Right.

00:36:29 But anyway, it’s interesting. Have you had any feedback from people that have come on to say, hey, interviewing with you was less freaky than other places?

00:36:38 We had a guy that we ended up not being able to hire that I really wanted to hire, but because of just kind of like funding timing for us, we ended up getting a candidate and we were kind of going, okay, what’s going to happen here with some of the revenue stuff and some government contract things, not knowing quite if it was going to work out, but I was very upfront with them about it too. I said, hey, look, I think it seems like you’d be a great candidate. That’s why we’re even doing this. But I want to let you know that you showed up and I didn’t really expect it per se.

00:37:11 And so at a time when this wasn’t quite, we were trying to get ahead of this eight ball a little bit. Right. And so we didn’t want to be kind of stuck with no options depending on how things work out. And so just very upfront and very kind of direct and honest about it, set up the premise. I really like your before the show, like sort of thing that you write up. I don’t have it. It’s not published on the website, but it’s a similar thing. I kind of sense it’s like, hey, look, here’s what the process is going to look like.

00:37:37 I’m trying to I want it to be fairly informal. Like here’s what I’m looking for. Try to be more clear about what we’re kind of assessing and where I have question marks. Right.

00:37:46 And kind of putting that all out ahead of time. And even though we ended up not hiring them yet, so hopeful that as things progress, they took a different opportunity because reasons, and then we’re getting more financing, all that stuff. So someday I still like to hire them. But they came back and said even though this didn’t work out and I actually wanted to work here more than the other places that I was looking at, this process was so much more positive than it became like it was a peer relationship. Right. I was being fairly transparent in the way that I could around what we were facing, letting them know just as a person who I respected as a professional, and to try to let them know, like I have my own professional challenges and here’s what they are and here’s where they are creating conflict for our mutual goal here of you working here. Right.

00:38:38 So it pays off like that. That’s not the first time that’s happened. That’s just the most recent experience that’s occurred.

00:38:48 People like being treated like humans. I don’t know. It sounds ridiculous to say that, but we create so many systems for reasons that are still unclear to me many times where we just fully dehumanize these professionals or otherwise.

00:39:05 And it feels bad. Going and looking for a new job is a full time job that mostly feels bad.

00:39:10 Mostly feels bad. And also we were talking about earlier some of the traditional systems, like the Google’s or other people, and I don’t even know if they hire like this anymore, but at least people think they do the trick questions and whiteboard interviews and everything.

00:39:30 It’s difficult on both sides. It’s difficult to find people to do the interviews. It’s also on both sides, the company and the individual.

00:39:40 But also I’ve had co workers that have been and team members that have been great engineers that run the gamut of different personalities.

00:39:50 And some of the best people I know are quiet, contemplative, just hard working people. They get a hold of a problem and they don’t let go until it’s finished.

00:40:01 Those kind of people that have to think about something for a day before they really jump in. They’re not going to do great in the traditional interview.

00:40:09 So we have to be thinking about different ways to make the interview process and the hiring process more humane.

00:40:16 Yeah.

00:40:18 That’s why I said it’s such an incredibly high noise, so low signal value approach to doing what is probably the most critical function of building a team and creating a company. Right.

00:40:37 Who’s there doing it?

00:40:39 Ideas are really cheap. Making them exist is where all the hard parts are.

00:40:46 And it’s just such a bizarre kind of backwards way that we handle it.

00:40:56 One of the things that I hadn’t really thought about it until you relayed that was, I wonder, you might say like, oh, well, somebody having to pair with somebody they don’t know is also going to be potentially nerve wracking, which is true. I’m sure it’s a person you don’t know. Not everybody is as flippant and fake extroverted as I am.

00:41:25 I’m sure that can be a little bit nerve racking. But what does happen, though, especially with I guess it’s like you start the ball rolling. Right. Is that I trust my team, even though they’re not all like me, we’re all a little bit curmudgeonly, but they’re not all like me, but they are fastidious and conscientious. Right. And so if they see a person where they latch onto something, go, oh, well, they were getting it like they were asking the right questions, but they seem like they might be a little bit apprehensive or they’re a little bit kind of like socially distant, then that’s fine. It becomes like a thing I can talk about or talk to the person about in the fall. Just understand why. Then I get to learn more about their working style and what was uncomfortable, what was comfortable. Right. But then if you start at the top with trying to be conscientious about the process, you will bring in conscientious people and they will then help you cascade that as you need to hire more people. But if your whole system is turning a meat grinder and bringing in people who don’t want to be there to ask questions they don’t want to ask, if people don’t want to answer them, well, then it shouldn’t be a surprise when it doesn’t work out great. When the culture is a bunch of disconnected, weird things that don’t align very well.

00:42:37 Yeah.

00:42:38 I also like that.

00:42:41 I’ve been on different types of hiring teams. And when we tried to grow a team quickly once where we brought in lots of people from different groups to do the help with the hiring part, the interviewing part, that’s tough because the people interviewing, holding the interview, they don’t really care who you hire. So they’re not really good at I don’t think they’re really good at picking somebody if they don’t have a stake in it.

00:43:06 Yeah, I agree with that a lot. Actually. That was one of the things that I what’s that thing that they say? Like, there’s a lot of idioms, but there’s colloquialisms. I never can keep track of which one it is.

00:43:22 The most satisfied employees are people who have a boss that they think could do their job. So they feel like that there’s somebody there that understands what their work they’re doing.

00:43:33 And I feel like that is missing in a lot of cases from whether it’s like structural in terms of process or just in terms of who gets promoted to run things and become hiring managers.

00:43:47 When that principal agent alignment breaks down, then it seems like you can only exacerbate things because the person doesn’t.

00:43:58 You kind of have to have a system of weird.

00:44:01 What’s?

00:44:01 That crazy.

00:44:02 There’s a thing that I refuse to use at a place that I work where they tried to make me use some ranking coder rank or some insane what’s it called? Hacker Rank.

00:44:10 Hacker Rank. Yes.

00:44:13 And she said, you want to distill down all these people to this ridiculous number. This seems really terrible.

00:44:20 No offense to the Hacker Rank people. I’m sure they’ve been very successful. But on the other side of it, I’m just like, I can’t imagine putting somebody through this and then wanting to have them around. You’re punting responsibility. Right. You’re not taking ownership of, like, your role in what if you’re the hiring manager, you need to build a team. You got to understand what that team’s doing. You got to understand how the team works together. You got to understand their different personalities and what kinds of folks are good at different kinds of things. I think one of the biggest challenges that I run into, I imagine my co founder probably agree with this is that I’ll get in sometimes into a situation where I’m unwittingly leaning on sort of like the weaknesses instead of the strengths of different team members, not because I’m like, trying to or I’m trying to make them feel bad, but I really need this to happen. And this is the person who principally imagine, like a spherical cow of a human could do this. But the type of work that it is or what needs to be done is some design work or something like that. And they’re much better at sort of like implementation and being very prolific coding, but they’re the kind of subject matter expert for that area of the product or something like that. Then I end up kind of leaning on a part of that person that isn’t good for either of us. Right.

00:45:44 They don’t feel like they’re being utilized and leveraged and interacted with in a way that’s like drawing on their strengths. We’re not getting ultimately what we want, which is the best output out of that person on things. And so I’ve had to develop some mechanisms for myself to go like, oh, this is just me having stupid expectations and repositioning and reprioritizing. Who gets pulled into what and that kind of thing?

00:46:15 Well, I mean, sometimes you got no choice. Sometimes the person is the only one available.

00:46:19 Yeah.

00:46:20 Especially like a small team. Like, everybody hunkers down.

00:46:23 It is also interesting that even on a team, you don’t need everybody to have the same skills.

00:46:29 You need to have some people that are really good at finishing and wrapping up and crossing all the T’s and dotting, all the I’s and some people that are really good at starting brand new things and getting rolling.

00:46:40 Like getting the first prototypes.

00:46:42 And it’s all about getting good information and then using that good information to get good results.

00:46:46 Right.

00:46:47 Anyway, we’re kind of going long, but thanks, Nathan, for showing up and talking. Taking some time out of your day.

00:46:53 Yeah, that’s great. Thanks for having me. I appreciate it. Happy to talk about I want to talk in a whole separate thing about those complicated systems you’re referencing because that’s right up our wheelhouse. So I’m very curious about all that.

00:47:04 Yeah. We need to schedule something else for some system testing and system evaluation stuff. Cool. Awesome. Thank you, sir.

00:47:17 Thank you, Nathan for your thoughts on tech hiring. Thank you SauceLabs for sponsoring SauceLabs. Test continuously test smarter develop with confidence. Learn more at Thank you, Patreon supporters. Join them at Support those links are in the show notes at 182 that’s all for now. Go out and test something.