Post

Interviewing with an AI assistant

Interviewing with an AI assistant

Let’s chat about that AI recorder in job interviews. Since I’ve been asked a few times about it on interviews, it can’t be that common yet. I’ve been using it a lot while in the hiring loop for various roles and have come up with some working patterns to make it halfway helpful.

tl;dr - I’m the one getting coached on the recording.

😰 Including it made me pretty uncomfortable to start with too. This sort of thing was unnerving to me as a candidate when I was interviewing around for the role I’m in now, too. It seemed to be less common a year ago.

🏢 It’s not mine. My company pays for it as a service and the applicant tracking system adds it to our calendar invites. The company controls the data, owns the recordings, and all that jazz. I’m assuming all i’s are dotted and t’s are crossed here. I cannot see any recordings that I wasn’t part of.

Individually

🤖 On any individual interview, I’ve found the “AI” features not useful. It’s a mediocre transcription servicer. It takes a decent chunk of time editing what it wrote to be helpful to the hiring manager who’s looking over mine (and other folks’) notes. I simply place a couple dashes and let it append away below my own notes. Here’s some funny ways it’s goofed up in the transcription.

  • My “one rep” became “one wreck” 😆
  • Apparently there’s “feature parody” between commercial and government regions … 🤣
  • Various struggles with abbreviations and acronyms - “aisle” for “IL” (impact levels), “essay” for “SA” (solution architect), “eye tar” for “ITAR”, etc.

🚩 It’s halfway decent at flagging unsafe questions or conversations in the recording. But it isn’t confidently good here either.

I’d said something to the effect of “I have no problem saying ‘bugger off’ to unqualified demo requests because it’s a waste of everyone’s time” and “I typically don’t check my email over the weekend as I’m with family.” It flagged the phrase “bugger off” as being impolite. The remark about spending time with family could be seen as discussion of life outside work. In both cases, the conversation didn’t get into unsafe territory, but I’m sure folks feel better having this sort of thing recorded if it did.

However, it did not catch a candidate that said “we want them to open the kimono a bit over some drinks.” I flagged that one myself, as a human being. It’s still not an uncommon phrase to hear in sales and here’s why that’s a problem . 🤦‍♀️

📝 To counter this inconsistency, I take notes. These aren’t detailed, just a couple words to jog my memory when writing up notes right afterward. You have my full attention.

👩‍🏫 Having a recording is nifty in some cases. I use it to ask the hiring manager directly for feedback as someone on the loop for their roles. In one case, it was “did I sound defensive?” when answering a candidate asking sharply how many years of experience I bring. In another, I wasn’t sure if we were talking past each other so I asked for another opinion.

Where it’s truly helpful is looking at my interview trends more broadly. It’s probably even more insightful at finding trends among interviewers as a bigger group.

👩‍⚖️ Talk ratio is something I’ve been a bit obsessed with since an episode of Radiolab about the percentage of time each Supreme Court justice gets interrupted, broken down by gender (link ). Unsurprisingly, the largest percentage of interruptions are men interrupting women. A data-packed op-ed piece in the Washington Post shows the same trend of “talking too much” being perceived very unequally along gendered lines (link ). It’s one of those fascinating numbers that is much easier to find thanks to automated analysis.

Interviews where I left feeling “talked at” are backed up by the recording stats. Some are 85% candidate talk time with 4-minute monologues isn’t a great conversation for either of us. Chances are, if we’re not having an active conversation, I’m not hearing things I need to know about how someone fits into this role.

The interviews I’ve been most positive about are about 2/3 candidate talking and 1/3 interviewer talking. It’s mostly me to kick it off with easy questions and providing some structure to the conversation - how are you, explain this round, got a couple tech questions for you, then about halfway through, flip the table to let you ask questions of me. When candidates don’t have questions for me or don’t leave me space to ask a clarifying question, it shows up very visibly thanks to the recording.

🏃‍♀️ How fast am I talking? How has that changed over time? Slowing down and being more deliberate with my words is something I’ve been working on, so it’s nice to see that reflected in the data.

screenshot

🔂 It’s really bad at telling how often I “follow the script”. I do have a set list of questions to work from, but it’s more about “hitting the high notes” rather than following a checklist. It can’t pick up questions out of order as the conversation flows. It’s also bad at picking up on “alternate phrasing” of a question. As an example, the question “tell me about a time you had to handle a difficult customer or prospect” can be phrased a lot of different ways. It tends to come up in conversation naturally a good percentage of the time too. My notes always catch this, but the AI doesn’t usually pick up on it unless I use some string of keywords.

💖 Interviewing is emotionally intense, time-consuming work. It’s true when you’re the candidate and it’s true when you’re the interviewer. I hope this helps answer some questions about what that “bot” is and how it’s used.

This post is licensed under CC BY 4.0 by the author.