Topics
Three studies, one clear result: structured interviews predict new-hire performance twice as well as unstructured. Yet most HR teams still run unstructured. Here's why - and how to switch.

Structured interviews feel awkwardly stiff at first. Three roles later you'll wonder how you ever hired without them.
A structured interview has four properties: the same questions for every applicant on the same role, a pre-defined rating scale, scoring immediately after the conversation (not two days later), and at least two independent voices per application. Drop any of those four and you slide back into unstructured-interview land - which research famously labels 'not much better than a coin flip'.
Research here is unusually clear. Schmidt + Hunter (1998), updated 2016 by Sackett + Lievens, find predictive validity of ~0.51 for structured vs ~0.20 for unstructured interviews on later job performance. A 2x predictive value is rare in social research.
Three reasons, all understandable. First - it feels cold. 'We want to get to know the person' sounds warm; 'we'll go through a questionnaire' sounds bureaucratic. The truth: good structure leaves room for warmth, it only constrains the scoring. Second - it requires prep. Teams run structured for three roles, then drift back because prep costs time. Third - the tool makes it hard. If your ATS doesn't ship scorecards, structure dies on contact.
Template to steal
We've got a scorecard template with anchor levels for the four standard dimensions. One click to seed - then adapt to your requirements profile.
One - a prepared question set per role. Three main questions per scoring dimension (skills / communication / culture), plus one practical task. We recommend writing it three weeks before the role goes live and having every interviewer review it.
Two - a 1-5 scale per dimension with clearly worded anchor levels. '3 = solid, would function on the team, but not a standout.' Without anchors, interviewers drift by a full level depending on mood.
Three - scoring before discussion. Each interviewer records their own scorecard before the team talks. Otherwise the loudest voice dominates, which is almost never the most accurate one.
Four - consensus note per application. Don't average scores, write: 'We recommend hire because … ; note that …' A two-sentence consensus note beats an 87-point score in any later discussion.
Scorecards are first-class in KI BMS - every application has an Evaluations tab, each interviewer enters independently, all scorecards stay visible (no one overwrites). Recommendation fields from 'strong yes' to 'strong no' block meaningless midpoints. Consensus note goes in the timeline.
FAQ
Free plan, no credit card. We host in Germany. You can export and delete everything self-serve.

Written by
Co-Founder + CEO
Julia is one of the Co-Founders. She handles design, product direction, and most of the support replies that arrive in the morning.
Read next
Why KI in recruiting is more than a trend - and how to tell the difference
How recruiting substantively changes with KI - beyond the hype.
Read
Recruiting with KI - a practical guide
A practical guide, not hype - with clear legal limits and concrete step-by-step instructions.
Read
GDPR in recruiting - what you actually have to do (and what you don't)
Six duties, three myths - and how a modern ATS handles half of it for you.
Read