Platform · comparisons

The compliance record speaks for itself.

Legacy AI hiring vendors run on shared cloud infrastructure, train on pooled data, and score candidates on video cues. Nodes runs inside your VPC, trains on your own top-performer trace, and scores against production outcomes — not interview polish. Here's the side-by-side your procurement team is asking for.

Vendors compared
3 · HireVue live
Attributes
11 · filterable
Egress
Your VPC · zero
Source
Public record
Side-by-side · 11 attributes

The comparison procurement actually asks for.

Every row below comes from a published source or a verifiable buyer-side measurement. Green markers indicate a Nodes-held posture; amber markers flag a vendor-side regulatory or methodological exposure. Filter by theme to focus the review.

matrix · nodes vs hirevue · 2026.q2
comparison.matrix · 2026.q2 11 attributes · source · public record
attribute
nodes customer-vpc
hirevue vendor-cloud
01 · compliance Compliance record third-party audit posture & held certifications
Clean. No regulatory actions on record. SOC 2 Type I & II — held. Per-vertical operating postures available under MNDA. soc 2 · held
FedRAMP authorized. Ongoing regulatory exposure. NYC LL144 mandatory bias audits. Public-record BIPA, ACLU, CVS, and FTC filings documented in industry press (2019–2025). public-record filings
02 · data Where candidate data goes data-flow path for PII, video, transcripts
Your VPC. Zero egress by construction. Video, transcripts, scores, and model weights remain inside the customer perimeter. Audited against egress firewall logs. zero egress
Vendor cloud. External processing. Video recordings and biometric-adjacent data processed on HireVue-hosted infrastructure. Subject to state biometric regulations. biometric exposure
03 · model What the model learns from training corpus · calibration source
Your CRM, HRIS, and ATS. Multi-system fusion from your top performers. Per-customer fine-tune; zero cross-tenant contamination. single-tenant calibration
Generic models on pooled external video. Trained on millions of external interviews across employers. Calibration is shared, not customer-specific. pooled training data
04 · model Prediction accuracy published correlation to on-the-job outcome
80% top-performer prediction. Controlled study: 14.0% → 27.7% hire-to-top-performer conversion across 6,053 agents. Out-of-sample, quarterly refit. on-job correlation · 0.618 AUC
Employability score from video analysis. No published correlation to on-the-job performance. Facial analysis removed from methodology in Jan 2020. unpublished correlation
05 · compliance Bias and audit what the customer can audit · where audits sit
Full audit trail in your VPC. Data sources, model weights, confidence intervals, and bias checks — all exposed to customer audit. You own the chain. customer-owned audit
Third-party bias audits required. NYC Local Law 144 mandates external bias audits annually. Methodology revisions followed public pressure (e.g. 2020 facial analysis removal). third-party audit required
06 · compliance Legal approval time through customer-side procurement review
17 days at a Fortune 500 carrier. VPC deployment eliminates biometric-transmission concerns before they surface in review. No external data movement to negotiate. procurement · fast path
Extended procurement review. Biometric collection, external-cloud processing, and ongoing litigation each extend the data-flow diagram review cycle. extended review
07 · model AI interview approach signal source · what the model reads
Behavioral signal against production outcomes. Full transcript analysis. No video analysis, no biometric inference. Scores map to written-premium, retention, and role-specific production traces. transcript-only
Video analysis of verbal & visual cues. Historical methodology under regulatory challenge; facial analysis discontinued in 2020 following external complaints. methodology revised 2020
08 · business Time to production contract-signed → first req live
34 days, contract to live. 44+ HRIS and ATS integrations ship with the deployment. First calibration pass runs on customer-side historical data in week two. 34-day deploy
Not publicly disclosed. Implementation timelines not published in vendor materials; varies by assessment library and customer ATS. timeline undisclosed
09 · business Pricing public list · unit economics
From $10K/month. Credits, unlimited seats. Published starter price. Usage-metered via candidate credits; no per-seat license math. transparent pricing
Not publicly disclosed. Industry press reports package-based annual contracts; pricing not published on vendor site. private quote
10 · model Post-hire intelligence coverage beyond screening
13 agents · full talent lifecycle. Sourcing, screening, calibration, onboarding, ramp, retention, succession. Same substrate, re-targeted per stage. 13-agent roster
Primarily screening & assessment. Pre-hire assessment is the published core surface area. Post-hire signal is limited. pre-hire only
11 · business Integrations ATS / HRIS / CRM coverage
44+ ATS and HRIS connectors. Including Workday, Epic (for clinical retention signal), SuccessFactors, Greenhouse, iCIMS, Salesforce, and custom REST endpoints. 44+ connectors
Standard ATS catalog. Published integrations with mainstream ATS platforms; customer data routed through vendor cloud before assessment. vendor-cloud routing
nodes posture · verifiable on customer side public-record flag · vendor-side exposure sources · vendor sites · industry press · 2026-04
The buyer's question

Ask the question legal is already asking.

Procurement reviews for AI hiring don't stall on the model. They stall on the data-flow diagram. This is the question a Fortune 500 legal team drafted for their vendor re-review — paste it directly into your next RFP.

Show me the correlation between the score you produce and the on-the-job performance of the candidates we hired on it — matched on req, location, and tenure. And show me the data-flow diagram our legal team will sign. — AI vendor re-review · F500 procurement · 2026
on your next RFP · ask for all threeartifact
  • Published on-job correlation. Not an employability score; a measured correlation to a production outcome, on your data.
  • A data-flow diagram. Every hop for PII, transcripts, video, weights, and logs — with a single line a lawyer can sign.
  • A replayable audit trail. Any decision in the system, replayable on the customer side, with sources and weights exposed.
What to take into the room

A clean record. A correlated score. A trail you can audit.

Two artifacts your compliance team can request before any demo — independent of the comparison above. Bring them to procurement; they do the work.

artifact · audit trail inside your vpc

Every decision, replayable on your side.

The trace object persists inside the customer VPC. A compliance officer can pull any decision and walk it end-to-end without a vendor call.

sourcesATS req · HRIS trace · CRM activity · transcript hash
weightsper-signal weight · calibration version · refit date
confidenceinterval · out-of-sample holdout class
bias checkdisparate-impact ratio · protected-class pass/fail
soc 2 type i + ii see architecture →
artifact · 13-agent roster full lifecycle

Post-hire, not just pre-hire.

The same calibrated substrate drives agents across the full talent lifecycle. Pre-hire screening is one of thirteen surfaces, not the whole product.

pre-hiresource · screen · calibrate · interview
hirescore · offer · negotiate
onboardingramp · cohort fit · mentorship match
retentionretention risk · succession · internal mobility
one substrate · 13 surfaces see agent roster →

Concerned about your current vendor's legal exposure?

A 30-minute architecture walk with your compliance team. We bring the data-flow diagram, the audit-trail sample, and the on-job correlation numbers — printed, redlined against your requirements.

What your compliance team leaves with.

  • A data-flow diagram pre-aligned to your VPC
  • A sample audit trail for a scored candidate
  • The on-job correlation pack for your vertical
  • SOC 2 Type I + II report under MNDA