The New Definition of Visibility & the Evolving Role of IOCs: Detection Engineering Through a UFO Lens with David Burkett

February 3, 2026

Get the Giveaway

Detection engineering has the same problem as UFO sightings....sometimes we think we’re seeing something, but we’re not sure what.

In this UFO-themed special, Alex Hurtado and David Burkett break down the new definition of visibility, the evolving role of IOCs, and the rise of EDR evasion exploiting blind spots in our tools, data, and assumptions. 🛸

Shownote references:

  • https://www.liesabove.com/
  • https://www.magonia.io/
    • Signal Detection Theory: https://www.magonia.io/blog/vintage-detection-radar-research-cyber-threats/
    • The Evolving Role of IOCs: https://www.magonia.io/blog/maximizing-the-value-of-threat-indicators-and-reimagining-their-role-in-modern-detection/
    • The New Definition of Visibility: https://www.magonia.io/blog/what-is-cybersecurity-visibility/
    • Decoding Fuzzy Hashes:  https://www.magonia.io/blog/what-is-cybersecurity-visibility/
Episode Host Headshot
Alex Hurtado
Host
Episode Host Headshot
David Burkett
Cloud Security Researcher and Cogswell Award Winner at Corelight
Podcast

The New Definition of Visibility & the Evolving Role of IOCs: Detection Engineering Through a UFO Lens with David Burkett

(00:01–03:30) — Opening: The Unknown, Detection, and Why This Episode Exists

Key concepts covered

  • Detection engineering as “finding truth in uncertainty”
  • Why UFO detection/radar thinking maps to SOC detection work
  • David’s background + why this crossover isn’t just for vibes


Notable soundbites

  • (00:12–00:33) Alex: “...Back to hunting the strange, the stealthy, and the signals that may very well be hiding in plain sight on this episode we are diving into the interesting parallels detection engineering has to UFO detection."

(03:30–10:30) — Segment: Signal Detection Theory (Radar Math → SOC Reality)

Key concepts covered

  • Signal Detection Theory origins (radar operators distinguishing false vs true alarms)
  • SOC analysts = modern radar operators (alerts as “pings”)
  • Sensitivity vs specificity parallels to detection tuning
  • Gaussian noise and the idea of baseline “static”

Notable soundbites

  • (03:45–04:40) David: “...signal detection theory… how radar operators would distinguish false alarms… versus true alarms.”
  • (04:14–04:55) David: “...a security analyst working in a SOC is kind of the same as someone working as an air traffic controller…”
  • (06:23–06:55) David: “Your SIEM is going to be like your radar array. The alerts are going to be the pings…”

(10:30–13:20) — Segment: Likelihood Ratios + Turning “Gut Feel” Into Something Defensible

Key concepts covered

  • Likelihood ratio as a way to combine context → probability
  • Using multiple context features (role, behavior, environment) to estimate true vs false positives
  • Using mature research from other domains to level up SOC decisions

Notable soundbites

  • (07:31–08:25) David: “...introduces a concept called the likelihood ratio… determining how likely something is to be a false positive…”
  • (08:00–08:20) David: “...adding various different pieces of context until you get that overall likelihood ratio…”
  • (09:00–09:30) David: “...lessons learned… why they changed… could be applicable to some of our detection use cases today.”

(13:20–17:10) — Segment: AI/Agent Behavior vs Human Behavior (Future Detection Angle)

Key concepts covered

  • Emerging challenge: distinguishing human-driven vs agentic/AI-driven behavior
  • “U(User)BA → A(AI Agent)BA” concept: what “normal” looks like when agents act for humans
  • Using bot-detection-style approaches (timing, consistency, persistence) + likelihood ratios

Notable soundbites

  • (09:34–10:30) Alex: “...human entities versus… AI entities… are we monitoring what is OK behavior… versus… malicious?”
  • (11:53–12:55) David: “...you could do a lot of the same stuff that you would use to detect bots… time analysis… how fast commands are executed…”
  • (13:05–13:15) David: “...come up with a percentage or likelihood… bot versus a human.”

(13:20–32:00) — Segment: The Evolving Role of IOCs + Bottom of the Pyramid of Pain

Key concepts covered

  • The pendulum swing: over-correcting into “TTPs only” and “ghosting” IOCs
  • IOCs are brittle, but still valuable (especially reputationally and defensively)
  • IOCs evolving from “trigger” → “context”
  • Practical fear: being breached by “known bad” that was public forever

Notable soundbites

  • (13:49–14:20) Alex: “...we’ve almost kind of swung the pendulum… IOCs… becoming just a piece of the context…”
  • (14:30–15:15) David: “...catch-22 with IOCs… most brittle… weakest detectors… shelf life…”
  • (15:55–16:25) David: “No one wants to be that company that was breached by an IOC posted on Twitter…”
  • (16:50–17:10) David: “...more contextual versus… binary.”

(17:10–25:45) — Segment: Fuzzy Hashing (TLSH) + Similarity as Context

Key concepts covered

  • Why traditional hashes don’t repeat meaningfully across environments
  • TLSH (Trend Micro Locality Sensitive Hashing) basics + “distance” scoring
  • Why similarity is useful: minor changes break normal hashes but not fuzzy similarity
  • Why this is compute-heavy (distance comparisons vs list lookup)
  • Using fuzzy hashing to discover unknown/earlier samples and improve investigations

Notable soundbites

  • (17:17–17:50) David: “...I added a fuzzy hash layer above the hashes…”
  • (18:22–19:10) David: “...rather than ‘is this hash in a list’… you’re saying ‘is it similar to a known malicious hash?’”
  • (20:00–20:35) David: “...compare… find a distance of how similar they are…”
  • (22:00–22:40) David: “...VirusTotal… click the TLSH… shows similar binaries…”
  • (22:40–23:10) David: “...I was able to find three earlier samples… six months earlier…”

(25:45–32:10) — Segment: IOC Behaviors (Domains Have “Behavior,” Too)

Key concepts covered

  • IOCs can carry behavioral traits (domain age, TLDs, registrar patterns, infra traits)
  • Using pre-attack detection: detect domains similar to yours before weaponization
  • Takedown opportunities during “weaponization phase” (kill chain)
  • Indicators become “enrichment features” that strengthen confidence scoring

Notable soundbites

  • (27:46–28:20) David: “...IOCs themselves can actually have behaviors…”
  • (29:00–29:40) David: “...detect that before it’s ever even used… stopping them at weaponization…”

(34:30–45:30) — Segment: Defining Visibility for Real (Telemetry vs Monitoring vs Observability)

Key concepts covered

  • Visibility means different things depending on role (IR vs Detection vs Leadership)
  • Critical line: visibility isn’t “data exists,” it’s “you can act/detect”
  • The model: telemetry (data) + monitoring (knowns/detections) + observability (unknowns/hunting)
  • Why EDR telemetry projects can mislead if “implemented” ≠ “detectable”

Notable soundbites

  • (36:34–37:05) David: “...if I can’t write detection logic on that event… we don’t have visibility.”
  • (39:55–40:45) David: “Monitoring… Observability… Telemetry…”
  • (40:10–41:05) David: “Visibility… is a combination of all three…”
  • (41:00–41:25) David: “...you can only have full visibility when you have the ability to do all three.”

(46:38–56:40) — Segment: EDR Evasion + What to Do When You Can’t Trust Sensors

Key concepts covered

  • EDR isn’t 100% coverage (agent drift, gaps, unsupported devices)
  • Adversaries exploit places you can’t run EDR (edge devices, appliances)
  • Network telemetry as compensating control (“multi-sensor correlation”)
  • Practical threat hunting: baseline management ports, detect deviations, abnormal outbound behavior
  • Even encrypted traffic: you can detect anomalies without payload inspection

Notable soundbites

  • (49:12–49:25) Alex: “The network doesn’t lie.”
  • (50:00–50:35) David: “...over-reliance on EDR… network wasn’t being monitored…”
  • (51:05–52:10) David: “...baseline… management SSH… outbound from the firewall… suspicious…”
  • (55:45–56:35) David: “...I don’t need to know what’s in the payload… if it’s communicating on a port it never has before…”

(56:40–01:00:40) — Segment: The EDR Debate (Are EDRs Better or Worse?)

Key concepts covered

  • David: EDRs getting strong → attackers move to unmonitored places
  • Alex: EDR alerts can be noisy/untrusted; some enterprises rely on raw telemetry + own detections
  • Visibility again: “top EDR” might have telemetry but limited detection authoring
  • Design philosophy differences (vendor-as-expert vs build-your-own detection maturity)

Notable soundbites

  • (57:33–58:15) Alex: “...I think the EDRs are getting worse… enterprises… can’t trust their EDR alerts…”
  • (58:35–59:20) David: “...CrowdStrike… you can’t write detections on most of that telemetry…”
  • (01:00:10–01:00:40) David: “...EDRs have gotten so good that adversaries… move off the endpoint…”

(01:00:57–01:07:45) — Segment: The Book + Evidence, Sourcing, and Why This Isn’t Just “Aliens”

Key concepts covered

  • Book intent: explain complexity beyond “aliens or not”
  • Emphasis: implications regardless of origin
  • Heavy sourcing: QR codes + citations to primary evidence
  • “multiple sensors corroboration” as a theme that ties back to detection engineering

Notable soundbites

  • (01:01:35–01:02:20) David: “...read this because it’s way more complicated… than just aliens.”
  • (01:02:10–01:02:45) David: “...implications, regardless of origin…”
  • (01:02:45–01:03:05) David: “...QR code on basically every page…”
  • (01:02:55–01:03:10) David: “...over 800 different citations…”
  • (01:07:15–01:07:40) David: “...corroborated by multiple differing sensors…”

(01:09:59–01:11:05) — Close: Curiosity as the Detection Engine

Key concepts covered

  • The “unknown” as a motivator, not a fear trigger
  • Detection engineering framed as exploration, resilience, iterative learning

Notable soundbites

  • (01:09:59–01:10:40) Alex: “...detection is about curiosity and this calling to explore the unknown.”