tech

January 15, 2026

A single click mounted a covert, multistage attack against Copilot

Exploit exfiltrating data from chat histories worked even after users closed chat windows.

A single click mounted a covert, multistage attack against Copilot

TL;DR

  • A vulnerability in Microsoft's Copilot Personal AI assistant allowed hackers to steal sensitive user data.
  • The exploit, discovered by Varonis researchers and named Reprompt, required only a single click on a legitimate URL.
  • The attack bypassed enterprise endpoint security and continued even after the user closed the Copilot chat.
  • The root cause was the inability to distinguish user prompts from data within untrusted requests, leading to indirect prompt injections.
  • Microsoft has patched the vulnerability, and the exploit only affected Copilot Personal, not Microsoft 365 Copilot.