Microsoft Copilot Vulnerability Exposed: "Reprompt" Attack Bypasses Safety Mechanisms
Researchers uncovered a method to exploit Microsoft Copilot’s URL parameter handling, enabling attackers to hijack user sessions and steal data without detection. Dubbed Reprompt, the attack leverages hidden malicious prompts embedded in seemingly legitimate Copilot links, which execute automatically upon loading requiring only a single click from the victim.
Copilot, Microsoft’s AI assistant integrated into Windows, Edge, and consumer apps, connects to personal accounts, making it a prime target for session-based attacks. The vulnerability stemmed from Copilot’s auto-execution of prompts via the q URL parameter, allowing attackers to inject instructions into an authenticated session. Unlike traditional prompt injection attacks, Reprompt required no user input, plugins, or connectors, evading both user awareness and client-side monitoring tools.
The attack’s simplicity lay in bypassing Copilot’s safeguards by instructing the AI to repeat actions twice. Once the initial prompt executed, attackers’ servers issued follow-up commands based on prior responses, creating an undetected chain of requests. This method obscured the attack’s true intent, making it difficult to trace.
Microsoft patched the flaw in its January Patch Tuesday update, with no evidence of in-the-wild exploitation. However, the incident highlights persistent risks in AI assistants that process untrusted inputs such as URL parameters or external content without robust separation or filtering. While Copilot Personal lacked enterprise-grade protections, Microsoft 365 Copilot offers additional safeguards, including Purview auditing and data loss prevention (DLP) policies to block sensitive data exposure.
The company is also testing policies allowing IT administrators to disable Copilot on managed devices, addressing concerns over unauthorized AI tool usage. The discovery underscores the ongoing challenges of securing AI-driven assistants against evolving exploitation techniques.
Microsoft Copilot cybersecurity rating report: https://www.rankiteo.com/company/microsoftcopilot
"id": "MIC1768486731",
"linkid": "microsoftcopilot",
"type": "Vulnerability",
"date": "1/2026",
"severity": "85",
"impact": "4",
"explanation": "Attack with significant impact with customers data leaks"
{'affected_entities': [{'customers_affected': 'Users of Microsoft Copilot '
'Personal',
'industry': 'Software, AI, Cloud Services',
'location': 'Global',
'name': 'Microsoft',
'size': 'Enterprise',
'type': 'Technology Company'}],
'attack_vector': 'Phishing Link (Malicious URL Parameters)',
'customer_advisories': 'Users should update systems, avoid unsolicited links, '
'and consider disabling Copilot if unnecessary.',
'data_breach': {'data_exfiltration': 'Possible via follow-up instructions in '
'the attack chain',
'personally_identifiable_information': 'Possible',
'sensitivity_of_data': 'High (if PII or financial data was '
'accessed)',
'type_of_data_compromised': 'Session data, potentially PII or '
'sensitive information'},
'date_resolved': '2026-01-14',
'description': 'Researchers found a method to steal data which bypasses '
'Microsoft Copilot’s built-in safety mechanisms. The attack '
'flow, called Reprompt, abuses how Microsoft Copilot handled '
'URL parameters in order to hijack a user’s existing Copilot '
'Personal session. The issue was fixed in Microsoft’s January '
'Patch Tuesday update, and there is no evidence of in-the-wild '
'exploitation so far.',
'impact': {'brand_reputation_impact': 'Moderate (demonstrates risks of AI '
'assistants)',
'data_compromised': 'User session data, potentially sensitive '
'information',
'identity_theft_risk': 'High (if PII was accessed)',
'operational_impact': 'Potential unauthorized actions in '
'authenticated sessions',
'payment_information_risk': 'High (if financial data was accessed)',
'systems_affected': 'Microsoft Copilot Personal, Windows, Edge '
'browser, consumer applications'},
'initial_access_broker': {'entry_point': 'Phishing link with malicious URL '
'parameters'},
'investigation_status': 'Resolved (Patch released)',
'lessons_learned': 'AI assistants with auto-execution of external inputs '
'(e.g., URL parameters) pose significant security risks. '
'Stronger separation and filtering of untrusted inputs are '
'necessary to prevent prompt injection attacks.',
'post_incident_analysis': {'corrective_actions': 'Fixed URL parameter '
'handling, improved input '
'validation, and separation '
'of untrusted inputs.',
'root_causes': 'Auto-execution of URL parameters '
'in Copilot Personal without '
'sufficient safeguards. Bypass of '
'initial request protections via '
'follow-up instructions.'},
'recommendations': ['Install the January 2026 Patch Tuesday updates to '
'mitigate the Reprompt attack.',
'Use Microsoft 365 Copilot for work data, as it includes '
'Purview auditing, tenant-level DLP, and admin '
'restrictions.',
'Avoid clicking on unsolicited links without '
'verification.',
'Disable Copilot on managed devices if not required (via '
'IT admin policies or tools like Malwarebytes).',
'Assume AI assistants driven via links or external '
'content may have similar vulnerabilities and exercise '
'caution.'],
'references': [{'source': 'Research Report (Unnamed)'}],
'response': {'communication_strategy': 'Public disclosure via research '
'reports and advisories',
'containment_measures': 'Patch released in January 2026 Patch '
'Tuesday updates',
'remediation_measures': 'Fixed URL parameter handling in Copilot '
'Personal'},
'stakeholder_advisories': 'IT administrators can now uninstall Copilot on '
'managed devices via new policies.',
'title': 'Reprompt: Microsoft Copilot Data Theft Bypass',
'type': 'Prompt Injection Attack',
'vulnerability_exploited': 'Auto-execution of URL parameters in Microsoft '
'Copilot Personal sessions'}