logoalt Hacker News

binsquaretoday at 6:11 AM0 repliesview on HN

I think there's different conversations happening and I don't think we're having the same conversation.

This is the claim by the article: "Vulnerabilities in the GitHub Copilot CLI expose users to the risk of arbitrary shell command execution via indirect prompt injection without any user approval"

But this is not true, the author gave explicit permission on copilot startup to trust and execute code in the folder.

Here's the exact starting screen on copilot:

│ Confirm folder trust │ │ │ │ ╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮ │ │ │ /Users/me/Documents │ │ │ ╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ │ │ │ │ Copilot may read files in this folder. Reading untrusted files may lead Copilot to behave in unexpected ways. With your permission, Copilot may execute │ │ code or bash commands in this folder. Executing untrusted code is unsafe. │ │ │ │ Do you trust the files in this folder? │ │ │ │ 1. Yes │ │ 2. Yes, and remember this folder for future sessions │ │ 3. No (Esc) │

And `The injection is stored in a README file from the cloned repository, which is an untrusted codebase.`