PARASITE: Conditional System Prompt Poisoning to Hijack LLMs
arXiv:2505.16888v4 Announce Type: replace-cross
Abstract: Large Language Models (LLMs) are increasingly deployed via third-party system prompts downloaded from public marketplaces. We identify a critical supply-chain vulnerability: conditional system …