-
Nice exploit of LLM confabulation: ask LLM for coding advice, get a nonexistent package, then register that package and exploit other coders attempting to follow the LLM’s terrible advice
(tags: ai malware coding llms chatgpt hallucination confabulation fail infosec security exploits)
Kottke’s 2023 Father’s Day Gift Guide
There’s actually some fantastic ideas in here!
(tags: gifts ideas fathers-day presents stuff)