A way of explaining why security's baseline is so low in places, and why it's so hard to raise
An imperfect analogy for AI that might help someone
Insane Video Deepfakes, Devin Gets Slack Access, New Fabric Patterns, AI Application Interfaces, Let Grow, and more…
What happens to user-facing businesses when most interaction happens through AI?
Google AI Espionage, My macOS UI, Cloudflare AI Firewall, Midnight Blizzard, and more…
AI is already becoming just as uninteresting as books, and that's a problem
A lot of stress about AI is caused by framing it incorrectly
Fabric Threat Models, An AI Worm, GitHub Auto-blocks, Long Covid IQ, and more…
AI enables creators and punishes workers, so it's time to start making things
Reddit selling user data to AI, Avast caught selling data, Crowdsec Report Analysis, and more…
We're seeing reality through drastically different lenses, and living in different worlds because of it
We've added a new pattern called `analyze_threat_report` that extracts the juicy bits out of cybersecurity threat reports