One of the comments “How To Fix The Internet” article that’s circulating is the notion of a central organization (like the BBB) maintaining a database of information that sites would presumably dish out to users upon them requesting a page for the first time.
Well, how about a standalone security tool to do this?
This tool would have a couple parts:
- A list of attributes with associated risks (think Bayesian adjustment of a risk level) e.g.:-Hosted in Russia / China-IP address for a URL-Immediately redirects you to somewhere else-Cert doesn’t match DNS-Webserver outdated (versions vulnerable to known exploits)-Recent bad activity for this site**** Assigns a score to the site**
- A client for the user’s system (to pull updates) and a plugin for their browser-Pulls down updates, checks visited sites vs. information
—
So here’s the idea — we create this system and serve the constantly updated data out to whoever wants it. They could pull it down weekly/daily/hourly or whatever (depending on how often there are updates).
From there, when they go to a website it checks the current “risk level” of the site against their current security settings. So, if it’s between 6 and 7, don’t run ActiveX on the site. If it’s between 1-4, don’t even load it — etc.
The key here is customizing the local security settings according to a semi-dynamic repository of information hosted by security enthusiasts. We don’t need the BBB for this.
This is very early thinking, and it could be utterly lame to me in 10 minutes, but it sounds good in my head right now.
Thoughts?