Idea: a browser extension that automatically scrapes and archives your data as you browse a site. Power-users can submit scrapers for new/modified sites.
Normally scrapers can be detected and blocked, but if it's passively running in the background on a user's machine, opportunistically grabbing data the user is already viewing, it should be impossible to detect.
This is a great idea. I've actually wanted to do this before. Browser plugins are one fundamental tool in the war against web tyranny. For instance, crawling Craigslist could be accomplished this way in a way that was unblockable.
Related: Consider http://Pinboard.in with the yearly plan offering Personal Archiving. Anything pinned (bookmarked) gets archived and you can view it from your personal archive.
Normally scrapers can be detected and blocked, but if it's passively running in the background on a user's machine, opportunistically grabbing data the user is already viewing, it should be impossible to detect.