Per Matt Cutts there will be new changes in Google’s algo coming soon. One of the features is to combat duplicate contents by de-valuing sites which are stealing contents and have no original contents themselves. Also known as scraper sites. As webmasters this is long overdue but how will it really work? In the past every time they try and change to improve algo, the original sites get nailed and the scraper sites move up. One theory is age of contents, which if done right such as how wayback machine works it could work. Big fear is they will base it on age of domain, and we all know anyone can go out a buy an expired domain and put contents on it. I guess the next couple weeks will be very interesting.