how to protect my site from HTTrack web copy using robot.txt

anyone know how to protect a site from HTTrack web copy using robot.txt, or there is any better way

The Awesome Link

Another Awesome Link


After a small google search, I believe there is really no way to protect yourself from site copying.

uziiuzair said

The Awesome Link

Another Awesome Link


After a small google search, I believe there is really no way to protect yourself from site copying.

httrack can easy ingoring robot.txt rule

You can’t.
I have always found a way to download.

Im not sure if this is possible but I was thinking of using javascript to append or prepend content if the domain is different than my domain.

Meaning, in regards to templates, a JavaScript snippet that hides website content on load and instead displays this page is stolen or something similar incase the website is loaded on domainB.com instead of my domainA.com . This would put most people off of using a ripped website. Again if someone was knowledgeable in JavaScript they would be able to remove the code, and load the website fine, but it would definitely put off most people even if they ripped the entire website.

Again im not sure if this is possible with javascript, but maybe another author would shed some light.