Hello,
I seem to have the most impossible questions lately ) Sorry about that... )
But I happen to own a domain called
TheBigArchive.com
and would like to offer a service to the public on them and I am not sure my plan is legal.
The problem:
A lot of science is published on the net these days. Science works by citing other, previous publications. The average lifespan of an internet website is 100 days. After that, the page is often deleted or moved which leaves a lot of broken links.
My solution:
So my plan is to allow anybody to suggest a site for archiving,
my server would parse the site and duplicate it on our server.
Finer details would allow the owner of the cached site control over the content and even possibly allow him to delete the cached site. I would also allow for several versions of the site to allow updates to be submitted too.
I am not quite sure yet, how my site would make money. Possibly ask for a small fee ($1) per submitted site... (not from the owner of the site, but from the guy who submitted it for caching, after all I am giving him a permanent link he can rely on, which is a service).
My question:
Would this business idea be legal? Would I violate copyright by storing these sites on my servers?
Please work on the assumption, that I am not just trying to grab the original site's traffic, whenever possible I would of course jump directly to the site-owner's current page. Only in case it is taken down or moved would I serve up my cached pages.
Other caching providers:
a) Site caching is done permanently by browsers.
b) Yahoo serves up cached sites upon request
c) There are sites out there that provide a history of the internet
So is site-caching in this way legal, or does it violate the copyright of the original site-owners?
Thanks a lot for your comments!
Kind regards
CoolDot
I seem to have the most impossible questions lately ) Sorry about that... )
But I happen to own a domain called
TheBigArchive.com
and would like to offer a service to the public on them and I am not sure my plan is legal.
The problem:
A lot of science is published on the net these days. Science works by citing other, previous publications. The average lifespan of an internet website is 100 days. After that, the page is often deleted or moved which leaves a lot of broken links.
My solution:
So my plan is to allow anybody to suggest a site for archiving,
my server would parse the site and duplicate it on our server.
Finer details would allow the owner of the cached site control over the content and even possibly allow him to delete the cached site. I would also allow for several versions of the site to allow updates to be submitted too.
I am not quite sure yet, how my site would make money. Possibly ask for a small fee ($1) per submitted site... (not from the owner of the site, but from the guy who submitted it for caching, after all I am giving him a permanent link he can rely on, which is a service).
My question:
Would this business idea be legal? Would I violate copyright by storing these sites on my servers?
Please work on the assumption, that I am not just trying to grab the original site's traffic, whenever possible I would of course jump directly to the site-owner's current page. Only in case it is taken down or moved would I serve up my cached pages.
Other caching providers:
a) Site caching is done permanently by browsers.
b) Yahoo serves up cached sites upon request
c) There are sites out there that provide a history of the internet
So is site-caching in this way legal, or does it violate the copyright of the original site-owners?
Thanks a lot for your comments!
Kind regards
CoolDot