Publishers Of Certain Belgian Newspapers Continue Effort To Not Be Found Online
from the why-not-stop-publishing-online dept
Here’s an idea: why don’t the various French- and German-language Belgian newspapers stop publishing their newspapers online? After their ridiculous win against Google for sending traffic their way without first paying them, the group of newspaper publishers is now going after Yahoo for the same thing. There really is an easy solution. Yahoo, Google and everyone else should simply refrain from linking to these newspapers. If they really want to be left alone, to lose all that traffic and to lose all that relevance, that’s their own decision. In the meantime, how long will it be before someone else comes along and figures this is a cash cow and starts suing? At this point, perhaps everyone should just sue Google, Yahoo, Microsoft, Ask and others for daring to link to them.
Comments on “Publishers Of Certain Belgian Newspapers Continue Effort To Not Be Found Online”
yeah
if they aren’t in the indexes, they might as well not exist
Have the French- and German- language Belgian newspapers also sued libraries for making copies of their newspapers available? How about cafes that have newspapers lying around? Or how about those workplaces that buy one subscription for an office with 20 or so workers? When will companies learn to embrace, use and exploit technology to their benefit and not their detriment?
Re: Re:
>
Allowing Google to steal your content is done at the company’s detriment and at Google’s benefice.
Not the opposite, as you might think.
A lot of people are getting bored of those little google ads everywhere, or by search results showing ebay at top… page rank algorithm, yeah, sure, but with some “adjustments” of course 😉
Re: Re: Re:
You’ve apparently never heard of robots.txt. The only person who is “allowing” Google to “steal your content” are publishers who are too stupid to read a spec.
Chris.
it could be worse!!!
Think of it this way; Anyone with a website could sue any of the search sites they want….
Why, you ask…
’cause they store cached pages “copies” of the site.
It’s technically illegal.
Think about that!
Re: it could be worse!!!
’cause they store cached pages “copies” of the site. It’s technically illegal.
Actually, Google has been sued for exactly this, and the cached copies were found not to violate the law.
Re: it could be worse!!!
Not in Japan
in fact
if you want to get technical you are breaking the law by viewing a website because a copy is stored on your computer.
I think that’s the most sensible thing to do. NOT TO PUBLISH ONLINE! Too touchy. Are they carrying the best news anyway? Suing for linking has become a habit. Sad.
robots.txt
Is Robots.txt not good enough for people anymore???
Re: robots.txt
I agree. If a company doesn’t want their site or select pages of their site indexed by a search engine. its as simple as creating a file (robots,txt) to exclude themselves. A lot of BS lawsuits flying around right now. There should be a law to protect companies and people against frivilous lawsuits!
Re: robots.txt
Problem is simple – the French and the Germans.
Need I say more?
At this time in history when lascivious promiscuity is a basis for which to fabricate entertainment formats around, and truthfull fact is waning in journalism, perhaps the courts should clog thier dockets with a class action suit against George Bush as the villian in masterminding Huricane Katrina.
.
ROBOTS.txt
Wtf
Re: .
The problem with robots.txt is that if it does not exist, bots assume they have the right to index and redistribute the copyrighted material.
Instead every site that wants to be indexed should have a robots.txt file that grants access to the bots, not the other way around.
Re: Re: .
Er, the web is a PUBLIC forum. By default, everything is accessible to everything. If you want to stop a indexer like Google or Yahoo, you only to put one file, with TWO lines in it in your web root:
That’s it. If you can’t do that, perhaps you should re-consider if publishing anything on the web is something you should really be engaging in. It’s sort like expecting the inventory on your shop to be safe if you don’t put a door on the storefront….
As for server load, those two lines will stop ANY indexer from looking at ANY of your files.
Chris
problem in belgium is simple
well , belgium wanna give u good beer and great chocolates, and just wants to let’em be their way 🙂
well if i make a website and i choose is not to be indexed, cose i want-it free of that 35-60% search engine generate on a server, and have-it free for personal acces, what’s google’s problem ?
do you guys have any ideea how much indexing is needed to keed damn search robots off your server trafic?
it seems great service from search engine is not free, cose someone’s paying for the trafic search bots generate
so have a beer and sit relaxed if is good, must be from belgium 🙂
German? Or Dutch?
Mike, shouldn’t it be “French- and Dutch-language”?
I think you have +- 10 native Belgians having German as their mother tongue…
Replying to individual comments broken
BTW, in the last Techdirt update, not only did something change so that articles are only 200 px wide, but replying to individual comments also appears to be broken.
Chris.
Re: Replying to individual comments broken
Chris,
You can set up Techdirt to view in wide mode through the Preferences page — from there, you can also set comments to view in threaded mode as well.
Cheers,
Dennis.