Cracking the Google Code... Under the GoogleScope 335
jglazer75 writes "From the analysis of the code behind Google's patents: "Google's sweeping changes confirm the search giant has launched a full out assault against artificial link inflation & declared war against search engine spam in a continuing effort to provide the best search service in the world... and if you thought you cracked the Google Code and had Google all figured out ... guess again. ... In addition to evaluating and scoring web page content, the ranking of web pages are admittedly still influenced by the frequency of page or site updates. What's new and interesting is what Google takes into account in determining the freshness of a web page.""
On the minds of all slashdotters, (Score:5, Funny)
Re:On the minds of all slashdotters, (Score:5, Funny)
So will this make it easier or harder to find porn?
Because there's a shortage of porn on the web?
Re:On the minds of all slashdotters, (Score:5, Funny)
Re:On the minds of all slashdotters, (Score:4, Funny)
bingo. therefore, the question he should've asked is: will the pron I find make me harder?
Re:On the minds of all slashdotters, (Score:4, Informative)
Have you already seen DOMAI [domai.com]? (NSFW)
Re:On the minds of all slashdotters, (Score:3, Funny)
eBay.com
Re:On the minds of all slashdotters, (Score:2)
while I hate all the review/price compare sites when i am looking for a driver, when I try to shop online nowadays I end up on yahoo...
why?
google is killing affiliate sites like crazy but sometimes (when eg I need a price compare or feature review) I end up shouting Yahoo
well I guess it is good to have a selection
back to the topic: pr0n is NOT permitted by adwords/adsense policies
Great (Score:2, Interesting)
http://www.anologger.com/ [anologger.com]
Re:Great (Score:3, Informative)
Google what is best in life (Score:5, Funny)
it's a war (Score:5, Funny)
resistance is futile (Score:5, Funny)
Borgle.
Re:resistance is futile (Score:3, Funny)
So that's what it sounds like when the Borg have problems digesting the food... and I always thought they just recharged or whatever... well, I guess they can't always adapt fast enough.
("Captain, may I suggest remodulating the food replicator inputs?")
Re:it's a war (Score:2, Funny)
Maybe we could find the cache of the article on Google.
in case of slashdotting, article text (Score:5, Informative)
Google's US Patent confirms information retrieval is based on historical data.
Publication Date: 5/8/2005 9:51:18 PM
Author Name: Lawrence Deon
An Introduction:
Google's sweeping changes confirm the search giant has launched a full out assault against artificial link inflation & declared war against search engine spam in a continuing effort to provide the best search service in the world... and if you thought you cracked the Google Code and had Google all figured out
Google has raised the bar against search engine spam and artificial link inflation to unrivaled heights with the filing of a United States Patent Application 20050071741 on March 31, 2005.
The filing unquestionable provides SEO's with valuable insight into Google's tightly guarded search intelligence and confirms that Google's information retrieval is based on historical data.
What exactly do these changes mean to you?
Your credibility and reputation on-line are going under the Googlescope! Google has defined their patent abstract as follows:
"A system identifies a document and obtains one or more types of history data associated with the document. The system may generate a score for the document based, at least in part, on the one or more types of history data."
Google's patent specification reveals a significant amount of information both old and new about the possible ways Google can (and likely does) use your web page updates to determine the ranking of your site in the SERPs.
Unfortunately, the patent filing does not prioritize or conclusively confirm any specific method one way or the other.
Here's how Google scores your web pages.
In addition to evaluating and scoring web page content, the ranking of web pages are admittedly still influenced by the frequency of page or site updates.
What's new and interesting is what Google takes into account in determining the freshness of a web page.
For example, if a stale page continues to procure incoming links, it will still be considered fresh, even if the page header (Last-Modified: tells when the file was most recently modified) hasn't changed and the content is not updated or 'stale'.
According to their patent filing Google records and scores the following web page changes to determine freshness.
The frequency of all web page changes
The actual amount of the change itself... whether it is a substantial change redundant or superfluous
Changes in keyword distribution or density
The actual number of new web pages that link to a web page
The change or update of anchor text (the text that is used to link to a web page)
The numbers of new links to low trust web sites (for example, a domain may be considered low trust for having too many affiliate links on one web page).
Although there is no specific number of links indicated in the patent it might be advisable to limit affiliate links on new web pages. Caution should also be used in linking to pages with multiple affiliate links.
Developing your web page augments for page freshness.
Now I'm not suggesting that it's always beneficial or advisable to change the content of your web pages regularly, but it is very important to keep your pages fresh regularly and that may not necessarily mean a content change.
Google states that decayed or stale results might be desirable for information that doesn't necessarily need updating, while fresh content is good for results that require it.
How do you unravel that statement and differentiate between the two types of content?
An excellent example of this methodology is the roller coaster ride seasonal results might experience in Google's SERPs based on the actual season of the year.
A page related to winter clothin
Unintended side effects of the Google arms race (Score:5, Interesting)
SEO (Score:2, Interesting)
Re:SEO (Score:3, Informative)
Re:SEO (Score:4, Informative)
Doesn't work in slashdot because:
Re:SEO (Score:2, Insightful)
There is an art to SEO. Some of us employ spamming techniques that will force a website to the top of the list for a short period of time, and then become banned. To some people, this is desirable - such as when you know your product has a short lifespan.
Others like myself try to help businesses retool their websites to be search engine friendly. Alot of smaller businesses out there have websites that have every bit of info on everything they do on every page, thats bad. We show them how to break it into
Re:Unintended side effects of the Google arms race (Score:5, Insightful)
Re:Unintended side effects of the Google arms race (Score:4, Interesting)
That would be great. Now that I've read TFA, it looks like Google's techniques a long way toward eliminating the fakery done by SEO's currently.
As an aside, the article looks like it was written by an SEO consultant, as it contains a lot of advice about how to get good rankings under Google's patented approach. Interestingly, the recommended actions are mostly legitimate (offer interesing content, update regularly, don't try to create fake links to your site), but also some less-upfront techniques (make link-exchange deals with other sites and encourage bookmarking, for example).
Re:Unintended side effects of the Google arms race (Score:5, Interesting)
Companies need to start realizing that making money is about providing what customers want. Advertising is a great way of getting your name out, but only a good product or service will actually carry through. So in that frame of thinking, I highly recommend that companies:
Re:Unintended side effects of the Google arms race (Score:5, Funny)
I'm unhappy because I was grabbed off the street. May I go now?
Please?
Re:Unintended side effects of the Google arms race (Score:2, Insightful)
Uhh, which world are you living in. Most companies have found that bigger profits can be made, by convincing people that they want what they have. And most customers find it easier to buy what they are told to buy.
I like your world,
Re:Unintended side effects of the Google arms race (Score:3, Informative)
Sometimes search engine optimization isn't about making a hack site rank well. Sometimes it is about getting the traffi
Re:Unintended side effects of the Google arms race (Score:5, Insightful)
Actually, pretty much everything you list falls under the issue of usability. Many of those options have lower usability for the user, and thus the search engine by extension.
These companies don't need an SEO, they need to find a web designer that doesn't use Macromedia "tools".
Re:Unintended side effects of the Google arms race (Score:4, Insightful)
I will not say anything at all about Flash because two camps who BOTH don't get it will start the usual pointless discussion. Flash is rarely used for what it's great at, visualizing data, and plagues us with wildly unnecessary and annoying l33t-masturbation stuff instead.
Dreamweaver itself is indeed a powerful timesaver in the hands of an experienced XHTML/CSS guy. If you look at it closely, you'll find that it is a very nice graphical frontend to HTML itself, with a great set of shortcuts so that you almost don't have to touch the mouse at all. The palettes just provide access to the most commonly needed attributes of the element you're working on. If you leave all those nasty "behaviours", "timelines" and whatnow alone, it produces nicely readable and well-formed code. I'm using Dreamweaver since the early betas, and even back then this was the case. I tend to think that this was an initial design goal behind DW.
The bad comes from the 'designers' who are taught print design at the universities and apply them to the Web, using all the nutty clicky-pointy tools that produce JS-laden horror cabinet of non-standards-compliance they dare to call "HTML". It's a classical PEBKAC. Look at it this way - if DW didn't have those features, GoLive would've taken over long ago and we don't want THIS to happen. IMNSHO the only thing worse would be Frontpage. At least the guys at Macromedia didn't invent bogus HTML extensions because they were incapable of providing a proper metadata infrastructure, like Adobe did.
(I'm not a fanboy though, I just use what works best at the moment for the things I do. If someone shows me how to reproduce this "Apply Source Formatting" feature from DW in Kate/KDevelop and how to synchronize sites like in DW, I'm switching my machine at work from Win2K with DW to KDevelop/nvu on FreeBSD tomorrow, because it better fits the things I do nowadays. It will then match my setup at home.)
While we're at it, SEO is, was and always will be BS, just like the whole Internet Advertising Myth which after nearly a decade of documented failure still isn't debunked. Duh.
Re:Unintended side effects of the Google arms race (Score:2)
Nonsense. My very point is that traditional methods do that for you. For example, if I list my site of DMOZ it will get picked up and linked by hundreds (if not thousands!) of sites. If I use a pay-per-click service like AdClick, then I can have my site temporarily higher on search results as well as in
Re:Unintended side effects of the Google arms race (Score:2)
That was intentional. The last thing I need is Slashdot pouring down on my poor server.
but i guess that your site is not there to make money
Correct. However, I have a commercial site that also gets top billing on Google without tricks. See if you can spot it. [google.com] The sad part is that I have done very few updates to the site in quite a long time, and yet Google *still* gives it top billing. (It's long overdue for an overhaul.)
If th
Re:Unintended side effects of the Google arms race (Score:2, Insightful)
Re:Unintended side effects of the Google arms race (Score:2)
After link analysis (Score:5, Interesting)
Re:After link analysis (Score:4, Insightful)
Re:After link analysis (Score:2)
Eventually Google should be able to start aggregating those ratings to find out how the public perceives a site.
Yes (Score:5, Funny)
Re:Yes (Score:3, Interesting)
Interestingly enough, the top the results for "tiger" are a page about tigers, tiger direct, and the Apple page. These seem pretty reasonable to me. The OS is obviously something a lot of people are going to be looking for, but I'd still find it weird if real tigers were not the first link. For "panther" the results are Apple's page, then some pages on real panthers. For "jaguar" you get the car manufacturer, Apple, then real panthers. I wonder what will happen if you do a search on "tiger" a year from no
Re:Yes (Score:5, Insightful)
Re:Yes (Score:2, Interesting)
If there are more copies of Tiger than there are Tigers, then I'd say Google's just doing it's job.
Not to be a Google apologist. I think this filing is too obvious to be patented. It's just taking the obvious things you would look at to rank a page, and looking at it one level removed. Instead of asking how many links there are to my page, we're asking how many and
Re:Yes (Score:2, Offtopic)
The minimum number of tigers in 1993 was 4400 and the maximum was 7700 where as in 2000/2001the minimum was 5700 and the maximum was 7000.
From: http://www.globaltiger.org/population.htm [globaltiger.org]
I can't find any stats for installations of Tiger, but I wouldn't be the least bit surprised if there were more than 5700 copies out there.
So the operating system called Tiger might be more prevalent tha
Re:Yes (Score:2)
Hey, leave Wayne Newton out of this.
Re:Yes (Score:2)
Re:Wrong. (Score:2)
No, that's the tack that Google has taken (i.e. Group think) because it works better than any other method. That doesn't meant that it's the "correct" design. The "correct" design should give you results relavent to what you searched for. i.e. "Tiger" or "Tigers" should give you info about furry creatures (and potentially furry creatures in advertising and sports), because that's what you searched
Is it the general opinion of the public... (Score:3, Interesting)
Re:Is it the general opinion of the public... (Score:3, Insightful)
Re:Is it the general opinion of the public... (Score:2, Insightful)
Re:Is it the general opinion of the public... (Score:2)
Re:Is it the general opinion of the public... (Score:2)
Re:Is it the general opinion of the public... (Score:2)
Take the article with a grain of salt... (Score:5, Insightful)
The article is not written by a Google employee, nor did the author speak with anyone at Google. It's simply his analysis of the patent document filed by Google.
Also, at the bottom of the article after the author's name, there's a link to some search optimization service's website.
Non-subscribers! Damn you all! (Score:2)
Six weeks to fix? (Score:2, Informative)
If this claim is true, I guess we'll have to wait the typical "four to six weeks for delivery."
Re:Six weeks to fix? (Score:2)
GoogleBombs Away (Score:5, Funny)
Re:GoogleBombs Away (Score:2)
Look at definition #2.
effect on search engine optimizers (Score:5, Informative)
Article text and Google cache link (Score:3, Informative)
Google United - Google Patent Examined
Google's newest patent application is lengthy. It is interesting in some places and enigmatic in others. Less colourful than most end user license agreements, the patent covers an enormous range of ranking analysis techniques Google wants to ensure are kept under their control.
Publication Date: 4/7/2005 7:41:24 AM
By Jim Hedger, StepForth News Editor, StepForth Placement Inc.
Thoughts on Google's patent... "Information retrieval based on historical data."
Google's newest patent application is lengthy. It is interesting in some places and enigmatic in others. Less colourful than most end user license agreements, the patent covers an enormous range of ranking analysis techniques Google wants to ensure are kept under their control. Some of the ideas and concepts covered in the document are almost certainly worked into the current algorithm running Google. Some are being worked in as this article is being written. Some may never see the blue-light of electrons but are pretty good ideas so it might have been considered wise to patent them. Google's not saying which is which. While not exactly War and Peace, it's a pretty complex document that gives readers a glimpse inside the minds of Google engineers. What it doesn't give is a 100% clear overview of how Google operates now and how the various ideas covered in the patent application will be integrated into Google's algorithms. One interesting section seems to confirm what SEOs have been saying for almost a year, Google does have a "sandbox" where it stores new links or sites for about a month before evaluation.
Google is in the midst of sweeping changes to the way it operates as a search engine. As a matter of fact, it isn't really a search engine in the fine sense of the word anymore. It isn't really a portal either. It is more of an institution, the ultimate private-public partnership. Calling itself a media-company, Google is now a multi-faceted information and multi-media delivery system that is accessed primarily through its well-known interface found at www.google.com.
Google is known for its from-the-hip style of innovation. While the face is familiar, the brains behind it are growing and changing rapidly. Four major factors (technology, revenue, user demand and competition) influence and drive these changes. Where Microsoft dithers and .dll's over its software for years before introduction, Google encourages its staff to spend up to 20% of their time tripping their way up the stairs of invention. Sometimes they produce ideas that didn't work out as they expected, as was the case with Orkut, and sometimes they produce spectacular results as with Google News. The sum total of what works and what doesn't work has served to inform Google what its users want in a search engine. After all, where the users go, the advertising dollars must follow. Such is the way of the Internet.
In its recent SEC filing, the first it has produced since going public in August 2004, Google said it was going to spend a lot of money to continue outpacing its rivals. This year they figure they will spend about $500 million to develop or enhance newer technologies. In 2004 and 2003, Google spent $319 million and $177 million respectively. The increase in innovation-spending corresponds with a doubling of Google's staff headcount which has jumped from 1628 employees in 2003 to 3021 by the end of 2004.
Over the past five years Google has produced a number of features that have proven popular enough to be included among its public-search offerings. On their front page, these features include Image Search, Google Groups, Google News, Froogle, Google Local, and Google Desktop. There are dozens of other features which can be accessed by cli
Frequency of changes (Score:4, Insightful)
Also, a page with frames might get penalized since its content doesn't change, although the content of the frames may change frequently.
Re:Frequency of changes (Score:2, Interesting)
I thought of that is less then 30 seconds, what are the odds Google has already thought about it?
Re:Frequency of changes (Score:3, Informative)
Puh-leeeeze! That trick became ineffective last century. It's very easy for the search engine to check background colors and FONT tags and penalize the page that uses text that is too close to the background color.
Coral cache link (Score:3, Funny)
FAQs (Score:2)
Since the story submission didn't end the post with a question, I feel compelled to add one:
How will this affect the ranking of insightful FAQs, which by nature my not change frequently?
Another shout-out poll to my homeboy Slashdotters: Do you pro
Re:FAQs (Score:2, Insightful)
Re:FAQs (Score:2)
It's nice that you care about intelligible pronunciation, but I'd like to inform you that an acronym is not what you think it is. If you look the term up in a dictionary, you should find out that acronyms, like LASER (actually I'd spell that Laser according to my style guide but in this particular case that point is moot anyway, because it has turned into the word laser) are meant to be pronounced as words. SQL is not an
Google's crackdown is coming (Score:4, Insightful)
Note that Google is now looking at domain ownership information. This may result in a much lower level of bogus information in domain registrations. It's probably a good idea to make sure that your domain registration information, business license, D&B rating, on-site contact info, and SSL certificates all match.
"Domain cloaking" will probably mean that you don't appear anywhere the top in Google. So that's on the way out.
Google's Click History Asset (Score:5, Insightful)
Google has millions upon millions of click history on their search results that say what it is people really are looking for, as well as which ones appeared good fodder for first clicking.
No one else has such a large database of what humans have actually picked.
Such a click history and search term history asset is worth even more if it gets correlated with Evil Direct Marketing information from the cookie traders.
Although, it seems possible that large ISPs could also grab and analyze their members Google interactions to figure out people's tastes, assuming such interactions remain unencrypted.
I have to wonder how many companies with static IP addresses have, unbeknownst to them, built up extensive history logs at Google showing their search term preferences and click selections. If I were a technology startup with a hot idea to research I'd be a little more paranoid about something like that.
Re:Google's Click History Asset (Score:4, Interesting)
Re:Google's Click History Asset (Score:4, Informative)
You sure about that? Try copying and pasting a Google results link.
For example, let's search Google for "elluusive" [google.com]. The first result was your slashdot "homepage", at http://slashdot.org/~eluusive [slashdot.org], which at first glance seems to be a direct link. But if you right-click on the link and copy it, paste it somewhere and you'll find something along these lines:
http://www.google.com/url?sa=U&start=1&q=http%3A/
Re:Google's Click History Asset (Score:3, Insightful)
A quick return would indicate that the page was not in fact what the user had requested.
Re:Google's Click History Asset (Score:5, Informative)
Each link in the search results on google has a onmousedown event attached.
If you have javascript enabled and click on it, then your browser will also execute the javascript, which sends a get request to google. They do log each link you click on.
check the source of any google search page.
The function that gets called for each onmousedown is called clk():
Re:Google's Click History Asset (Score:3, Interesting)
i look at this page:
http://www.google.com/search?q=test [google.com]
The first result link looks like this:
at least in IE.
In opera the javascript is missing
try this (remove the space before the ?):
compared to this:
Re:uh uh.... (Score:2)
Two Keys: Data Mining and Delay (Score:5, Interesting)
What does that mean? At the highest level, it means that most of the Google algorithm is constructed by a machine. You give the machine human-constructed examples of how to rank a sample set of pages (notice those want ads where Google is hiring people who can inspect and assess the quality of web pages?) and it then uses essentially brute-force techniques to test every possible combination of your ranking variables to find the simplest formula that ranks pages the same way the human did.
There is no human at Google "twisting dials" to alter individual parameters of a formula. The machine constructs the algorithm, and it can therefore easily be so complex that no human can understand it. Tweaking the algorithm becomes a process of changing or adding to your "training set" of human-ranked pages, and letting the data mining process come up with a revised algorithm.
For example, Google could invent a new variable called "category", and identify each page as belonging to category Astronomy, Botulism, Country, [...] and Other. Once that variable is thrown into the mix, then the Google "aglorithm" is essentially free to vary wildly from one type of subject matter to the next. For example, you might see someone with a Real Estate site swearing up and down that inbound links are no longer as important, while someone with an Astronomy site might swear that, no, inbound links are more important than ever. You can see exactly this kind of bickering in most of the forums that people who hope to do Search Engine Optimization frequent.
The other big mistake people make in trying to see how to game the Google algorithm is "delay". In studying how people manage (or fail to manage) complex systems, psychologists learned that people generally would fail if a delay was introduced between their actions and the results of their actions.
In one very simple test, people were charged with trying to stabilize the temperature in a virtual refridgerator. They had one dial, and there was exactly one piece of feedback: the current temperature in the fridge. However, they were not explicitly told that there was a delay between moving the dial and when the results of that action would stabilize.
The responses of those test subjects was eerily similar to what we see in Google-gaming webmasters these days. Some people swore up and down that some human behind the scenes was directly tweaking the results to thwart whatever they did. Others became frustrated and decided that nothing they did really mattered, so they would just swing the dial back and forth between its minimum and maximum settings.
What does this have to do with Google? These days, Google can change their algorithm relatively frequently, and the algorithm can vary by the relative date of various things. The net sum is, there's a delay between when your page is first ranked and when it is likely to arrive at a relatively stable ranking. This can drive webmasters nuts as they think they've done something clever to rank their page high, but then it drops a week later. Although it doesn't occur to them, the important question is: did the change cause the high ranking or did it cause the sudden decline?
The few people who did master the simple refridgerator system? Well, they sounded more like some of the people who are more successful at gaming Google. Those folks tend to say things like: "just make one change and then leave it alone for a while to see what happens."
Can you still game the Google algorithm? Undoubtedly in specific cases. But it's getting harder. The Google algorithm was always complex, but what's changing is that the days when a few variables (such as inbound link count) generally swamped the effects of all the others is drawing to a close. We are approaching the day when the best technique to rank highly with Google will be: sit down at your keyboard and make more good content every day.
Web page "freshness?" A good thing... (Score:3, Informative)
The site is mostly static but is rich with cultural value. It's currently the number one hit on Google. I'm hoping that Google's emphasis on "freshness" won't make his site fall in ranking.
seo gets more difficult (Score:2)
Wait a second. (Score:2, Insightful)
What about harmful link spam? (Score:3, Insightful)
Google rhymes with "GOD" (Score:2)
Its not nice to fool Mother Nature.
Comment removed (Score:4, Insightful)
Re:This is under YRO? (Score:3, Funny)
That's an interesting interpretation. Here's a review of today's submissions, translated to your perspective:
Broadway Awards Spam is about your rights to watch Spamalot, nominated for 14 Tony awards.
IT: More on Last Years Cisco Source Code Theft is about your rights to read about a theft of proprietary source code.
IT: What Does a Spreading Worm Look Like? is about your rights to visualize what a spreading worm looks like.
G
Re: (Score:2)
Re:This is under YRO? (Score:5, Funny)
Re:A reason why *not* to use .NET? (Score:2)
Re:A reason why *not* to use .NET? (Score:2)
(typical slashdotter)
Re:A reason why *not* to use .NET? (Score:3, Insightful)
another reason why to cache (Score:2)
Disclaimer: I'm a J2EE dev so my opinion may not count.
It appears that they are either not properly implementing connection pooling and running out of connections. Or the database is being overloaded by failing to implement caching to non changing data.
Maybe they just aren't used to developing for a high traffic web site.
Re:A reason why *not* to use .NET? (Score:2)
Re:A reason why *not* to use .NET? (Score:2)
Apache, should be ok though, no?
It was interesting that this site was down in under 10 comments.
Glad some seen this not as flaimbait but as a question of understanding.
Re:A reason why *not* to use .NET? (Score:2)
Re:A reason why *not* to use .NET? (Score:2)
As opposed to a breaking of the server due to limitations in configuration.
My point being, if you place the same server on a larger bandwidth, it would survive such that the bottle neck is the bandwidth *not* the server.
Re:A reason why *not* to use .NET? (Score:2)
Re:A reason why *not* to use .NET? (Score:2)
Bandwidth aside, what are the ideal parameters to harden a server for the slashdot effect?
No doubt, I've seen enough PHP/SQL errors to know that they certainly are not up to par.
It would be nice to have a list of URLs that can survive with an accompanied analysis for the community of what recipe worked.
Re:A reason why *not* to use .NET? (Score:2)
Re:Is it the case.. (Score:2, Insightful)
Their search dominance is a direct result of PageRank. That they have a patent on it prevents other companies from copying the idea or hiring their employees away (Microsoft is notorious at doing both these things). So yes, the patent is important.
Sorry kids, but patents and "Do no evil" are mutually incompatible concepts.
You're retarded if you think that.
Re:Is it the case.. (Score:2)
Re:Is it the case.. (Score:3, Insightful)
So they have monopoly. What's your point?
When did a monopoly by google become ok?
Sometime around the 1790's when the patent system was created in the US to give inventors an temporary and artificial monopoly on their inventions so as to encourage them to innovate. Google has not violated their policy of "do no evil" by properly utilizing the patent system, and it has had the in
How a company *uses* patents determines evil (Score:2)
If I use a patent to economically enrich myself but as a result impede the use of information - possibly evil.
If I patent something then create a free license for it so that no one can restrict its use through commercial monopoly - good.
I'm not saying that Google is using their
Re:Is it the case.. (Score:2)
Re:10 Comments and the site is down (Score:2)
Seems most of the pages returned from google feeding this error code(0x80004005) are about people using MS Access behind the web server. Now I like to raz on MS as much as the next guy but, this may just be a case of the wrong tool for the job. Access on a public web site is just a plain bad idea.
OTOH, even if its not Access the nature of the article suggests that any reasonable site would employ
Re:Search Engine Spam (Score:2)