Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Google Businesses The Internet Patents

Cracking the Google Code... Under the GoogleScope 335

jglazer75 writes "From the analysis of the code behind Google's patents: "Google's sweeping changes confirm the search giant has launched a full out assault against artificial link inflation & declared war against search engine spam in a continuing effort to provide the best search service in the world... and if you thought you cracked the Google Code and had Google all figured out ... guess again. ... In addition to evaluating and scoring web page content, the ranking of web pages are admittedly still influenced by the frequency of page or site updates. What's new and interesting is what Google takes into account in determining the freshness of a web page.""
This discussion has been archived. No new comments can be posted.

Cracking the Google Code... Under the GoogleScope

Comments Filter:
  • by uberjoe ( 726765 ) on Tuesday May 10, 2005 @11:27AM (#12489453)
    So will this make it easier or harder to find porn?
  • Great (Score:2, Interesting)

    Now I'll see more Get ranked #1 in search engines" spam.

    http://www.anologger.com/ [anologger.com]
  • by kensai ( 139597 ) on Tuesday May 10, 2005 @11:27AM (#12489456) Homepage
    To crush artificial link inflation and hear the lamintations of search engine spam
  • it's a war (Score:5, Funny)

    by roman_mir ( 125474 ) on Tuesday May 10, 2005 @11:28AM (#12489458) Homepage Journal
    The linked article is slashgoogled. It's a googlewar. Googlers are all googling.

  • by Anonymous Coward on Tuesday May 10, 2005 @11:28AM (#12489464)
    Cracking the Google Code... Under the GoogleScope
    Google's US Patent confirms information retrieval is based on historical data.

    Publication Date: 5/8/2005 9:51:18 PM

    Author Name: Lawrence Deon

    An Introduction: ...if you thought you cracked the Google Code and had Google all figured out ... guess again.

    Google's sweeping changes confirm the search giant has launched a full out assault against artificial link inflation & declared war against search engine spam in a continuing effort to provide the best search service in the world... and if you thought you cracked the Google Code and had Google all figured out ... guess again.

    Google has raised the bar against search engine spam and artificial link inflation to unrivaled heights with the filing of a United States Patent Application 20050071741 on March 31, 2005.

    The filing unquestionable provides SEO's with valuable insight into Google's tightly guarded search intelligence and confirms that Google's information retrieval is based on historical data.

    What exactly do these changes mean to you?
    Your credibility and reputation on-line are going under the Googlescope! Google has defined their patent abstract as follows:

    "A system identifies a document and obtains one or more types of history data associated with the document. The system may generate a score for the document based, at least in part, on the one or more types of history data."

    Google's patent specification reveals a significant amount of information both old and new about the possible ways Google can (and likely does) use your web page updates to determine the ranking of your site in the SERPs.

    Unfortunately, the patent filing does not prioritize or conclusively confirm any specific method one way or the other.

    Here's how Google scores your web pages.

    In addition to evaluating and scoring web page content, the ranking of web pages are admittedly still influenced by the frequency of page or site updates.
    What's new and interesting is what Google takes into account in determining the freshness of a web page.

    For example, if a stale page continues to procure incoming links, it will still be considered fresh, even if the page header (Last-Modified: tells when the file was most recently modified) hasn't changed and the content is not updated or 'stale'.

    According to their patent filing Google records and scores the following web page changes to determine freshness.
    The frequency of all web page changes
    The actual amount of the change itself... whether it is a substantial change redundant or superfluous
    Changes in keyword distribution or density
    The actual number of new web pages that link to a web page
    The change or update of anchor text (the text that is used to link to a web page)
    The numbers of new links to low trust web sites (for example, a domain may be considered low trust for having too many affiliate links on one web page).
    Although there is no specific number of links indicated in the patent it might be advisable to limit affiliate links on new web pages. Caution should also be used in linking to pages with multiple affiliate links.

    Developing your web page augments for page freshness.

    Now I'm not suggesting that it's always beneficial or advisable to change the content of your web pages regularly, but it is very important to keep your pages fresh regularly and that may not necessarily mean a content change.

    Google states that decayed or stale results might be desirable for information that doesn't necessarily need updating, while fresh content is good for results that require it.

    How do you unravel that statement and differentiate between the two types of content?

    An excellent example of this methodology is the roller coaster ride seasonal results might experience in Google's SERPs based on the actual season of the year.

    A page related to winter clothin
  • It just occurred to me that, as Google changes its algorithms, it'll just create more business for the Search Engine Optimization consultant. When web sites drop in the Google rankings, they'll want to make changes to move back up, and will hire the SEO again to do so.
    • SEO (Score:2, Interesting)

      by Anonymous Coward
      What do those guys actually *do* in any case? I mean, legitimately. I guess you can tweak things a bit, but... how much does that actually get you if you simply aren't a popular site?
      • Re:SEO (Score:3, Informative)

        by Intron ( 870560 )
        There's a whole range. Some will tell you how to rewrite your web page so that search engines will classify it better. That seems legit. Others will try to sell you on "link farms" and other hacks to improve your ratings - not so legit. I've also seen spamming websites that have google-accessible logs with fake referrers, or spamming blogs like /. with links in your sig [place link here].
        • Re:SEO (Score:4, Informative)

          by hankwang ( 413283 ) * on Tuesday May 10, 2005 @04:38PM (#12492883) Homepage
          spamming blogs like /. with links in your sig [place link here].

          Doesn't work in slashdot because:

          • Sigs are only visible for logged-in users (i.e. not for robots)
          • Posts without a karma bonus have the REL=NOFOLLOW attribute in the links, so that they don't count for Google.
      • Re:SEO (Score:2, Insightful)

        by Anonymous Coward

        There is an art to SEO. Some of us employ spamming techniques that will force a website to the top of the list for a short period of time, and then become banned. To some people, this is desirable - such as when you know your product has a short lifespan.

        Others like myself try to help businesses retool their websites to be search engine friendly. Alot of smaller businesses out there have websites that have every bit of info on everything they do on every page, thats bad. We show them how to break it into

    • by rm999 ( 775449 ) on Tuesday May 10, 2005 @11:37AM (#12489574)
      Perhaps, or perhaps if Google changes its rankings enough, the SEOs' credibilities will be destroyed (they will be seen as a temporary and overpriced fixes)
      • Perhaps, or perhaps if Google changes its rankings enough, the SEOs' credibilities will be destroyed

        That would be great. Now that I've read TFA, it looks like Google's techniques a long way toward eliminating the fakery done by SEO's currently.

        As an aside, the article looks like it was written by an SEO consultant, as it contains a lot of advice about how to get good rankings under Google's patented approach. Interestingly, the recommended actions are mostly legitimate (offer interesing content, update regularly, don't try to create fake links to your site), but also some less-upfront techniques (make link-exchange deals with other sites and encourage bookmarking, for example).

    • Here's a thought: How about companies try to offer useful services rather than "optimize" their search engine results? I've gotten several top hits on Google by the complete accident of providing useful services or information in the past. Traditional advertising such as adclicks and dmoz listings also help. Not once have I wasted my time trying to game the system.

      Companies need to start realizing that making money is about providing what customers want. Advertising is a great way of getting your name out, but only a good product or service will actually carry through. So in that frame of thinking, I highly recommend that companies:

      • Stop looking at "cost cutting" by reduction, and start looking at "using existing resources to provide relavent products"
      • Start hiring employees who know what they're doing and listen to them
      • Stop wasting your money on search engine optimizations.
      • Be good to the customer, and the cutomer will be good to you. If you don't know why people are upset or unhappy, grab a couple off the street and ask.
      • by MrNiceguy_KS ( 800771 ) on Tuesday May 10, 2005 @12:05PM (#12489884)
        If you don't know why people are upset or unhappy, grab a couple off the street and ask.

        I'm unhappy because I was grabbed off the street. May I go now?

        Please?

      • Companies need to start realizing that making money is about providing what customers want. Advertising is a great way of getting your name out, but only a good product or service will actually carry through. So in that frame of thinking, I highly recommend that companies:

        Uhh, which world are you living in. Most companies have found that bigger profits can be made, by convincing people that they want what they have. And most customers find it easier to buy what they are told to buy.
        I like your world,
      • How about sites that already provide a useful service and want to get as much exposure as possible? I can't count the number of useful sites that I've visited that are not ranked as well as google as I would like (so I can find them more easily) because they do non-Google-friendly things like:
        • Session IDs in urls
        • Doorway pages
        • Content that expires or changes urls
        • Javascript navigation

        Sometimes search engine optimization isn't about making a hack site rank well. Sometimes it is about getting the traffi

        • Sometimes search engine optimization isn't about making a hack site rank well. Sometimes it is about getting the traffic that a really nifty site deserves.

          Actually, pretty much everything you list falls under the issue of usability. Many of those options have lower usability for the user, and thus the search engine by extension.

          These companies don't need an SEO, they need to find a web designer that doesn't use Macromedia "tools".
          • by Doctor O ( 549663 ) on Tuesday May 10, 2005 @04:47PM (#12492966) Homepage Journal
            Being a professional webworker for more than 8 years now, I agree with you from experience, but actually I don't think you can blame Macromedia.

            I will not say anything at all about Flash because two camps who BOTH don't get it will start the usual pointless discussion. Flash is rarely used for what it's great at, visualizing data, and plagues us with wildly unnecessary and annoying l33t-masturbation stuff instead.

            Dreamweaver itself is indeed a powerful timesaver in the hands of an experienced XHTML/CSS guy. If you look at it closely, you'll find that it is a very nice graphical frontend to HTML itself, with a great set of shortcuts so that you almost don't have to touch the mouse at all. The palettes just provide access to the most commonly needed attributes of the element you're working on. If you leave all those nasty "behaviours", "timelines" and whatnow alone, it produces nicely readable and well-formed code. I'm using Dreamweaver since the early betas, and even back then this was the case. I tend to think that this was an initial design goal behind DW.

            The bad comes from the 'designers' who are taught print design at the universities and apply them to the Web, using all the nutty clicky-pointy tools that produce JS-laden horror cabinet of non-standards-compliance they dare to call "HTML". It's a classical PEBKAC. Look at it this way - if DW didn't have those features, GoLive would've taken over long ago and we don't want THIS to happen. IMNSHO the only thing worse would be Frontpage. At least the guys at Macromedia didn't invent bogus HTML extensions because they were incapable of providing a proper metadata infrastructure, like Adobe did.

            (I'm not a fanboy though, I just use what works best at the moment for the things I do. If someone shows me how to reproduce this "Apply Source Formatting" feature from DW in Kate/KDevelop and how to synchronize sites like in DW, I'm switching my machine at work from Win2K with DW to KDevelop/nvu on FreeBSD tomorrow, because it better fits the things I do nowadays. It will then match my setup at home.)

            While we're at it, SEO is, was and always will be BS, just like the whole Internet Advertising Myth which after nearly a decade of documented failure still isn't debunked. Duh.
    • by Anonymous Coward
      There will always be >=11 sites wanting to be in the Top10
    • Yeah, but hopefully that will be more expensive than filling their site with content that's actually relevant to the keywords they want to get to the top of, and interesting to anyone searching for them.
  • After link analysis (Score:5, Interesting)

    by Ars-Fartsica ( 166957 ) on Tuesday May 10, 2005 @11:30AM (#12489482)
    Its obvious Google and Yahoo are moving on to trust-based (or perceived trust) ranking for sites based on what they see users clicking on through the web accelerator, Yahoo's MyWeb, etc. Hopefully this will help grade down the obvious spam...although you only find out its spam by going to the page...we'll see.
    • by JVert ( 578547 ) <corganbilly@hotmail. c o m> on Tuesday May 10, 2005 @11:34AM (#12489549) Journal
      Doesn't seem like the best solution. This would work if you started from a clean slate but spam pages are still out there and are being clicked on. Not much you can do about that, I just hope its not something silly like how much time you spend on a page. If I find a page that quickly answers my question or at least answers part of my question and I click back for other links i'd hate to think that that site would be marked as "spam".
      • I recently sent Google a suggestion on their Search History feature. I would like to be able give each site in my history a thumbs up or down rating. This rating should then appear in future searches. That way at least I can see when a crap site comes up in a search again.

        Eventually Google should be able to start aggregating those ratings to find out how the public perceives a site.
  • Yes (Score:5, Funny)

    by Anonymous Coward on Tuesday May 10, 2005 @11:30AM (#12489494)
    But when I search on Tiger, a mail-order company's site still comes up above Apple's. Is anyone at Google listening?
    • Re:Yes (Score:3, Interesting)

      Interestingly enough, the top the results for "tiger" are a page about tigers, tiger direct, and the Apple page. These seem pretty reasonable to me. The OS is obviously something a lot of people are going to be looking for, but I'd still find it weird if real tigers were not the first link. For "panther" the results are Apple's page, then some pages on real panthers. For "jaguar" you get the car manufacturer, Apple, then real panthers. I wonder what will happen if you do a search on "tiger" a year from no

  • by wcitech ( 798381 ) on Tuesday May 10, 2005 @11:31AM (#12489500)
    ...that google is still a "not evil" company? This proxy "web-accelerator" thing really still has me freaked out. Am I just paranoid or is there legitimate reason for concern?
  • by nganju ( 821034 ) on Tuesday May 10, 2005 @11:31AM (#12489506)

    The article is not written by a Google employee, nor did the author speak with anyone at Google. It's simply his analysis of the patent document filed by Google.

    Also, at the bottom of the article after the author's name, there's a link to some search optimization service's website.
  • While the article was in the "mysterious future", I clicked on it, skimmed the article, then clicked "printer friendly version" and closed the window with the original browser friendly page. The printer friendly version never came up and the original page was no longer accessible because in those few seconds the article went live on slashdot and the server was knocked out. I guess I'll just have to search my cache or find a mirror.
  • Six weeks to fix? (Score:2, Informative)

    by Anonymous Coward
    I use google quite a bit to check on recent spyware/malware (used it this morning) and with all due respect, the first few links typically are for spyware products that don't work, domain parking sites (search engines themselves), requiring some amount of diligence to get to the "real" sites that have information.

    If this claim is true, I guess we'll have to wait the typical "four to six weeks for delivery."
  • by Doc Ruby ( 173196 ) on Tuesday May 10, 2005 @11:35AM (#12489558) Homepage Journal
    The "war" metaphor really is cute. Geeky competition in search relevance is really a lot like bombing cities, shooting ranks of soldiers, and destroying bridges and railways. Burnt, bloody bodies everywhere! And clean datacenters with mathematical algorithms.
  • by nemexi ( 786227 ) on Tuesday May 10, 2005 @11:36AM (#12489564)
    One of the most interesting (and obvious) effects of Google's changes: The company which once ranked first for the phrase "search engine optimization", SEOinc, is now nowhere to be found -- even a search for the company's name doesn't bring up the company's website. SEOincs response has been a -- somewhat ineffective -- try to bring those reporting [outer-court.com] on its fall [battellemedia.com] to "cease and desist".
  • by RealProgrammer ( 723725 ) on Tuesday May 10, 2005 @11:37AM (#12489568) Homepage Journal
    I think this is the same article: google:www.coder.com [64.233.167.104]
    Google United - Google Patent Examined

    Google's newest patent application is lengthy. It is interesting in some places and enigmatic in others. Less colourful than most end user license agreements, the patent covers an enormous range of ranking analysis techniques Google wants to ensure are kept under their control.

    Publication Date: 4/7/2005 7:41:24 AM

    By Jim Hedger, StepForth News Editor, StepForth Placement Inc.

    Thoughts on Google's patent... "Information retrieval based on historical data."

    Google's newest patent application is lengthy. It is interesting in some places and enigmatic in others. Less colourful than most end user license agreements, the patent covers an enormous range of ranking analysis techniques Google wants to ensure are kept under their control. Some of the ideas and concepts covered in the document are almost certainly worked into the current algorithm running Google. Some are being worked in as this article is being written. Some may never see the blue-light of electrons but are pretty good ideas so it might have been considered wise to patent them. Google's not saying which is which. While not exactly War and Peace, it's a pretty complex document that gives readers a glimpse inside the minds of Google engineers. What it doesn't give is a 100% clear overview of how Google operates now and how the various ideas covered in the patent application will be integrated into Google's algorithms. One interesting section seems to confirm what SEOs have been saying for almost a year, Google does have a "sandbox" where it stores new links or sites for about a month before evaluation.

    Google is in the midst of sweeping changes to the way it operates as a search engine. As a matter of fact, it isn't really a search engine in the fine sense of the word anymore. It isn't really a portal either. It is more of an institution, the ultimate private-public partnership. Calling itself a media-company, Google is now a multi-faceted information and multi-media delivery system that is accessed primarily through its well-known interface found at www.google.com.

    Google is known for its from-the-hip style of innovation. While the face is familiar, the brains behind it are growing and changing rapidly. Four major factors (technology, revenue, user demand and competition) influence and drive these changes. Where Microsoft dithers and .dll's over its software for years before introduction, Google encourages its staff to spend up to 20% of their time tripping their way up the stairs of invention. Sometimes they produce ideas that didn't work out as they expected, as was the case with Orkut, and sometimes they produce spectacular results as with Google News. The sum total of what works and what doesn't work has served to inform Google what its users want in a search engine. After all, where the users go, the advertising dollars must follow. Such is the way of the Internet.

    In its recent SEC filing, the first it has produced since going public in August 2004, Google said it was going to spend a lot of money to continue outpacing its rivals. This year they figure they will spend about $500 million to develop or enhance newer technologies. In 2004 and 2003, Google spent $319 million and $177 million respectively. The increase in innovation-spending corresponds with a doubling of Google's staff headcount which has jumped from 1628 employees in 2003 to 3021 by the end of 2004.

    Over the past five years Google has produced a number of features that have proven popular enough to be included among its public-search offerings. On their front page, these features include Image Search, Google Groups, Google News, Froogle, Google Local, and Google Desktop. There are dozens of other features which can be accessed by cli

  • by Veinor ( 871770 ) <veinor.gmail@com> on Tuesday May 10, 2005 @11:41AM (#12489615)
    Almost any algorithm can be spoofed fairly easily: inserting very small text that's the same color as the background. Then whenever they want Google to think they've updated, they change the text. The viewer doesn't tell the difference, but the source code changes. Or they could just use comments in Javascript, or just create Javascript that never gets used.

    Also, a page with frames might get penalized since its content doesn't change, although the content of the frames may change frequently.
    • What if Google starts to use a filter designed to elimnate the effect of text that is deemed 'unviewable'. Just check to see if the text color is the same as the background, if it is, ignore it.

      I thought of that is less then 30 seconds, what are the odds Google has already thought about it?
    • "inserting very small text that's the same color as the background"

      Puh-leeeeze! That trick became ineffective last century. It's very easy for the search engine to check background colors and FONT tags and penalize the page that uses text that is too close to the background color.

  • by dagnabit ( 89294 ) on Tuesday May 10, 2005 @11:43AM (#12489639)
    Original server is /.ed. Coral cache link [nyud.net] here.
  • In addition to evaluating and scoring web page content, the ranking of web pages are admittedly still influenced by the frequency of page or site updates. What's new and interesting is what Google takes into account in determining the freshness of a web page.

    Since the story submission didn't end the post with a question, I feel compelled to add one:

    How will this affect the ranking of insightful FAQs, which by nature my not change frequently?

    Another shout-out poll to my homeboy Slashdotters: Do you pro

    • Re:FAQs (Score:2, Insightful)

      by eluusive ( 642298 )
      I say F-A-Q not FAQ. I pronounce IRC I-R-C not Irck. It makes me go irck when somebody says erck for IRC. I pronounce MySQL as My-S-Q-L not My Sequel. #$*#@$%&)(@#&%()*#@&%)(*#@% However, I do pronounce LASER as laser the word. Laser is no longer just an acronym.
      • However, I do pronounce LASER as laser the word. Laser is no longer just an acronym.

        It's nice that you care about intelligible pronunciation, but I'd like to inform you that an acronym is not what you think it is. If you look the term up in a dictionary, you should find out that acronyms, like LASER (actually I'd spell that Laser according to my style guide but in this particular case that point is moot anyway, because it has turned into the word laser) are meant to be pronounced as words. SQL is not an

  • by Animats ( 122034 ) on Tuesday May 10, 2005 @11:53AM (#12489767) Homepage
    It's clear that Google is gearing up for a crackdown on search engine spamming. They've already started to kill off "link farms". They're checking spam blacklists. And they're not stopping there.

    Note that Google is now looking at domain ownership information. This may result in a much lower level of bogus information in domain registrations. It's probably a good idea to make sure that your domain registration information, business license, D&B rating, on-site contact info, and SSL certificates all match.

    "Domain cloaking" will probably mean that you don't appear anywhere the top in Google. So that's on the way out.

  • by 4of12 ( 97621 ) on Tuesday May 10, 2005 @12:07PM (#12489904) Homepage Journal

    Google has millions upon millions of click history on their search results that say what it is people really are looking for, as well as which ones appeared good fodder for first clicking.

    No one else has such a large database of what humans have actually picked.

    Such a click history and search term history asset is worth even more if it gets correlated with Evil Direct Marketing information from the cookie traders.

    Although, it seems possible that large ISPs could also grab and analyze their members Google interactions to figure out people's tastes, assuming such interactions remain unencrypted.

    I have to wonder how many companies with static IP addresses have, unbeknownst to them, built up extensive history logs at Google showing their search term preferences and click selections. If I were a technology startup with a hot idea to research I'd be a little more paranoid about something like that.

  • by RonBurk ( 543988 ) on Tuesday May 10, 2005 @12:17PM (#12489995) Homepage Journal
    The first big mistake webmasters make when trying to understand how Google ranks search results is failing to grasp the idea of data mining. The Google folks come from a data mining background, the constantly write about data mining algorithms, it would be highly surprising if the bulk of the Google algorithm was not constructed via data mining.

    What does that mean? At the highest level, it means that most of the Google algorithm is constructed by a machine. You give the machine human-constructed examples of how to rank a sample set of pages (notice those want ads where Google is hiring people who can inspect and assess the quality of web pages?) and it then uses essentially brute-force techniques to test every possible combination of your ranking variables to find the simplest formula that ranks pages the same way the human did.

    There is no human at Google "twisting dials" to alter individual parameters of a formula. The machine constructs the algorithm, and it can therefore easily be so complex that no human can understand it. Tweaking the algorithm becomes a process of changing or adding to your "training set" of human-ranked pages, and letting the data mining process come up with a revised algorithm.

    For example, Google could invent a new variable called "category", and identify each page as belonging to category Astronomy, Botulism, Country, [...] and Other. Once that variable is thrown into the mix, then the Google "aglorithm" is essentially free to vary wildly from one type of subject matter to the next. For example, you might see someone with a Real Estate site swearing up and down that inbound links are no longer as important, while someone with an Astronomy site might swear that, no, inbound links are more important than ever. You can see exactly this kind of bickering in most of the forums that people who hope to do Search Engine Optimization frequent.

    The other big mistake people make in trying to see how to game the Google algorithm is "delay". In studying how people manage (or fail to manage) complex systems, psychologists learned that people generally would fail if a delay was introduced between their actions and the results of their actions.

    In one very simple test, people were charged with trying to stabilize the temperature in a virtual refridgerator. They had one dial, and there was exactly one piece of feedback: the current temperature in the fridge. However, they were not explicitly told that there was a delay between moving the dial and when the results of that action would stabilize.

    The responses of those test subjects was eerily similar to what we see in Google-gaming webmasters these days. Some people swore up and down that some human behind the scenes was directly tweaking the results to thwart whatever they did. Others became frustrated and decided that nothing they did really mattered, so they would just swing the dial back and forth between its minimum and maximum settings.

    What does this have to do with Google? These days, Google can change their algorithm relatively frequently, and the algorithm can vary by the relative date of various things. The net sum is, there's a delay between when your page is first ranked and when it is likely to arrive at a relatively stable ranking. This can drive webmasters nuts as they think they've done something clever to rank their page high, but then it drops a week later. Although it doesn't occur to them, the important question is: did the change cause the high ranking or did it cause the sudden decline?

    The few people who did master the simple refridgerator system? Well, they sounded more like some of the people who are more successful at gaming Google. Those folks tend to say things like: "just make one change and then leave it alone for a while to see what happens."

    Can you still game the Google algorithm? Undoubtedly in specific cases. But it's getting harder. The Google algorithm was always complex, but what's changing is that the days when a few variables (such as inbound link count) generally swamped the effects of all the others is drawing to a close. We are approaching the day when the best technique to rank highly with Google will be: sit down at your keyboard and make more good content every day.

  • by Eric Damron ( 553630 ) on Tuesday May 10, 2005 @12:23PM (#12490068)
    There seems to be a lot of weight put on web page freshness. I host a friend's site containing the collection of poems by Ella Wheeler Wilcox. She lived in the 1800s so one cannot expect to see any new material from her.

    The site is mostly static but is rich with cultural value. It's currently the number one hit on Google. I'm hoping that Google's emphasis on "freshness" won't make his site fall in ranking.
  • People expect seo to get more complex as time goes on. This isn't news and SEO is not going to dissapear. What will happen is people with little motivation or resources will be further discouraged to do SEO as competition increases. That's it. Trustrank will take over Pagerank. Link history will become more important than simply having links. Easily created seo tools such as linkfarms and blog spammers will decrease in value. Everyone expects these things to happen. SEO will always existly largely b
  • Wait a second. (Score:2, Insightful)

    Isn't this "page update frequency" hullaballoo a bit premature? If Google wants relevant results I can only see update frequency being but a minor factor in any page rank determination algorithms. For example: Informations sites (historical information, dictionaries, encyclopedias, collections, etc...) are often at once the most relevant (if info is what you're looking for) and the least updated sites. I can't really imagine the Oxford Faculty meeting every week to decide new words for their dictionary to
  • by mejesster ( 813444 ) on Tuesday May 10, 2005 @12:55PM (#12490463)
    It seems nobody has asked the question: what if a spammer wants to lower the rank of more reputable companies? If a spammer link spams a site that is already fairly popular, couldn't it harm the page rank of a company that has nothing to do with the spam?
  • Or at least they start out spelling the same way.
    Its not nice to fool Mother Nature.

There are new messages.

Working...