Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Patents The Internet IT

Akamai Wins Lawsuit to Protect Obvious Patent 173

brandaman writes "Akamai, the largest content delivery network (CDN) with about 70% market share, recently won its lawsuit against the against second largest CDN - Limelight Networks. The suit asserted that Limelight was infringing on Akamai's patent which, upon examination, seems to be somewhat on the obvious side. 'In accordance with the invention, however, a base HTML document portion of a Web page is served from the Content Provider's site while one or more embedded objects for the page are served from the hosting servers, preferably, those hosting servers near the client machine. By serving the base HTML document from the Content Provider's site, the Content Provider maintains control over the content.' Limelight is obviously not pleased, and this is not the first lawsuit Akamai has won regarding its patents."
This discussion has been archived. No new comments can be posted.

Akamai Wins Lawsuit to Protect Obvious Patent

Comments Filter:
  • I guess I'd better shut down BlogPuzzles.net [blogpuzzles.net] immediately, since it obviously infringes on Akamai's patent. My site allows people to host a base HTML document, with embedded content (puzzles) being hosted on my servers. This is clearly unlicensed use of Akamai's intellectual property. While I'm at it, I'd better warn Google [google.com] before they get involved in a real financial nightmare over content hosted on their servers and integrated into other peoples' websites. Now, where did I stick that attorney's phone number?
    • Actually, Google has used Akamai technology and services. Google.com DNS was hosted by Akamai, and some of their other services use Akamai for content delivery such as YouTube. As Google has grown, they have become less reliant on Akamai.
    • by Iphtashu Fitz ( 263795 ) on Sunday March 02, 2008 @08:13PM (#22618824)
      Sorry, but you're not even close.

      The way Akamai works is it distributes the "heavy duty" content like images, scripts. to its own servers all around the world. It then lets its customers (like E*Trade, to pick one actual example) modify their static HTML content to refer to those images in a special way. For example, the E*Trade home page has the following link in it for one of its images:

      https://a248.e.akamai.net/n/248/1777/20080228.0/www.etrade.com/images/prospect/topGrad.gif [akamai.net]

      The url is specially encoded in such a way that when your local DNS server queries a248.e.akamai.net, the DNS server returns a server located physically near you. So if you're in England a248.e.akamai.net might resolve to an IP located in Londan, but in New York City it would resolve to an IP somewhere in New York. Then when the http request is sent, Akamais servers decode that annoyingly long URL to determine which customer of theirs it is and serve up the correct image. It's actually a fairly complex and fast process. If the server that you're directed to doesn't actually have the image locally then that Akamai server will query another nearby Akamai server. If that server also doesn't have it then it'll actually pull the image down from a master server that E*Trade uploaded the image to.

      You can test this out yourself by looking up the IP address of a248.e.akamai.net yourself. Locally you'll get one IP. If you do a google search for dns lookup tools you can submit that domain name to other sites to look it up and you'll get totally different IP's that are physically close to wherever that domain lookup tool runs from.

      The bottom line is that it's a prety complex process that involves both the use of DNS to ensure you download large chunks of content from physically near servers as well as some pretty sophisticated caching in the background to make sure static content is delivered rapidly no matter where in the world you are.

      I used to work at Akamai so I have a pretty good firsthand knowledge of how their stuff works. I doubt a lot of their algorithms they use would pass the "obviousness" test...
      • by MobyDisk ( 75490 ) on Sunday March 02, 2008 @09:52PM (#22619426) Homepage

        I used to work at Akamai so I have a pretty good firsthand knowledge of how their stuff works. I doubt a lot of their algorithms they use would pass the "obviousness" test...
        I'm reading the linked patent now, and I think the problem is that what is patented is not an algorithm, but a network architecture. This is furthermore a mucky issue because according to patent law, algorithms are not patentable. In the US "mental processes" are not patentable either. But the patent office grants "algorithm" patents so long as the submitter is implementing it in hardware or software. Oddly enough, even things like RLE are patented even though they can easily be done in your head.

        I am not familiar with this particular case, but the big issue here is that Akamai might be trying to patent the general concept of distributing cache servers around the world. This is the kind of thing that the patent office should not allow. If I have a better way to do this, or even the same way, I should be allowed to do it. Akamai is the leader in this industry and they are well set and nobody is going to knock them off the map suddenly one day by copying them. They don't need patent protection. Furthermore, this is the kind of thing any group of competent developers can create, and 10 different groups would have 10 different ways of doing it. Even if a patent is appropriate here, it should not be used to squash similar competing services.
        • Re: (Score:3, Informative)

          by Tablizer ( 95088 )
          according to patent law, algorithms are not patentable

          They are now under the "business process" umbrella. The courts are accepting these so far.
               
          • by Alsee ( 515537 ) on Monday March 03, 2008 @01:12AM (#22620618) Homepage
            The courts are accepting these so far.

            Yeah, lower US courts decided to start allowing software patents.

            However in the current Microsoft-AT&T case before the US Supreme Court the multiple justices were clearly skeptical of that behavior. In particular:

            JUSTICE BREYER: I take it that we are operating under the assumption that software is patentable? We have never held that in this Court, have we?

            MR. JOSEFFER [DOJ Atty]: No, but as I was saying before -

            JUSTICE BREYER: So what should we do here? Should, if we are writing this, since it's never been held that it's patentable in this Court

            I have read the Supreme Court rulings relating to software patents. The rulings were back in the early 80's or so, back before the lower courts went off on their software patent kick. It seems clear to me that the lower courts have ignored or directly violated several points of Supreme Court law in those rulings. It appears that the Supreme Court is looking to directly rule on the subject and reign in the aberrant lower courts.

            -
            • by Tablizer ( 95088 )

              JUSTICE BREYER: So what should we do here? Should, if we are writing this, since it's never been held that it's patentable in this Court [end Breyer quote]... It seems clear to me that the lower courts have ignored or directly violated several points of Supreme Court law in those rulings. It appears that the Supreme Court is looking to directly rule on the subject and reign in the aberrant lower courts.

              Chemistry and genes didn't used to be patentable, but they ended up that way anyhow after long arguments.

              • by Alsee ( 515537 ) on Monday March 03, 2008 @09:16AM (#22622488) Homepage
                Think about it

                I've spent plenty of time, as I said in my post I've actually read all of the Ssupreme Court rulings on the subject.

                A number is not an "invention". An equation is not an "invention". A calculation is not an "invention". Mental information processing is not an invention. Mental information processing does not magically become a patentable invention when you OBVIOUSLY use a calculator to accelerate/automate it. Mental information processing does not magically become a patentable invention when you OBVIOUSLY use a computer to accelerate/automate it. Mental information processing does not magically become a patentable invention when you OBVIOUSLY do it on the internet to accelerate/automate it.

                The Supreme Court has explicitly ruled that no possible algorithm can ever qualify as "novel" or "non-obvious" for patent purposes. Therefore no possible software can ever qualify as novel, no possible software can ever qualify as non-obvious, no possible software can ever be an invention. There is nothing novel or non-obvious in blatantly using an ordinary computer to carry out that "non-novel" "obvious" calculation. Sticking the words "on a computer" at the end of a mathematical information manipulation does not magically turn it into a patentable invention.

                It doesn't matter if you are the first person to write down some particular number, it cannot be "novel". It doesn't matter how many digits long your number is, it cannot be "non-obvious". No possible math, no possible information processing, no possible mental process, no possible algorithm, no possible software, can ever be an invention. As the Supreme Court said, they can never qualify as novel or non-obvious for the same reason that laws of physics are never treated as novel or non-obvious for patent purposes. An invention may make use of gravity, but G=M1*M2/R^2 is not an invention and it is treated as non-novel and as obvious, even if you are the first person to figure it out.

                Chemistry and software are not really that different philosophically

                No matter how long I *think* about a chemical reaction I will never actually make any molecules.

                Physical objects and physical processes are philosophically different than math/calculations/mental-processes.
                Physical objects and physical processes are concretely different than math/calculations/mental-processes.

                -
                • Re: (Score:3, Informative)

                  by udippel ( 562132 )
                  I've spent plenty of time, as I said in my post I've actually read all of the Ssupreme Court rulings on the subject.

                  Okay, so then you can enlighten me, the half-knowing. To my knowledge the whole mess started with the Supreme Court rebuking the USPTO in the Diamond vs. Diehr case, where the USPTO was kind of ordered to grant a patent on essentially software. Yes, I read the patent and some resources around it. Yes, the Supreme Court held the earlier appeals for non-patentable. Though in Diamond vs. Diehr th
        • This is furthermore a mucky issue because according to patent law, algorithms are not patentable.

          What about RSA, LZW, LZS and MP3?
          • by reebmmm ( 939463 )
            Algorithms are not patentable. Particular implementations of algorithms are definitely patentable. Always have been. Machines that implement well-known equations from physics are definitely patentable.

            In your case, implementing particular equations for the purpose of compression and encryption would definitely be patentable IF (that's a big IF) they also meet the other criteria for patentability.
      • by glwtta ( 532858 )
        So if you're in England a248.e.akamai.net might resolve to an IP located in Londan, but in New York City it would resolve to an IP somewhere in New York.

        Great. I wonder if the fact that DNS was specifically designed with this sort of thing in mind undermines Akamai's inventiveness here?

        Of course it's pretty pointless speculating here, since TFA has exactly zero information about what's specifically being infringed here: could be something of substance, or could be the old "use a more or less complex
    • A company I contracted to was building an Internet gambling site (football spreads). Since the site was technically not legal in the US, the site was run off-shore (Hati and DR). The problem was that bandwidth (and latency) was not good to either country. So, we served the images for the site (perfectly legal) off a domestic server. Fast response time, but the actual site logic was safely out of jurisdiction.

      The site was called "Wager Web", but I don't think it exists anymore. After the off-shore clients st
  • by Gonoff ( 88518 ) on Sunday March 02, 2008 @07:31PM (#22618562)

    As I am not a lawyer, it was not obvious to me what they were patenting.

    Is this patenting having the html on one server and the rest (pictures etc) on other ones?

    If it is that, I think there should be some prior art in the original stuff from Tim Berners-Lee.

    • Re: (Score:3, Informative)

      by Anonymous Coward
      Is this patenting having the html on one server and the rest (pictures etc) on other ones?

      Apparently only in the specific case of having the "other ones" be distributed across the network and with the "closest" server to the client chosen to download the content from.

      I suppose that things like mirrors, etc. don't count because in that case the user typically chooses what they believe to be the closest server rather than the host or akamai.
      • Apparently only in the specific case of having the "other ones" be distributed across the network and with the "closest" server to the client chosen to download the content from.

        Without having read the patent (mea culpa) your description sounds damning in terms of obviousness and probability of prior art.

        I have two issues with Akamai now: first, sitting front and centre in the evil patents problem, playing a starring role as a patent troll. Second, being a Linux freeloader. A grep of my lkml mailbox reveals zero Akamai email addresses. Not at all becoming a multibillion dollar business that built itself largely on the back of low cost Linux servers.

      • Genuity, a web hosting company, was doing this via their "hopscotch" routing protocol in 1997. They were bought by GTE at that point but the technology had already been in development for several years. I met the founders at a conference and we exchanged some ideas on improvements based on some work I was doing for another company. Basically, though, they had connections into all of the major NAPs in the US and a dynamic cost-based routing protocol that chose which server to use for which customer. Dynamic updates to the site data (e.g. actually buying stuff) was more complex, obviously, because they had to wait for the transaction to synchronize, but at least they benefited by processing the request through the fastest pipe to the browser. Those updates and associated content came from a different server, matching the patent requirements.

        I found this article ( http://findarticles.com/p/articles/mi_m0EIN/is_1997_Dec_10/ai_20053332 [findarticles.com] ) rather easily, going back to 1997.
    • by gatzke ( 2977 )

      I can't find it, but somebody patented storage in the door of a refrigerator. That seems obvious, but a true improvement. They had 17 years to exploit the monopoly on refrigerators with door storage.

      My wife worked at a paper company. They had patents on how to cut cardboard to make containers (french fry boxes, etc)

      Obvious stuff can be patented.
      • by Anonymous Coward on Sunday March 02, 2008 @07:46PM (#22618688)

        Obvious stuff can be patented.
        In practice. In theory, that's not supposed to happen. But the patent system, like the cake, is a lie. Patent monopolies exist to prevent free markets.

        People pereenially confuse the theory of the patent system (reward the poor starving inventors) with its actual empirical effects (allowing corporatist elites to control innovation and the very direction of a technological society).
      • Re: (Score:3, Interesting)

        Cutting boxes to minimize waste and facilitate processing can be solutions to VERY nonobvious problems easily desrving patents.

    • Re: (Score:3, Interesting)

      It's a combination of modified URL's in the static HDML, DNS trickery that causes those URL's to be downloaded from servers physically close to you, and smart caching of that content. It basically provides a way of ensuring that static content like images, which take up a lot of bandwidth compared to HTML documents, is downloaded from servers physically near you and not from the companies primary server. It dramatically speeds up the loading of web pages no matter where the requests come from, and offload
    • by EmbeddedJanitor ( 597831 ) on Sunday March 02, 2008 @08:18PM (#22618882)
      I am also not a lawyer, but I have written over ten patents and read many.

      As in many of these "obvious patent" trolling articles, the article/summary oversimplify the patent. The patent does not just claim click here, fetch there redirection which is used by just about every major site, but algorithms for doing the load balancing etc.

      If you read some of the claims, then you'll see that various algorithms are used for load balancing and other purposes. While these might be obvious to some, they are extremely obvious to all.

      The test of "obvious" is also not that clear cut. IIRC, the tests is "reasonably obvious to practitioners of the art". This test should be applied to the state of the art as at the time of the patent, because a patent "teaches" the industry and therefore after the disclosure the less-than-obvious become obvious.

      • The web, along with HTML, has been available since, what, '93?

        They didn't file the patent until '99.

        If it's that obvious, one would assume there's prior art from that six year window.

        If it had been filed in say '95 when there wasn't that much interest in the web yet, that'd be one thing. By '99, a hell of a lot of people were using it and filing IPOs as fast as they could come up with ideas.

        The alternative explanation would be that it wasn't all that obvious and the obviousness either comes from retrospect
        • As I note in a previous post, http://yro.slashdot.org/comments.pl?sid=472974&cid=22621278 [slashdot.org] , Genuity was doing dynamic cost-based routing and smart mirroring in 1997 and the technology had already been in development for several years. The company I was working with was also working on similar technology, which is how I got introduced to the founders of Genuity (about the time they were bought by GTE). I know of at least one other effort to do the same thing during the same period, although it was not as far along as either ours or Genuity's "Hopscotch" protocol, and another company I worked with was doing the same thing with distributed database systems in 1998 (project was over five years old when I worked with it).

          We also had a Internet gambling site at the time which used at least elements of the patent in that it was an off-shore (for legal reasons) site with static content served domestically for performance through multiple NAP connections with some routing magic. No where near as advanced as either Genuity or the design we were working on, but obviously pointing toward that goal.

          A guy, possibly by the name of Alex Yuriev, was talking about distributed sites and dynamic routing in Philadelphia in 1996-1997. He may have worked for NetAccess at that point and was a bit genius with BGP and routing in general. My business partner at the time talked back and forth with him about some of the similar things we were working on.

          The base concept is just not that hard, and the most difficult part of the implementation is physical and logistical, not technical. The hard technical part is doing dynamic updates to the distributed systems and synchronizing transactions, but even that can be fudged decently if you are willing to go with the 90% solution that gives you most of the benefit.

          So basically, there was a lot of activity on this sort of thing in the 90's, the technology was clearly driving in that direction, and it becomes easy once the underlying tools are in place.
    • I would assume they are talking about Edge Side Includes [akamai.com] and not simply about the serving of images.

      ESI is like Server Side Includes, except that the included part resides on the Edge servers. So your server would serve a page with only the content personalized to you specifically (like the fact that you are logged in) but a box full of news headlines that everyone sees would be included by the edge server.

      Not entirely obvious, but I am not so sure it warrants patent protection in any case.
      • I don't think so. Akamai's servers don't store the client's content statically, there's much, much, much too much content for that, especially for streaming media. It would be hideously expensive in disk space for their servers. Instead, they use a web proxy to rewrite the URL's, grab the content from their customer's designated servers, and keep it cached locally on that Akamai server or set of servers as it's needed. So the first hit on an image takes a while to grab from the customer, but the rest go aga
        • by daBass ( 56811 )
          Who said anything about static? :)

          But read the link I pointed to about ESI. Akamai does more than just server dependencies like images, video, css, js, etc. They actually have an engine that assembles HTML also.

          That said, I don't think any more this is specifically about ESI; the patent doesn't really mention it. It just seems to cover the good old image includes.

          No matter, back when this was patented, this was far from obvious and certainly a novel idea. I am not the biggest fan of software patents, but I'
          • That ESI link looks like what I read about 5 years ago when discussing Akamai with someone who thought having a Squid server in their local network would do the same thing and we decided that no, there are big differences.

            I agree that a lot of this was far from obvious when the patent was made. And I agree that this is not a patent troll issue: they actually use the patent, and the patents they've filed seem surprisingly legible. It's not like Microsoft's "57 patent violations by Linux" that they refuse to
    • by tambo ( 310170 ) on Sunday March 02, 2008 @08:49PM (#22619094)
      As I am not a lawyer, it was not obvious to me what they were patenting.

      Let's start first with the definition of "obviousness." Patent law doesn't go by the common English or Webster's definitions of the term - it has a very technical meaning, refined by probably a thousand patent law cases. Many nuances. And unhelpfully, the definitive section [uspto.gov] on the topic is circularly defined.

      At least two reasons for the technicalities. First, virtually anything is "obvious" in hindsight. Second, what is "obvious" to one (unbiased) person may be completely non-obvious to another (unbiased) person, and the patent office would produce radically inconsistent results if examiners were permitted such subjectivity.

      - David Stein

    • Is this patenting having the html on one server and the rest (pictures etc) on other ones?
      No, it is absolutely NOTHING like that at all. I suggest you RTFA and RTFP and come back.

      If it is that
      It isn't.
  • by QuantumG ( 50515 ) <qg@biodome.org> on Sunday March 02, 2008 @07:36PM (#22618594) Homepage Journal
    in retrospect.

    The sex and violence of a patent is in the claims. go read em [uspto.gov] and now look at the date the patent was filed: May 19, 1999.. which means it was being written for 6 to 8 months before that. You're saying that rewriting urls in a web page to fetch objects from geographically different servers was obvious in late 1998?

    Not defending the patent system in the US or anything, but claiming that something is "obvious" now when the patent was filed in '99 is pretty freakin', well, obvious!

    • by Anonymous Coward on Sunday March 02, 2008 @07:47PM (#22618692)
      You're saying that rewriting urls in a web page to fetch objects from geographically different servers was obvious in late 1998?

      Yes.

      Well, maybe not if you were in high school then. But to people actually doing content delivery over the web, yes. And there were starting to be big web sites around even then.
    • Re: (Score:3, Insightful)

      > You're saying that rewriting urls in a web page to fetch objects
      > from geographically different servers was obvious in late 1998?

      Technically, yes. Remember Image bandwidth-stealing? A guy hosting images would find others not only presenting those images in a different website, but to add insult to injury, would load those images from _his_ servers? (i.e. they had modified their IMG tags to load images from the unwitting originator.) Now, if the originating servers were clustered and/or geographicall
      • Re: (Score:3, Insightful)

        A guy hosting images would find others not only presenting those images in a different website, but to add insult to injury, would load those images from _his_ servers? (i.e. they had modified their IMG tags to load images from the unwitting originator.) Now, if the originating servers were clustered and/or geographically distributed, you've got a setup just like Akamai.

        Not really. What you describe is basically just offloading static images to an unsuspecting third party. If it's a popular website then t
        • Re: (Score:3, Interesting)

          by russotto ( 537200 )
          There may be all sorts of trickery involved in what makes Akamai work, but the patent covers more than that trickery. It covers any system where a webserver modifies a URL to include a hostname whose DNS entry is served up by two DNS servers in the system, and whose content is served up by a host other than the webserver.

          For examle, if I have a webserver at example.com, and it modifies image URLs within it to point to foo.bar.example.com, and there's an 'example.com' DNS server which contains the NS record
        • by WNight ( 23683 )
          Yeah, it's more than just links to an outside server, it's links to an outside server that's close to you and has the data.

          But, that's pretty damn obvious. Round-robin DNS for load-balancing is an old and obvious technology. Instead of spreading the load in strict order, or by load average, you'd instead uses a lookup table of IPs to physical locations.

          And that's pretty trivial. I've seen MUDs in the 80s that knew where you were connecting from.

          Caching is "hard", in that hairy issues exist, but simple enoug
        • > describe is basically just offloading static images to an unsuspecting third party.

          What if the 'unsuspecting third party' was a geographically distributed server setup like CPAN or sourceforge.net?

          Nefarious intent does not invalidate prior art.

          I have no hassles with Akamai protecting something innovative, say their precise server-determination algorithms, but geographically distributed HTTP load balancing itself is not worth protecting.
          • What if the 'unsuspecting third party' was a geographically distributed server setup like CPAN or sourceforge.net?

            CPAN, SourceForge, etc. don't use DNS load balancing like Akamai does. They just use a collection of static mirrors and they randomly pick one or let you pick from a list. Now if you could just type in "download.sourceforge.net" and have it return a download location logically close to you that would be a different situation. But how would you implement this? If there were hundreds of downlo
      • by Alomex ( 148003 ) on Sunday March 02, 2008 @09:29PM (#22619312) Homepage
        I was doing web caching at the time (I had my hands very early on on the original hotspots paper by Akamai's founders). When I learned of the embedded elements redirection I found the Akamai idea totally non-obvious and far more reaching in terms of web caching than their hotspots contribution. Of course, once I saw it, all I could say was "what didn't I think of that, its so obvious!"
    • Re: (Score:3, Insightful)

      by rastoboy29 ( 807168 ) *
      No, it was way obvious by then.  You must not be old enough to remember.

      And even so, it is in no way a brilliant idea.  I was making web pages with content sucked from multiple sites in 1994, and I was no genius.

      It may not be obvious to a non-technical judge or jury, however, even today.
      • It may not be obvious to a non-technical judge or jury, however, even today.

        Which is really the core of the whole issue, isn't it? Obvious TO WHOM? Under patent law, that "whom" is "a person skilled in the art." In other words, a techie. Now, a lot of techies are vile, petty, competitive creatures with a great disdain for humanity as a whole. For reference, see Slashdot (http://www.slashdot.org/). Of course we are going to find numerous tech professionals who will claim that any damn thing under the Sun

        • by WNight ( 23683 )
          Perhaps because 'we', presumably uber-logical geeks, are smarter than everyone else. Maybe not by touchy-feely standards of IQ, but by the metric of being able to better parse complex statements, or design complex systems, yes.

          As to smarter than other techs? Well, there are many people who assume that because they didn't know how to do something it must be black magic. There are others who know, from experience, that there is little you can't do well with a well-intentioned group of smart developers. (Polis
      • by AusIV ( 950840 )
        Yes, but did you suck the content from the site nearest the requesting user? This patent doesn't cover pulling content from somewhere other than the server that offered the HTML document, it covers algorithms that determine which server is best able to provide content to a particular user.
        • Are you seriously suggesting that the patent is worthy?  Gimme a break, it's merely a relatively minor efficiency tweak--anybody would have thought of the same thing when faced with the same problem.
    • by bit01 ( 644603 ) on Sunday March 02, 2008 @08:14PM (#22618834)

      in retrospect.

      No it is not, and your hand waving is not helping. The PTO loves to push this self-serving nonsense as if it were fact. People are perfectly capable of evaluating whether something is obvious or not after the fact. They don't mystically lose their intelligence simply because they have more facts at their disposal.

      This is obvious, if for no other reason than the HTTP/HTML protocols have built in the ability to get different elements of the one the page from different servers and to URL redirect a client from one server to another plus the address rewriting rules in popular servers like Apache. All of these capabilities existed for years before this "patent". Not to mention DNS referral, caching, network throttling etc. which existed for decades before this "patent". Don't be fooled by patent "claims" which list standard techniques together and then claim the assembly is somehow "different".

      Face it, this "patent" is blindingly obvious to anybody with even basic training in networking. The fact that this got through just shows how incompetent the PTO is. Not surprising, given the chutzpah of claiming that the bureaucrats in a small government department can assess against all of human knowledge for whether an idea is original or not. Only a scientist working a life time in a very narrow area can do that and even then they make mistakes.

      ---

      "It is difficult to get a man to understand something when his job depends on not understanding it." - Upton Sinclair

      • by QuantumG ( 50515 ) <qg@biodome.org> on Sunday March 02, 2008 @08:24PM (#22618922) Homepage Journal
        Actually, around 1998, a DNS server that returned a different IP address for a lookup based on who the request is for was not only novel, it was considered WRONG. Geographical load balancing was your typical dot-com boom idea.

        plus the address rewriting rules in popular servers like Apache.
        Evidence that you didn't even read the patent.. and you have the audacity to call the PTO incompetent. Not saying they're not, just saying that you shouldn't be throwing stones here.

        • "Actually, around 1998, a DNS server that returned a different IP address for a lookup based on who the request is for was not only novel, it was considered WRONG."

          Yet nothing about that makes the patent non-obvious to someone in the field, which is how the USPTO is supposed to rate obviousness, not whether the RFC happened to concur with the idea. I specifically remembering thinking that the patent was so bloody obvious that I couldn't believe it had been granted in the first place.
        • by bit01 ( 644603 )

          Actually, around 1998, a DNS server that returned a different IP address for a lookup based on who the request is for was not only novel, it was considered WRONG.

          It's just caching at a different network layer. Big deal.

          Geographical load balancing was your typical dot-com boom idea.

          No it was not. "Geographical load balancing" is just an obfuscated name for "caching", something that had been known about for decades. One of the big problems with the PTO is that they are endlessly confusing invented t

      • by rasilon ( 18267 )
        People are perfectly capable of evaluating whether something is obvious or not after the fact. They don't mystically lose their intelligence simply because they have more facts at their disposal.

        The facts beg to differ. Research in to negligence cases shows that people are far more likely to rate something as obvious when they've seen the outcome.

        I can't remember who defined genius as "The ability to see the obvious things that others miss.", but it's sometimes useful to remember.
    • by dbIII ( 701233 )

      You're saying that rewriting urls in a web page to fetch objects from geographically different servers was obvious in late 1998

      No, I don't think that is what is being said. It was obvious to some people in 1992.

    • Re: (Score:3, Interesting)

      by Gr8Apes ( 679165 )
      Well, since HTML, circa 1993, was designed to allow for referencing external components. Rewriting URLs was a fundamental principal for serving pages and applications, which existed since at least 1996 that I'm aware of, as I did it then. Add in to that that commercial IP blocks are owned by companies with definite locality, and I'm not sure what part of Akamai's patent isn't stating things that were already in existence.

      It was already stated that algorithms cannot be patented. And that's all that Akamai se
    • Re: (Score:3, Interesting)

      by TheRaven64 ( 641858 )
      I did some web design for a company around 93-94 (I can't remember exactly when, Netscape 2 was either new and shiny, or just about to be new and shiny, and most people used Mosaic). Because they wanted to be able to modify the page easily, they wanted to host it locally. Since they only had an ISDN line, I put their images on the web space provided by the ISP (slow to update, but lots of capacity) and the HTML on their computer. More recently, but well before 1998, I helped my school IT department set t
      • by QuantumG ( 50515 )
        And none of those things you just described is what the patent is about. Go and read the patent already.
    • by kriegsman ( 55737 ) on Sunday March 02, 2008 @09:39PM (#22619356) Homepage
      Well, it was pretty obvious then, at least to people in the business, especially considering that at least one earlier CDN patent (e.g., US Pat. 5,991,809, originally filed as provisional pat. 60/022,598, filed on July 25, 1996 [uspto.gov], by me) had already been granted and therefore made completely public in 1997. Clearway Technologies (my company) was already selling a commercial off-the-shelf CDN implementation system starting in September of 1996. Akamai's success has been substantial, and I feel it is truly well-deserved, but they were not the first to invent a CDN, nor the first to patent it, nor the first to bring it to commercialize it.

      -Mark Kriegsman
      Founder, Clearway Technologies
    • by tuomoks ( 246421 )
      Yes, very obvious. Drop modifying URL ( or any other mechanism to identify ) - it has been obvious a long, long time. Drop modifying DNS ( or any other mechanism to identify source and/or origin ) - obvious a long, long time ago. And so on. Does anyone remember when ordering cars or car parts using a computer in 80's. Let's see - at least Toyota did that, I think Nissan and Volvo maybe at the same time. The requests got distributed depending of system, network and locality to a link to Japan, the country ma
  • Non-obviousness (Score:5, Insightful)

    by Prime Mover ( 149173 ) on Sunday March 02, 2008 @07:44PM (#22618674)
    Just did a report about business patents. Non-obviousness, a requirement of Patents (35 USC 102?), isn't proven by looking at something and saying "Duh!" You need to show prior art preferably enough prior art examples to cover all of Akamai's claims.
    • by Titoxd ( 1116095 ) on Sunday March 02, 2008 @07:49PM (#22618702) Homepage
      Damn, and I just used my mod points... people need to start realizing that the best way to argue against a patent is not by saying "but so-and-so did this", but to tell the USPTO (or find somebody who will tell them [eff.org]) that "so-and-so did this"...
    • You need to show prior art preferably enough prior art examples to cover all of Akamai's claims.
      Actually each claim is required to be nonobvious by itself. You can invalidate single claims by showing prior art for them. And nonobviousness is 35 U.S.C. 103.
    • Just did a report about business patents. Non-obviousness, a requirement of Patents (35 USC 102?), isn't proven by looking at something and saying "Duh!" You need to show prior art preferably enough prior art examples to cover all of Akamai's claims.

      I'm not sure that I understand you properly. You are saying that if you can find prior art, this proves obviousness. In other words, the fact that somebody thought of it, makes it obvious, i.e., all things are obvious. Since this is clearly not what you mean

      • The prior art has to be dated at least one year prior to the "priority date" which in most cases is the application filing date.
        • But had that prior art been patented, then it would not have been considered obvious. So we have a situation where obviousness is determined by the filing of arbitrary paperwork, which makes no sense.
    • by glwtta ( 532858 )
      Non-obviousness, a requirement of Patents (35 USC 102?), isn't proven by looking at something and saying "Duh!" You need to show prior art

      So, the first person to do something obvious gets a patent on it?

      Apparently the patent system is working as intended, after all.
  • I think this case proves it. They're simply not aware of the technical implementations of popular sites out there, leading to these sort of stupid cases.

    Many advertisers will fall fowl of this, since many sites have ad content that is served up by another server which is not their own.
    • Ad sites might not fall into this. Akamai is protecting an entire system that involves the dynamic distribution of cached static content through it's servers around the world and the use of DNS tricks to ensure that any user who needs that content gets it from the server closest to him/her. It's much more than just displaying images from another server. The only way an advertiser would run into problems is if they developed their own in-house dynamic caching system for the delivery of their ads. I think
    • And guess what, you can't expect them to. I do not expect them to understand the inner workings of the internet or the www ot html. You can't ask me to have basic knowledge of particle physics or advanced chemistry. It doesn't matter if I am a judge or not. If that was the case, the requirements to be a judge would be ridicilously high. So high, that no normal person could do it. I expect judges to know their way when it comes to the law and common knowledge things(geography, calculations etc.). For the res
  • by daeg ( 828071 )
    So JavaScript-based RSS feeds violate the patent, correct? The HTML is served from one or more possibly distributed servers while the owner of the RSS feed has control of the RSS feed's content.
  • it's time to pray for that upcoming Supreme Court decision that will cover the scope of what's patentable.
    • by ADRA ( 37398 )
      The supreme court can only interpret laws, they can't create them. Try picking up the phone and calling your local congressman/woman.

  • Microsoft is partnering with Limelight to build its own CDN network [datacenterknowledge.com]. They're probably the biggest of Limelight's 1,150 customers, but there are plenty of other big companies using Limelight. If the judge issues an injunction, they might have tough decisions, as Limelight has said an injunction might force them to shut down their CDN. Appeals would stretch things out, but customers don't like uncertainty.
  • by the eric conspiracy ( 20178 ) on Sunday March 02, 2008 @08:49PM (#22619098)
    In at least two fundamental ways. First, the summary quoted the abstract of the patent, not the claims. The abstract is almost always a simplified extract of the contents of the patent and rarely has any meat to it. Of course it looks obvious.

    READ THE CLAIMS TO FIND OUT WHAT IS BEING COVERED BY THE PATENT!!

    Here is claim 1:

    1. A distributed hosting framework operative in a computer network in which users of client machines connect to a content provider server, the framework comprising:

    a routine for modifying at least one embedded object URL of a web page to include a hostname pretended to a domain name and path;

    a set of content servers, distinct from the content provider server, for hosting at least some of the embedded objects of web pages that are normally hosted by the content provider server;

    at least one first level name server that provides a first level domain name service (DNS) resolution; and

    at least one second level name server that provides a second level domain name service (DNS) resolution;

    wherein in response to requests for the web page, generated by the client machines the web page including the modified embedded object URL is served from the content provider server and the embedded object identified by the modified embedded object URL is served from a given one of the content servers as identified by the first level and second level name servers.

    Doesn't seem so obvious now, does it?

    The second is the fact is that Akamai is a very innovative company that has pioneered a lot of distributed content delivery starting with the early days of the internet. In my mind it is very obvious that they would have a lot of valid patent material. They are most assuredly NOT patent trolls, and in fact have brought many innovations based on some very advanced work to commercial fruition. It is insane that their work is being shown in this light by Slashdot.

    The company was founded by an MIT graduate student (Dan Lewin) and an applied math professor from MIT, Tom Leighton who is currently head of the algorithms group at at MIT's Computer Science and Artificial Intelligence Laboratory. Lewin was tragically killed when AA flight 11 was crashed during the 9-11 terrorist attack.

    This article is one of the most ridiculous ever posted by Slashdot.

    • Re: (Score:3, Insightful)

      by Rakishi ( 759894 )

      Doesn't seem so obvious now, does it?
      Actually it does, just because they use many large words doesn't make what you quote anything but obvious. Christ, I mean the patent has 34 sections and you quote one of the most obvious of them.

      You know what your quote says: "serve some of the parts of a webpage from other servers." In other words if you allow an easy way of hot linking of images then you meet the criteria.
    • by Wolfbone ( 668810 ) on Sunday March 02, 2008 @09:48PM (#22619404)

      In at least two fundamental ways. First, the summary quoted the abstract of the patent, not the claims. The abstract is almost always a simplified extract of the contents of the patent and rarely has any meat to it. Of course it looks obvious.

      READ THE CLAIMS TO FIND OUT WHAT IS BEING COVERED BY THE PATENT!!
      I did. Where in claim 1 is the non-obvious meat you speak of that is not in the abstract?

      Doesn't seem so obvious now, does it?
      Why not?

      The second is the fact is that Akamai is a very innovative company that has pioneered a lot of distributed content delivery... It is insane that their work is being shown in this light by Slashdot.
      It is insane if that invention is Akamai's idea of a contribution to progress and disclosure thereof meriting a 20 year monopoly right to exclude.

      This article is one of the most ridiculous ever posted by Slashdot.
      Not really. The frequency with which articles are posted about hapless re-inventors getting caught out by dreadful patents like this one is rather tedious though.
    • by glwtta ( 532858 ) on Sunday March 02, 2008 @10:08PM (#22619478) Homepage
      Doesn't seem so obvious now, does it?

      All I'm seeing is the same thing as the summary, just with more words.

      If you think this is the sort of thing that needs patent protection, you are high; no matter how many "wherein"'s they throw in there.

      I'm sure they are in fact a very innovative company, that doesn't stop this patent from being complete bullshit.
    • Loyalty a good argument does not make.

      The patent is actually kinda obvious, actually actually. Whether it was obvious in 1999 when it was registered is another matter, but distributed content serving is certainly quite obvious now. Also, whether or not they're an innovative company is irrelevant if they're suing another company for infringing on their (obvious or not) patent specifically to stop them competing. That's the very definition of anti-competitive behavior, even if it's legally allowed.

      please no
    • by Quothz ( 683368 )

      First, the summary quoted the abstract of the patent, not the claims.

      Does, too.

      The second is the fact is that Akamai is a very innovative company that has pioneered a lot of distributed content delivery starting with the early days of the internet.

      1998 was hardly the early days of the Internet. Or even the Web, really, although that's debatable enough.

      Lewin was tragically killed when AA flight 11 was crashed during the 9-11 terrorist attack.

      I'm not sure what Lewin's death has t'do with patent rights.

  • While I haven't been following the litigation very closely, Limelight's position on it seems interesting. They seem to be denying that they infringed upon the patent, rather than taking the approach that the patent is obvious or that there exists prior art. I'm sure there is a sound legal strategy here, I'm not a lawyer - but my instinct would be to go the other way (maybe that's why I'm not a lawyer, ha!)

    It'll also be very interesting to see how the other well-funded CDN players react to this - Level 3 (
  • So they've patented (Score:5, Interesting)

    by n6kuy ( 172098 ) on Sunday March 02, 2008 @09:26PM (#22619284)
    ... automatic redirection to the "nearest" mirror?

    Brilliant!
    What a novel use of technology.

    Surely this is just amazing. Who ever woulda thunk that computers could do things for us automatically?
    • They wouldn't have won their case unless it was a novel use. :) The courts are always right and justice prevails in all instances. Those who mistrust Lady Liberty are anti-American.

      You aren't anti-American, are you n6kuy?
  • by PhrozenF ( 205108 ) on Monday March 03, 2008 @12:16AM (#22620218)
    Guys, I've been at Slashdot for years, and have never seen such blatant disregard for the core subject matter. You guys are all going on writing about how obvious the patent is / how bad akamai is, without even looking into the matter. I've been an Akamai customer for many years now, and no matter how much of a bloodsucking leech they are, and how exorbitantly they price their services, they do have some massive innovation going behind their products.

    First, the patent isn't so obvious. The patent is for Edge Side Includes, which is in no way trivial. It is the method by which you can have a full HTML page (eg.the slashdot homepage), cached at the akamai edge servers, and have one part of personalized message (welcome USERNAME / you have X private messages / etc. etc.) load from the origin servers, taking into account all cookies etc. Doing so required inventing a whole new method of writing, interpreting, and selectively applying caching to enhanced include tags, that too across a distributed network, supporting other cool items like tiered distribution, progressive caching, server side cookies etc. etc.

    Now, realize, this isn't about loading one object, like an image / flash object / javascript from a different server, but transparently loading a part of the core HTML code of a page from the origin server, with full support for cookies / post etc. while making it look like it is coming from the same physical source, so as to maintain cookie coherency. Trust me, before Akamai's founder came around and invented this, web caching static objects with personalized items was like pulling teeth. Also, Akamai is licensing this technology to the whole world http://www.akamai.com/html/support/esi.html [akamai.com], and if they choose not to license this to their competitors, but the competitor goes ahead and implements it "as-is" based on their spec, then hey, the competitor deserves to be sued.

    And you know what? Limelight is a bunch of ex-akamai guys, who left with a boatload of trade secrets, and customer lists. I got a call from them within 15 days of their service starting, asking to switch over at half price, but their Super POP model doesn't work for dynamic content like ours.
    • Re: (Score:3, Insightful)

      by thehossman ( 198379 )

      Akamai is licensing this technology to the whole world http://www.akamai.com/html/support/esi.html [akamai.com] , and if they choose not to license this to their competitors, but the competitor goes ahead and implements it "as-is" based on their spec, then hey, the competitor deserves to be sued.

      That almost makes sense, except that even according to the Akamai page you link to...

      The ESI open-standard specification is being co-authored by Akamai, ATG, BEA Systems, Circadence, Digital Island, IBM, Interwoven, Oracle, an

  • A company winning a lawsuit against a competitor since they have patented the objective of an algorithm rather than the algorithm itself, nothing new to see here, come back later.
  • by Evets ( 629327 ) * on Monday March 03, 2008 @04:36AM (#22621460) Homepage Journal
    Edge caching in general is obvious. The implementation is not, and that's what this lawsuit is about.

    Limelight copied akamai's patented edge cache implementation, and violated enough of the patent to warrant this decision.

    I can see how a bunch of people jump on the "obvious" bandwagon, developing an edge cache for a single enterprise would be relatively easy to do. Developing an edge cache infrastructure that will work across hundreds and thousands of different enterprises with different business and process infrastructures with varying and often conflicting traffic load patterns is an entirely different problem.

    Let's look at the url rewriting aspect of it. The rewritten urls are specified to include popularity flag and use virtual hosts as a serial #. Something that would be obvious today, but in '98/'99 not so much.

    Let's look at geographic dispersion. What is your obvious geographic dispersion methodology? A database like MaxMind has to offer? Those were few and far between in 1999. They use a network map. Further, they also consider the load of their servers when returning dns responses - so we're not just talking about getting the closest server to the edge, we're talking about getting the closest server with the lowest load at the edge. There's also the problem that the dns request is in most cases not coming from the client machine, but from a geographically disparate dns cache. The akamai system redirects users to a closer server if one exists and that situation is discovered by the web server.

    There's also the situation that the network load of a particular server becomes too heavy while serving a particular request midstream (i.e. for audio and video). This patent covers switching the responsibility for handling a request to another server midstream.

    This patent also includes anti-DOS/DDOS technology - a no brainer today, but in 1999? Not so much.

    I wouldn't categorize Akamai as good or bad. I don't know much about the company itself. I can see the possibility of working around their patent to build a competitor - it's not an all-inclusive-only-we-can-do-CDN patent. But given the fact that limelight's foundation was built by a bunch of Akamai's ex-employees, it certainly isn't surprising that they chose the same path for resolving issues that Akamai did. And given Limelight's close ties with Microsoft, it's also not very surprising that they chose to emulate what they knew rather than innovating and improving upon the model.
  • slashdot has patented kneejerk reaction. Fox news contests.

BLISS is ignorance.

Working...