Amiga Demonstration Helps Win Against Patent Troll 239
Amigan writes "Over on Groklaw, PJ is reporting that an actual demonstration of the Amiga OS (circa 1988) on an Amiga A1000 may have been the turning point in the lawsuit of IP Innovation v. Red Hat/Novell."
It's True. (Score:5, Funny)
No respect. (Score:3, Insightful)
Re: (Score:2)
Comment removed (Score:5, Informative)
Re:It's True. (Score:5, Funny)
As long as we are reminiscing of ye olden times in porn I remember when it was ground breaking to incrementally display the porn as it was being transferred over the modem. Ahhhh... the memories.
To this day that magical sound of two modems negotiating a connection gets me excited.
Re: (Score:2, Funny)
3 days? (Score:2)
Boss calling up on Monday morning: "Hey are you coming yet?".
Re:It's True. (Score:5, Funny)
You reminded me of a story... of a long time ago.. in a far away place....
Normally I am very careful about posting real events that occurred in my life since I fanatically guard my privacy and anonymity.. but this needs to be told and it is time to tell it.
Quite a few years back I was attending a university and lived in a quasi-fraternity house off campus. One of my friends was in his room connected up to some chat service over the modem. I came in and sat down in the beginning of what turned out to be a horrifically depraved example of cyber sex.
Towards the end there were at least a dozen guys in the room and every one of us kept trying to one-up each other on what we thought we could get this chick to do. No webcams at this point in history, and I know our collective wisdom today sets off alarms like, "It's really a dude".
This chick was off the hook perverted. Depravity at a level you could only hope to find and marry as quick as possible. I think one dude passed out at some point (kidding). Finally one of us has the bright idea of asking her to do a file transfer over the modem with one of her pictures... naked. She agreed all too quickly.
Now there are about 12 guys pushing each other to get a prime viewing position for the monitor. Line by line the picture starts to form. It starts at the top of her head, we get to see her ears, and then........... the picture just keeps getting WIDER. It never got any thinner and her head was like the tip of an iceburg. Literally. .
Pandemonium ensues. After a minute or two of absolute hysterical laughter everyone but my friend and I are left in the room with the creature from beyond all comprehension staring at us from the monitor with 300 pounds of tits. I tell him not to feel bad and the best advice I could give him was to roll her in flour and find the wet spot. I then beat a hasty retreat.
It gets better.....
Two days later after, what is now simply referred to as "Cybersex with Godzilla", five us were in a fast food restaurant in the middle of the afternoon. My back was turned to the door and I remember that suddenly it seemed as if there was a total eclipse. My friends in front of me look they are in a state of total shock. I look behind me and see the entire frame of the door taken up by none other than Godzilla herself. She was 6'3" and at least 500 pounds. After literally squeezing through the door she made her way to the front to order the restaurant to go.
Guess who was with us? Yes... the man that started it all. What followed was a hushed and tense negotiation of what he was going to provide us over the next 30 days to NOT shout out his name and bail with the car.
To this day the only way I can explain how I felt about the whole thing was saying, "Imagine if you saw a picture of Sasquatch and you met it the very next day?".
Re: (Score:2)
Ah.... ASCII porn... The good ol days!
Re: (Score:3, Funny)
Ahhhh... the mammaries.
There, FTFY.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Interesting)
There's nothing that Amiga demos cannot accomplish.
I recall the first tyme I saw an Amiga demo IRL. It was set up to run the Mac OS and not just Workbench. Next to it was a new Mac running the same Mac OS. The Amiga ran the Mac OS faster than the Mac did. Another Amiga was running MS DOS and Windows 3.x.
Falcon
Re: (Score:2)
Wait a moment. MacOS and Win 3.1 in their time being able to run on the same hardware?
Win 3.1 has always been restricted to x86 processors.
MacOS in it's time on what was it the Motorola 68something or so. Definitely not Intels. Win 3.1 was from even before the PPC era. Especially if Amigas were still around other than as museum piece.
Now I don't know the hardware of the Amiga but I can not believe it would be able to happily emulate a totally different hardware layout, and still be speedy.
Or am I missin
Re: (Score:3, Interesting)
Wait a moment. MacOS and Win 3.1 in their time being able to run on the same hardware?
Win 3.1 has always been restricted to x86 processors.
Back in the olden days, it was possible to buy an expansion card for several types of non-x86 system that had all the x86 hardware necessary to run DOS and Windows.
I had one for my parents' Apple IIe - the Applied Engineering PC Transporter. IIRC, it was similar to the Atari 2600 module for the ColecoVision in that it really just used the Apple for its keyboard and monit
Amiga marketing (Score:2)
If only amiga was able to make their OS more wide-spread and accepted... Sigh.
Yeap, the Amiga marketing was shitty. When Gateway bought the Amiga from Escom I was hoping they'd revive the Amiga but it looks like all they did was waste money. There is AROS [sourceforge.net] but I don't know how that's going.
Falcon
Re: (Score:3, Interesting)
I was a fully-initiated member of the Church of Amiga, but truth be told... the good old days weren't really as great as we like to remember them. I remember my endless frustration over software (mostly European, but that was because American Amiga games totally sucked and were basically warmed-over ports of their EGA PC versions) that crashed and burned if you had anything besides an Amiga 500 with no fast ram and a floppy drive. 2-8 megs of fast ram? Guru. Hard drive? Guru (but less wait to get to it). 68
Re:It's True. (Score:5, Insightful)
Nice comment, but I'm not sure you're speaking the absolute truth. John Carmack pretty much reinvented the side scroller for PC hardware with the Commander Keen series (scrolling is easy on the Amiga, but difficult to do well on a primitive EGA/VGA screen), and he wasn't an Amiga programmer. When he went on to make the more influential Wolfenstein and Doom, he still wasn't an Amiga programmer. On the demo scene, the legendary Future Crew apparently moved from the C64. Wing Commander, the game that finally took the computer gaming crown to the PC, was certainly not done in Amiga style -- it was full of DOS hacks, and the graphics didn't replicate any of the techniques made popular by the Amiga.
Re: (Score:3, Interesting)
>>>Euro games that crashed and burned if you had anything besides an Amiga 500 with no fast ram and a floppy drive. 2-8 megs of fast ram?
My Amiga 500 has 1 megabyte of RAM (half chip/half fast) and runs everything just fine. The only time I get a Guru is when I'm doing something stupid, like trying to run two games at once. The MMU in the 68020 eliminates most of those conflicts, by stopping programs from overwriting one another.
Perhaps the problem you had was trying to run those 50 hertz games
Re:It's True. (Score:5, Interesting)
Former Amiga programmers realized
Look I was a game programmer back then, I worked on PC, Apple //, and Commodore 64-series products among others ... and you're just wrong. You don't know what you're talking about. I never did anything for the Amiga, and I find it kinda irritating that you believe programmers like me required Amiga experience in order to be good at our jobs. Geez, you Amiga guys sound like Mac fanboys sometimes. You just make shit up. I spent seven years before that failed attempt at a personal computer ever hit the market, hacking high-speed graphics code on a number of different microprocessors, and neither I nor my employers ever felt that I needed to learn to study the Amiga to write graphics an animation code for other systems.
... pretty much bare metal and raw assembler all the way through. That's because the CPU had to do everything, except maybe sound if you had an early Soundblaster. No fancy graphics or sound chips, no sirree.
// lines. That's because many things that were easy on the Amiga took some very sharp, largely ex-Apple ][ programmers to do well on the PC.
You're giving all the credit to ex-Amiga coders for driving the game market forward and that's just ridiculous. Most of the guys I knew that bought into the Amiga hype went over to the Mac because they didn't want to be dealing with the bare metal. Most of them hadn't a clue what an I/O port was, much less how to screw around with refresh timing or anything else on a VGA card. They let the custom ASICs do all the work. Arcade game development on the IBM compatibles of the era was a lot like it was on the Apple ][
The Amiga had many hardware and other advantages, and the reality is that experience with the Amiga's custom chips didn't count for SQUAT when it came to coding for what passed as video on PCs at the time. Matter of fact, the Amiga's hardware support spoiled the typical Amiga developer and put him at a distinct disadvantage when it came to working on the PC or Apple
That's the real history. You can assign credit any way you like, but those of us who were there will likely go all Guru Meditation on you.
Re:It's True. (Score:5, Insightful)
The IIgs's 3200 color mode literally had a unique 16 color palette for each and every scanline, hence 16 * 200 = 3200.
HAM had a 16 color (4-bit) palette for the entire screen, and then a pixel (which were 6-bit, not 4-bit) could be flagged to be a modification of the previous (from the scanline above) pixel color. HAM mode was an ugly thing to program for and was certainly not suitable for efficient rendering.
The IIgs thrived on its per-scanline capabilities. Each scanline could literally have a different palette and resolution.
It was lacking a blitter chip so was deficient compared to the amiga in 2D sprite based stuff, but it was much better at vector and 3D rendering (because of its Fill Mode) than the Amiga.
It also had 32 channel mono wavetable synthesis (16 stereo), compared to Amiga's 4 pannable mono channels.
So no, the Amiga was not way ahead of the pack in capabilities. The Amiga was good, but it really wasn't as special as Amiga users made it out to be. The Amiga had a much bigger install base so got a lot more games written for it. Apple was playing two-faced during this period, pushing the Mac instead of the IIgs.
Amiga demos rocked! (Score:5, Insightful)
Re:Amiga demos rocked! (Score:4, Funny)
Commodore has sushi and sold it as fish, sadly.
That's actually an insightful analogy on more levels than was probably intended. There was a time in the Western world, before sushi had ascended to its current status, that it was much easier to sell fish & chips than it was to sell sushi. People were actually grossed out by the idea - "Raw fish? Ewwww. Plus it's ethnic food!"
So, the decision to market it as fried fish or sushi was not so clear-cut in the 1980s. Nobody really knew what to make of the home computer market. It was a quirky world that could have become anything, and monumental marketing/strategy blunders were commonplace. Although there's little that can top the hilarity of an earlier era's bizarre attempt at marketing computers. [wikipedia.org]
Re: (Score:2)
Some parts of Western world eat sea fish which are basically raw. It's just that they are not considered to be anything "fancy"... (herring, sardine, sprat, mackerel in salt or vinegar; also in oil or sour cream though those, possibly, somehow less raw)
Re: (Score:2)
There was a time in the Western world, before sushi had ascended to its current status, that it was much easier to sell fish & chips than it was to sell sushi. People were actually grossed out by the idea
People in the Netherlands, northern Germany, and Scandinavia eat raw herring and have done so for ages -- it's a delicacy. Also, raw oysters, steak tartare... I'll grant that raw dead animals were never as prominent on the European menu as in Japanese cuisine, but they were certainly there before the current wave of Asian food started. :-)
Re: (Score:2)
People in the Netherlands, northern Germany, and Scandinavia eat raw herring and have done so for ages -- it's a delicacy.
I probably should have narrowed my scope to "the English-speaking Western World" but that doesn't quite work either. Lacking another meaningful category, let's just say Britain, America, Australia, etc. The white trash countries, basically.
Also, I wouldn't quite put things like oily fish, caviar and oysters in the same category as sushi (although they are used as ingredients in sushi). They're a lot more meat-like and strongly-flavored than things like white fish served raw.
Anyway - I was just talking about
Re: (Score:2)
Although there's little that can top the hilarity of an earlier era's bizarre attempt at marketing computers.
Now that's really disturbing. Not the sexist tone of an old ad from a bygone day, no... that's expected. What's disturbing is how women are treated like pieces of meat, drooled over, ridiculed, sexually harassed and frightened away from computer science IN THIS DAY AND AGE. I mean for fuck's sake, here's an advertisement talking about a woman's place in the kitchen and then nonchalantly, without ev
Re: (Score:2)
Re: (Score:2, Insightful)
Steak Tartare
Time for another overrated comment.
David Gay
Re: (Score:3, Insightful)
Sushi became popular exactly because of this
really? Here I thought it was because it was yummy...
Re:Amiga demos rocked! (Score:5, Insightful)
Eating meat raw has always been a sign of unsophistication in Western culture.
As the other reply notes - Steak Tartare and Carpaccio have long been considered at the heights of sophisticated Western dining.
Sushi became popular exactly because of this - by rejecting our own culture and embracing an alien one, you show how sophisticated and different you are from the masses. In addition, the high cost (in the 80s anyway) kept the morons out.
I don't buy this argument. Firstly, it contradicts itself - if you are eating a certain food just to show how different you are, doesn't that make you a moron? So if this were the case, wouldn't it be keeping the morons in?
I think there's a much simpler explanation - Globalization exposed people to different foreign cultures, and sushi is delicious. Over time, foreign foods become normalized. In the 1980s, there just weren't very many sushi restaurants outside of Japan, so few people got exposed to it. I very much doubt that most customers ate it simply to be snobby or different.
So what would be your current day example of such behavior? I mean, you don't see people going to, say, Danish restaurants and acting "oh, look how edgy and different I am eating this food that hardly anybody eats!"
Re: (Score:2)
"I don't buy this argument. Firstly, it contradicts itself - if you are eating a certain food just to show how different you are, doesn't that make you a moron? So if this were the case, wouldn't it be keeping the morons in?"
Indeed but his assumption that keeping the morons out would lead to widespread popularity is false. Keeping the morons in is the road to widespread success.
Re:As seen on Oprah (Score:4, Insightful)
As a matter of fact, Oprah has Rugbrød flown in straight from Denmark for her breakfast (google Oprah, Rugbrød, and check the Danish-press articles. I couldn't find a decent English one).
But that doesn't mean she's trying to show how edgy and different she is. Maybe she just really likes it? There's a difference between being a foodie and eating food to make a statement.
The point of such behavior: what we eat is the primary social differentiators.
Why would Oprah need anything to differentiate herself? She has a fuckton of money more than the average person, and is one of the most influential people in America. She doesn't need to prove herself by trying to be different.
But the ability to serve sushi, and to eat it, indicates belonging to a social group of wealthy, educated elites.
Oh bullshit, even middle-class people in the US can afford fine sushi. Hell, I make it from scratch, and it can cost less than what people typically spend on a fast-food meal for the family.
That's also why in the US, they make sickly sweet "blush" wines and overoaked chardonnays: Americans associated drinking wine with bourgeois status, but many don't like the taste.
Again, they buy them because they prefer the taste. It has little to do with social status. Nobody seriously links drinking wine with sophistication anymore.
Re: (Score:2)
I'd make a guess that eating "raw" fish atleast has been a long tradition of hunter/gatherer cultures. Atleast nordic/scandinavian societies have recipes which are considered as food that are typically consumed during holidays - they might not be raw in the same sense as sushi but drying and salted fish are still respected delicasies.
Re:Amiga demos rocked! (Score:4, Insightful)
Nah. Eating meat raw has always been a sign of the very most expensive dining. Go into a very posh restaurant and try ordering a steak "well done" and see the looks it'll get you; your choice is "rare" or "blue", if you want to fit in. Fish is often served raw in western culture too; smoked salmon is basically uncooked, oysters are usually served raw, sea bass is best uncooked (in the best restaurants).
Well cooking food is a peasant thing- if the meat is cheap, you need to cook it lots to stop it killing you. If the meat is raw, it has to be high quality and expensive.
Anyway, sushi is yummy. That's all that really matters.
It really wasn't marketing (Score:2)
There were two big things that worked to kill Amiga, other than simply not being DOS (which was the standard even back then):
1) Cost. Amigas cost a whole lot more than other computers. Also let's not forget that in general computers were expensive. So when you were already talking something that was a major purchase and then talking something that was more expensive on top of it, well that gets real hard for people to justify. Sure the higher cost bought you something better, but the money isn't always ther
Re: (Score:2)
It was too expensive compared to other computers to every become the system most people owned, and it fell behind when it came to pros
Sounds like bad marketing to me...
Re: (Score:2)
Amigas were cheaper than the IBM PCs and short list of clones available at the time and much less expensive than the Macs. They weren't cheaper than the 8 bit computers of their day, but then they weren't 8 bit computers. The Amiga had four channel, stereo sound compared to the bleeps and bloops of PCs and Macs, 4096 colors instead of 16 or plain black and white, and a multitasking operating system a decade before Windows. And because of the assistance of parallel processing, the Amiga could execute
Re: (Score:2)
They weren't cheaper than the 8 bit computers of their day, but then they weren't 8 bit computers.
Actually, if I recall correctly, there was a time when Amigas cost less than the Apple IIe. They certainly weren't expensive compared to Macs and IBM PCs.
The Amiga had four channel, stereo sound compared to the bleeps and bloops of PCs and Macs,
Arrgh, this is the second time in the last 24 hours that somebody has claimed that Macs only made beeping sounds. It's just not true. From the very first Mac they had audio. Not as good as an Amiga's, but certainly not "bleeps and bloops." You could even get sound digitizers for them.
What's more baffling is how people make this claim, when at the very laun
Re: (Score:2)
...and the GS has an Ensoniq chip with 32 channels (though they're paired, and one pair used by the ROM routines.. so essentially 15 channel sound)..
Re: (Score:2)
True. When I bought an A500 as my first computer in 1990 it was clearly the smart choice for a poor student.
The senator from Disney is needed (Score:5, Funny)
Wont someone legislate to close this prior art loophole.
Re: (Score:2)
Seriously. What part of prior do they not understand?
Re: (Score:2)
Seriously. What part of prior do they not understand?
The part about them not inventing or developing it ... or being able to profit from it by suing people.
Upton Sinclair to the rescue by way of Al Gore (Score:5, Insightful)
Re: (Score:2, Funny)
What part of prior do they not understand?
Richard?
Re: (Score:2)
In their defense, his comedy could be pretty cutting edge.
Re: (Score:2)
Richard?
You misunderstand Richard Prior at your own peril! That shit has consequences.
Re: (Score:2)
"prior"
What's an Amiga? (Score:5, Funny)
Ahead of the curve (Score:5, Interesting)
I always loved the way the Amiga offered functions other computers of the same era never came close to matching..
I love the quote from the owner who produced the working model.. "My Amiga Killed a Troll!"
Re: (Score:2)
You know, with a little twist (and still discarding NeXT machines - more than "close" to Amiga, but hideously expensive), saying "other computers of the same era never came close to matching" (emphasis mine) is too strong of a statement.
Even contemporary 8-bit computers came somewhat close, after a while; with operating systems like Contiki [wikipedia.org] or SymbOS [wikipedia.org]. And let us not forget what was rather quickly possible with Ata...uhm...ok, I won't go there ;)
Re: (Score:2)
Even contemporary 8-bit computers came somewhat close, after a while; with operating systems like Contiki or SymbOS.
I was an 8-bit computer user. I owned an Amiga. You, sir (?) never used an Amiga, obviously. 8-bit? Contiki? You're out of your mind. But that's okay, because the Amiga is dead, and I'm no longer required to kill infidels like you. :)
Re: (Score:2)
I also started on 8-bits. Plus I'm actually from a place where Amiga, for half a decade, was the standard home computer; PCs got its foothold only in the late 90's.
Notice I objected (and not very strongly) only to the word "never"; mentioned not only Contiki, also SymbOS (watch this video [google.com]; you will really tell me that's not "somewhat close"?)
Re: (Score:2)
Re: (Score:3)
((Anyone remember when WUStL was the hub of the warez scene? *sheds a silent tear*))
Re:Ahead of the curve (Score:4, Interesting)
Man, that brings back memories. I remember in the early 90's when I heard that wuarchive was being upgraded to have a... wait for it... a 14GB hard drive (although in retrospect it was probably a RAID array rather than a single spindle) and I was simply dumbfounded by that amount of storage and wondered how on earth they'd ever fill it.
Now, my phone has more storage than that...
Re:Ahead of the curve (Score:4, Insightful)
Jesus christ, they're not that rare. (Score:2, Funny)
I have five working Amigas sitting next to me. FIVE. All with Commodore branding, and including an A1000. University dumpsters were a gold mine for these things a few (by which I mean five) years ago. Groklaw speaks as if someone restored a System/360 or something!
Say what? (Score:3, Insightful)
Re:Say what? (Score:5, Informative)
Multiple screens and switching.
this was the original shout out requesting reader prior art:
http://www.groklaw.net/article.php?story=20071011205044141 [groklaw.net]
about 10 comments into the discussion someone mentions this is exactly what the amiga had.
http://www.groklaw.net/comment.php?mode=display&sid=20071011205044141&title=M%24+Virtual+Desktop+Manager+licensed+by+IP+Innovation%3F&type=article&order=&hideanonymous=0&pid=634370#c634821 [groklaw.net]
Re: (Score:3, Informative)
Although this is somewhat tangential, I have to mention that what the Amiga had was actually much cooler than the facility to switch between screens with different resolutions. You could slide each screen down by grabbing the bar at the top of the screen with your mouse, to reveal those beneath. So at times, and quite commonly, you would have different visible parts of your monitor displaying parts of screens with different resolutions (and, if I recall correctly, their own color depths as well).
Re:Say what? (Score:5, Interesting)
You could slide each screen down by grabbing the bar at the top of the screen with your mouse, to reveal those beneath. So at times, and quite commonly, you would have different visible parts of your monitor displaying parts of screens with different resolutions (and, if I recall correctly, their own color depths as well).
That really was super-cool. I believe you are correct about the different color depths, too. There was just something compelling about that mechanism, it was like peeking behind a curtain to see backstage, perhaps? Maybe the youngsters would say it would be like seeing the matrix or something. It just had this incredible fluidity to it. Editing a document or program, and want to take a peek at how your 3D render in the background is going? Oooh... nice, just 8 more hours to go, looking good so far.
Re: (Score:2)
Yea, I remember that, it was very cool. I dont remember if you could do that with a full screen game running though.. I was using truespace3d in that day to do all my modeling and it was very cool to be able to work on other stuff while that was happening. Except it took alot of processing power and slowed the machine down.
Re:Say what? (Score:5, Interesting)
The success is all very nice and all, but what was the disputed issue?
The actual dispute is irrelevant ... Linux won a patent suit and that's all we care about. A patent troll lost and will have to pay court costs. Double bonus points!
Here's the slashdot story [slashdot.org] about the court victory
Here's a link to the post that details the patents [slashdot.org]
Re: (Score:2)
And an article that came out when this began:
http://www.linux-watch.com/news/NS2013674721.html [linux-watch.com]
Re: (Score:2)
OS-9 (Score:5, Interesting)
Let us not forget that OS-9 was doing it before Amiga.... and that was also submitted by someone as prior art from 1983:
http://www.post-issue.org/prior_art/83/detail [post-issue.org]
OS-9 was my first "real" OS, before eventually switching to Unix, then Linux. Back in the day, it was extremely impressive.
Re: (Score:2, Interesting)
Re: (Score:3, Interesting)
The C=128 could also do this; you could hook up a split composite (now called S-video) and RGBI monitor at the same time and have an app display different outputs on each screen.
Re: (Score:2)
Loser...I saw the 4k coco in the 'Shack and chose the 16k model...
Dungeons of Daggorath? Polaris? The mining donkey kong clone (can't believe I can't remember the name)
Re: (Score:2)
Mac? Who said anything about Mac? OS-9 [wikipedia.org] was not an Apple OS.
The original ALE mailing list post (Score:2)
patent troll (Score:2)
Re:MORE (Score:5, Interesting)
More prior art plskthx.
But that's the problem in itself right there. Yes, chances are that there is little "new" being done in software for the most part, and that someone has done [patent idea] before, but just imagine trying to find just the right bit of software, or just the right platform to show it's been done before.
The patent office couldn't instigate a "Prove no-one has done it before" process as that would be just ludicrous, but at the same time, having the right people on hand to show "just exactly where it HAS been done before" may not be 1) cheap, 2) practical and 3) possible.
There simply isn't an easy solution to this. If you abolish software patents, it makes it very difficult for companies to realistically spend millions on development of new concepts and ideas when someone can then just take the ground breaking UI or process etc. If you don't abolish patents, you still end up with the farcical joke that we have now.
Here, it really is a lose - lose scenario. Except if you are a patent lawyer.
Re:MORE (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
For a start there is such thing as "copyright" that covers software nicely.
Secondly there is something called a "design patent" which allows you to protect a specific design, e.g. of a machine, or of a UI. Some years ago here on /. (sorry too lazy to search) there was mentioned that Apple was granted a patent on the waste basket of OS-X. That was a design patent. Other vendors may still implement waste baskets, but they are not allowed to look just like Apple's.
Re: (Score:2)
And on top of that the patent system (used to) allow technical different implementations to the same result.
E.g. fans: the result is blowing air. Still there will be many different ways to (mechanically) blow air, each of which are patentable and rightfully so.
It is not the idea of blowing air that is patentable, how interesting it may be in itself, it is the implementation on how to do it that is patentable.
Re: (Score:2, Informative)
Re:MORE (Score:5, Insightful)
Which is an illustration of the IP problem. A design is a textbook case of something which clearly belongs to copyright protection, not patent.
Re:MORE (Score:4, Informative)
Your comment is an textbook case of the IP problem--ignorance of the issues that is popularly, and blindly, reinforced as a worthwhile statement.
A copyright cannot be used to protect a useful article. A patent cannot be used to protect nonfunctional aspects of an object. A trademark has limited application and cannot protect objects clearly marked as unrelated. Thus, a design patent (which is usually known as an "industrial design" in most countries and is not a patent in the ordinary sense, having different application procedures, a shorter term, and a narrower scope of protection) bridges any gap that might arise, providing protection for the nonfunctional, distinctive design of a useful object, as well as provides an alternative to seeking independent protection of individual aspects of a creation.
There is certainly some overlap with copyright, but industrial design is not copyrightable unless its form can be separated from its medium--you can't copyright a car. You can copyright photographs, drawings, paintings, sculptures, songs, and stories of the car, but the car itself needs an industrial design registration to protect. In the US, that's called a design patent.
An industrial design registration simultaneously protects creative enterprise, promotes distinctiveness of competing products, and rewards successful integration of art and science. There is little legitimate reason to be upset about having to come up with an original design, given that it is difficult to infringe accidentally.
(And FYI, it's 'something that clearly', not 'something which'.)
Re: (Score:3, Interesting)
Re: (Score:3, Insightful)
Best scenario that I can think of is make the USPTO website really a lot easier to use. I think they do a good job considering the volume of crap they have to deal with, but it could be easier.
Second, allow anyone to submit comments regarding any prior art relevant to the claims of any patent application. So if someone posts an application with claims X, Y and Z and it's a rehash of an old idea, someone can just post a comment "Yo examiner, this was done in FVWM in 1995. Reject this shit."
And voila, it is r
Re: (Score:2)
So if someone posts an application with claims X, Y and Z and it's a rehash of an old idea, someone can just post a comment "Yo examiner, this was done in FVWM in 1995. Reject this shit."
But do you sit there sifting through applications? I don't. I have better things to do with my free time. I think just about everyone else would too. Perhaps a different twist on this is that a patent can be quickly and easily invalidated if someone shows prior art after it has been granted. However, in that case, would it actually then be transferred to the people that whose work was used to throw it out?
Re: (Score:2)
But do you sit there sifting through applications? I don't.
And that must mean that no-one does. Companies do look at patent applications of their competitors and should be given the opportunity to say "hey, I've already done that" before the patent is approved, not being forced to fight it during an expensive lawsuit later on.
Re:MORE (Score:4, Funny)
"This whole Linux thing won't work because I have better things to do with my free time than program a computer." **
**quote taken from slashdot comment in 1994***
***actually a hypothetical quote taken in 1994 if slashdot had existed in 1994
Re: (Score:2)
"But do you sit there sifting through applications? I don't. I have better things to do with my free time. I think just about everyone else would too."
Do you sit there correcting spelling and grammar errors in random wikipedia articles? I don't. I have better things to do with my free time. I think just about every else would too.
Of course I, and likely you, would be wrong. People do exactly this. All you need is some kind of recognition or reward system and people will do it. This carries extra kudos. If y
Re: (Score:2)
Re:MORE (Score:4, Insightful)
There simply isn't an easy solution to this. If you abolish software patents, it makes it very difficult for companies to realistically spend millions on development of new concepts and ideas when someone can then just take the ground breaking UI or process etc.
How about the fact that one company will be first to market and develop continously improving iterations staying ahead of the competition? To take for example graphics card as an example, the designs are often started 3-4 years in advance. Let's say they start now with a released card and probably spend the first year reverse engineering it, whatever they learn might be out in 2015. And then they'll be five years behind copying the 2015 models. You have to weigh that against the impact of granting a monopoly for 20 years - why should they continue to invent when they have an essential patent and can basically price gouge the market any way they want? It's really important to understand that software patents will stifle innovation too, and they're only worth it if the good outweigh the bad.
Re: (Score:2)
If you abolish software patents, it makes it very difficult for companies to realistically spend millions on development of new concepts and ideas when someone can then just take the ground breaking UI or process etc.
Sorry... which new concepts and ideas companies have spent millions on would you be referring to? Name one useful software patent that is not obvious... Please.....
Re: (Score:2)
However, they are about the only thing I think deserves patents in the field of computing. Everything else is just a rehash of ideas first implemented in the 1960s.
And compression algorithms are different from pure mathematics, how...?
Re:MORE (Score:5, Interesting)
(Ecclesiastes 1:9-14 NIV) What has been will be again, what has been done will be done again; there is nothing new under the sun. {10} Is there anything of which one can say, "Look! This is something new"? It was here already, long ago; it was here before our time. {11} There is no remembrance of men of old, and even those who are yet to come will not be remembered by those who follow.
citation [globalchristians.org]
/I'm not prone to cite bible verse, but there you go. All your software patents are invalid. It sez so in the Good Book. The verse itself is an uncited theft of the work of Sophocles c. 429 BCE - himself a synthesist who didn't cite the vast realms of prior art from which he distilled his digests of the written and performed arts into their purest forms. Sophocles was a hack, but we don't have records of the prior art he stole, or today he'd be a pirate. His synthesis though? Timeless art in and of itself. It's good thing for us ancient Greece didn't have DMCA, DRM, and eternal copyright or he'd be Sophowho? To most he already is.
If only ancient Greece, or modern Phoenix, had a sort of distributed Library of Alexandria where one works could not be forgotten - where the wisdom of our fathers and their fathers (and their foolishness too) might be preserved and so remain available to our children and their children. Something like a Google for books. Alas, copyright prevents it and copyright is now eternal in every practical sense. So it is that each new generation, constrained by previously patented and copyrighted art has diminishing realms of imagination to work with - until the lawyers finally abolish imagination altogether and we reach the asymptote where creation ends. So then we lay upon our children the duty to rethink the thoughts we've had, to re-invent our inventions, and to do so in peril of the trolls who lay claim to a third degree ownership of any potential perceived reference to characters or invented places in a brief manuscript published in 100 copies only, 200 years before - and upon their children we lay a logarithmically greater burden.
As patents are the death of invention, copyrights are the death of art. A pity our children must climb these mountains we've built for them without the benefit of a culture, but culture itself is deprecated in this regime in preference to whatever mindless new drivel can escape lawsuits long enough to become popular - and then is itself extinguished in a flurry of lawyers and cocaine.
We might have stood on the shoulders of giants, but now we huddle in fear of lawyers.
Re: (Score:2)
If you abolish software patents, it makes it very difficult for companies to realistically spend millions on development of new concepts and ideas when someone can then just take the ground breaking UI or process etc
And yet almost all of the software on your PC was written by people who didn't either rely on patent protection for their ideas, or pay others for the use of theirs.
Re: (Score:2)
Without patents on physical goods, innovators stand to lose hundreds of millions of dollars, at least. This already accounts for at least 2 orders of magnitude di
Re: (Score:2)
"If you abolish software patents, it makes it very difficult for companies to realistically spend millions on development of new concepts and ideas when someone can then just take the ground breaking UI or process etc."
This is complete FUD. Software patents only keep the little guy out of the game as it is. The big guys all rip one anothers patents off because their competition is violating enough of their own patents that they don't want to face the fallout of a patent war.
Re: (Score:2)
Proving prior art by demonstration can indeed be hard and expensive, but it is often used as a powerful tool. I am working as patent engineer in Europe, and I am currently in the process of tracking down a certain model of oldtimer to show a technical feature which is clearly prior art in an ongoing patent case, but which has for some reason never been documented in any written piece I could get my hands on. Not exactly an easy task, indeed, but sometimes it is the best way. It is a purely mechanical featur
Re:"Fake" (Score:4, Interesting)
If they know it's not a fake, then ultimately they will face the same situation.
They will be spending more of their own time and money, and possibly be liable for the additional court costs of the winning side.
That sounds like a potentially large risk to them.
Re: (Score:2)
"Of course, wheeling the Amiga 1000 in, and booting it up would have a better affect."
Exactly!
Prior art on a computer over 20 years old, running an equally "ancient" OS.
Doing that in an emulator would have been no where near as impressive and convincing to the jury.
In fact, there could have been a Mac Plus running System 6 and Switcher, along with a C=64 + RAM Expansion Unit running an app (the name of which escapes me) that swapped running C=64 apps into and out of system RAM from the REU, to demonstrate p