Desktops (Apple)

Razer Slims Down Blade, Debuts MacOS-Compatible eGPU Enclosure (arstechnica.com) 39

An anonymous reader quotes a report from Ars Technica: Today, Razer debuted big updates to its Razer Blade laptop, focusing on design and performance to usher the gaming notebook into 2018. While the new Blade still looks unmistakably "Razer," its design has changed dramatically for the better. Razer upped the screen size from 14 inches to 15.6 inches, reducing the surrounding bezels to just 4.9mm so that the device fits in with the other nearly bezel-less ultrabooks popular today. Razer is offering 1080p 60Hz or 144Hz panels, along with a 4K touchscreen option as well. The larger display panel makes the laptop slightly heavier than its predecessor, and it's a bit wider overall, too (4.7 pounds and 9.3 inches, respectively). However, the slimmer bezels, sharper edges, and aluminum unibody make the new Razer Blade look like a clear upgrade from the previous model.

Another new addition to the Razer lineup is the Core X, a Thunderbolt 3 external graphics enclosure with space for large, three-slot wide graphics cards. The Core X joins the Core V2 graphics enclosure as one of Razer's solutions for gamers who want to add desktop-like graphics power to their laptops -- and it's more affordable than the V2 as well. While it's a bit stockier than Razer's existing enclosure, the Core X has an aluminum body with open vents to properly handle heat, regardless of the task at hand. The Core X connects to a compatible notebook through one Thunderbolt 3 port, providing eGPU access and 100W of power thanks to its 650 ATX power supply. It's both cheaper and seemingly easier to use than the V2, but that comes with some compromises: the Core X doesn't have Chroma lighting, and it lacks USB and Ethernet ports.
Some other specs of the new Blade include a Intel Core i7-8750H processor, Nvidia GTX 1060 or 1070 with Max-Q graphics, up to 32GB of RAM, up to 2TB of PCIe-based SSD, and 80Whr battery. There are three USB-A 3.1 ports, one proprietary charging port, one Thunderbolt 3 port, a Mini DisplayPort, and an HDMI port.
Intel

Intel's First 10nm Cannon Lake CPU Sees the Light of Day (anandtech.com) 184

Artem Tashkinov writes: A Chinese retailer has started selling a laptop featuring Intel's first 10nm CPU the Intel Core i3 8121U. Intel promised to start producing 10nm CPUs in 2016 but the rollout has been postponed almost until the second half of 2018. It's worth noting that this CPU does not have integrated graphics enabled and features only two cores.

AnandTech opines: "This machine listed online means that we can confirm that Intel is indeed shipping 10nm components into the consumer market. Shipping a low-end dual core processor with disabled graphics doesn't inspire confidence, especially as it is labelled under the 8th gen designation, and not something new and shiny under the 9th gen -- although Intel did state in a recent earnings call that serious 10nm volume and revenue is now a 2019 target. These parts are, for better or worse, helping Intel generate some systems with the new technology. We've never before seen Intel commercially use low-end processors to introduce a new manufacturing process, although this might be the norm from now on."

AMD

AMD Integrates Ryzen PRO and Radeon Vega Graphics In Next-Gen APUs (zdnet.com) 76

The three biggest PC OEMs -- Dell, HP, and Lenovo -- are now offering AMD Ryzen PRO mobile and desktop accelerated processing units (APUs) with built-in Radeon Vega graphics in a variety of commercial systems. There are a total of seven new APUs -- three for the mobile space and four for the desktop. As AMD notes in its press release, the first desktops to ship with these latest chips include: the HP Elitedesk G4 and 285 Desktop, the Lenovo ThinkCentre M715, and the Dell Optiplex 5055. ZDNet's Adrian Kingsley-Hughes writes about what makes Ryzen PRO so appealing: Ryzen PRO has been built from the ground up to focus on three pillars -- power, security and reliability. Built-in security means integrated GuardMI technology, an AES 128-bit encryption engine, Windows 10 Enterprise Security support, and support for fTPM/TPM 2.0 Trusted Platform Module. One of the features of Ryzen PRO that AMD hopes will appeal to commercial users is the enterprise-grade reliability that the chips come backed with, everything from 18-moths of planned software availability, 24-months processor availability, a commercial-grade QA process, 36-moth warranty, and enterprise-class manageability.

There are no worries on the performance front either, with the Ryzen PRO with Vega Graphics being the world's fastest processor currently available for ultrathin commercial notebooks, with the AMD Ryzen 7 PRO 2700U offering up to 22 percent more productivity performance than Intel's 8th-generation Core i7-8550U in testing carried out by AMD. AMD has also designed the Ryzen PRO processors to be energy-efficient, enabling up to 16 hours of battery life in devices, or 10.5 hours of video playback. The Ryzen PRO with Vega Graphics desktop processors are also no slouches, opening up a significant performance gap when compared to Intel Core i5 8400 and Core i3 8100 parts.
AMD also announced that it is sampling its second-generation Threadripper 2900X, 2920X and 2950X products. "For Threadripper Gen2 you can expect a refresh of the current line-up; an 8-core Threadripper 2900X, a 12-core Threadripper 2920X and of course a 16-core Threadripper 2950X," reports Guru3D.com. "AMD will apply the same Zen+ tweaks to the processors; including memory latency optimizations and higher clock speeds."

AMD has something for the datacenter enthusiasts out there too. Epyc, AMD's x86 server processor line based on the company's Zen microarchitecture, has a new promo video, claiming more performance, more security features, and more value than Intel Xeon. The company plans to market Epyc in an aggressive head-to-head format similar to how T-Mobile campaigns against Verizon and AT&T. Given Intel Xeon's 99% market share, they sort of have to...
Graphics

Nvidia Shuts Down Its GeForce Partner Program, Citing Misinformation (theregister.co.uk) 82

In a blog post on Friday, Nvidia announced it is "pulling the plug" on the GeForce Partner Program (GPP) due to the company's unwillingness to combat "rumors" and "mistruths" about the platform. The GPP has only been active for a couple of months. It was launched as a way for gamers to know exactly what they're buying when shopping for a new gaming PC. "With this program, partners would provide full transparency regarding the installed hardware and software in their products," reports Digital Trends. From the report: Shortly after the launch, unnamed sources from add-in card and desktop/laptop manufacturers came forward to reveal that the program will likely hurt consumer choice. Even more, they worried that some of the agreement language may actually be illegal while the program itself could disrupt the current business they have with AMD and Intel. They also revealed one major requirement: The resulting product sports the label "[gaming brand] Aligned Exclusively with GeForce." As an example, if Asus wanted to add its Republic of Gamers (RoG) line to Nvidia's program, it wouldn't be allowed to sell RoG products with AMD-based graphics. Of course, manufacturers can choose whether or not to join Nvidia's program, but membership supposedly had its "perks" including access to early technology, sales rebate programs, game bundling, and more.

According to Nvidia, all it asked of its partners was to "brand their products in a way that would be crystal clear." The company says it didn't want "substitute GPUs hidden behind a pile of techno-jargon." Specifications for desktops and laptops tend to list their graphics components and PC gamers are generally intelligent shoppers that don't need any clarification. Regardless, Nvidia is pulling the controversial program because the "rumors, conjecture, and mistruths go far beyond" the program's intent.

Businesses

GPU Prices Are Falling (venturebeat.com) 149

An anonymous reader shares a report: If you were looking for a new graphics card for your PC over the last year, your search probably ended with you giving up and slinging some cusses at cryptocurrency miners. But now the supply of video cards is on the verge of rebounding, and I don't think you should wait much longer to pull the trigger on a purchase. Earlier this week, Digitimes reported that GPU vendors like Gigabyte, MSI, and others were expecting to see their card shipments plummet 40 percent month-over-month. The market for digital currencies like Bitcoin and Etherum is losing some of its momentum, and at the same time, large mining operations are pulling back on their investment in GPUs in anticipation of dedicated mining rigs (called ASICs) that are due out before the end of the year. These factors working in conjunction seem like they are leading to more supply, which in turn is forcing retailers to cut prices. For example, the Gigabyte GeForce GTX 1080 video card is selling on Amazon right now for $700. Other retailers even have it listed at the original MSRP of $600. These are the lowest prices of 2018 so far.
AMD

AMD Wants To Hear From GPU Resellers and Partners Bullied By Nvidia (forbes.com) 127

An anonymous reader quotes a report from Forbes: Nvidia may not be talking about its GeForce Partner Program, but AMD has gone from silent to proactive in less than 24 hours. Hours ago Scott Herkelman, Corporate VP and General Manager of AMD Radeon Gaming, addressed AMD resellers via Twitter, not only acknowledging the anti-competitive tactics Nvidia has leveraged against them, but inviting others to share their stories. The series of tweets coincides with an AMD sales event held in London this week. This was preceded by an impassioned blog post from Herkelman yesterday where he comes out swinging against Nvidia's GeForce Partner Program, and references other closed, proprietary technologies like G-Sync and GameWorks.

AMD's new mantra is "Freedom of Choice," a tagline clearly chosen to combat Nvidia's new program which is slowly taking gaming GPU brands from companies like MSI and Gigabyte, and locking them exclusively under the GeForce banner. The GeForce Partner Program also seems to threaten the business of board partners who are are not aligned with the program. Here's what Herkelman -- who was a former GeForce marketing executive at Nvidia -- had to say on Twitter: "I wanted to personally thank all of our resellers who are attending our AMD sales event in London this week, it was a pleasure catching up with you and thank you for your support. Many of you told me how our competition tries to use funding and allocation to restrict or block [...] your ability to market and sell Radeon based products in the manner you and your customers desire. I want to let you know that your voices have been heard and that I welcome any others who have encountered similar experiences to reach out to me..."
The report adds that Kyle Bennett of HardOCP, the author who broke the original GPP story, "says that Nvidia is beginning a disinformation campaign against him, claiming that he was paid handsomely for publishing the story."
Government

Palantir Knows Everything About You (bloomberg.com) 111

Palantir, a data-mining company created by Peter Thiel, is aiding government agencies by tracking American citizens using the War on Terror, Bloomberg reports. From the report: The company's engineers and products don't do any spying themselves; they're more like a spy's brain, collecting and analyzing information that's fed in from the hands, eyes, nose, and ears. The software combs through disparate data sources -- financial documents, airline reservations, cellphone records, social media postings -- and searches for connections that human analysts might miss. It then presents the linkages in colorful, easy-to-interpret graphics that look like spider webs.

[...] The U.S. Department of Health and Human Services uses Palantir to detect Medicare fraud. The FBI uses it in criminal probes. The Department of Homeland Security deploys it to screen air travelers and keep tabs on immigrants. Police and sheriff's departments in New York, New Orleans, Chicago, and Los Angeles have also used it, frequently ensnaring in the digital dragnet people who aren't suspected of committing any crime.

Operating Systems

'Fuchsia Is Not Linux': Google Publishes Documentation Explaining Their New OS (xda-developers.com) 245

An anonymous reader quotes a report from XDA Developers: You've probably seen mentions of the Fuchsia operating system here and there since it has been in development for almost 2 years. It's Google's not-so-secretive operating system which many speculate will eventually replace Android. We've seen it grow from a barely functional mock-up UI in an app form to a version that actually boots on existing hardware. We've seen how much importance Google places on the project as veteran Android project managers are starting to work on it. But after all of this time, we've never once had either an official announcement from Google about the project or any documentation about it -- all of the information thus far has come as a result of people digging into the source code.

Now, that appears to be changing as Google has published a documentation page called "The Book." The page aims to explain what Fuchsia, the "modular, capability-based operating system" is and is not. The most prominent text on that page is a large section explaining that Fuchsia is NOT Linux, in case that wasn't clear already. Above that are several readme pages explaining Fuchsia's file systems, boot sequence, core libraries, sandboxing, and more. The rest of the page has sections explaining what the Zircon micro-kernel is and how the framework, storage, networking, graphics, media, user interface, and more are implemented.

Graphics

Intel Reportedly Designing Arctic Sound Discrete GPU For Gaming, Pro Graphics (hothardware.com) 68

MojoKid shares a report from HotHardware: When AMD's former graphics boss Raja Koduri landed at Intel after taking a much-earned hiatus from the company, it was seen as a major coup for the Santa Clara chip outfit, one that seemed to signal that Intel might be targeting to compete in the discrete graphics card market. While nothing has been announced in that regard, some analysts are claiming that there will indeed be a gaming variant of Intel's upcoming discrete "Arctic Sound" GPU. According to reports, Intel originally planned to build Arctic Sound graphics chips mainly for video streaming chores and data center activities. However, claims are surfacing that the company has since decided to build out a gaming variant at the behest of Koduri, who wants to "enter the market with a bang." Certainly a gaming GPU that could compete with AMD and NVIDIA would accomplish that goal. Reportedly, Intel could pull together two different version of Arctic Sound. One would be an integrated chip package, like the Core i7-8809G (Kaby Lake-G) but with Intel's own discrete graphics, as well as a standalone chip that will end up in a traditional graphics cards. Likely both of those will have variants designed for gaming, just as AMD and NVIDIA build GPUs for professional use and gaming as well.
Media

Ask Slashdot: How Do You Stream/Capture Video? 155

datavirtue writes: I am starting to look at capturing and streaming video, specifically video games in 4K at 60 frames per second. I have a Windows 10 box with a 6GB GTX 1060 GPU and a modern AMD octa-core CPU recording with Nvidia ShadowPlay. This works flawlessly, even in 4K at 60 fps. ShadowPlay produces MP4 files which play nice locally but seem to take a long time to upload to YouTube -- a 15-minute 4K 60fps video took almost three hours. Which tools are you fellow Slashdotters using to create, edit, and upload video in the most efficient manner?
Bitcoin

GPU Prices Soar as Bitcoin Miners Buy Up Hardware To Build Rigs (computerworld.com) 157

"Bitcoin and other cryptocurrency miners have created a dearth of mid-range and high-end GPU cards that are selling for twice as much as suggested retail," reports Computerworld. "The reason: miners are setting up server farms with the cards." Lucas123 writes: GPU prices have more than doubled in some cases... Some of the most popular GPUs can't even be found anymore as they've sold out due to demand. Meanwhile, some retailers are pushing back against bitcoin miners by showing favoritism to their traditional gamer customers, allowing them to purchase GPUs at manufacturer's suggested retail price. Earlier this year, NVIDIA asked retailers of its hardware to prioritize sales to gamers over cryptocurrency miners.
Graphics

Programmer Unveils OpenGL Bindings for Bash (opensource.com) 47

Slashdot reader silverdirk writes: Compiled languages have long provided access to the OpenGL API, and even most scripting languages have had OpenGL bindings for a decade or more. But, one significant language missing from the list is our old friend/nemesis Bash. But worry no longer! Now you can create your dazzling 3D visuals right from the comfort of your command line!
"You'll need a system with both Bash and OpenGL support to experience it firsthand," explains software engineer Michael Conrad, who created the first version 13 years ago as "the sixth in a series of 'Abuse of Technology' projects," after "having my technical sensibilities offended that someone had written a real-time video game in Perl.

"Back then, my primary language was C++, and I was studying OpenGL for video game purposes. I declared to my friends that the only thing worse would be if it had been 3D and written in Bash. Having said the idea out loud, it kept prodding me, and I eventually decided to give it a try to one-up the 'awfulness'..."
Anime

Animation Legend Isao Takahata, Co-founder of Studio Ghibli, Dies at 82 (nbcnews.com) 27

Isao Takahata, co-founder of the prestigious Japanese animator Studio Ghibli, which stuck to a hand-drawn "manga" look in the face of digital filmmaking, has died. He was 82. From a report: Takahata started Ghibli with Oscar-winning animator Hayao Miyazaki in 1985, hoping to create Japan's Disney. He directed "Grave of the Fireflies," a tragic tale about wartime childhood, and produced some of the studio's films, including Miyazaki's 1984 "Nausicaa of the Valley of the Wind," which tells the horror of environmental disaster through a story about a princess. Takahata died Thursday of lung cancer at a Tokyo hospital, the studio said in a statement Friday.

He was fully aware of how the floating sumie-brush sketches of faint pastel in his works stood as a stylistic challenge to Hollywood's computer-graphics cartoons. In a 2015 interview with The Associated Press, Takahata talked about how Edo-era woodblock-print artists like Hokusai had the understanding of Western-style perspective and the use of light, but they purposely chose to depict reality with lines, and in a flat way, with minimal shading.
"Pom Poko", a movie released in 1994, is often considered the best work of Takahata. The New York Times described it as, "a comic allegory about battling packs of tanuki (Japanese raccoon dogs) joining forces to fight human real estate developers. It's earthy and rollicking in a way that his co-founder's films aren't." In an interview with Wired in 2015, when Takahata was asked what he felt about people regarding him as the heart of Studio Ghibli. "Now you've both finished your final films, what are your feelings on Ghibli's legacy and reputation?, the interviewer asked. Takahata said, "I'm not sure I can respond in any meaningful way. What Hayao Miyazaki has built up is the greatest contribution. The existence of that thick trunk has allowed leaves to unfurl and flowers to bloom to become the fruitful tree that is Studio Ghibli."

Further reading: Isao Takahata's stark world of reality (The Japan Times).
Displays

Latest macOS Update Disables DisplayLink, Rendering Thousands of Monitors Dead (displaylink.com) 331

rh2600 writes: Four days ago, Apple's latest macOS 10.13.4 update broke DisplayLink protocol support (perhaps permanently), turning what may be hundreds of thousands of external monitors connected to MacBook Pros via DisplayLink into paperweights. Some days in, DisplayLink has yet to announce any solution, and most worryingly there are indications that this is a permanent change to macOS moving forward. Mac Rumors is reporting that "users of the popular Mac desktop extension app Duet Display are being advised not to update to macOS 10.13.4, due to 'critical bugs' that prevent the software from communicating with connected iOS devices used as extra displays." Users of other desktop extensions apps like Air Display and iDisplay are also reporting incompatibility with the latest version of macOS.
Intel

Intel Unveils New Coffee Lake 8th Gen Core Line-Up With First Core i9 Mobile CPU (hothardware.com) 73

MojoKid writes: Intel is announcing a big update to its processor families today, with new 8th Gen Coffee Lake-based Core chips for both mobile and desktop platforms. On the mobile side of the equation, the most interesting processors are no doubt Intel's new six-core Coffee Lake parts, starting with the Core i7-8750H. This processor comes with base/max single-core turbo boost clocks of 2.2GHz and 4.2GHz respectively, while the Core i7-8850H bumps those clocks to 2.6GHz and 4.3GHz respectively. Both processors have six cores (12 threads), a TDP of 45 watts and 9MB of shared Smart Cache.However, the new flagship processor is without question the Intel Core i9-8950HK, which is the first Core i9-branded mobile processor. It retains the 6/12 (core/thread) count of the lower-end parts, but features base and turbo clocks of 2.9GHz and 4.8GHz respectively. The chip also comes unlocked since it caters to gaming enthusiasts and bumps the amount of Smart Cache to 12MB. Intel is also announcing a number of lower powered Coffee Lake-U series chips for thin and light notebooks, some of which have on board Iris Plus integrated graphics with 128MB of on-chip eDRAM, along with some lower powered six-core and quad-core desktop chips that support the company's Optane memory in Intel's new 300 series chipset platform.
Graphics

Ask Slashdot: Should CPU, GPU Name-Numbering Indicate Real World Performance? 184

dryriver writes: Anyone who has built a PC in recent years knows how confusing the letters and numbers that trail modern CPU and GPU names can be because they do not necessarily tell you how fast one electronic part is compared to another electronic part. A Zoomdaahl Core C-5 7780 is not necessarily faster than a Boomberg ElectronRipper V-6 6220 -- the number at the end, unlike a GFLOPS or TFLOPS number for example, tells you very little about the real-world performance of the part. It is not easy to create one unified, standardized performance benchmark that could change this. One part may be great for 3D gaming, a competing part may smoke the first part in a database server application, and a third part may compress 4K HEVC video 11% faster. So creating something like, say, a Standardized Real-World Application Performance Score (SRWAPS) and putting that score next to the part name, letters, or series number will probably never happen. A lot of competing companies would have to agree to a particular type of benchmark, make sure all benchmarking is done fairly and accurately, and so on and so forth.

But how are the average consumers just trying to buy the right home laptop or gaming PC for their kids supposed to cope with the "letters and numbers salad" that follows CPU, GPU and other computer part names? If you are computer literate, you can dive right into the different performance benchmarks for a certain part on a typical tech site that benchmarks parts. But what if you are "Computer Buyer Joe" or "Jane Average" and you just want to glean quickly which two products -- two budget priced laptops listed on Amazon.com for example -- have the better performance overall? Is there no way to create some kind of rough numeric indicator of real-world performance and put it into a product's specs for quick comparison?
Cloud

Move Over Moore's Law, Make Way For Huang's Law (ieee.org) 55

Tekla Perry writes: Are graphics processors a law unto themselves? Nvidia's Jensen Huang says a 25-times speedup over five years is evidence that they are. He calls this the 'supercharged law,' and says it's time to start counting advances on multiple fronts, including architecture, interconnects, memory technology, and algorithms, not just circuits on a chip.
Operating Systems

macOS 10.13.4 Enables Support for External GPU (engadget.com) 53

With the latest release of macOS High Sierra, Apple has officially delivered on a couple of items in the works since WWDC 2017 last June. macOS 10.13.4 brings the external GPU (eGPU) support that lets developers, VR users gamers and anyone else in need of some extra oomph to plug in a more powerful graphics card via Thunderbolt 3. From a report: While that may not make every underpowered laptop VR ready, it certainly makes staying macOS-only more palatable for some power users. Another notable addition is Business Chat in Messages for users in the US. Twitter, Facebook, WhatsApp and others have tweaked their services to enable customer service linkups and now Apple has its version available on the desktop. With it, you can interact with business representatives or even make purchases. Other tweaks include waiting for the user to select login fields before autofilling password information in Safari, a smoke cloud wallpaper that had previously been restricted to the iMac Pro and a Safari shortcut for jumping to the rightmost tab by pressing Command-9. Further reading: Gizmodo.
AI

NVIDIA Unveils 2 Petaflop DGX-2 AI Supercomputer With 32GB Tesla V100, NVSwitch Tech 41

bigwophh writes from a report via HotHardware: NVIDIA CEO Jensen Huang took to the stage at GTC today to unveil a number of GPU-powered innovations for machine learning, including a new AI supercomputer and an updated version of the company's powerful Tesla V100 GPU that now sports a hefty 32GB of on-board HBM2 memory. A follow-on to last year's DGX-1 AI supercomputer, the new NVIDIA DGX-2 can be equipped with double the number of Tesla V100 processing modules for double the GPU horsepower. The DGX-2 can also have four times the available memory space, thanks to the updated Tesla V100's larger 32GB of memory. NVIDIA's new NVSwitch technology is a fully crossbar GPU interconnect fabric that allows NVIDIA's platform to scale to up to 16 GPUs and utilize their memory space contiguously, where the previous DGX-1 NVIDIA platform was limited to 8 total GPU complexes and associated memory. NVIDIA claims NVSwitch is five times faster than the fastest PCI Express switch and offers an aggregate 2.4TB per second of bandwidth. A new Quadro card was also announced. Called the Quadro GV100, it too is being powered by Volta. The Quadro GV100 packs 32GB of memory and supports NVIDIA's recently announced RTX real-time ray tracing technology.
Graphics

Ask Slashdot: How Did Real-Time Ray Tracing Become Possible With Today's Technology? 145

dryriver writes: There are occasions where multiple big tech manufacturers all announce the exact same innovation at the same time -- e.g. 4K UHD TVs. Everybody in broadcasting and audiovisual content creation knew that 4K/8K UHD and high dynamic range (HDR) were coming years in advance, and that all the big TV and screen manufacturers were preparing 4K UHD HDR product lines because FHD was beginning to bore consumers. It came as no surprise when everybody had a 4K UHD product announcement and demo ready at the same time. Something very unusual happened this year at GDC 2018 however. Multiple graphics and GPU companies, like Microsoft, Nvidia, and AMD, as well as other game developers and game engine makers, all announced that real-time ray tracing is coming to their mass-market products, and by extension, to computer games, VR content and other realtime 3D applications.

Why is this odd? Because for many years any mention of 30+ FPS real-time ray tracing was thought to be utterly impossible with today's hardware technology. It was deemed far too computationally intensive for today's GPU technology and far too expensive for anything mass market. Gamers weren't screaming for the technology. Technologists didn't think it was doable at this point in time. Raster 3D graphics -- what we have in DirectX, OpenGL and game consoles today -- was very, very profitable and could easily have evolved further the way it has for another 7 to 8 years. And suddenly there it was: everybody announced at the same time that real-time ray tracing is not only technically possible, but also coming to your home gaming PC much sooner than anybody thought. Working tech demos were shown. What happened? How did real-time ray tracing, which only a few 3D graphics nerds and researchers in the field talked about until recently, suddenly become so technically possible, economically feasible, and so guaranteed-to-be-profitable that everybody announced this year that they are doing it?

Slashdot Top Deals