The Tokyo Institute of Technology has announced plans to launch Japan's "fastest AI supercomputer" this summer. The supercomputer is called Tsubame 3.0 and will use Nvidia's latest Pascal-based Tesla P100 GPU accelerators to double its performance over its predecessor, the Tsubame 2.5. Slashdot reader kipperstem77 shares an excerpt from a report via The Next Platform: With all of those CPUs and GPUs, Tsubame 3.0 will have 12.15 petaflops of peak double precision performance, and is rated at 24.3 petaflops single precision and, importantly, is rated at 47.2 petaflops at the half precision that is important for neural networks employed in deep learning applications. When added to the existing Tsubame 2.5 machine and the experimental immersion-cooled Tsubame-KFC system, TiTech will have a total of 6,720 GPUs to bring to bear on workloads, adding up to a total of 64.3 aggregate petaflops at half precision. (This is interesting to us because that means Nvidia has worked with TiTech to get half precision working on Kepler GPUs, which did not formally support half precision.)
Trust the World's Fastest VPN with Your Internet Security & Freedom - A Lifetime Subscription of PureVPN at 88% off. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. ×
Brian Fagioli, writing for BetaNews: System76 is refreshing three of its laptops with some high-end parts. The Oryx Pro, Serval WS, and Bonobo WS are now all equipped with 7th generation Intel Kaby Lake processors. In addition, all three can be had with 4K displays and NVIDIA GTX 10 series graphics too. While the Oryx Pro already had the option of 4K and GTX 10, it is the 7th gen Intel chips that are new to it. In fact, all of the company's laptops now come with Kaby Lake standard. The computer seller throws some shade at Apple by saying, "The HiDPI displays that ship on the laptops have 3.1 million more pixels than Apple's 'Retina' displays, enabling sharper text, 4K video, and higher res gaming. Beyond that, the displays give video and photo professionals the ability to work more easily with higher resolution multimedia."
Here's another area where Microsoft, whose cloud services are doing very well, continues to make a lot of money: video games. Microsoft has minted its fifth billion-dollar video-game franchise. The "Forza" racing series in December topped $1 billion in lifetime sales since the first game's release 12 years ago, Microsoft said. From company's blogpost: As of December, more than 14 million unique players were involved in the Forza community on Xbox One and Windows 10, the award-winning Forza Horizon 3 sold through 2.5 million units, and Forza continued its run as the best-selling racing franchise of this console generation. Additionally, our online racing community expanded significantly: over three million players joined us online each month and we launched the Forza Racing Championship, an eSports league for players of all skill levels to compete for glory and real-world prizes. "Since the beginning, Forza has combined stunning graphics, racing's leading simulation engine, and an emphasis on fun and accessibility," said Phil Spencer, head of Xbox. "With the Forza series, Turn 10 Studios has built the world's largest racing community. We couldn't be more proud of their success." Other game franchises in Microsoft's billion-dollar club are "Halo," "Minecraft," "Gears of War", and "Age of Empires".
schwit1 quotes a report from IEEE Spectrum: Ever since 2015 Consumer Electronics Show, quantum dots have been in a market struggle to displace light-emitting diodes (LEDs) as a backlight source for liquid crystal displays (LCDs). Now an advance by a team of researchers from the University of Illinois at Urbana-Champaign, the Electronics and Telecommunications Research Institute in South Korea and Dow Chemical may turn the display market on its head by eliminating the need for backlights in LCD devices. They have produced a LED pixel out of nanorods capable of both emitting and detecting light. In research described in the journal Science, the international team of researchers mixed three types of semiconductors to produce engineered nanorods. "The nanorods contain three different semiconductor materials," explains Shim. "The first semiconductor, which is attached at the tips of the nanorod, is the quantum dot that emits and absorbs visible light." The other two semiconductors are the main body of the rod and the shell around the quantum dot. These components facilitate and control the flow of electrons (negative charges) and holes (positive charges) to and from the quantum dot. The semiconductor materials in the rod and the shell each have a band gap in which no electron states can exist as well as band alignment. With these two semiconductors in contact with the quantum dot, the nanorods are extremely efficient at both emitting and detecting light.
Mikey Campbell, writing for AppleInsider: Apple's WebKit team on Tuesday proposed a new Community Group at the World Wide Web Consortium (W3C) that will focus on developing a new standard API, perhaps based on Metal, for accelerating GPU-based 3D graphics and general computation for the web. Announced through Apple's WebKit blog, the new 'GPU for the Web' Community Group will discuss a potential next-generation web graphics API that can better leverage modern GPUs. Along with 3D content, Apple proposes GPU architecture might also be used to accelerate general web computations. As noted by Dean Jackson from the WebKit team, advancements in the GPU hardware space has led to identical enhancements in software APIs. He cites platform technologies like Apple's Metal, Microsoft's Direct3D 12 and the Khronos Group's Vulkan as offering lower overhead, and thus better performance, than the OpenGL standard. Unfortunately, the new graphics APIs contain nuanced architectural differences and are not available across all platforms, making them unsuitable for wide implementation on the web.
Late last month, we learned that LG's UltraFine 5K Display, which was designed in part by Apple to work with the new MacBook Pro and as a replacement for the Thunderbolt Display, would flicker, disconnect, or freeze computers if placed within two meters of a router. The company has acknowledged the issue, and says it will add enhanced shielding to its 5K monitors to prevent interferences with nearby wireless routers in the upcoming batch. From a report: "LG apologizes for this inconvenience and is committed to delivering the best quality products possible, so all LG UltraFine 27-inch 5K displays manufactured after February 2017 will be fitted with enhanced shielding," the company said in an email. Existing models will be able to be retrofitted with the enhanced shielding, which will allow the monitor to be placed near a router.
Nvidia is putting a stop to the resale of bundled promotional game keys by tying them to a specific graphics card purchase, according to Ars Technica. Users will now have to redeem codes via the GeForce Experience (GFE) app, which is directly linked to third-party services like Steam and Uplay. Users must also ensure that the requisite graphics card is "installed before redemption." GFE then performs "a hardware verification step to ensure the coupon code is redeemed on the system with the qualifying GPU." From the report: Previously, retailers sent promotional game codes to customers that purchased a qualifying product. Those codes could then be redeemed on Nvidia's website, which spit out the relevant Steam, Uplay, Origin, or Microsoft Store key. Since the promotional game codes were not tied to a specific account, many users took to either gifting spare keys to friends or selling them on eBay in order to offset the cost of the graphics card purchase. [Ars Technica has updated their report with additional information:] Nvidia has confirmed that while GFE checks to ensure a user has installed a qualifying graphics card like a GTX 1070 or GTX 1080, the game itself is not permanently linked to the hardware. GFE's hardware check is based only on the wider product range, and not on a specific serial number. The company has also confirmed that the redemption process permanently adds the game to the appropriate third-party service. For example, if users redeems a promotional game key through to Steam, that game will be useable on any other device, just like normal Steam games. Users can also opt to uninstall GFE, or install a different graphics card, once the promotional code has been redeemed and still retain full ownership of the game. A full set of instructions for redeeming codes is now available on Nvidia's website.
The LG UltraFine 5K Display was designed in part by Apple to work with the New MacBook Pro and as a replacement for the Thunderbolt Display, which was discontinued late last year. According to 9to5Mac, the display apparently wasn't designed to work next to routers as it will flicker, disconnect, or freeze computers when it's within two meters of a router due to electromagnetic interference. The Verge reports: In emails to 9to5Mac, LG acknowledged the problem -- which LG says isn't an issue for any of its other monitors -- noting that routers "may affect the performance of the monitor" and that users should "have the router placed at least two meters away from the monitor" to avoid issues. Once the monitor was moved into a different room away from the router, 9to5Mac says the issues subsided. Despite the fact that it's insane to require a router to be far away from what is likely the main computer in your home, there's been no indication that LG is working on a fix for the issue, which may be more troublesome.
An anonymous reader quotes a report from Softpedia: It's finally here! After so many months of development and hard work, during which over 6,600 bugs have been patched, the Wine project is happy to announce today, January 24, 2017, the general availability of Wine 2.0. Wine 2.0 is the biggest and most complete version of the open-source software project that allows Linux and macOS users to run applications and games designed only for Microsoft Windows operating systems. As expected, it's a massive release that includes dozens of improvements and new features, starting with support for Microsoft Office 2013 and 64-bit application support on macOS. Highlights of Wine 2.0 include the implementation of more DirectWrite features, such as drawing of underlines, font fallback support, and improvements to font metrics resolution, font embedding in PDF files, Unicode 9.0.0 support, Retina rendering mode for the macOS graphics driver, and support for gradients in GDI enhanced metafiles. Additional Shader Model 4 and 5 shader instructions have been added to Direct3D 10 and Direct3D 11 implementation, along with support for more graphics cards, support for Direct3D 11 feature levels, full support for the D3DX (Direct3D Extension) 9 effect framework, as well as support for the GStreamer 1.0 multimedia framework. The Gecko engine was updated to Firefox 47, IDN name resolutions are now supported out-of-the-box, and Wine can correctly handle long URLs. The included Mono engine now offers 64-bit support, as well as the debug registers. Other than that, the winebrowser, winhlp32, wineconsole, and reg components received improvements. You can read the full list of features and download Wine 2.0 from WineHQ's websiteS.
Reader Krystalo writes: Mozilla today launched Firefox 51 for Windows, Mac, Linux, and Android. The new version includes a new warning for websites which collect passwords but don't use HTTPS, WebGL 2 support for better 3D graphics, and FLAC (Free Lossless Audio Codec) playback. Mozilla doesn't break out the exact numbers for Firefox, though the company does say "half a billion people around the world" use the browser. In other words, it's a major platform that web developers target -- even in a world increasingly dominated by mobile apps.
AppleInsider has obtained a note to investors from KGI analyst Ming-Chi Kuo that says Apple's 2017 laptop line will focus on internal component updates, including the platform-wide adoption of Intel's Kaby Lake architecture. What's more is that Apple is expected to manufacture a 15-inch MacBook Pro with up to 32GB of RAM in the fourth quarter of 2017. AppleInsider reports: Apple took flak in releasing its latest MacBook Pro with Touch Bar models with a hard memory cap of 16GB, an minimal allotment viewed as a negative for imaging and video professionals. Responding to customer criticism, Apple said the move was made in a bid to maximize battery life. Essentially, the Intel Skylake CPUs used in Apple's MacBook Pro only support up to 16GB of LPDDR3 RAM at 2133MHz. Though Intel does make processors capable of addressing more than 16GB of memory, those particular chipsets rely on less efficient DDR4 RAM and are usually deployed in desktops with access to dedicated mains power. In order to achieve high memory allotments and keep unplugged battery life performance on par with existing MacBook Pro models, Apple will need to move to an emerging memory technology like LPDDR4 or DDR4L. Such hardware is on track for release later this year. As for the 12-inch MacBook, Kuo believes next-generation versions of the thin-and-light will enter mass production in the second quarter with the same basic design aesthetic introduced in 2015. New for 2017 is a 16GB memory option that will make an appearance thanks to Intel's new processor class.
An anonymous reader quotes a report from Ars Technica: The Raspberry Pi Compute Module is getting a big upgrade, with the same processor used in the recently released Raspberry Pi 3. The Compute Module, which is intended for industrial applications, was first released in April 2014 with the same CPU as the first-generation Raspberry Pi. The upgrade announced today has 1GB of RAM and a Broadcom BCM2837 processor that can run at up to 1.2GHz. "This means it provides twice the RAM and roughly ten times the CPU performance of the original Compute Module," the Raspberry Pi Foundation announcement said. This is the second major version of the Compute Module, but it's being called the "Compute Module 3" to match the last flagship Pi's version number. The new Compute Module has more flexible storage options than the original. "One issue with the [Compute Module 1] was the fixed 4GB of eMMC flash storage," the announcement said. But some users wanted to add their own flash storage. "To solve this, two versions of the [Compute Module 3] are being released: one with 4GB eMMC on-board and a 'Lite' model which requires the user to add their own SD card socket or eMMC flash." The core module is tiny so that it can fit into other hardware, but for development purposes there is a separate I/O board with GPIO, USB and MicroUSB, CSI and DSI ports for camera and display boards, HDMI, and MicroSD. The Compute Module 3 and the lite version cost $30 and $25, respectively.
An anonymous reader quotes a report from Apple Insider: Apple on Tuesday was granted a patent detailing technology that allows for ear speakers, cameras and even a heads-up display to hide behind an edge-to-edge screen, a design rumored to debut in a next-generation iPhone later this year. Awarded by the U.S. Patent and Trademark Office, Apple's U.S. Patent No. 9,543,364 for "Electronic devices having displays with openings" describes a method by which various components can be mounted behind perforations in a device screen that are so small as to be imperceptible to the human eye. This arrangement would allow engineers to design a smartphone or tablet with a true edge-to-edge, or "full face," display. With smartphones becoming increasingly more compact, there has been a push to move essential components behind the active -- or light-emitting -- area of incorporated displays. Apple in its patent suggests mounting sensors and other equipment behind a series of openings, or through-holes, in the active portion of an OLED or similar panel. These openings might be left empty or, if desired, filled with glass, polymers, radio-transparent ceramic or other suitable material. Positioning sensor inputs directly in line with said openings facilitates the gathering of light, radio waves and acoustic signals. Microphones, cameras, antennas, light sensors and other equipment would therefore have unimpeded access beyond the display layer. The design also accommodates larger structures like iPhone's home button. According to the document, openings are formed between pixels, suggesting a self-illuminating display technology like OLED is preferred over traditional LCD structures that require backlight and filter layers. Hole groupings can be arranged in various shapes depending on the application, and might be larger or smaller than the underlying component. If implemented into a future iPhone, the window-based HUD could be Apple's first foray into augmented reality. Apple leaves the mechanics unmentioned, but the system could theoretically go beyond AR and into mixed reality applications.
MojoKid writes: AMD has a lot riding on Ryzen, its new generation CPU architecture that is supposed to return the chip designer to a competitive position versus Intel in the high-end desktop X86 processor market. Late last week, at CES 2017, AMD has lined up over a dozen high-performance AM4 motherboards from five hardware partners, including ASRock, ASUS, Biostar, Gigabyte, and MSI. All AM4 motherboards are built around one of two desktop chipsets for Ryzen, the AMD X370 or X300. Motherboards based on the X370 chipset are intended for power users and gamers. These boards bring more robust overclocking controls and support for dual graphics cards, along with more I/O connectivity and dual-channel DDR4 memory support. The X300 is AMD's chipset for mini-ITX motherboards for small form factor (SFF) system platforms. The X300 also supports dual-channel DDR4 memory, PCIe 3.0, M.2 SATA devices, NVMe, and USB 3.1 Gen 1 and Gen 1. Finally, AMD representatives on hand at CES also reported that all Ryzen processors will be multiplier unlocked, hopefully for some rather flexible overclocking options. There will also be several processors in the family, with varying core counts depending on SKU, at launch.
MojoKid writes: Over the past couple of years, Dell has been driving a redesign effort of its consumer and commercial product lines and has systematically been updating both design signatures and the technology platforms within them. Dell's premium consumer XPS product line, perhaps more so than any other, has seen the most significant design reinvention with the likes of its XPS 13 and XPS 15 notebook line. At CES 2017, Dell announced the XPS 27 7760 all-in-one PC that has a radically new look that draws at least one design cue from its XPS notebook siblings, specifically with respect to the display bezel, or the lack thereof. Though Dell isn't officially branding the touch-enabled version of XPS 27 with an "InfinityEdge" display, the side and top bezel is cut to a minimum, accentuating a beautiful 4K IPS panel. However, the machine's display might not be the most standout feature of the 2017 Dell XPS 27. Under that display, Dell actually expanded things mechanically to make room not only for a Windows Hello capable camera but a 10 speaker sound system that was designed in conjunction with Grammy Award-winning music producer and audio engineer, JJ Puig, that takes the system's audio reproduction and output capabilities to a whole new level. Its sound system is very accurate with dual 50 watt amplifiers at less than 1% THD (Total Harmonic Distortion) and a 70Hz to 20KHz frequency response. Though the system is currently built on Intel's Skylake platform, Kaby Lake versions are imminent and with discrete AMD Radeon R9 M470X graphics, it has decent gaming and multimedia chops as well.
MojoKid writes: AMD lifted the veil on its next generation GPU architecture, codenamed Vega, this morning. One of the underlying forces behind Vega's design is that conventional GPU architectures have not been scaling well for diverse data types. Gaming and graphics workloads have shown steady progress, but today's GPUs are used for much more than just graphics. In addition, the compute capability of GPUs may have been increasing at a good pace, but memory capacity has not kept up. Vega aims to improve both compute performance and addressable memory capacity, however, through some new technologies not available on any previous-gen architecture. First, is that Vega has the most scalable GPU memory architecture built to date with 512TB of address space. It also has a new geometry pipeline tuned for more performance and better efficiency with over 2X peak throughput per clock, a new Compute Unit design, and a revamped pixel engine. The pixel engine features a new draw stream binning rasterizer (DSBR), which reportedly improves performance and saves power. All told, Vega should offer significant improvements in terms of performance and efficiency when products based on the architecture begin shipping in a few months.
When a technology company like Apple releases a new product, chances are it's going to be thinner than its predecessor -- even if may be slightly worse off for it. HP is taking a different approach with its new 15.6-inch Spectre x360 laptop, which was recently announced at CES. The machine is slightly thicker than its predecessor, and HP claims it features three hours of additional battery life. The Verge reports: The difference between the new x360 and the old x360, in terms of thickness, is minimal, from 15.9mm to 17.8mm. (For reference, the 2015 MacBook Pro was 18mm thick.) It's an increase of 1.9mm for the Spectre, but HP says it's now including a battery that's 23 percent larger in exchange. At the same time, the laptop is also getting narrower, with its body shrinking from 14.8 inches wide to 14 inches wide. Unfortunately, the claimed three hours of additional battery life aren't meant to make this laptop into some long-lasting wonder -- they're really just meant to normalize its battery life. HP will only be selling the 15.6-inch x360 with a 4K display this year, and that requires a lot more power. By increasing the laptop's battery capacity, HP is able to push the machine's battery life from the 9.5 hours it estimated for the 4K version of its 2016 model to about 12 hours and 45 minutes for this model. So it is adding three hours of battery life, but in doing so, it's merely matching the battery life of last year's 1080p model. The x360 is also being updated to include Intel's Kaby Lake processors. It includes options that max out at an i7 processor, 16GB of RAM, a 1TB SSD, and Nvidia GeForce 940MX graphics. It's supposed to be released February 26th, with pricing starting at $1,278 for an entry-level model.
Qualcomm has detailed the Snapdragon 835 processor, which will power most of the leading Android smartphones this year. It's designed to grab information from the air at gigabit speeds and turn it into rich virtual and augmented reality experiences, according to several executives at a pre-CES briefing. Qualcomm SVP Keith Kressin said, "The 835 is going to be one of the key devices that propels the VR use case." PC Magazine reports: The hardest thing to understand about the Snapdragon 835, especially if you're thinking from a desktop CPU space, is how much Qualcomm has been prioritizing elements of the system-on-chip other than the CPU. This has been coming for years, and it can be tricky because it relies on firmware and the Android OS to properly distribute work to non-CPU components of the chip. During the briefing, it was striking how little Qualcomm talked about its Kryo 280 CPU, as compared to other components. Qualcomm tries to counter that by pointing out that this is the first 10nm mobile processor, which will improve efficiency, and also by saying the CPU is "tightly integrated" with other components using the new Symphony system manager, which operates automatically yet can be customized by application developers. This distributes work across the CPU, GPU, DSP, and more exotic components, letting the Snapdragon 835 work better than it would with CPU alone. How that will combine with Qualcomm's recent announcement that it will support Windows 10 on mobile PCs, including legacy Win32 apps, is yet to be seen. The Snapdragon 835 consumes 25 percent less power than the 820, according to Qualcomm. That means seven hours of 4K streaming video and two hours of VR gaming on a typical device, the company said. These new uses are really power hungry. Since Qualcomm can only do so much on power efficiency, it's also introducing Quick Charge 4, which supposedly charges a phone to five hours of use in five minutes and is USB-C power delivery compliant. The new Adreno 540 graphics chip improves 3D performance by 25 percent over the previous generation, Qualcomm said. But it also enables features like HDR10, which improves colors; foveated rendering, which most clearly renders what you're looking at rather than elements in the periphery of a scene; and low latency, which allows you to move your head smoothly around VR scenes. With one 32MP or two 16MP cameras running at the same time, the Snapdragon 835 supports various dual-camera functions. The Snapdragon 835 will feature the X16 modem, which Qualcomm announced earlier this year and will be able to boost LTE to gigabit speeds. The keys to gigabit LTE are triple 20MHz carrier aggregation with 256-QAM encoding and 4x4 MIMO antennas, said Qualcomm's senior director of marketing, Peter Carson. That's going to be first introduced with a Netgear hotspot in Australia this January, but Sprint and T-Mobile have said they're trying to assemble this set of technologies.
Reader joshtops writes: Ars Technica has reviewed the much-anticipated Intel Core i7-7700K Kaby Lake, the recently launched desktop processor from the giant chipmaker. And it's anything but a good sign for enthusiasts who were hoping to see significant improvements in performance. From the review, "The Intel Core i7-7700K is what happens when a chip company stops trying. The i7-7700K is the first desktop Intel chip in brave new post-"tick-tock" world -- which means that instead of major improvements to architecture, process, and instructions per clock (IPC), we get slightly higher clock speeds and a way to decode DRM-laden 4K streaming video. [...] If you're still rocking an older Ivy Bridge or Haswell processor and weren't convinced to upgrade to Skylake, there's little reason to upgrade to Kaby Lake. Even Sandy Bridge users may want to consider other upgrades first, such as a new SSD or graphics card. The first Sandy Bridge parts were released six years ago, in January 2011. [...] As it stands, what we have with Kaby Lake desktop is effectively Sandy Bridge polished to within an inch of its life, a once-groundbreaking CPU architecture hacked, and tweaked, and mangled into ever smaller manufacturing processes and power envelopes. Where the next major leap in desktop computing power comes from is still up for debate -- but if Kaby Lake is any indication, it won't be coming from Intel. While Ars Technica has complained about the minimal upgrades, AnandTech looks at the positive side: The Core i7-7700K sits at the top of the stack, and performs like it. A number of enthusiasts complained when they launched the Skylake Core i7-6700K with a 4.0/4.2 GHz rating, as this was below the 4.0/4.4 GHz rating of the older Core i7-4790K. At this level, 200-400 MHz has been roughly the difference of a generational IPC upgrade, so users ended up with similar performing chips and the difference was more in the overclocking. However, given the Core i7-7700K comes out of the box with a 4.2/4.5 GHz arrangement, and support for Speed Shift v2, it handily mops the floor with the Devil's Canyon part, resigning it to history.
AMD announced Tuesday it is introducing Radeon FreeSync 2, a new display technology that will enable monitors to show the exact intended image pixels that a game or other application wants to. The result will be better image quality for gamers, according to AMD. From a report on VentureBeat: With the FreeSync 2 specification, monitor makers will be able to create higher-quality monitors that build on the two-year-old FreeSync technology. Sunnyvale, Calif.-based AMD is on a quest for "pixel perfection," said David Glen, senior fellow at AMD, in a press briefing. With FreeSync 2, you won't have to mess with your monitor's settings to get the perfect setting for your game, Glen said. It will be plug-and-play, deliver brilliant pixels that have twice as much color gamut and brightness over other monitors, and have low-latency performance for high-speed games. AMD's FreeSync technology and Nvidia's rival G-Sync allow a graphics card to adjust the monitor's refresh rate on the fly, matching it to the computer's frame rate. This synchronization prevents the screen-tearing effect -- with visibly mismatched graphics on different parts of the screen -- which happens when the refresh rate of the display is out of sync with the computer.