Power

California Is Giving Away Free Solar Panels To Its Poorest Residents 21

Posted by samzenpus
from the you-get-a-panel-and-you-get-a-panel-and-you-get.... dept.
MikeChino writes: Oakland-based non-profit GRID Alternatives is giving away 1,600 free solar panels to California's poorest residents by the year 2016. The initiative was introduced by Senator Kevin de León and launched with funds gathered under the Greenhouse Gas Reduction Fund (GCRF), the state's cap-and-trade program. SFGate reports: "Kianté London used the program to put panels on his three-bedroom North Richmond home, which he shares with two sons and a daughter. 'It helps me and my family a great deal to have low-cost energy, because these energy prices are really expensive,' said London, 46, whose solar array was installed this week. 'And I wanted to do my part. It’s clean, green energy.' London had wanted a solar array for years, but couldn’t afford it on his income as a merchant seaman — roughly $70,000 per year. Even leasing programs offered by such companies as SolarCity and Sunrun were too expensive, he said. The new program, in contrast, paid the entire up-front cost of his array."
Earth

Heat Wave Kills More Than 1,100 In India 125

Posted by Soulskill
from the rest-in-peace dept.
An anonymous reader sends word that a week-long heat wave in India has resulted in the deaths of more than 1,100 people. Temperatures reached 47C (117F) on Monday and are expected to stay dangerously high throughout the week. The heat and extreme dryness are being accompanied by strong westerly winds. "About one-third of the country's 1.2 billion people have access to electricity, meaning millions are enduring the blistering heat without relief." The local power grid has been struggling under high demand from fans and air conditioning. In some states, citizens are being advised to stay indoors during the middle of the day, when the sun is at its peak. Many hope the upcoming monsoons will return temperatues to less dangerous levels.
Programming

Google's Diversity Chief: Mamas Don't Let Their Baby Girls Grow Up To Be Coders 427

Posted by samzenpus
from the starts-at-home dept.
theodp writes: Explaining the reasons for its less-than-diverse tech workforce, Google fingered bad parenting for its lack of women techies. From the interview with Google Director of Diversity and Inclusion Nancy Lee: "Q. What explains the drop [since 1984] in women studying computer science? A. We commissioned original research that revealed it's primarily parents' encouragement, and perception and access. Parents don't see their young girls as wanting to pursue computer science and don't steer them in that direction. There's this perception that coding and computer science is ... a 'brogrammer' culture for boys, for games, for competition. There hasn't been enough emphasis on the power computing has in achieving social impact. That's what girls are interested in. They want to do things that matter." While scant on details, the Google study's charts appear to show that, overall, fathers encourage young women to study CS more than mothers. Google feels that reeducation is necessary. "Outreach programs," advises Google, "should include a parent education component, so that parents learn how to actively encourage their daughters."
Transportation

GM's Exec. Chief Engineer For Electric Vehicles Pam Fletcher Answers Your Question 107

Posted by samzenpus
from the read-all-about-it dept.
Pam Fletcher was propulsion system chief engineer on the first Chevrolet Volt plug-in hybrid and is now executive chief engineer for electrified vehicles at GM, overseeing electrified vehicles company-wide. A while ago you had a chance to ask about her work and the future of electric cars. Below you'll find her answers to your questions.
Java

The Reason For Java's Staying Power: It's Easy To Read 413

Posted by samzenpus
from the easy-on-the-eyes dept.
jfruh writes: Java made its public debut twenty years ago today, and despite a sometimes bumpy history that features its parent company being absorbed by Oracle, it's still widely used. Mark Reinhold, chief architect for the Oracle's Java platform group, offers one explanation for its continuing popularity: it's easy for humans to understand it at a glance. "It is pretty easy to read Java code and figure out what it means. There aren't a lot of obscure gotchas in the language ... Most of the cost of maintaining any body of code over time is in maintenance, not in initial creation."
Power

Hydrogen-Powered Drone Can Fly For 4 Hours at a Time 116

Posted by samzenpus
from the different-way-to-fly dept.
stowie writes: The Hycopter uses its frame to store energy in the form of hydrogen instead of air. With less lift power required, its fuel cell turns the hydrogen in its frame into electricity to power its rotors. The drone can fly for four hours at a time and 2.5 hours when carrying a 2.2-pound payload. “By removing the design silos that typically separate the energy storage component from UAV frame development - we opened up a whole new category in the drone market, in-between battery and combustion engine systems,” says CEO Taras Wankewycz.
Power

Energy Dept. Wants Big Wind Energy Technology In All 50 US States 256

Posted by Soulskill
from the any-way-the-wind-blows dept.
coondoggie writes: Bigger wind turbines and towers are just part of what the U.S. needs in order to more effectively use wind energy in all 50 states.That was the thrust of a wind energy call-to-arms report called "Enabling Wind Power nationwide" issued this week by the Department of Energy. They detail new technology that can reach higher into the sky to capture more energy and more powerful turbines to generate more gigawatts. These new turbines are 110-140 meters tall, with blades 60 meters long. The Energy Department forecasts strong, steady growth of wind power across the country, both on land and off shore.
Cloud

Google Offers Cheap Cloud Computing For Low-Priority Tasks 59

Posted by Soulskill
from the guaranteed-7-7s-of-uptime dept.
jfruh writes: Much of the history of computing products and services involves getting people desperate for better performance and faster results to pay a premium to get what they want. But Google has a new beta service that's going in the other direction — offering cheap cloud computing services for customers who don't mind waiting. Jobs like data analytics, genomics, and simulation and modeling can require lots of computational power, but they can run periodically, can be interrupted, and can even keep going if one or more nodes they're using goes offline.
Entertainment

Marvel's Female Superheroes Are Gradually Becoming More Super 228

Posted by Soulskill
from the superduper-heroines dept.
New submitter RhubarbPye writes: A new study shows an increasing trend in the power and significance of female superhero characters in the Marvel comic book universe. Several criteria were used to examine the trend, including cover art, dialog, and the actual superpowers. Over 200 individual comic books from Marvel's 50+ year history were compared for the study. What's of particular interest is the study's author is a 17-year-old high school student from Ohio.
AMD

AMD Details High Bandwidth Memory (HBM) DRAM, Pushes Over 100GB/s Per Stack 98

Posted by timothy
from the lower-power-higher-interest dept.
MojoKid writes: Recently, a few details of AMD's next-generation Radeon 300-series graphics cards have trickled out. Today, AMD has publicly disclosed new info regarding their High Bandwidth Memory (HBM) technology that will be used on some Radeon 300-series and APU products. Currently, a relatively large number of GDDR5 chips are necessary to offer sufficient capacity and bandwidth for modern GPUs, which means significant PCB real estate is consumed. On-chip integration is not ideal for DRAM because it is not size or cost effective with a logic-optimized GPU or CPU manufacturing process. HBM, however, brings the DRAM as close to possible to the logic die (GPU) as possible. AMD partnered with Hynix and a number of companies to help define the HBM specification and design a new type of memory chip with low power consumption and an ultra-wide bus width, which was eventually adopted by JEDEC 2013. They also develop a DRAM interconnect called an "interposer," along with ASE, Amkor, and UMC. The interposer allows DRAM to be brought into close proximity with the GPU and simplifies communication and clocking. HBM DRAM chips are stacked vertically, and "through-silicon vias" (TSVs) and "bumps" are used to connect one DRAM chip to the next, and then to a logic interface die, and ultimately the interposer. The end result is a single package on which the GPU/SoC and High Bandwidth Memory both reside. 1GB of GDDR5 memory (four 256MB chips), requires roughly 672mm2. Because HBM is vertically stacked, that same 1GB requires only about 35mm2. The bus width on an HBM chip is 1024-bits wide, versus 32-bits on a GDDR5 chip. As a result, the High Bandwidth Memory interface can be clocked much lower but still offer more than 100GB/s for HBM versus 25GB/s with GDDR5. HBM also requires significantly less voltage, which equates to lower power consumption.
AI

New Chips Could Bring Deep Learning Algorithms To Your Smartphone 40

Posted by samzenpus
from the smarter-smart-phone dept.
catchblue22 writes: At the Embedded Vision Summit, a company called Synopsys, showed off a new image-processor core tailored for deep learning. It is expected to be added to chips that power smartphones, cameras, and cars. Synopsys showed a demo in which the new design recognized speed-limit signs in footage from a car. The company also presented results from using the chip to run a deep-learning network trained to recognize faces. A spokesperson said that it didn't hit the accuracy levels of the best research results, which have been achieved on powerful computers, but it came pretty close. "For applications like video surveillance it performs very well," he said. Being able to use deep learning on mobile chips will be vital to helping robots navigate and interact with the world, he said, and to efforts to develop autonomous cars.
Power

Wind Turbines With No Blades 164

Posted by Soulskill
from the can-finally-take-them-through-airport-security dept.
An anonymous reader writes: Wired has a profile of Spanish company Vortex Bladeless and their unusual new wind turbine tech. "Their idea is the Vortex, a bladeless wind turbine that looks like a giant rolled joint shooting into the sky. The Vortex has the same goals as conventional wind turbines: To turn breezes into kinetic energy that can be used as electricity." Instead of relying on wind to push a propeller in a circular motion, these turbines rely on vorticity — how wind can strike an object in a particular way to generate spinning vortices of air. Engineers usually try to avoid this — it's what brought down the Tacoma Narrows Bridge. But this Spanish company designed the turbine computationally to have the vortices occur at the same time along its entire height. "In its current prototype, the elongated cone is made from a composite of fiberglass and carbon fiber, which allows the mast to vibrate as much as possible (an increase in mass reduces natural frequency). At the base of the cone are two rings of repelling magnets, which act as a sort of nonelectrical motor. When the cone oscillates one way, the repelling magnets pull it in the other direction, like a slight nudge to boost the mast's movement regardless of wind speed. This kinetic energy is then converted into electricity via an alternator that multiplies the frequency of the mast's oscillation to improve the energy-gathering efficiency."
Microsoft

In-Database R Coming To SQL Server 2016 94

Posted by Soulskill
from the r,-me-hearties dept.
theodp writes: Wondering what kind of things Microsoft might do with its purchase of Revolution Analytics? Over at the Revolutions blog, David Smith announces that in-database R is coming to SQL Server 2016. "With this update," Smith writes, "data scientists will no longer need to extract data from SQL server via ODBC to analyze it with R. Instead, you will be able to take your R code to the data, where it will be run inside a sandbox process within SQL Server itself. This eliminates the time and storage required to move the data, and gives you all the power of R and CRAN packages to apply to your database." It'll no doubt intrigue Data Scientist types, but the devil's in the final details, which Microsoft was still cagey about when it talked-the-not-exactly-glitch-free-talk (starts @57:00) earlier this month at Ignite. So, brush up your R, kids, and you can see how Microsoft walks the in-database-walk when SQL Server 2016 public preview rolls out this summer.
Graphics

Oculus Rift Hardware Requirements Revealed, Linux and OS X Development Halted 227

Posted by Soulskill
from the sad-penguin dept.
An anonymous reader writes: Oculus has selected the baseline hardware requirements for running their Rift virtual reality headset. To no one's surprise, they're fairly steep: NVIDIA GTX 970 / AMD 290 equivalent or greater, Intel i5-4590 equivalent or greater, and 8GB+ RAM. It will also require at least two USB 3.0 ports and "HDMI 1.3 video output supporting a 297MHz clock via a direct output architecture."

Oculus chief architect Atman Binstock explains: "On the raw rendering costs: a traditional 1080p game at 60Hz requires 124 million shaded pixels per second. In contrast, the Rift runs at 2160×1200 at 90Hz split over dual displays, consuming 233 million pixels per second. At the default eye-target scale, the Rift's rendering requirements go much higher: around 400 million shaded pixels per second. This means that by raw rendering costs alone, a VR game will require approximately 3x the GPU power of 1080p rendering." He also points out that PC graphics can afford a fluctuating frame rate — it doesn't matter too much if it bounces between 30-60fps. The Rift has no such luxury, however.

The last requirement is more onerous: WIndows 7 SP1 or newer. Binstock says their development for OS X and Linux has been "paused" so they can focus on delivering content for Windows. They have no timeline for going back to the less popular platforms.
AI

Baidu's Supercomputer Beats Google At Image Recognition 115

Posted by samzenpus
from the all-the-better-to-see-you-with dept.
catchblue22 writes: Using the ImageNet object classification benchmark, Baidu’s Minwa supercomputer scanned more than 1 million images and taught itself to sort them into about 1,000 categories and achieved an image identification error rate of just 4.58 percent, beating humans, Microsoft and Google. Google's system scored a 95.2% and Microsoft's, a 95.06%, Baidu said. “Our company is now leading the race in computer intelligence,” said Ren Wu, a Baidu scientist working on the project. “I think this is the fastest supercomputer dedicated to deep learning,” he said. “We have great power in our hands—much greater than our competitors.”
Programming

Is Agile Development a Failing Concept? 507

Posted by timothy
from the surely-you're-not-all-out-of-buzzwords dept.
Nerval's Lobster writes: Many development teams have embraced Agile as the ideal method for software development, relying on cross-functional teams and adaptive planning to see their product through to the finish line. Agile has its roots in the Agile Manifesto, the product of 17 software developers coming together in 2001 to talk over development methods. And now one of those developers, Andy Hunt, has taken to his blog to argue that Agile has some serious issues. Specifically, Hunt thinks a lot of developers out there simply aren't adaptable and curious enough to enact Agile in its ideal form. 'Agile methods ask practitioners to think, and frankly, that's a hard sell,' Hunt wrote. 'It is far more comfortable to simply follow what rules are given and claim you're 'doing it by the book.'' The blog posting offers a way to power out of the rut, however, and it centers on a method that Hunt refers to as GROWS, or Growing Real-World Oriented Working Systems. In broad strokes, GROWS sounds a lot like Agile in its most fundamental form; presumably Hunt's future postings, which promise to go into more detail, will show how it differs. If Hunt wants the new model to catch on, he may face something of an uphill battle, given Agile's popularity.
Power

Wireless Charging Tech Adopted By Ford, Chrysler, and Toyota Goes Open Source 75

Posted by timothy
from the cautious-optimism dept.
An anonymous reader writes: The in-vehicle wireless charging technology adopted by Ford, Chrysler, Dodge, RAM, and Toyota has been released to the public domain without royalties or licenses. This technology that you probably never heard of before is in 12 vehicles; more vehicles than all the other wireless charging standards combined. The open standard web page shows schematics, app notes, and certification information to get companies to make compatible wireless charging products.
Intel

Intel NUC5i7RYH Broadwell Mini PC With Iris Pro Graphics Tested 80

Posted by timothy
from the why-pay-for-big-any-more? dept.
MojoKid writes: In addition to ushering in a wave of new notebooks and mobile devices, Intel's Broadwell microarchitecture has also found its way into a plethora of recently introduced small form factor systems like the company's NUC platform. The new NUC5i7RYH is a mini-PC packing a Core i7-5557U Broadwell processor with Iris Pro graphics, which makes it the most powerful NUC released to date. There's a 5th-gen Core i7 CPU inside (dual-core, quad-thread) that can turbo up to 3.4GHz, an Iris Pro 6100 series integrated graphics engine, support for dual-channel memory, M.2 and 2.5" SSDs, 802.1ac and USB 3.0. NUCs are generally barebones systems, so you have to build them up with a drive and memory before they can be used. The NUC5i7RYH is one of the slightly taller NUC systems that can accommodate both M.2 and 9.5mm 2.5 drives and all NUCs come with a power brick and VESA mount. With a low-power dual-core processor and on-die Iris Pro 6100-series graphics engine, the NUC5i7RYH won't offer the same kind of performance as systems equipped with higher-powered processors or discrete graphics cards, but for everyday computing tasks and casual gaming, it should fit the bill for users that want a low profile, out-of-the-way tiny PC.
Technology

Ask Slashdot: After We're Gone, the Last Electrical Device Still Working? 403

Posted by Soulskill
from the all-the-robots-that-killed-us dept.
Leomania writes: After watching a post-apocalyptic Sci-Fi short on YouTube (there are quite a few) and then having our robot vacuum take off and start working the room, I just wondered what would be the last electric/electronic device still functioning if humans were suddenly gone. I don't mean sitting there with no power but would work if the power came back on; rather, something continuously powered, doing the task it was designed for. Are we talking a few years, decades, or far longer?
Data Storage

Enterprise SSDs, Powered Off, Potentially Lose Data In a Week 184

Posted by timothy
from the other-side-of-solid-state's-speed dept.
New submitter Mal-2 writes with a selection from IB Times of special interest for anyone replacing hard disks with solid state drives: The standards body for the microelectronics industry has found that Solid State Drives (SSD) can start to lose their data and become corrupted if they are left without power for as little as a week. ... According to a recent presentation (PDF) by Seagate's Alvin Cox, who is also chairman of the Joint Electron Device Engineering Council (JEDEC), the period of time that data will be retained on an SSD is halved for every 5 degrees Celsius (9 degrees Fahrenheit) rise in temperature in the area where the SSD is stored. If you have switched to SSD for either personal or business use, do you follow the recommendation here that spinning-disk media be used as backup as well?