Tuesday, March 23, 2010

xFruits - 21st Century Green Tech - 10 new items

How Super Computing is Revolutionizing Nuclear Power  

2010-03-24 02:05

Katie Fehrenbacher - Clean Power

Out of all the carbon-free power options, nuclear power faces some of the highest hurdles to commercial-scale deployment. The upfront costs for reactors are in the billions, the projects take years to site and build, and nuclear materials and designs have to undergo testing for decades to make sure they can be used in the field. That’s one reason why nuclear research costs a whole lot of money and the pace of innovation seems incredibly slow. But that’s also the reason why supercomputing has started to truly revolutionize the world of nuclear power innovation.

Supercomputing, or “extreme computing” as the Department of Energy described it during a workshop on computing and nuclear power last year, involves computers at the petaflop scale. It will eventually reach even exaflop scale. A computer running at a petaflop can do 1 million billion calculations in a second, and an exaflop of performance can deliver a billion billion calculations per second (see Supercomputers and the Search for the Exascale Grail, subscription required).

That massive amount of number crunching can help developers of nuclear power technology simulate next-generation designs of nuclear reactors, show how advanced fuels in a reactor could be consumed over time, and model more efficient waste disposal and refueling efforts. It’s all about being able to go through very complex and lengthy research and development processes much more quickly and with far less cost compared to both physical testing and using less powerful computers.

TerraPower, the nuclear startup backed by Microsoft Chairman Bill Gates that is working on a traveling wave reactor design, has leaned heavily on supercomputing to design and model its reactor and the lifecycle of the fuel. The TerraPower team says they are using "1,024 Xeon core processors assembled on 128 blade servers," which is a cluster that is "over 1,000 times the computational ability as a desktop computer."

Intellectual Ventures, which is led by former Microsoft chief technology officer Nathan Myhrvold and which spun out TerraPower, explains the importance of computer modeling for nuclear power on its web site as:

Extensive computer simulations and engineering studies produced new evidence that a wave of fission moving slowly through a fuel core could generate a billion watts of electricity continuously for well over 50 to 100 years without enrichment or reprocessing. The hi-fidelity results made possible by advanced computational abilities of modern supercomputer clusters are the driving force behind one of the most active nuclear reactor design teams in the country.

Supercomputing can also help extend the lives of more traditional nuclear reactors, can make them more efficient and safer, and, as the costs come down, can be used by average scientists. The Department of Energy has been developing the Nuclear Energy Modeling and Simulation Hub, which is intended to help nuclear engineers use computing for predicting how nuclear reactors can be extended and upgraded. (The deadline to apply for the DOE hub project is coming up this month.)

This type of computing for nuclear power has in the past mostly been used by computing specialists. But in the future, through programs like the Nuclear Energy Modeling and Simulation Hub, more scientists can model virtual reactors running under different scenarios and safety conditions.

One ironic twist to this whole equation — petascale and exascale computing requires lots of power to run. As David Turek, VP of Deep Computing, IBM put it to us for our article Supercomputers and the Search for the Exascale Grail (subscription required):

“Today the energy required for memory is still measured in kilowatts: but exascale memory takes 80 megawatts and then you add 60 or 80 megawatts for interconnect and the whole energy profile would be measured in gigawatts. You’re going to need a nuclear power plant to run one of these.”
Top

A123 Insiders Soon Free to Sell as Lock-Up Period Ends  

2010-03-24 00:24

Josie Garthwaite - Energy Storage

Just five more days. And on the sixth day, officers, directors, employees and early investors in A123Systems holding more than 71 million shares of common stock in the battery maker will be free to cash out. While certain volume restrictions can still apply under federal securities law, Monday marks the end of what’s called a “lock-up period,” lasting 180 days from A123’s $371 million initial public offering.

The point of lock-up agreements like those that underwriters secured with A123 insiders ahead of the company’s September IPO is to prevent a company’s stock from gushing too quickly onto the market. Too much stock unloaded in a short period could put a damper on the stock price in early trading.

A123 explained the worst-case scenario in the risk section of its prospectus last fall, writing that when its lock-up agreements with shareholders expire, ”A significant portion of our total outstanding shares may be sold into the public market…which could cause the market price of our common stock to drop significantly, even if our business is doing well.”

A123 notes that shares don’t actually have to be sold en masse to have this effect. Rather, the simple, “market perception that the holders of a large number of shares intend to sell shares, could reduce the market price of our common stock.”

The battery maker’s successful IPO last fall boosted its cachet and signaled a go-ahead for its ambitious manufacturing plans. It also has helped to renew other cleantech companies' hope that the public markets are once again open to them as a source of funding. After a much-lamented drought of cleantech IPOs, the industry greeted A123’s successful IPO as a sign that investor appetite could be back.

Similarly, how well A123’s stock performs after Monday’s milestone, and how much insiders decide to unload in coming weeks, will offer a signpost for the industry. While A123 has a potentially bountiful future, investors have proven impatient when it comes to revenue and profits, which aren’t expected to ramp up until after 2012. The stock has been up and down over the past several months. By afternoon on Tuesday A123's stock had dropped to $15.50, off of former highs above $20.

A123’s investors include General Electric, Procter & Gamble, Motorola, Qualcomm, North Bridge Venture Partners, Sequoia Capital, CMEA Ventures, FA Technology Ventures, OnPoint and the Massachusetts Institute of Technology. Check out our chart showing the company’s major shareholders and how much they own.

Top

Daily Sprout  

2010-03-24 00:11

Josie Garthwaite - Misc

Modeling Climate Change at the City Level: The “massive grid cells favored by climate models run on today’s supercomputers as useful as they could be for planning purposes, given that they can encompass 10,000 square kilometers.” But now the U.S. government is launching a $50 million effort to develop new “computer models aimed at revealing the anticipated effects of climate change at the regional level.” — Scientific American

How Much Will Coal Interests Influence the Climate Bill?: “Just as the insurance companies — and to a lesser extent, the pharmaceutical industry — were powerfully positioned to first influence the writing of the legislation and then to attempt to block health care reform, the coal industry and other polluters are in the same spot when it comes to a climate and energy bill.” — SolveClimate

Tribes Key for Green Energy Buildout: A new report from the National Wildlife Federation finds,”Tribal lands make up almost 5 percent of the United States and hold around 10 percent of the country’s renewable energy resources.” — Associated Press

Valero CEO: CO2 Regulations Would Freeze Investment: “The U.S. refining industry will freeze investment in anything beyond maintaining operations if the Environmental Protection Agency moves to regulate carbon pollution,” according to the CEO of Valero Energy Corp. — Reuters

Hybrids vs. Pricey Small Cars: “Producers of upscale small cars such as the Volkswagen Golf, Mazda3 and Volvo C30 face stiff competition from hybrid vehicles for reasons unrelated to fuel mileage. Upscale cars are priced at the top end of their individual segment….The buyer expects others to notice that their vehicle is more than just basic transportation.” – Autoblog Green

Top

BioFuelBox Goes Bust: Report  

2010-03-23 20:43

Josie Garthwaite - Biofuels

BioFuelBox, a San Jose, Calif.-based startup that has spent a quiet four years working to turn waste grease, oil and fat into low-sulfur biodiesel, has ceased operations, according to a peHUB report, “after being unable to reach agreement on a new round of funding with its investors.”

About six months ago, the company told us it was in the process of raising its second round of funding. That apparently didn’t pan out. Backed by Draper Fisher Jurvetson and Element Partners, the startup had raised its first round of $9.46 million back in 2007. And last fall, the company announced it had begun operating its first biodiesel refinery, a 1-million-gallon-per-year plant.

It marked a risky move at a time when other biodiesel plants (generally much larger than BioFuelBox’s “containerized, modular micro-refineries,”) were being idled, shuttered and put up for sale (see our Biofuels Deathwatch map). BioFuelBox claimed at the time that it was already bringing in revenue and producing biodiesel at rates cost-competitive with diesel from petroleum.

The company aimed to set up 10 of its modular refineries by the middle of 2011, and see its first profits early in 2011. Richard Reddy, vice president of marketing, told us the company would need only "a couple dozen" of these  located near grease supplies, to become "wildly profitable."

The idea of eliminating much of the cost associated with transportation of biofuels by moving production of biofuels closer to the feedstock or pumping station isn’t unique to BioFuelBox. Several of Vinod Khosla's biofuel investments have adopted similar strategies. But for BioFuelBox, at least, it seems the promise of wild profitability following that route hasn’t been enough for investors to keep the faith.

Top

25 Cities That Have Gone Gaga for Green Building  

2010-03-23 17:52

Josie Garthwaite - Energy

Just nine metropolitan areas in the U.S. at the end of 2009 could boast the presence of more than 100 commercial and industrial buildings earning the Energy Star label for energy efficiency, according to the just-released list of the EPA’s top 25 cities for Energy Star buildings.

The list (included in full below) offers a general sense of which metro areas are really pushing for greener buildings. But the actual rank on the list shouldn’t be taken at face value, because the number of buildings in a city with superior energy efficiency doesn’t directly correlate with reductions in energy consumption. In this case, size matters.

To determine a facility’s energy performance (the first step toward an Energy Star label), the EPA compares the amount of energy used among similar types of facilities on a scale of 1-100. To qualify for the Energy Star label, a commercial building must earn a score of 75 or higher, while industrial facilities have to score in the top 25 percent (the EPA says its rating system accounts for variables such as operating conditions and regional weather data).

For example, New York City ranks as No. 10 on the EPA’s list, with only 90 buildings. But those 90 buildings include 50.4 million square feet of floor space — more than four times the total floor space of the 120 Energy Star labeled buildings in Lakeland, Fla. So Lakeland ranks higher on the list (No. 7), but its greener buildings deliver only $8.3 million in cost savings, and prevent emissions equivalent to those resulting from 6,300 homes electricity use. By comparison, New York Cities’ 90 buildings deliver $88.3 million in cost savings and prevent emissions equivalent to those from 31,000 homes.

Los Angeles, Calif. holds the top spot on the EPA list, with 293 Energy Star labeled buildings, $93.9 million in cost savings and prevention of emissions equivalent to the impact of 34,800 homes. But the 133 buildings in Houston, Tex. prevent emissions equivalent to 53,400 homes (more than any other metro area on the list), while saving only $73.9 million and earning the city slot No. 6 on the list.

That said, spurring competition among cities to boost the efficiency of commercial buildings is certainly comes as a welcome effort. According to Energy Star, the energy used in commercial buildings and manufacturing plants accounts for nearly half of the country’s power consumption and costs more than $200 billion per year — more than any other sector of the economy.

The Energy Star program is widely used to identify top energy-performing buildings, and Matthew Macko, a principal with San Francisco-based Environmental Building Strategies, has told us that part of its appeal stems from the fact that the web-based system is quick. It also helps that the label has a federal agency vouching for it that’s well known outside the building industry (the EPA).

But alternative scoring systems have begun to crop up. The American Society of Heating, Refrigerating and Air-Conditioning Engineers unveiled a program last year called Building EQ that’s meant to “expand on the type and amount of information the Energy Star program provides,” presenting buildings with a kind of report card of their energy use. A typical commercial building today would get about a C rating under Building EQ, and while an Energy Star-rated building would likely earn a B.

The basic goal of these types of programs is to highlight information about the environmental impact of a building. From there the hope is for the market to provide a premium for structures that are cleaner and more efficient, and in turn, give owners and developers an incentive to seek out the rating with greener designs.

Photo courtesy of Flickr user O b s k u r a

Related reports on GigaOM Pro (subscription required):

Green IT, Meet Green Building

Is Energy Management the Killer App for the Home Automation Market?

Top

Intel Spinout SpectraWatt Snags $41.4M for Solar Cell Plant  

2010-03-23 16:06

Josie Garthwaite - Clean Power

SpectraWatt, a maker of crystalline silicon solar cells spun out of Intel back in 2008, has dusted itself off after funding troubles and manufacturing delays, and now says it has the capital necessary to finish work on its first factory, allow for capacity expansions and continue developing its technology. This morning the company announced that it has raised $41.4 million in convertible debt from a subsidiary of Goldman Sachs, Intel’s VC arm Intel Capital, and the PCG Clean Energy & Technology Fund. A portion of the new funds will be used to ready SpectraWatt’s factory in Hopewell Junction, New York, to produce cells for the company’s first customer shipments in the second quarter of this year.

SpectraWatt describes itself as having “semiconductor manufacturing roots,” and it aims to shrink the cost of solar energy generation in part by improving the manufacturing processes for solar cells. Already, SpectraWatt says it has completed construction of the plant (designed to house up to 200 megawatts in production capacity), and begun initial cell production on its first 60-megawatt line.

The startup has survived a few challenges to get here. Back in mid-2008, when it spun out of Intel with a $50 million Series A financing round led by Intel Capital (the computing giant’s investing arm), SpectraWatt said it hoped to "break ground" on a 60-megawatt factory in Oregon by the end of that year, with plans to ship product by the middle of 2009.

SpectraWatt put its plans to build a factory in Hillsboro, Ore., on hold in January 2009 after being unable to get enough financing. CEO Andrew Wilson told The Oregonian newspaper at the time that SpectraWatt was searching for an existing building that it could retrofit for less than the cost of building a new factory, and had started considering leaving the state. He also said the change of plans would delay SpectraWatt's first solar-cell shipments by five or six months.

SpectraWatt ended up setting up headquarters in Hopewell Junction last spring, although it still has some operations in Oregon. If it sticks with its current timeline and starts shipping cells to customers within the next three months, that will put it about a year behind the original schedule.

But after a year in which the solar industry struggled with scarce credit, slack demand, module overproduction and plunging prices, and solar companies have canceled, scaled back or delayed projects right and left, the startup still remains standing — not a small feat.

Related reports on GigaOM Pro (subscription required):

Cleantech Financing Trends: 2010 and Beyond

Renewable Energy Charging Up Electrical Transmission Tech

Getting Solar Onto the Smart Grid

Top

TerraPower In Talks With Toshiba for Mini Nuclear  

2010-03-23 14:52

Katie Fehrenbacher - Clean Power

Having the world’s most famous billionaire tech tycoon in your corner can really open a lot of doors. Nuclear startup TerraPower, which counts Microsoft Chairman Bill Gates as a principal owner and advocate, is reportedly in talks with Japanese giant Toshiba to jointly develop a small nuclear reactor.

First reported by Japan’s Nikkei business daily, the partnership could focus on nuclear traveling wave reactor technology, which is a relatively new type of small nuclear reactor design that can use the waste byproduct of the enrichment process, or waste uranium, for fuel. Traveling wave nuclear reactors have been under development since the 1990's, but TerraPower is one of the first companies to develop a practical design for the technology. (See 6 Nuclear Power Startups To Watch and Nuclear Power By the Numbers.

TerraPower is a nuclear spinoff project from incubator Intellectual Ventures, which is run by former Microsoft chief technology officer Nathan Myhrvold. The startup uses a small amount of enriched uranium at the beginning of the process (see slides below), but then the nuclear reactor runs on waste product and can make and consume its own fuel.

The benefits of the design are that the reactor doesn't have to be refueled or have its waste removed until the end of life of the reactor, which is theoretically a couple hundred years. Using waste uranium reduces the amount of waste in the overall nuclear life cycle, and extends the available supply of the world's uranium for nuclear by many times. According to this presentation by TerraPower CEO John Gilleland, "operation of a traveling wave reactor can be demonstrated in less than ten years, and commercial deployment can begin in less than fifteen years."

Not surprisingly, with its Microsoft connection, TerraPower has leaned heavily on supercomputing to design and model the reactor and the lifecycle of the fuel. The TerraPower team is using "1,024 Xeon core processors assembled on 128 blade servers," which is a cluster that is "over 1000 times the computational ability as a desktop computer."

Bill Gate’s talk, which mentioned TerraPower, at the TED conference:

For more research check out GigaOM Pro (subscription required): Cleantech Financing Trends: 2010 and Beyond

Top

How to Improve Vehicle Software: Open Up the Data, Dive In  

2010-03-23 07:00

Josie Garthwaite - @Not for Syndication

A string of recent events has helped to focus a strong — and not too favorable — light on the growing amount of software embedded in the cars we drive: Apple co-founder Steve Wozniak made headlines last month complaining that his Prius had a “scary” software glitch. Not long afterwards, Toyota recalled its 2010 Prius and Lexus HS 250h hybrids for a software update related to the anti-lock brake system. And on top of that, documents have emerged in the last couple weeks showing that federal regulators asked the automaker back in 2007 to install software to prevent sudden acceleration in its vehicles (an action Toyota didn’t take until this year).

The scrutiny has prompted calls for tougher standards for vehicle software. Well, SAE International, the auto industry’s main standards development group, has just launched a database system that in the coming years could help move vehicle software quality in the right direction. Called the SAE Software Assessment Repository, the web-based system will allow automakers to get a more detailed look at potential software suppliers’ strengths and capabilities. The database will give those suppliers a place to post and share the most salient results — rather than just a general capability level — from the assessments that are already widely used in the industry.

According to SAE’s Caroline Michaels, Ford and General Motors plan to mandate use of the repository for their suppliers, and SAE has “begun talks” with BMW and other automakers, which Michaels anticipates will follow Ford and GM’s lead in making it mandatory. (We’ve reached out to Ford, GM and Toyota for comment.)

“It’s important that you don’t have a recall because of software,” said Michaels. But far from a hasty response to the current sensitivity to high-tech cars as a result of Toyota’s recent recalls, the database system has been in the making for nearly five years, as a response to “the incredible rise in automotive software,” she said.

By some accounts, she said, “high-end cars now have more software than jets.” And over time, the introduction of greener vehicles — hybrids, plug-in hybrids and all-electric vehicles — will add to the trend of vehicles relying more and more on embedded software.

The SAE J2746 Software Assessment Repository Task Force came together in the fall of 2005 with a mission to help answer a range of questions — from whether the quality of automotive software is generally improving, and whether it’s getting cheaper, to whether suppliers are marketing their software appropriately.

Part of the repository’s intended function — making it easy for an automaker to see what capabilities a potential supplier has in a particular area (such as powertrain, safety or chassis systems) — seems to highlight a remarkable gap in the industry. As SAE’s Embedded Software Standards Committee put it in a technical report during the development of the repository, “When a customer is interested in the capabilities of a supplier, the relevance of those capabilities to the customer’s need is critical.”

In other words, if a company is looking for software related to the powertrain, it wants a supplier with a strong track record developing high-quality software for powertrain systems. It wants to know how long ago the supplier was assessed, and how long the team has been together, Michaels explained. But in the past, the industry has not had a standard process for sharing detailed assessment results in those specific areas.

That has left a door open for some less-than-stellar marketing practices. Michaels explained that without this type of standard reporting system, a large software supplier could acquire a small developer that had performed well in assessments, and then market the entire organization using those assessment results. She described it as “manipulating the system.” There simply hasn’t been enough detailed information reported on a uniform basis across the industry. Providing guidance on how automakers should interpret those details, the database is also meant to help, “Prevent surprise issues due to loose interpretations of the results,” according to the technical report.

Still, Michaels emphasized that the creation of this repository is not an indication that the assessments themselves are lacking. In fact, all of the data going into the new system is already captured in today’s assessments — but how it’s reported varies from supplier to supplier. ”Reporting needs to be uniform,” Michaels said. In part, that will “help developers who are capable to get more exposure.”

That’s one reason they have to welcome the new database. SAE has also put some competitive safeguards in place: “The repository was not developed to allow suppliers to view each other’s capability,” said Michaels. Automotive software suppliers will be able to set who gets to view their assessment data, and for how long. In addition, the system is meant to help reduce assessment costs for suppliers and OEMs alike.

"This is an important step both in terms of improving the quality and reliability of automotive embedded software and reducing developmental costs,” Peter Abowd, president of worldwide automotive at Altia and chairman of the J2746 repository committee, said in a statement last week. He added that the repository “promotes higher fidelity and more responsible disclosure of software development capability.”

The repository will also provide a valuable source of data to help improve vehicle software down the road, said Michaels. SAE will be able to mine that data to help determine whether new standards or targets are needed, and if the industry and organization can “do something to improve capability” in a given area.

Related reports on GigaOM Pro (subscription required):

The App Developer’s Guide to Working with Ford Sync

Electric Cars Need Software, Not Just Hardware

Intel’s Atom Paves the Way for Smarter Electric Vehicles

Top

Peabody Energy Pours $15M Into Carbon Recycler Calera  

2010-03-22 22:30

Josie Garthwaite - Clean Power

Peabody Energy, the world’s largest coal company, has taken a shine to an idea from a Silicon Valley startup to capture and recycle carbon emissions. The coal giant announced today that it has invested $15 million into 3-year-old Calera, which has developed technology to capture carbon dioxide emissions from power plants, refineries and other industrial facilities, and with the addition of waste water or brine, uses it to produce cement and other building materials.

New York Times columnist Thomas Friedman got the scoop from an unnamed source earlier this month that Peabody would soon announce a stake in Calera (venture capitalist Vinod Khosla’s “favorite baby right now”), but the companies have just publicly confirmed and revealed the amount of the investment this morning.

Khosla, whose firm Khosla Ventures has invested some $50 million in Calera, boasted in a Times piece this weekend, “With this technology, coal can be cleaner than solar and wind, because they can only be carbon-neutral.”

High-profile backers aside, the company’s claims have drawn criticism in recent months. Last spring (when the company had a diagram of its process on display at the California Academy of Sciences) Ken Caldiera, a professor in the Carnegie Institution Department of Global Ecology who studies carbon sequestration, said that from the publicly available info about Calera's technology, it seems to go "in the wrong direction and will tend to increase and not decrease atmospheric CO2 content."

Calera currently has a demonstration project at a natural gas-fueled power plant near Moss Landing, Calif., but like the rest of the carbon capture and sequestration sector, it has yet to be proven at commercial scale. As Google CEO Eric Schmidt put it in a talk with Department of Energy chief Steven Chu, carbon capture remains in the beta phase and still needs some “debugging.”

Emerging Energy Research noted in a recent report that it will be the six global supermajors — BP, Chevron, ConocoPhillips, ExxonMobil, Shell and Total — that will be at the front of the line to benefit from hefty carbon capture investments coming down the pike from countries that rely heavily on energy from coal. But for startups that offer a key technology for the carbon capture process itself, Lux Research analyst Mark Bunger, who heads up the firm’s Biosciences division, has told us, “there are going to be buyers.”

Related reports on GigaOM Pro (subscription required):

Facebook’s Coal-Powered Problem

Top

10 Things Outta the Smart Grid World of DistribuTECH  

2001-12-05 00:00

Jeff St. John - Smart Grid

DistribuTECH, the once-sleepy power grid trade show, has been transformed into a high-profile smart grid showcase over the past couple of years — and this year's show in Tampa, Fla. is no exception. Only two days into the event and we've got a host of announcements detailing new partnerships between IT giants and home energy management startups, new ways to do demand response, and new technologies for keeping the grid running smoothly. Here’s 10 things you should pay attention to coming outta DistribuTECH:

1). Smart grid consumer outreach: Today the Smart Grid Consumer Collaborative, a partnership aimed at bridging the gap between smart grid technologies and power consumers, launched. Given the consumer backlash mounting against smart meters in California and Texas, and the focus on consumer friendliness from the federal agency setting smart grid standards, it's something the industry needs. Partners include IBM and General Electric, home automation company Control4, smart grid networking vendor Silver Spring Networks, and utility Southern California Edison. While the SGCC is focused on educating utilities, vendors and consumers, it could also be seen as a list of companies that might be partnering with one another in the future.

2). Consumer-facing partnerships: Of course, in the patchwork world of smart grid vendors, having multiple routes to the customer can only help. GridPoint, the richly-funded smart grid software maker that's working on Xcel Energy's showcase SmartGridCity project in Boulder, Colo., already has its own home energy management system courtesy of its acquisition of Lixar last year. But that hasn't stopped it from inking two interoperability deals at DistribuTECH this week. One is with home energy display maker EnergyHub and another is with smart thermostat maker ecobee.

3). New standards for demand response: Tendril Networks, the Boulder-based home energy management startup that makes software and devices to link smart meters to customers, launched its new Vision home energy dashboard at the show this week. Tendril also announced two new partnerships aimed at getting it connected en masse — a licensing agreement with smart metering heavyweight Landis+Gyr and plans to integrate with Utility Integration Solutions (UISOL), a company working on automating the demand response systems that allow utilities to ask customers to turn down power to shave peak demand. The latter project is interesting, as it melds two different standards for managing the automation of getting utility power-down signals to customers.

Tendril supports both Smart Energy Profile, an emerging standard from the ZigBee Alliance aimed at the residential market, and OpenADR, a Berkeley Labs-developed open source standard used today by commercial and industrial customers. UISOL is developing kits for utilities to implement OpenADR, and Smart Energy Profile has been named as a likely smart grid standard by the National Institute of Standards and Technology (NIST), so both are likely to play an important part in future demand response systems.

4). Wi-Fi for the home energy network? So far, the world of the energy-sensing and controlling home area network (HAN) has been dominated by ZigBee, the low-power wireless technology that is being included in many of the smart meters now being deployed in North America. But there’s a growing number of home energy devices looking to good old Wi-Fi to get the networking job done, and smart meter maker Aclara joined that group this week.

The division of Esco Technologies said it was working with Intwine Connect to deliver a Wi-Fi home area network to utilities. The Wi-Fi Alliance said last week that it would work with the ZigBee Alliance on a new version of its smart energy profile, indicating a path for Wi-Fi networks to mash up with evolving smart grid standards.

5). Other roads to demand response: There are multiple ways to get utilities and customers talking to one another, such as Comverge's Apollo software system. The big demand response aggregator said Tuesday that it will work with big smart meter maker Sensus to run Comverge signals across Sensus' network. Of course, it's a "non-exclusive alliance" that mentions a host of other linkages, such as ZigBee home automation systems and the U-SNAP Alliance, a group that wants to make smart grid devices with modular communications slots.

6). Smart grid security: Control4, the high-end home automation company with a lower-cost system for measuring household energy, announced a partnership to deliver residential demand response with Lockheed Martin, the defense contractor that also has a host of utility partnerships. Lockheed and fellow defense contractors like Boeing and Raytheon say they can provide military-grade security to the grid — and security is a particular focus of NIST.

7). Opening up smart grid platforms: Control4 also made a nod toward open smart grid systems with news that it will open its new Advantage software platform to third-party application developers. That follows the example of many startups, as well as IT giants such as Google and Microsoft, which both plan to open their nascent home energy management platforms to third-party developers as well. It's all part of a growing push toward opening smart grid data for new uses, one that could come into conflict with rising data privacy and security concerns.

8). Silver Spring expands: No smart grid shindig could go without mention of Silver Spring Networks, the well-funded smart grid startup voted most likely to go public this year. On Tuesday, the Redwood City, Calif.-based company added a new customer, Australia's Western Power, to its growing list of utilities under contract. Notably, the utility will use both Silver Spring's smart grid networking and its CustomerIQ home energy management platform, based on technology it got when it bought startup Greenbox in September.

9). New grid control systems: While linking consumers to the smart grid takes center stage, the business of actually running the grid is building up steam. The Department of Energy directed some $2 billion of its smart grid stimulus funds at integrating and "crosscutting" different smart grid technologies. S&C Electric Co. launched just such an integrated platform on Tuesday, one that links software, radio networks and the Chicago-based company's grid control gear, such as switches and fault restoration devices.

All of those devices have embedded computing power, a way to distribute intelligence throughout the grid to restore grid failures in seconds, rather than minutes, Witold Bik, vice president of automation systems for S&C told us. Utilities face increasing regulatory and business pressure to shorten the time it takes to restore outages, as well as the challenge of integrating more intermittent renewable power onto their grids.

10). Faster smart grid networks: Part of making the smart grid work is having a network fast enough to trip grid equipment to match fluctuations in grid power, and S&C said it is also forming a new business line to market its SpeedNet radio system for such uses. S&C will face competition from the likes of Tropos Networks, the muni Wi-Fi company that's moved into smart grid.

Tropos announced a partnership with leading smart meter maker Itron at DistribuTECH this week, aimed at integrating their platforms. Itron, like many smart meter networking companies prevalent in North American smart meter deployments, uses a lower-cost, lower-bandwidth 900-megahertz network to link smart meters at the neighborhood level, but needs partners to provide the "backhaul" to connect those smart meter networks with utilities.

Other companies working on richer, faster wireless networks include Trilliant, Arcadian Networks and the host of companies working on WiMax-based smart grid networks, such as Grid Net.

Top

No comments: