Go to GCC Discussions forum About GCC Contact  RSS Subscribe Twitter headlines

March 2012

March 31, 2012

Study suggests eco-driving techniques could reduce public transit fleet fuel consumption by up to 18.7%

Transit fleets could reduce fuel consumption on average by as much as 18.7% by engaging in fuel-efficient, eco-driving best practices, according to a Public Transit Fuel Efficiency Study released by SmartDrive Systems, a provider of fleet management and driver safety systems and services.

Eco-driving best practices for public transit include smooth acceleration and deceleration; reducing excess idling; avoiding hard turning (anticipating turns and smoothly decelerating into the turn to take advantage of the bus’s forward momentum and smoothly accelerating out of it); and maintaining constant vehicle speed.

The study evaluated several hundred transit buses and drivers in multiple US locations and from various manufacturers, including Eldorado, Flxible, Gillig, New Flyer, Orion, Prevost, Nova Bus, and Thomas Built to assess the effect of driving performance on fuel consumption and determine the impact of training and in-vehicle instant feedback on improving fuel economy. Study data was compiled by SmartDrive sensors and recorders, then analyzed and training recommendations provided.

The study measured at several key indicators and driving maneuvers known to impact fuel use and economy:

  • Actual fuel use: as measured in miles per gallon from the Engine Control Unit (ECU)
  • Idling time: how much time was spent with the engine running while no movement was recorded for greater than three minutes
  • Acceleration: the incidence, frequency and severity of quick starts and sudden acceleration during travel as measured by accelerometer and ECU
  • Braking: hard braking defined by speed from the ECU, duration of deceleration and G-force effect measured by the accelerometer
  • Turning: hard turning/cornering measured by speed from the ECU and G-force from the accelerometer

During the control period, the SmartDrive system recorded the following inefficient driving maneuvers:

  • 17.8 hard accelerations performed on average per hour;
  • 9.5 hard braking events performed on average per hour; and
  • 3.9 hard turns performed on average per hour.

Real-time in-vehicle feedback on driving maneuvers and idling gave drivers the ability to adjust driving performance as it happened. Post-training performance was measured showing substantial reductions in the number and severity of hard accelerations, hard decelerations and hard turns.

With the volatility of fuel prices, reducing fuel consumption is increasingly important in controlling operating expenses for public transit fleets. Our study documented a significant opportunity to increase fuel efficiency by addressing the 84.8% of fuel waste that can be improved through softer driving. The study also shows that training and real-time in-cab feedback combine to dramatically lower the incidence of wasteful maneuvers.

Within one month, the top 25% of drivers improved their fuel economy from 3.87MPG to 4.59MPG, or 18.7%.

—SmartDrive President Jason Palmer

Conclusions and recommendations for public transit fleets resulting from the study include:

  1. The greatest opportunity in fuel efficiency comes from the way a vehicle is operated, particularly hard driving maneuvers. Identifying inefficient driving habits and reinforcing best practices leads directly to improved performance and reduced operating costs.

  2. Providing drivers with immediate feedback in the vehicle allows them to make quick corrections and learn over time how to most efficiently operate their vehicle. Substantial week-to-week improvements indicate that drivers are adopting and adhering to eco-driving techniques that improve fuel efficiency.

  3. For quick results, deliver additional training with constructive guidance to the drivers that show the highest number of inefficient driving events. The combination of real-time feedback and focused training drives significant and immediate impact on overall fleet fuel consumption. In this study, in less than a month, the top 25% of drivers with the greatest improvement in fuel economy reduced fuel use by an average of 18.7%, resulting in an annual average fuel savings of $3,392 per vehicle.

To further help drivers improve their fuel efficiency, SmartDrive has released a short Eco-Driving Training Video, designed specifically for public transit.

March 31, 2012 in Behavior, Fleets, Fuel Efficiency, Heavy-duty | Permalink | Comments (16) | TrackBack

Way cleared for major Alaska North Slope natural gas pipeline project

Alaskapipeline
Proposed route for the Alaska Pipeline project. Click to enlarge.

Alaska Governor Sean Parnell announced that two major milestones have been met in the state’s effort to bring Alaska’s North Slope natural gas to Alaskans and markets beyond. The North Slope holds more than 35 trillion cubic feet of discovered natural gas.

First, the State of Alaska resolved its long-running litigation with ExxonMobil and other leaseholders regarding the Point Thomson field, which holds almost a quarter of the North Slope’s known natural gas. Second, the three major producers—ExxonMobil, ConocoPhillips and BP—delivered a letter to the governor announcing that they are now aligned with the Alaska Pipeline Project (APP) parties, and working on a gasline project focusing on bringing North Slope gas to tidewater in Alaska.

Point Thomson. Point Thomson, located 60 miles east of Prudhoe Bay, is Alaska’s largest undeveloped oil and gas field, holding an estimated 8 trillion cubic feet of natural gas and hundreds of millions of barrels of oil and gas liquids. ExxonMobil is the unit operator at Point Thomson, with BP, ConocoPhillips and Chevron holding the majority of the leases. The state’s legal dispute with the companies resulted from the lack of development at Point Thomson for the past 30 years.

Today’s settlement lays out strong near-term production commitments and a clear path for full development of Point Thomson’s significant oil and gas resources, and it establishes clear consequences if the companies do not follow through. The companies have agreed to firm timetables for production at Point Thomson. This will result in significant new investment, increased work for Alaskans and increased revenue for state and local government.

The animating principle of this settlement is that the companies must earn their acreage. The more work, more commitment, more investment and more production that occur, the more acreage the companies will retain.

—Alaska Natural Resources Commissioner Dan Sullivan

Major components of the settlement include:

  • Increasing liquids production into the Trans Alaska Pipeline System (TAPS).

  • Opening the Eastern North Slope to new development opportunities by adding infrastructure and a 70,000 barrels per day common carrier pipeline connecting to TAPS.

  • Incentivizing and laying out a clear path and alternatives for full-field development, each of which will require billions of dollars in investment if pursued.

  • Positioning North Slope gas for a large-scale gas pipeline project.

  • Providing potential for significant gas volumes for in-state use no later than 2019.

  • Requiring a commitment to develop a separate oil reservoir within Point Thomson.

Alaska Pipeline Project. The Alaska Pipeline Project proposes to design, permit and construct a new natural gas pipeline system beginning near Alaska’s Prudhoe Bay field and extending over one of two alternative routes.

  • The Alberta option would extend from Prudhoe Bay to points near Fairbanks, and Delta Junction, and then to the Alaska-Canada border, where the pipeline would connect with a new pipeline in Canada. The pipeline in Canada would extend from the Alaska-Canada border to link up with pipeline systems near Boundary Lake, Alberta, Canada, providing the capability of transporting natural gas into the United States.

  • The Valdez LNG option would extend from Prudhoe Bay through points near Fairbanks and Delta Junction, and then to LNG facilities (to be built by third parties) near Valdez, Alaska.

  • In both options, a minimum of five in-state connections to the main pipeline in Alaska (off-takes) would provide local natural gas suppliers the opportunity to obtain natural gas to meet community needs. For the Alberta option, local off-takes will also be available along the pipeline route in Canada.

  • The Alaska Pipeline Project proposes to design, permit and construct a new gas treatment plant (GTP) as an integral component of the project’s facilities. It would be located near existing Prudhoe Bay facilities and operate in conjunction with either the Alberta or the Valdez options.

  • A natural gas transmission pipeline connecting the Point Thomson field to the GTP is also a proposed component of both options.

ExxonMobil, ConocoPhillips, BP and TransCanada, through its participation in the Alaska Pipeline Project, announced in a joint letter to Governor Parnell that they have agreed on a work plan aimed at commercializing North Slope natural gas resources within an Alaska Gasline Inducement Act (AGIA) framework. Because of a rapidly evolving global market, large-scale liquefied natural gas (LNG) exports from south-central Alaska will be assessed as an alternative to a natural gas pipeline through Alberta.

In addition to broadening market access, a south-central Alaska LNG approach could more closely align with in-state energy demand and needs. We are now working together on the gas commercialization project concept selection, which would include an associated timeline and an assessment of major project components including in-state pipeline routes and capacities, global LNG trends, and LNG tidewater site locations, among others.

—Producers letter to Gov. Parnell

In their joint letter, the CEOs of ExxonMobil, ConocoPhillips and BP also said that the “unprecedented commitments of capital for gas development will require competitive and stable fiscal terms with the State of Alaska first be established”.

With Point Thomson legal issues now settled, the producers are moving forward with the initial development phase of the Point Thomson project.

March 31, 2012 in Infrastructure, LNG, Natural Gas | Permalink | Comments (25) | TrackBack

NRC clears path for issuing licenses for two AP1000 reactors at Summer site in South Carolina

The Nuclear Regulatory Commission (NRC), in a 4-1 vote, found its staff’s review adequate to make the necessary regulatory safety and environmental findings, clearing the way for the NRC’s Office of New Reactors (NRO) to issue the two Combined Licenses (COLs) for the South Carolina Electric & Gas (SCE&G) and Santee Cooper application for the Summer site in South Carolina. The COLs will authorize SCE&G and Santee Cooper to build and operate two AP1000 reactors at the Summer site, adjacent to the company’s existing reactor approximately 26 miles northwest of Columbia, S.C.

The AP1000 is a 1,100 megawatt electric pressurized-water reactor that includes passive safety features that would cool down the reactor after an accident without the need for electricity or human intervention.

The Commission’s findings impose two conditions on the COLs, with the first requiring inspection and testing of squib valves, important components of the new reactors’ passive cooling system. The second requires the development of strategies to respond to extreme natural events resulting in the loss of power at the new reactors. The Commission also directed NRO to issue to SCE&G and Santee Cooper, simultaneously with the COLs, an Order requiring enhanced, reliable spent fuel pool instrumentation, as well as a request for information related to emergency plant staffing.

The NRC staff is expected to issue the COLs within 10 business days.

SCE&G and Santee Cooper submitted the COL application on March 27, 2008. The NRC’s Advisory Committee on Reactor Safeguards (ACRS) independently reviewed aspects of the application that concern safety, as well as a draft of the staff’s Final Safety Evaluation Report (FSER). The ACRS provided the results of its review to the Commission in a report dated 17 Feb. 2011. The NRC completed its environmental review and issued a Final Environmental Impact Statement for the Summer COLs on 15 April 2011. The NRC completed and issued the FSER on 17 Aug. 2011. The NRC certified Westinghouse’s amended AP1000 design on 30 Dec. 2011.

March 31, 2012 in Brief | Permalink | Comments (23) | TrackBack

ARRA funding raises R&D expenditures within federally funded R&D centers 11% to $16.8B in FY 2010

Research and development expenditures at the US’ 39 federally funded R&D centers (FFRDCs) rose from $15.2 billion in FY 2009 to $16.8 billion in FY 2010, according to data from the National Science Foundation (NSF) FFRDC Research and Development Survey. More than $1 billion of the FY 2010 total was supplied by funds from the American Recovery and Reinvestment Act of 2009 (ARRA).

This 10.6% increase is the largest one-year increase since 2002, when expenditures increased 14.5% to $11.5 billion. Between FY 2002 and 2009, FFRDC expenditures increased an average of 5% each year, with the exception of a brief stall in FY 2006.

FFRDCs are privately operated R&D organizations that are exclusively or substantially financed by the federal government. FFRDCs provide the sponsoring federal agencies with capabilities to meet special long-term R&D needs that cannot be met as effectively by existing in-house or by contractor resources. Each FFRDC is operated, managed, and/or administered by a university or university consortium, a nonprofit organization, or an industrial firm, either as an autonomous organization or as a separate operating unit.

Federal funding accounted for 97.3% ($16.4 billion) of the FFRDC’s total expenditures in FY 2010. Since 2001 the federal government has consistently provided over 96% of the funding to the FFRDCs each year, and the FFRDCs’ federally funded R&D expenditures have increased 69%, from $9.7 billion in FY 2001 to $16.4 billion in FY 2010.

The laboratories, centers, and institutes that are designated as FFRDCs conduct work within a variety of fields, such as physics, engineering, astronomy, computer science, and psychology. Similar to the proportions in previous years, in FY 2010 basic research activities accounted for 39% of total FFRDC R&D expenditures; applied research, 31%; and development, 30%.

Five FFRDCs reported ARRA-funded expenditures of more than $100 million in FY 2010: the Jet Propulsion Laboratory ($101 million); Pacific Northwest National Laboratory ($113 million); Oak Ridge National Laboratory ($135 million); Los Alamos National Laboratory ($139 million); and the National Cancer Institute at Frederick ($183 million). ARRA provided the funding for nearly 30% of the National Cancer Institute’s total R&D expenditures in FY 2010 and was responsible for 50% ($48 million) of the FY 2010 R&D expenditures at the Thomas Jefferson National Accelerator Facility.

March 31, 2012 in Brief | Permalink | Comments (2) | TrackBack

March 30, 2012

EnerDel parent Ener1 completes financial restructuring and emerges from Chapter 11

Ener1, Inc., the parent of lithium-ion battery maker EnerDel, has completed its financial restructuring and emerged from Chapter 11 bankruptcy as a privately-held company. On 26 January, the company had filed a pre-packaged bankruptcy case to implement a restructuring plan agreed on by its primary investors and lenders. (Earlier post.)

The US Bankruptcy Court in the Southern District of New York confirmed the Plan of Reorganization on 28 February; the Plan became effective on 30 March 2012.

We have emerged from bankruptcy with significantly less debt, more working capital and a stronger financial position to enable us to compete more effectively in pursuing business opportunities to provide energy storage solutions for electric grid, transportation and industrial applications.  We are grateful for the strong support of our primary investors, customers, employees and suppliers throughout this process.

—Alex Sorokin, Ener1’s interim CEO

Ener1 restructured its long-term debt and secured an infusion of up to $86 million of new equity funding, which will support the continued operation of Ener1’s subsidiaries.  In addition to the new equity funding, the holders of the existing senior notes, the convertible notes and a line of credit have restructured their debt in a partial debt-for-equity exchange. 

In accordance with Ener1’s prior announcements, and as provided for by the Plan, Ener1’s common stock (which had traded over the counter with the symbol HEVV) was cancelled effective as of 30 March 2012, and Ener1 will no longer be an SEC-reporting company. Holders of the cancelled common stock did not receive any distribution of any kind under the Plan.

Ener1 issued new shares of preferred stock in consideration of the new equity funding that will flow into the Company and in repayment of the debtor-in-possession loan received by the Company in connection with the restructuring.  The existing senior notes were exchanged for a combination of cash, new common stock and new notes, while the convertible notes were exchanged for a combination of cash and new common stock.  The amount due under the existing line of credit was cancelled in exchange for new common stock.

March 30, 2012 in Batteries, Market Background | Permalink | Comments (6) | TrackBack

Illinois Governor launches EV fast-charging network, currently US’ largest

Illinois Governor Pat Quinn and representatives from the Illinois Tollway, 350Green LLC and 7-Eleven, Inc. announced the availability of the US’ largest network of fast-charging electric vehicle (EV) stations to date.

Through the Chicago-Area EV Infrastructure Project, 26 fast-chargers have been installed, with 73 total fast-chargers planned. Eight of these are currently in place at Tollway Oases along the Jane Addams Memorial Tollway (I-90) at the Des Plaines Oasis and on the Tri-State Tollway (I-94/I-294/I-80) at the Lake Forest Oasis, O’Hare Oasis and Chicago Southland Lincoln Oasis. The installations, managed by 350Green, were performed by Chicago-based JNS Power & Control Systems.

The City of Chicago is overseeing the project’s installation of 280 charging stations overall to increase accessibility to EV charging. With a budget of $8.8 million, including $1.9 million in public funding and $6.9 million in private investment, EV stations are being installed mostly in areas with dense residential and worker populations and in high-traffic areas.

Partnerships with charging station hosts such as 7-Eleven have been key to the rollout of the network across the state. 7-Eleven convenience stores at four of the Illinois Tollway Oases now have dedicated space for fast-charging stations.

Before plugging in to one of the Tollway’s fast-chargers, drivers must purchase a payment card from 350Green. The $21 card includes three 15-minute sessions at fast-charging stations.

The Chicago-Area EV Infrastructure Project is funded in part by the state’s Illinois Jobs Now! capital plan and Clean Cities Grant funds that the city of Chicago received through the American Recovery and Reinvestment Act (ARRA) of 2009. The project is one of several initiatives now underway in Illinois to promote the adoption and use of electric vehicles.

The Illinois Jobs Now! plan includes up to $10 million in capital funding for the Illinois Department of Commerce and Economic Opportunity (DCEO) to award EV manufacturing and infrastructure incentives, which will begin rolling out this spring. In July 2011, Governor Quinn signed the Electric Vehicle Act to form the Illinois Electric Vehicle Advisory Council, a group of public- and private-sector electric vehicle stakeholders collaborating to develop policies and programs that support EVs.

In addition, the Illinois Environmental Protection Agency (IEPA) offers the Illinois Alternate Fuels Rebate Program, which provides a rebate of up to $4,000 toward EV purchases. The Illinois Commerce Commission (ICC) also launched a Plug-in Electric Vehicle Initiative (PEV) Initiative to explore regulatory issues related to EV deployment.

March 30, 2012 in Brief | Permalink | Comments (5) | TrackBack

Ford investing $1.3B in Hermosillo Stamping and Assembly Plant for Fusion and Lincoln MKZ

Ford is investing $1.3-billion in its Hermosillo (Mexico) Stamping and Assembly Plant for the production of Ford Fusion and Lincoln MKZ lineups.

Ford is making a significant investment in this facility and a significant commitment to the employees here while also transforming our vehicle lineup for customers throughout North America. The midsize sedan market is significant, growing and one of the most competitive in the industry.

—Mark Fields, Ford president of The Americas

Ford’s latest investment is in addition to $3 billion invested in Mexico during the past decade. In addition to Fusion, which has been built in Hermosillo since 2005 and sold more than 1.1 million models, Ford started building Fiestas at its Cuautitlán plant in 2008. The company also builds diesel engines for small and medium trucks in Chihuahua as well as transmissions through a joint venture with Getrag (GTF).

The all-new Lincoln MKZ marks an important milestone in Lincoln’s reinvention and is one of seven new or significantly upgraded products by 2015.

March 30, 2012 in Brief | Permalink | Comments (4) | TrackBack

CAR releases study on use of bio-based materials in automotive sector; potential for the Great Lakes Region

There is significant potential for the expansion of bio-based automotive parts and components manufacturing in the US Great Lakes region, according to a newly-released study conducted by the Center for Automotive Research (CAR), a nonprofit research organization based in Ann Arbor, Michigan.

The report defines bio-based materials as industrial products made from renewable agricultural and forestry feedstocks, which can include wood, grasses, and crops, as well as wastes and residues. These materials may replace fabrics, adhesives, reinforcement fibers, polymers, and other, more conventional, materials.

Encouraging bio-manufacturing and its associated value chain development, and building upon its current expertise in producing conventional parts for automakers, may position the Great Lakes region at a global competitive advantage as oil prices climb, and the demand for more bio-based parts increases. The objective of this study is to assist in identifying business opportunities for increased market penetration of bio-based products into the automotive supply chain.

—The Bio-Based Materials Automotive Value Chain

The study, produced by the Sustainability and Economic Development Strategies group at CAR, notes that the automotive industry’s adoption of bio-based materials has been gradually accelerating over the last several years. This emphasis has been spurred by government regulations, consumer preferences, and, in some cases, financial savings that can be realized from the adoption of these materials and technologies.

In this way, bio-based materials face some of the same challenges to broader adoption that are also faced by alternative powertrain technologies and alternative fuels (many of which are also bio-based). Environmentally-friendly materials and technologies hold the promise of accelerated adoption, as costs drop due to economies of scale made possible by high volume production. This promise of lower costs with increased volumes holds as true for lithium-ion batteries as it does for bio-based floor mats.

—The Bio-Based Materials Automotive Value Chain

Bio-based materials background. Beyond traditional uses for bio-based materials in autos—such as wood trim, cotton textiles, and leather seats—there are two primary applications for these materials: as reinforcement and filler or to create polymers.

  • Bio-based composites may be reinforced or filled with natural fibers including bast fibers, which come from the stem of plants that are specifically grown for fiber (such as hemp, kenaf, flax, and jute); fibers from a variety of wood sources or crop residues; or leaf fibers such as sisal, abaca, and banana fibers.

  • Bio-based polymers can be made from a variety of sources including soybean, castor bean, corn, and sugar cane. These feedstocks are usually fermented and go through a series of conversions to produce polymers that can be used in plastic composites. Just like their conventional counterparts, bio-based polymers can be extruded, blown, molded, injection- molded, foamed, and thermoformed.

Natural fiber fillers and reinforcements are the fastest-growing polymer additive, according to the report. Use of castor and soy-based polyols for interior foams has now become more widespread as well.

Bio-based materials have been tested and deployed in a number of automotive components. Flax, sisal, and hemp are used in door interiors, seatback linings, package shelves, and floor panels. Coconut fiber and bio-based foams have been used to make seat bottoms, back cushions, and head restraints. Cotton and other natural fibers have been shown to offer superior sound proofing properties and are used in interior components. Natural latex is used to enhance the safety of interior components by making the surfaces softer. Abaca fiber has been used to create under-floor body panels.

Recently, there also have been attempts to use natural fiber composites in structural applications; some researchers are interested in combining natural fibers with nano-materials to develop structural components that could potentially be used in automotive components. Though exterior, under-the-hood, and structural applications are more limited and frequently still in various stages of research, they represent some of the more advanced technology and high-value applications of bio-based materials and could potentially become an important part of the market, the report suggests.

Currently, there are no standards in place regulating what can and what cannot be called a bio-based material, although standards organizations such as the American Society for Testing and Materials (ASTM) and the International Organization for Standardization (ISO) have developed standards for measuring bio-based content and conducting life cycle assessments (LCAs).

The bio-based materials industry already uses these standards to demonstrate the benefits of using these materials and to verify the materials’ renewable content. The difficult issue is determining how to create a labeling system for complex products (such as automobiles) that integrate numerous components. It is not currently feasible to make a significant portion of an automobile out of bio-based materials, and it may never be feasible to make some components out of bio-based materials. A voluntary labeling program could, however, be beneficial to promote the expanded use of bio-based materials in automobiles.

Challenges with bio-based plastics. Although bio-based plastics are coming closer to meeting or exceeding performance parameters and cost of conventional plastics than ever before, there are still some drawbacks which prevent wider application in the automotive industry. These include:

  • Bio-based plastic parts are typically marginally less strong and rigid than their conventional counterparts. While not necessarily a critical factor for a variety of applications (particularly those in the interior of the vehicle), it does limit the use of bio-based plastics for certain structural components.

  • Bio-based plastics are subject to the seasonal and geographic differences found in the plants from which they are made. Weather, climate, soil, and other factors can alter the qualities of a given plant oil or fiber, making it necessary for suppliers to monitor these materials and adapt their manufacturing processes as necessary.

  • Natural variations, particularly among filler materials, can cause natural variation in the appearance and texture of components made from bio-based materials.

  • Bio-based materials have unique negative characteristics, such as odor issues and increased susceptibility to moisture and heat damage, as well as not being sufficiently flame-retardant.

  • Bio-based plastic fillers are more sensitive than their conventional counterparts to high temperatures, which can break down their inner structure. This characteristic limits their use in applications that require high temperature manufacturing processes.

Commercialization. CAR conducted three case studies of successful automaker bio-based product utilization to provide a basis for understanding how a component that integrates bio-based materials is developed and how these materials move from farm to factory:

  1. Wheat Straw Reinforced Composite in a Storage Bin of the Ford Flex
  2. Bio-Based Material Commercialization Fund Managed by the Ontario BioAuto Council
  3. Castor Oil Based Nylon in the Radiator End Tank of the Toyota Camry
Drawing on the literature review, case studies, and meetings with industry representatives, CAR documented lessons learned and obstacles encountered and developed recommendations for increased commercialization and adoption of bio-based materials into automotive supply chains. Lessons learned included:

  • Involving industry in research consortiums;
  • Moving technology beyond university research into the pilot stage;
  • Partnering among companies along the supply chain; and
  • Implementing bio-based materials beyond initial applications.

While this report provides a solid understanding of the current status of use of bio-based materials in automotive parts and components, additional inquiry is needed to understand the current and potential size of the bio-based material market in the automotive sector. Further research could examine the realistic potential of the market and investigate the maturity curve for bio-based technologies in automobiles.

A supplemental study could also address in detail the role of the government in motivating the bio-based automotive market. The study could make recommendations to strategically support the development of the bio-based materials industry specific to businesses and other organizations in the Great Lakes region. Future work could also include the establishment of an automotive bio-based products network in Great Lakes region. Such a network could facilitate relationships and business development among interested stakeholders and cultivate the bio-based automotive components market.

—The Bio-Based Materials Automotive Value Chain

Resources

March 30, 2012 in Bio-polymers, Biomass, Materials | Permalink | Comments (0) | TrackBack

NETL to install new supercomputer to simulate carbon capture, utilization and storage scenarios

The US Department of Energy (DOE) Office of Fossil Energy’s National Energy Technology Laboratory (NETL) will install a new supercomputer this summer to help to develop solutions to carbon capture, utilization and storage (CCUS) technology barriers.

Housed at NETL’s Simulation-Based Engineering User Center, a facility primarily devoted to advancing CCUS science and technology, the new supercomputer will be used to develop and deploy advanced simulation tools.

Researchers from partnering organizations, such as the five universities that are part of the NETL-Regional University Alliance, will be able to access the supercomputer via user centers at NETL’s Albany, Morgantown, and Pittsburgh locations. The three user centers will also provide advanced visualization hardware and software. This arrangement allows collaborators to simulate phenomena that are difficult or impossible to probe experimentally without the expense of building dedicated supercomputing facilities.

March 30, 2012 in Brief | Permalink | Comments (2) | TrackBack

Obama Administration launches $200M Big Data Research and Development Initiative

The White House Office of Science and Technology Policy (OSTP), in concert with several Federal departments and agencies, launched a $200-million “Big Data Research and Development Initiative” to improve greatly the tools and techniques needed to access, organize, and glean discoveries from huge volumes of digital data.

The Big Data initiative is intended to: (1) advance state-of-the-art core technologies needed to collect, store, preserve, manage, analyze, and share huge quantities of data; (2) harness these technologies to accelerate the pace of discovery in science and engineering; and (3) expand the workforce needed to develop and use Big Data technologies.

In the same way that past Federal investments in information-technology R&D led to dramatic advances in supercomputing and the creation of the Internet, the initiative we are launching today promises to transform our ability to use Big Data for scientific discovery, environmental and biomedical research, education, and national security.

—Dr. John P. Holdren, Assistant to the President and Director of the White House Office of Science and Technology Policy

The initiative responds to recommendations by the President’s Council of Advisors on Science and Technology, which last year concluded that the Federal Government is under-investing in technologies related to Big Data. In response, OSTP launched a Senior Steering Group on Big Data to coordinate and expand the Government’s investments in this critical area.

The first wave of agency commitments to support this initiative includes:

  • Department of Energy – Scientific Discovery Through Advanced Computing. The Department of Energy (DOE) will provide $25 million in funding to establish the Scalable Data Management, Analysis and Visualization (SDAV) Institute. Led by the Energy Department’s Lawrence Berkeley National Laboratory, the SDAV Institute deploy, and assist scientists in using, technical solutions addressing challenges in three areas:

    Data Management: infrastructure that captures the data models used in science codes, efficiently moves, indexes, and compresses this data, enables query of scientific datasets, and provides the underpinnings of in situ data analysis.

    Data Analysis: application-driven, architecture-aware techniques for performing in situ data analysis, filtering, and reduction to optimize downstream I/O and prepare for in-depth post-processing analysis and visualization.

    Data Visualization: exploratory visualization techniques that support understanding ensembles of results, methods of quantifying uncertainty, and identifying and understanding features in multi-scale, multi-physics datasets.

    SDAV is a collaboration tapping the expertise of researchers at six laboratories: Argonne, Lawrence Berkeley, Lawrence Livermore, Los Alamos, Oak Ridge, and Sandia national laboratories and in seven universities: Georgia Tech, North Carolina State, Northwestern, Ohio State, Rutgers, the University of California at Davis, and the University of Utah. Kitware, a company that develops and supports specialized visualization software, is also a partner in the project. The team will build on their successes from the SciDAC Scientific Data Management (SDM) Center for Enabling Technologies, the Visualization and Analytics Center for Enabling Technologies (VACET), and the Institute for Ultra-Scale Visualization (UltraVis) and provide the tools and knowledge required to achieve breakthrough science in this data-rich era.

  • US Geological Survey – Big Data for Earth System Science. USGS is announcing the latest awardees for grants it issues through its John Wesley Powell Center for Analysis and Synthesis. The Center catalyzes innovative thinking in Earth system science by providing scientists a place and time for in-depth analysis, state-of-the-art computing capabilities, and collaborative tools invaluable for making sense of huge data sets. These Big Data projects will improve our understanding of issues such as species response to climate change, earthquake recurrence rates, and the next generation of ecological indicators.

  • National Science Foundation and the National Institutes of Health - Core Techniques and Technologies for Advancing Big Data Science & Engineering. “Big Data” is a new joint solicitation supported by the National Science Foundation (NSF) and the National Institutes of Health (NIH) that will advance the core scientific and technological means of managing, analyzing, visualizing, and extracting useful information from large and diverse data sets. This will accelerate scientific discovery and lead to new fields of inquiry that would otherwise not be possible.

    NIH is particularly interested in imaging, molecular, cellular, electrophysiological, chemical, behavioral, epidemiological, clinical, and other data sets related to health and disease.

  • National Science Foundation. In addition to funding the Big Data solicitation, and keeping with its focus on basic research, NSF is implementing a comprehensive, longterm strategy that includes new methods to derive knowledge from data; infrastructure to manage, curate, and serve data to communities; and new approaches to education and workforce development.

    Specifically, NSF is: (1) encouraging research universities to develop interdisciplinary graduate programs to prepare the next generation of data scientists and engineers; (2) funding a $100-million Expeditions in Computing project based at the University of California, Berkeley, that will integrate three powerful approaches for turning data into information - machine learning, cloud computing, and crowd sourcing; (3) providing the first round of grants to support “EarthCube”—a system that will allow geoscientists to access, analyze and share information about our planet; (4) issuing a $2-million award for a research training group to support training for undergraduates to use graphical and visualization techniques for complex data; (5) providing $1.4 million in support for a focused research group of statisticians and biologists to determine protein structures and biological pathways; and (6) convening researchers across disciplines to determine how Big Data can transform teaching and learning.

  • Department of Defense – Data to Decisions. The Department of Defense (DoD) is “placing a big bet on big data” investing approximately $250 million annually (with $60 million available for new research projects) across the Military Departments in a series of programs intended to: (1) harness and utilize massive data in new ways and bring together sensing, perception and decision support to make truly autonomous systems that can maneuver and make decisions on their own; (2) improve situational awareness to help warfighters and analysts and provide increased support to operations. The Department is seeking a 100-fold increase in the ability of analysts to extract information from texts in any language, and a similar increase in the number of objects, activities, and events that an analyst can observe.

    To accelerate innovation in Big Data that meets these and other requirements, DoD will announce a series of open prize competitions over the next several months. In addition, the Defense Advanced Research Projects Agency (DARPA) is beginning the XDATA program, which intends to invest approximately $25 million annually for four years to develop computational techniques and software tools for analyzing large volumes of data, both semi-structured (e.g., tabular, relational, categorical, meta-data) and unstructured (e.g., text documents, message traffic).

    Central challenges to be addressed include: (1) developing scalable algorithms for processing imperfect data in distributed data stores; and (2) creating effective human-computer interaction tools for facilitating rapidly customizable visual reasoning for diverse missions. The XDATA program will support open source software toolkits to enable flexible software development for users to process large volumes of data in timelines commensurate with mission workflows of targeted defense applications.
  • National Institutes of Health – 1000 Genomes Project Data Available on Cloud. The National Institutes of Health is announcing that the world’s largest set of data on human genetic variation—produced by the international 1000 Genomes Project—is now freely available on the Amazon Web Services (AWS) cloud. At 200 terabytes, the current 1000 Genomes Project data set is a prime example of big data, where data sets become so massive that few researchers have the computing power to make best use of them. AWS is storing the 1000 Genomes Project as a publicly available data set for free and researchers only will pay for the computing services that they use.

OSTP was created by Congress in 1976 to serve as a source of scientific and technological analysis and judgment for the President with respect to major policies, plans, and programs of the Federal Government.

March 30, 2012 in Market Background, Policy, Research | Permalink | Comments (1) | TrackBack

Green Car Congress © 2013 BioAge Group, LLC. All Rights Reserved. | Home | BioAge Group