Artists under siege by artificial intelligence (AI) that studies their work, then replicates their styles, have teamed with university researchers to stymy such copycat activity.Artists under siege by artificial intelligence (AI) that studies their work, then replicates their styles, have teamed with university researchers to stymy such copycat activity.Machine learning & AI[#item_full_content]

A team of researchers, led by Professor Hyong-Ryeol Park from the Department of Physics at UNIST has introduced a technology capable of amplifying terahertz (THz) electromagnetic waves by over 30,000 times. This breakthrough, combined with artificial intelligence (AI) based on physical models, is set to revolutionize the commercialization of 6G communication frequencies.A team of researchers, led by Professor Hyong-Ryeol Park from the Department of Physics at UNIST has introduced a technology capable of amplifying terahertz (THz) electromagnetic waves by over 30,000 times. This breakthrough, combined with artificial intelligence (AI) based on physical models, is set to revolutionize the commercialization of 6G communication frequencies.Engineering[#item_full_content]

Joyce Loaiza lives alone, but when she returns to her apartment at a Florida senior community, the retired office worker often has a chat with a friendly female voice that asks about her day.Joyce Loaiza lives alone, but when she returns to her apartment at a Florida senior community, the retired office worker often has a chat with a friendly female voice that asks about her day.Machine learning & AI[#item_full_content]

Imagine a flying dragon that doesn’t spout fire, but instead extinguishes it with blasts of water. Thanks to a team of Japanese researchers, this new kind of beast may soon be recruited to firefighter teams around the world, to help put out fires that are too dangerous for their human teammates to approach.Imagine a flying dragon that doesn’t spout fire, but instead extinguishes it with blasts of water. Thanks to a team of Japanese researchers, this new kind of beast may soon be recruited to firefighter teams around the world, to help put out fires that are too dangerous for their human teammates to approach.[#item_full_content]

Artificial intelligence is poised to upend much of society, removing human limitations inherent in many systems. One such limitation is information and logistical bottlenecks in decision-making.Artificial intelligence is poised to upend much of society, removing human limitations inherent in many systems. One such limitation is information and logistical bottlenecks in decision-making.Consumer & Gadgets[#item_full_content]

In With The New, Then Out With The Old – A Managed Transition Is Key To Maintaining Resource Adequacy

This post is the second in a series titled “Real Talk on Reliability,” which will examine the reliability needs of our grid as we move toward 100 percent clean electricity and electrify more end-uses on the path to a climate stable future. It was written by Michelle Solomon, Senior Policy Analyst in the Electricity Program. A shorter version of this article was published in Utility Dive. Other posts in this series covered Rethinking the Reliability of the Grid

 

A significant aspect of the Biden administration’s plans to reduce emissions from the power sector is currently under debate – the Environmental Protection Agency’s (EPA) proposed power plant greenhouse gas emission rules, which would establish emissions limits for new and existing natural gas plants, as well as existing coal plants.

If adopted, the proposed rules will require steep emissions reductions by the early 2030s from any coal plants that do not retire before 2040. For existing gas plants, the rules require emissions controls such as carbon capture or hydrogen blending for any large gas unit that operates as a baseload plant. For new gas, the rules place similar restrictions on all units that operate more than 20 percent of the time.

Power providers, grid operators, and clean energy advocates have offered reactions to the proposed rules, and Congress recently held hearings on reliability of the grid in the context of the rule. Industry representatives have raised concerns surrounding resource adequacy – whether there are in fact enough resources to supply energy and capacity to meet rising demand.

There are two separate questions that underlie concerns about maintaining resource adequacy through the clean energy transition.

First, is it technically feasible to ensure resource adequacy with the energy resources that would be allowable under the proposed EPA rules? And if so, how might the methods of measuring and planning for resource adequacy need to change to account for the future resource mix?

And second, is it practically feasible to bring enough resources online fast enough to replace those that are projected to retire?

The answer to each of these questions is yes–if good policy enables a managed transition that balances retirement of the old with installation of the new.

We Can Reach 80-90 Percent Clean Electricity With Existing Technology 

Researchers have explored deep-decarbonization scenarios and agree that the U.S. can achieve up to 90 percent clean electricity generation using only existing technology. For example, the 2035 Report 2.0 found that a 90 percent clean grid could meet demand at all hours of the year through the addition of existing energy technologies like solar, wind, and batteries. In addition, no new coal or gas plants would need to be built, even with increased demand from the high electrification of transportation, buildings, and industry.

The Net Zero America study similarly finds that clean sources of energy can supply 70-85 percent of U.S. electricity by 2030. Here, the electricity mix is largely wind and solar, with hydro and nuclear remaining relatively constant while gas usage decreases by about 25 percent and coal generation goes to zero. The National Renewable Energy Lab research agrees, finding that 71-90 percent of electricity could come from clean sources by 2030, again all with existing energy technologies.

Regional studies support the same conclusion, with GridLab and Telos Energy finding that California could reach 85 percent clean electricity by 2030 while maintaining resource adequacy with the addition of primarily wind, solar, and batteries. Here, the use of a diverse set of clean resources, including offshore wind and geothermal, significantly decrease the necessary deployment rate to meet the 85 percent clean threshold.

To be clear, none of these studies claim that continued use of natural gas is not needed during the energy transition. Existing gas plants will be an integral part of the power system for the foreseeable future. However, their value will shift increasingly toward use as capacity resources for reliability during risk periods, while their total annual energy contributions are expected to drop significantly – just as the EPA rules propose.

While research has been done on the pathway from 90 percent or 95 percent to 100 percent clean electricity, these studies tend to rely on technologies not yet commercialized – but we are far from such a point in time, which means we have time for technologies and grid operations to evolve to meet the last five to ten percent. Keeping the lights on with solely wind, solar, and batteries may be possible at these higher percentages, though modeled costs tend to be prohibitively high without incorporation of dispatchable clean resources or significant flexible demand.

For example, the “Moonshot study” by GridLab that uses the Public Service Company of New Mexico as a case study finds that there are several viable supply-side pathways to 100 percent clean electricity, likely combining possible future technologies including multi-day energy storage, dispatchable clean sources like geothermal, nuclear, hydrogen combustion turbines, or thermal resources with carbon capture and storage. Priya Sreedharan, program director at GridLab and an author of the study, highlights the importance of not letting uncertainty in this final stage delay action on building a lot of clean energy now, saying “It’s okay that we don’t know exactly what the last 10-20 percent will be. The focus needs to be on building the stuff we know we need, and not get hung up on what that perfect clean firm resource is.”

Research shows mature technologies can get us cost-effectively to high shares of clean electricity, and there are viable pathways to 100 percent clean. However, to plan for a resource adequate system using clean energy, some changes are needed.

Resource Adequacy Planning Should Adapt For Weather-Dependent, Energy-Limited Systems 

Resource adequacy is undoubtably more complicated in a high renewables world, but planners can take several actions to adapt, including consistently accrediting each resource type, accounting for the interdependent nature of clean resources, and updating planning practices for changing risks.

First, while critics continually highlight that wind and solar energy are weather-dependent and have a variable energy output, many do not apply the same scrutiny to fossil fuel resources and consider them to be always available. This is one of the biggest pitfalls in resource adequacy planning, and one that has had particularly serious implications during extreme weather.

Derek Stenclik, founder of the independent modeling firm Telos Energy and lead author of a recent Energy Systems Integration Group paper on future resource adequacy emphasizes that “there is no such thing as perfect capacity. We need to recognize that all resources have challenges in meeting reliability needs,” and that the impression that there is a type of electricity generator that can be considered “firm”, or available to be dispatched at any time, is a widespread myth. For example, during Winter Storm Uri, un-winterized gas plants across the state of Texas failed simultaneously, making up 58 percent of the unplanned outages. During Winter Storm Elliot, it was nearly the same story, with 70 percent of the unexpected outages coming from gas plants. Weather-related correlated outages will continue to be an issue as power systems add renewables, so ensuring all power plants are held to the same standard is crucial.

Second, in a clean electricity future, the reliability value of each resource becomes increasingly dependent on the others. To perfectly determine each resource’s value would require complex calculations that evaluate the entire generation portfolio and the relationship between each resource. However, transparency and certainty on future accreditation values is important for those trying to bring new resources online, and sometimes we will have to “accept that none of these methods will be perfect”, says Sreedharan, in accrediting these resources to keep markets accessible and resources coming online quickly.

Third, resource adequacy analysis has long operated by identifying the time of day or year in which the peak electricity demand occurs, and then planning to have enough capacity available, plus an additional margin of around 15 percent to account for any unexpected outages. However, this paradigm is changing rapidly as the risky periods on the grid no longer occur at the time of peak demand.

Stenclik highlights that while most planners now “understand that the risk hours are shifting to the evening as the sun sets,” not all yet recognize that the system risks will be “transitioning to winter – partially because of solar, but also due to cold snaps constraining gas supplies, increased electrification for electric winter heating, and the lower efficiency of electric vehicles in cold weather.”

Furthermore, considering instantaneous periods of risk will no longer suffice. Increasingly, a new limiting factor for adequacy will be whether energy in one period is enough to charge batteries or other storage technologies to supply capacity in another. While more sophisticated utilities and all ISOs already analyze risk across all hours of the year using chronological modeling, this approach is becoming more of a requirement than it has been in the past. Planners will need to assess a diversity of portfolios against metrics like expected unserved energy and loss of load expectation that examine all hours of the year.

With weather systems typically confined to one region of the country, interregional transmission has been shown to have significant resource adequacy benefits, especially in high-renewable systems, because it allows regions to export and import during times of need that may occur with simultaneous times of excess in other regions, as seen in Winter Storm Uri. Demand-response and energy efficiency, too, can be particularly important during short, rare events – they are much cheaper than new power plants, and can shift or reduce energy usage and reduce that net load peak without having to build these. The demand-side considerations of resource adequacy have great potential and will be explored further in the next installment in this series.

These are just a few of the ways resource adequacy is evolving across the country, and several resources explore principles for this new paradigm in depth, such as a deep dive on capacity accreditation from Stenclik and the Energy Systems Integration Group.

 

Addressing Uncertainties About Clean Energy Technologies

Yet, while IBRs are moving quickly to adapt their programming to enhance their grid performance, some recent incidents with IBRs have raised concerns among reliability experts. For example, ERCO has seen large amounts of solar and wind trip offline in response to a grid fault. The largest of them, the Odessa Disturbance 2 incident in June 2021 involved 14 solar facilities and resulting in the loss of over 1.5 gigawatts of solar power.

While these incidents are uncommon, they spotlight the need for appropriate responses to avoid their occurrence in the future. ERCOT has established an IBR working group make recommended improvements and mitigate future potential risks. The North American Electric Reliability Council (NERC) has formed an IBR performance task force working to address innovative solutions. Another notable collaborative network for research and emerging practices is the Energy Systems Integration Group, as well as numerous efforts being spearheaded by the U.S. Department of Energy and various national laboratories.

Early efforts to achieve consensus around technical performance and any accompanying standards will aid grid operators eager for near-term solutions and new approaches.

New Policies Are Needed To Bring A Managed Transition To Fruition 

No accreditation or probability calculation will be able to avoid reliability issues if we are not bringing new resources online apace of retirements. The risk of capacity shortfall is not a problem that is specifically driven by the proposed EPA rules, but instead a trend that has proliferated over several years. This pattern evolved largely because of uneconomic coal plants closing before their previously planned retirement date while new clean resources, that could make up the retiring capacity, have faced barriers to entry. Whether or not the EPA rules are finalized as is, grid operators, utilities, and the policymakers that support them will need to deal with this phenomenon.

The interconnection queue presents one of the biggest sources of project delay and cost increases, but it is also an area where grid operators have the most control. FERC Order 2023 has reckoned with many of the sources of interconnection delay, but RTOs should go even further. One of the reforms that goes beyond Order 2023 that could represent a step-change in interconnection is moving to an energy-only interconnection approach, which involves more limited studies and upgrades but requires resources to take additional curtailment risk.

Beyond improving interconnection, long-term, improved resource planning that includes transmission will be the foundation of a managed transition to clean energy. To quickly increase transmission capacity, utilities and grid operators should utilize grid enhancing technologies and advanced conductors to upgrade the capacity of existing transmission lines. With more advance notice on planned retirements, grid operators should also proactively plan transmission to enable reliability through retirements, instead of waiting until retirement is imminent. Being proactive here can prevent  finding that transmission upgrades are needed to maintain stability and rely on the “reliability must run” process through FERC that costs ratepayers money to keep uneconomic plants running.

There is an opportunity through the EPA’s proposed rule to create more certainty around the timeline for the clean energy transition that we are already undergoing. The poor economics of coal plants have been driving the transition to date, creating sudden retirements, and catching grid operators by surprise. Now, it’s time to turn the technically feasible clean energy future into reality via a managed transition. We have the chance to look decades ahead and plan a clean future that will have the best outcome for reliability, customers, and the climate.

The post In With The New, Then Out With The Old – A Managed Transition Is Key To Maintaining Resource Adequacy appeared first on Energy Innovation: Policy and Technology.

This post is the second in a series titled “Real Talk on Reliability,” which will examine the reliability needs of our grid as we move toward 100 percent clean electricity and electrify more end-uses on the path to a climate…
The post In With The New, Then Out With The Old – A Managed Transition Is Key To Maintaining Resource Adequacy appeared first on Energy Innovation: Policy and Technology.[#item_full_content]

Since the advent of fusion research, scientists have published thousands of documents on the subject—papers, conference proceedings, and even written logs from previous experiments at fusion reactors around the world. Such a wellspring of information would easily take a lifetime to read, and even longer to comprehend.Since the advent of fusion research, scientists have published thousands of documents on the subject—papers, conference proceedings, and even written logs from previous experiments at fusion reactors around the world. Such a wellspring of information would easily take a lifetime to read, and even longer to comprehend.Machine learning & AI[#item_full_content]

Harvard researchers have realized a key milestone in the quest for stable, scalable quantum computing, an ultra-high-speed technology that will enable game-changing advances in a variety of fields, including medicine, science, and finance.Harvard researchers have realized a key milestone in the quest for stable, scalable quantum computing, an ultra-high-speed technology that will enable game-changing advances in a variety of fields, including medicine, science, and finance.[#item_full_content]

Artificial intelligence programs can’t be named as an inventor for patents, the U.K.’s top court said in a crucial ruling refusing to put machines on a near-equal footing with humans.Artificial intelligence programs can’t be named as an inventor for patents, the U.K.’s top court said in a crucial ruling refusing to put machines on a near-equal footing with humans.Business[#item_full_content]

Hirebucket

FREE
VIEW