The Evolution of Hosting Capacity Analysis as a Grid Modernization Tool (2024)

For anyone in the distributed energy industry, the term “hosting capacity analysis” is one to know. Hosting capacity analysis (HCA) is a new analytical tool that can help states and utilities plan for and build a cleaner electric grid that optimizes customer-driven distributed energy resources (DERs), such as rooftop solar and energy storage.

An HCA provides a snapshot in time of the conditions on a utility’s distribution grid that reflect the grid’s ability to “host” additional DERs at specific locations without upgrades. The information gleaned from this analysis can help regulators, utilities, developers, and customers make more proactive, cost-effective, and efficient decisions about DER investments.

By providing greater transparency into the grid, displayed in the form of maps with supporting datasets, such analyses can help reveal the operational limits of the grid, which might impact the ability of new DERs to interconnect quickly or affordably. They can also identify areas where DERs may be able to provide beneficial services by addressing existing grid constraints and inform more strategic grid investments over the long term.

HCAs are being deployed as part of broader grid modernization or distribution planning efforts. Hawaii, California, New York, Nevada, and Minnesota have already mandated that utilities in their state produce an HCA, and there are active proceedings in Colorado, Maryland, Connecticut, and other states, all likely to result in the development of HCAs.

IREC’s Optimizing the Grid: A Regulator’s Guide to Hosting Capacity Analyses for Distributed Energy Resources, published December 2017, provided early lessons and insights on HCA as a key informational and navigational tool to enable and optimize more customer-driven DERs. In this in-depth article, we address some of the recent developments on HCA as it continues to evolve and as healthy debates about the tool and its uses play out in more states. We set out to ask—and explore answers to—the following key questions surrounding HCA:

  1. Can hosting capacity analysis be used to streamline the interconnection process for DERs?
  2. To what extent is computational intensity a barrier to the adoption of detailed hosting capacity methodologies?
  3. How should the accuracy of hosting capacity results be evaluated?
  4. How much methodological transparency is needed?

Hosting capacity analysis (HCA), a new analytical tool, can help states and utilities plan for and build a cleaner electric grid that optimizes customer-driven distributed energy resources (DERs), like rooftop solar and energy storage.

Recent HCA Developments
In Optimizing the Grid, IREC provided case studies of several early state and utility efforts (namely California, New York, Minnesota and Pepco Holdings Co.), which have continued to evolve. In California, a working group convened by the California Public Utilities Commission is moving forward with developing a framework to integrate HCA into its interconnection standards. In both Minnesota and New York, the utilities have continued to make some improvements to their analyses based upon public input and commission direction.

In addition, multiple new HCA providers have entered the field, and the menu of methodological options has expanded. In January 2018, the Electric Power Research Institute (EPRI) released a report, Impact Factors, Methods, and Considerations for Calculating and Applying Hosting Capacity (hereinafter “EPRI’s report”), which compares EPRI’s proprietary hosting capacity methodology—DRIVE—with other methodologies that have been tested or implemented. San Diego Gas & Electric Company (SDG&E) of California also released a report comparing the results of a test-run of EPRI’s DRIVE on five SDG&E feeders with results from the iterative methods being deployed by SDG&E and the other California investor-owned utilities.

The HCA tool and collective understanding of its use are quickly evolving as a result of these efforts. Though most states and utilities are still in the early stages of development and rollout of HCA, their experiences have shown that the overall value and usefulness of an HCA largely depend on the process and framework that regulators put in place to guide the development, design, and adaptation of the tool.

In addition, the following key questions have emerged from conversations playing out across the states moving forward with HCA deployment. Exploring the answers will help inform the ongoing evolution of this important tool.

  1. Can HCA Streamline the Interconnection Process for DERs?

A key question for regulators to consider is whether and in what ways an HCA can be used to inform, streamline, and accelerate the interconnection process for DERs. As explained in Optimizing the Grid, there are two components to the interconnection “use case” for HCA. First, hosting capacity maps and data spreadsheets can provide customers the grid visibility that allows them to target locations where they can interconnect more rapidly and at a lower cost.

To this end, an HCA can help project developers design systems to fit within operational constraints of the grid at the proposed point of interconnection. Second, HCA results can be used by utilities in the interconnection review process to more accurately screen projects and identify those that can interconnect safely without the need for lengthy study, which can save time for both the customer and the utility. See Figure 1.

Figure 1. Interconnection Use Case for HCA (Source: IREC’s Optimizing the Grid)

While there is widespread agreement on the first component of the interconnection use case (though with some variation on the level of detail of the HCA results that should be shared), the second has proved more controversial. For example, in Minnesota, the Public Utilities Commission in its Order in Docket No. E-002/M-17-177 has required Xcel Energy to produce hosting capacity reports that are “detailed enough to provide developers with a reliable estimate of the available level of hosting capacity per feeder” as a “starting point for interconnection applications.”

But the Minnesota commission has thus far deferred stakeholder requests to ensure that Xcel’s analysis is accurate enough to integrate into the interconnection review process. Likewise, EPRI’s report argues that “hosting capacity can inform [the interconnection] process” but “should not be used as a solution to automate the technical review process.”

EPRI suggests that a hosting capacity analysis could inform engineering assessments in the supplemental review stage of the interconnection process but should not play a role in Fast Track screening. The initial screening stage of a Fast Track process for DERs relies on a series of technical screens – which are generally based on rough rules of thumb – against which a project is evaluated to identify whether it can readily interconnect without a more detailed grid study.

California is taking a different approach and currently has a proceeding underway to incorporate the results of the three investor-owned utilities’ Integration Capacity Analyses (ICA)—California’s version of an HCA—into Rule 21 interconnection review rules. In initiating this effort, the California Public Utilities Commission in its Order Instituting Rulemaking recognized that doing so “may better inform interconnection siting decisions and streamline the Fast Track process for certain projects.”

Leading up to this effort, the California ICA working group report acknowledged that some of these screens, such as the 15% of peak load screen, may be addressed more accurately by hosting capacity analysis. Rather than relying on these screens that approximate system conditions, utilities can use hosting capacity results that model actual conditions to identify projects that can be safely greenlighted for interconnection.

Moving forward from that assessment, the California Rule 21 working group is currently working to identify those screens that can be supplemented or supplanted by the ICA, which will then be integrated into the interconnection process. The California utilities’ current ICA would need to include additional factors to address all the technical issues covered in the screens (i.e. the ICA does not include analysis of single phase laterals and would need more thorough modeling of coordination).

However, the analyses are already capable of providing more accurate information about several of the more commonly failed screens and can help inform which screening process—e.g. Fast Track or more detailed review—a project should go through.

Beyond whether HCA can substitute or supplement the technical interconnection screens, another issue under debate is the extent to which HCA can help automate the interconnection application process. EPRI’s report states that HCA “should not be used to pre-screen every location or create a ‘click to claim’ approach” because hosting capacity changes “as [distributed energy resource] penetration grows on a feeder.”

It is true that hosting capacity results change as soon as new resources are interconnected to a feeder; as such, using HCA to significantly streamline the interconnection process requires the HCA results to be updated more frequently than most currently are. The majority of HCAs are updated on an annual or, at best, monthly basis.

But the frequency of updates is not so much an inherent limitation of HCA as it is a technical or logistics hurdle (i.e. more frequent updates require more frequent runs of the model and more computational intensity). As HCA developers and utilities work to achieve near real-time updating, a semi-automated interconnection approach becomes more viable.

HCA is still at a nascent technical stage and tomorrow’s enhancements may render today’s perceived limitations obsolete.

As the California Rule 21 experience illustrates, the HCAs that exist today also do not evaluate every relevant technical factor or location, and thus it is not yet possible to have full automation of interconnection through the use of HCA. However, IREC considers automation to be a long-range goal worth keeping in mind – particularly for the typical residential rooftop PV system.

In addition, whether HCA can automate interconnection application decisions depends on the accuracy of the results—discussed in more detail below—produced by the HCA methodology. To move toward automation, the HCA will need to not only produce results more accurate than the existing screens, but also yield the confidence achieved by a more detailed study.

Ultimately, it is important to keep in mind that HCA use cases may incorporate near and longer-term objectives. In the case of interconnection streamlining, the California Rule 21 experience provides one pathway toward designing and implementing HCA to supplant or supplement technical screens in the short term to improve the interconnection review process.

While achieving full automation may seem at present like a very remote objective, it is important to keep in mind that HCA is still at a nascent technical stage, and tomorrow’s enhancements may render today’s perceived limitations obsolete.

HCA developers are constantly working to improve the accuracy and efficiency of their tools. Utility regulators may find it helpful to ensure that whatever tools their utilities adopt are flexible enough to accommodate improvements so that longer-term goals can be achieved without needing to redo foundational decisions.

  1. To What Extent Is Computational Intensiveness a Barrier to Implementation of Detailed Hosting Capacity Methodologies?

As explained in Optimizing the Grid, multiple HCA methodologies have been developed and deployed, and the differences between methodologies impact the overall use and usefulness of the tool. Iterative hosting capacity methodologies, like the ICA models being implemented by the California utilities, use power flow simulations to directly model distributed energy resources on the grid. Power flow simulations are run iteratively at each node on the distribution system until a violation of a power system limitation is identified.

Rather than relying on simplifying algorithms or heuristics to approximate hosting capacity, iterative hosting capacity analyses model it directly, much in the same way that a utility engineer does in performing a detailed review of a DER project seeking to interconnect at a specific node on the grid. For instance, the California commission decision 17-09-026 ordered the utilities to implement an iterative ICA methodology for the interconnection use case using 576 hourly load profiles as inputs to capture the full range of system conditions. EPRI’s report acknowledges that similarity between the iterative approach and interconnection studies is a key advantage of the method.

While this type of methodology currently tends to inspire the most confidence in the accuracy of its results, that confidence comes at a computational cost. As IREC identified in Optimizing the Grid, the iterative method is computationally intensive relative to streamlined or algorithmic approaches. EPRI agrees but goes further in their report, asserting that “based on the time and intensity to perform the iterative ICA analysis, this method is not practically scalable from a planning standpoint” and that “this type of direct DER modeling for time-based hosting capacity should be reserved for analysis of select feeders and locations.”

EPRI derives its conclusions about computational intensiveness from ICA demonstration projects performed by the three California investor-owned utilities, asserting in the report that “the complete implementation of the CA ICA iterative analysis took an average of 27 hours per feeder.” However, elsewhere in the report, EPRI notes that 27 hours per feeder was the average computational time achieved by only one of the utilities—San Diego Gas & Electric Co. (SDG&E). The other two utilities achieved far faster computational times: 23 minutes per feeder for Southern California Edison (SCE) and 1.4 hours per circuit for Pacific Gas & Electric Co. (PG&E).

These computational differences were due in significant part to hardware choices. For instance, SDG&E reported that it chose to perform its iterative analysis on office laptop computers with only 16.0 GB of memory. By contrast, PG&E used a “combination of local machines and servers which relied upon many parallel computing streams” to reduce run times.

Ultimately, as the PG&E ICA Demo Report noted, “computing power” is a significant determinant of computational efficiency. Solutions like parallel and cloud computing may help to alleviate the computational barrier, though there may be other challenges associated with utility adoption of such options (including internal restrictions on the use of cloud computing solutions or the need to get additional software licenses).

At the same time, HCA developers that use the iterative method have achieved significant advances in computational efficiency through software improvements. Synergi Electric, whose ICA tool is used by SDG&E, was able to reduce run-times dramatically through changes in software alone, according to Synergi’s senior principal electric engineer, Larry Trussell.

Representatives from CYME, whose ICA tool is used by PG&E and SCE, reported similar progress. According to Marco Andrade, CYME’s customer success manager, “CYME has achieved significant improvements in computational time through software optimization alone, and utilities that use CYME are speeding computations up even further through CYME’s distributed computing platform and other techniques.”

Underlying the issue of computational intensiveness is a cost question with respect to adopting faster or more efficient alternatives. To date, there is scant information available on the costs or benefits associated with reducing model run-times, which is an area ripe for further investigation going forward.

Ultimately, some of the assumptions about computational efficiency may simply be outdated. According to Larry Trussell of Synergi, “HCA is computationally intensive over a massive amount of data, but it can be run as a back-office process. We do not see analysis time as an issue with a good engineering approach to balance data relevance and data availability and the use of consistent network, load, and generation models.” Continued reporting on hardware and software advances is essential to support fully informed choices about which HCA methodologies and tools should be adopted.

  1. How Should the Accuracy of Hosting Capacity Results Be Evaluated?

As alluded to above, a key issue in developing hosting capacity analyses is ensuring that results are sufficiently accurate for the identified use cases. The interconnection use case in particular puts a premium on accuracy, but accuracy is necessary for planning decisions and other purposes as well. The challenge is identifying the best practice to evaluate the accuracy of results.

At present, the principal standard for evaluating accuracy of streamlined and DRIVE methods is to compare results from those methods to those produced by iterative methods on representative test circuits. Since iterative methods mimic detailed studies, the assumption is that they reflect the most accurate hosting capacity values.

The California utilities, under commission direction, undertook such a comparison of streamlined and iterative methods, and SDG&E recently performed another such comparison of DRIVE with its adopted iterative method. EPRI’s report asserts that such comparisons assess not accuracy but precision—that is, “how precise the methods are relative to each other in producing similar results.” This begs the question: how should accuracy be gauged?

EPRI posits that results of detailed studies of actual interconnection applications must serve as the appropriate comparator, and yet comparison to results from interconnection review comes with limitations. Xcel Energy’s attempt to compare DRIVE results to interconnection studies in Minnesota exemplifies these shortcomings.

As Xcel noted in its Distribution System/Hosting Capacity Study, interconnection studies do not necessarily measure hosting capacity; rather they evaluate whether there is sufficient capacity for the current project, making comparisons inapt. Xcel could identify only 15 feeders with appropriate interconnection study data, leading to an extremely small comparator sample from which it was difficult to draw meaningful conclusions.

As HCA developers and utilities work to achieve near real-time updating, a semi-automated interconnection approach becomes more viable.

An alternative approach may be to identify and correct any shortcomings in the iterative method that would lead to discrepancies between its results and those derived from detailed interconnection studies. Iterative methods tend to approximate the detailed analysis performed in the interconnection process, but they may do so imperfectly. Modeling errors, the introduction of heuristics, and infrequent updating tend to reduce the accuracy of iterative results.

HCA developers are working to identify and correct these issues. Developers could also investigate whether it would be possible to expand the iterative analyses to more closely mimic detailed studies by, for instance, introducing impact factors that iterative models do not currently consider.

On the other hand, comparison to iterative results is an appropriate way to gauge accuracy if what we want to evaluate is whether algorithms employed in methodologies like DRIVE achieve the same results as direct modeling. As SCE explained in its comparison study, iterative simulations achieve “increased confidence in accuracy due to direct modeling of resource[s].” Marco Andrade echoed this view, explaining that “iterative analyses like CYME’s ICA module run simulations at each node using direct system modeling to capture actual hosting capacity limits rather than estimating them.”

By contrast, EPRI’s report indicates that methodologies like DRIVE that take a streamlined approach rely on algorithms to evaluate the same criteria. So long as iterative methods model circuit conditions and resources directly rather than relying on simplifying heuristics, it may be appropriate to view their results as ground-truth against which other models can be compared.

At the same time, comparison with iterative methods raises its own set of challenges. As SDG&E explained in its comparison study of iterative and DRIVE methodologies, “no two hosting capacity methodologies are the same. Inputs, outputs, and assumptions vary,” making it difficult to faithfully compare results. For instance, the iterative ICA uses 576 hourly load profiles as inputs, whereas DRIVE uses only two load conditions (maximum and minimum load) as inputs. To address this discrepancy, SDG&E compared results for only a single time period, which could mask meaningful differences between the results.

These methodological choices require attention, and the questions surrounding accuracy are key to ensuring HCA is not only informative, but also that it enables a meaningful shift in how the grid is currently planned, operated, and managed. As noted above, the level of rigor required for the accuracy evaluation may depend on how the results will be used. Having a robust modeling tool that does not fundamentally improve the grid user’s experience is likely not worth the time and effort to create.

  1. How Much Methodological Transparency Is Needed?

Along with accuracy, transparency in how the hosting capacity results are achieved is key to inspiring confidence in the HCA. But transparency can be a struggle to achieve in a domain where methodologies are the property of private developers. A key question is how much transparency into methodological details should be required?

EPRI’s report posits that visibility into “underlying algorithms” is unnecessary so long as results from test feeders are published. EPRI explains that “similar to smart inverter functions, rather than mandating manufacturers share complicated algorithms, the inverters are instead simply tested in order to evaluate performance.”

However, transparency into results alone may limit the robustness and usefulness of the tool. For instance, results from a test circuit may not scale perfectly to the grid as a whole, and methodologies may incorporate problematic assumptions. The assumptions the utility is making about the performance of their system components and DERs should be reasonable, i.e. not overly conservative so they increase the need for investment, or not insufficiently conservative so they could pose safety or reliability concerns.

Ultimately, utilities, as regulated monopolies, should make publicly accessible the details about how their HCA algorithms were developed, particularly since their results will have direct economic implications for all customers. Transparency is key to ensuring that regulators, stakeholders, and ratepayers have confidence in the fairness and accuracy of the results, which is in turn essential to fulfilling HCA’s unique and powerful promises.

Along with accuracy, transparency in how the hosting capacity results are achieved is key to inspiring confidence in the HCA.

Conclusion
Continued HCA developments will raise new questions and insights over time. Likewise, potential HCA use cases will change as new methodologies emerge. Indeed, several new entrants to the HCA developer field are preparing to release their own hosting capacity methodologies, which will expand the understanding of the tool and create more opportunities for further evolution.

Going forward, regulators, utilities, DER developers, and other stakeholders should remain engaged, querying how HCA should be deployed, evaluated, and compared. And, all involved parties should endeavor to clearly define the intended goals of an HCA and ensure that the use cases, underpinning methodologies, and assumptions are supportive to those ends. IREC looks forward to continuing to participate in this rich and nuanced conversation.

By Stephanie Safdi and Sky Stanfield, attorneys for IREC with the law firm Shute Mihaly & Weinberger, LLP

Note: References can be made available upon request.

The Evolution of Hosting Capacity Analysis as a Grid Modernization Tool (2)

Sky Stanfield

Sky is a partner with Shute, Mihaly, Weinberger, attorneys for IREC. Her practice focuses on the intersection between renewable energy regulation and environmental and land use law, with a particular focus on regulatory policy implementation, compliance and permitting processes.

The Evolution of Hosting Capacity Analysis as a Grid Modernization Tool (2024)

References

Top Articles
Latest Posts
Article information

Author: Golda Nolan II

Last Updated:

Views: 5915

Rating: 4.8 / 5 (58 voted)

Reviews: 89% of readers found this page helpful

Author information

Name: Golda Nolan II

Birthday: 1998-05-14

Address: Suite 369 9754 Roberts Pines, West Benitaburgh, NM 69180-7958

Phone: +522993866487

Job: Sales Executive

Hobby: Worldbuilding, Shopping, Quilting, Cooking, Homebrewing, Leather crafting, Pet

Introduction: My name is Golda Nolan II, I am a thoughtful, clever, cute, jolly, brave, powerful, splendid person who loves writing and wants to share my knowledge and understanding with you.