Energy Archives https://reason.org/topics/energy/ Fri, 14 Nov 2025 20:24:55 +0000 en-US hourly 1 https://reason.org/wp-content/uploads/2017/11/cropped-favicon-32x32.png Energy Archives https://reason.org/topics/energy/ 32 32 Surface Transportation News: Congestion is back, for trucks as well as cars https://reason.org/transportation-news/congestion-is-back-for-trucks-as-well-as-cars/ Thu, 09 Jan 2025 14:57:16 +0000 https://reason.org/?post_type=transportation-news&p=79559 Plus: Addressing diversion from toll roads and bridges, the electric vehicle debacle, and more.

The post Surface Transportation News: Congestion is back, for trucks as well as cars appeared first on Reason Foundation.

]]>
In this issue:

Congestion Is Back, for Trucks as Well as Cars

Two reports have crossed my screen in recent months, both quantifying U.S. traffic congestion. The first, from several months ago, is the “2023 Urban Mobility Report” from the Texas A&M Transportation Institute (TTI). The second is the American Transportation Research Institute’s “Cost of Congestion to the Trucking Industry, 2024 Update.” Because of the delays in the availability of detailed congestion data, both of these 2024 reports cover U.S. congestion in 2022. The message of both studies is that traffic congestion is back, but not quite to the level of 2019.

The TTI report covers 494 U.S. urban areas, far more than in some of its earlier editions. The most detailed data are from 101 “intensively studied urban areas.” From the entire data set, the average annual delay per auto commuter was 54 hours in 2022, equaling the pre-pandemic delay from 2019. Most of the other 2022 data elements from this largest set of urban areas were 5-8% lower in 2022 than in 2019. Also, in this larger set, total travel volume (billions of miles traveled) was 5% lower in 2022 than in 2019. The only large increase was in urban truck congestion cost, which was 16% higher in 2022 than in 2019.

Turning to the 101 larger urban areas, in 2019 only five of them had less than 30 hours of delay per commuter, but in 2020 that number increased to 73; by 2022 only five urban areas had that little delay, the same as in 2019. In a table tracking key parameters for every year since 1982, the only figure that reached a new high in 2022 was the total cost of congestion, at $224 billion compared with $217 billion in 2019.

The American Transportation Research Institute (ATRI) report on trucking congestion is not solely focused on urban areas, although that is where the most severe truck congestion occurs. It relies on a very large truck GPS database that records truck miles and speed, among other parameters from more than a million commercial trucks. Truck vehicle miles of travel (VMT) are obtained from the Federal Highway Administration’s (FHWA) Highway Statistics tables that can be segmented by state, region, and metro area. To calculate truck congestion cost, ATRI draws on data on the operational cost per hour of Class 7 and Class 8 tractor-trailer combination trucks, which are the focus of this study. ATRI updates those costs every year.

The findings on truck delays and congestion costs are somewhat different than TTI’s overall findings. For example, annual average truck speed in “bottleneck” locations was on a downward trend from 2016 through 2019. Not surprisingly, with far fewer vehicles on the roads in 2020, average truck speed increased. As personal travel began to return, truck speeds declined slightly in 2021 but increased again in 2022. So trucks were going faster in 2022 than in 2019, unlike cars.

But that’s not the end of the story. Truck VMT increased significantly in 2022 as they were operating in less congested conditions. But due to the rising cost of fuel and labor, their cost of congestion continued to increase, making their overall cost of congestion reach a new high in 2022, even though their hours of congested travel declined. Table 1 in the report shows that trucks’ cost of congestion in 2022 was 15% greater than in 2021, at nearly $109 billion.

The states with the highest truck congestion costs in 2022 were Texas, followed by California, Florida, New York, and Georgia. And the states with the largest percentage increases in truck congestion cost were Hawaii (up 92%), followed by Vermont, Minnesota, Kentucky, and Alaska. An indication that post-pandemic economic recovery is somewhat uneven is that 25 states had decreases in truck congestion in 2022 compared with 2021. The largest decreases were in Louisiana (almost 13% less), with smaller decreases in New Mexico, Maryland, California, and Ohio.

Needless to say, very large metro areas still had large percentage increases in truck congestion cost in 2022, with New York leading the pack at 21.6% increase and by far the highest cost ($6.7 billion), followed by Miami, Chicago, Philadelphia, and Dallas.

For auto commuters, the highest delay (person-hours) in 2022 was once again in Los Angeles, followed by San Francisco/Oakland, New York/Newark, Washington, D.C., and Atlanta. Almost the same ranking appears for annual congestion cost per commuter: Los Angeles, San Francisco, New York, Atlanta, and San Diego, with Washington this time in 6th place.

We can see that for both commuters and truckers, the cost of congestion was higher in 2022 than in 2021. But for commuters in large metro areas, the extent of congestion was greater in 2022 than in 2021, while the opposite was true for truckers. I suspect that when we get the comparable data for 2023, both the extent of congestion and its cost will increase for both commuters and trucking.

 » return to top

Addressing Diversion from Toll Roads and Bridges
By Baruch Feigenbaum

The number of vehicles that avoid toll roads by diverting to non-toll roads has long been a knotty topic. Several studies have attempted to address the question, but due to the large number of variable factors, there was no consensus. Recently, Robert Bain and Deny Sullivan of CSRB Group released a study titled “The Traffic Impact of Road Pricing“, in which they combined both a literature review and their own research to determine a true diversion rate.

The authors examine the increased interest in tolling due to greater regulatory permission, a need for more highway funding, accelerated project construction, blended funding, and advances in technology. Bain and Sullivan studied fixed-rate toll systems only. They did not analyze priced managed lanes or highways using dynamic pricing.

In the United States, the authors cite several other studies in their literature reviews. The first is Nichols and Belfield’s study of the Midtown and Downtown Tunnels in Hampton Roads, Virginia as well as the State Route 520 bridge in Washington state. In Virginia, after tolling began, traffic decreased 8% on the Midtown tunnels and 20% on the Downtown Tunnel. In Washington state, traffic declined by 30%. Breaking down the traffic volumes by day of week and time of day, the studies found that diversion was a bigger problem during off-peak hours than peak periods. A separate meta analysis of nine tolled facilities in North America shows that facilities had 10-36% less traffic after tolling.

A Minnesota Department of Transportation (DOT) study that relied on modeling examined different types of roadways. Urban Interstates were found to have a 15% diversion rate, rural Interstates 20%, urban freeways 20%, and rural freeways 25%. Frontage roads increase diversion by 5%; competing roads within 10 miles increase diversion by 10%.

Bain and Sullivan combined these studies with those in their database for a total sample size of 83 tolled highways. They used research reports, academic papers, media reports, toll operator data, and transportation department websites. Many were not true academic research papers, as they did not undergo double-blind peer reviews, but they do provide useful information. The U.S. had by far the most toll highways with Portugal, Australia, the U.K., and Canada rounding out the top five. There were 35 road and 20 bridge projects, the two most common types of tolled infrastructure.

Using all those papers, Bain and Sullivan calculated the median diversion impact of tolling as -25%. In other words, for every four people who used the infrastructure before tolling, only three used it after tolling. But the spread was 4% to -85%, indicating local conditions are paramount. Further, there were no clear patterns among urban or rural roads.

The one exception is toll bridges with no realistic alternatives. They had a lower diversion rate— -15% —than other highways. Given the lack of alternatives, drivers may have no other option than a boat.

While the Bain study has the largest sample size, its results did not differ from the other studies that found a diversion rate of 20-25%.

Using the results, Bain created a predictive model to explain which toll roads will have the highest diversion rates. The model examines alternative routes and alternative modes and uses a decision tree (chose one of several options in multiple steps) instead of a mathematical model. The study found that 84% of impacts lie within the range, which would be similar to having an r-squared value of 0.8, if this were a mathematical model.

Overall, the paper presents the best model so far for toll diversions. The model captures most of the diversion and the predictive model is helpful in explaining what types of roadways will have higher diversion rates. But I’m most interested in how we can reduce toll road diversions. We need to address problems including traffic congestion, high costs, user behavior, and status-quo thinking. 

One option would be to focus on reducing congestion, but that might not always be cost-effective, especially on urban freeways with limited right-of-way.  While they weren’t in the study, I have encountered congestion on the New Jersey, Massachusetts, and Pennsylvania Turnpikes in urban areas. On rural tollways, adding new lanes will reduce congestion.

For the trips that are taken on roads without significant congestion, we could reduce the diversion problem by using carrots or sticks. I recommend carrots.

Why not sticks? The study shows that one factor impacting diversion is the lack of parallel routes. The stick approach would be to pull new non-tolled highways out of regional transportation improvement plans and long-range transportation plans or to tear-down existing roads, similar to what USDOT has tried to do with some urban freeways. Technically, this would end diversion, but even if it survives court challenges and potential riots, it has a few problems. It would harm economic development. That method would imperil emergency services. And it would make congestion so bad that drivers might choose to stay home or move to a different region.

A carrot-based approach would be to create a win-win for roadway operators and drivers alike. One approach would provide a frequent traveler discount or reward, the way that many airlines already do. The provider might lower a driver’s tolls, if she used the roadway five or more times per week. New drivers could be incentivized to try the toll road by paying a lower toll rate for the first 30 days. The provider might work with employers to see if their office could begin flexible scheduling or telework. Either would decrease the cost of using the toll road. Finally, the provider might add additional exits or motor services to make the toll road more convenient.

Toll diversion is not anywhere close to the biggest problem toll providers face, but it is still a problem. It is not easy to solve, and a small number of drivers will always take another route to avoid tolls. But where we can, we should encourage new toll paying customers to use the toll road by making the experience better and not worse.

 » return to top

Study Analyzes NEPA Permitting Delays

The process that major infrastructure projects must go through to obtain federal permission to build is increasingly costly and time-consuming, as I’ve previously discussed in this newsletter and analyzed in a 2024 policy study. A recent empirical study sheds some light on what factors affect project approval under the legal framework that has evolved since the enactment of the National Environmental Policy Act (NEPA) in 1970. The paper is “A Hazard Analysis of Federal Permitting Under the National Environmental Policy Act of 1970,” by Michael Bennon, Daniel De La Hormaza, and R. Richard Geddes, published in the Journal of Regulatory Economics.

The authors relied on data from the Council on Environmental Quality (CEQ) regarding 1,269 Environmental Impact Statements (EISs). A key variable was the duration from the Notice of Intent to file to the eventual Record of Decision (ROD). They used a statistical technique called a Cox proportional hazard model to estimate the impact of a number of factors on the duration of the permitting process.

One of the interesting findings was that projects proposed as privately financed public-private partnerships (P3s) completed the EIS process faster than others. The authors speculate that the typical team proposing such projects may have a better understanding of the process, enabling a faster review than would otherwise be possible. They also speculate that faster permitting for energy projects may be due to the volume of such projects, leading to analysts’ greater familiarity with their impacts.

The authors also compared project permitting durations for projects located in states with restrictive environmental laws. Those projects did take longer to reach the ROD, but this was not due to the state restrictions, per se. They hypothesize that opposition groups may be stronger in those states, leading to more public opposition and threats of litigation.

Another finding concerned projects that were designated for inclusion in the federal permitting “dashboard.” These projects had longer durations, which seems contrary to the intent of the dashboard, but the authors speculate that projects designated for the dashboard are likely to be larger and more complex, leading to a longer permitting process. Higher EIS page counts have longer durations in getting to a final EIS, but they also have a longer duration between the final EIS and the ROD.

One other interesting finding concerns permitting of projects connected with a federal economic stimulus program such as the American Recovery and Reinvestment Act. They hypothesize that because elected leaders and administrators desire quick impact from stimulus projects, they tend to focus on “shovel-ready” projects that have already completed the NEPA process. But they point out that “This may lead to the allocation of stimulus funds to projects with lower expected returns on investment.”

 » return to top

The Electric Vehicle Debacle

The last six months have been dismal for those hoping for a U.S. electric vehicle (EV) future. Here is a small sample of news articles from this period.

The last of these articles includes graphs showing the shrinking cash balances among EV startups and the dismal share prices of those (such as Fisker, Lucid, and Rivian) that are publicly traded. The declining sales of electric vehicles have led to setbacks for plans to build huge EV battery plants, not only here but also in EV-friendly Europe. Volvo has put on hold plans to build a large plant to make batteries for its electric trucks. And the very ambitious startup Swedish EV battery company Northvolt filed for bankruptcy soon after.

Those failures were warning signs to our federal and state governments that large-scale subsidies to build battery plants would not be a wise move. Several state governments have provided state aid for EV battery plants, but the most enormous subsidies have come from the federal government. The latest was announced in November with great fanfare by the Biden Department of Energy: a $6.6 billion loan to nearly-bankrupt Rivian for an EV factory in Georgia. If built, it would have the capacity to produce 400,000 SUVs and crossovers per year. Rivian has lost about $4 billion on the 37,396 vehicles it sold in the first nine months of 2024, and it has $1.25 billion in debt. The Department of Energy recently finalized an even larger $9.6 billion loan for a Ford EV battery plant in Tennessee. Ford has also been losing tons of money on its poorly selling EVs.

It turns out that the $6.6 billion loan to Rivian is “conditional” on the company meeting certain technical, legal, environmental, and financial conditions. A Wall Street Journal editorial (Dec. 26) explained that these conditions include pro-union policies that Rivian has been resisting at its vehicle factory in Illinois. But Ford recently agreed to a neutrality agreement with the United Auto Workers for its Tennessee EV battery plant.

I can’t imagine any private investor making such loans and expecting them to be repaid. And in the federal government’s nearly insolvent condition, giving away tens of billions of dollars is the last thing it should be doing. Fortunately, with a new administration and new Congress, these policies can be changed.

 » return to top

Federal AV Policy Review and Outlook
By Marc Scribner

There will be many fundamental policy differences between the outgoing Biden administration and incoming Trump 47 administration. One area to watch is automated vehicle (AV) policy. The Biden administration departed from the AV enthusiasm of the previous Trump and Obama administrations. The new Trump administration is expected to pick up where the first Trump administration left off on AV policy. On balance, the Trump administration is likely to be friendlier to AV technology development and deployment than the Biden administration. But the scrambled populist politics of our time introduce some sizeable uncertainty.

While most of its actions did not actively undermine AV technology development and deployment, the Biden administration was content to mostly do nothing on AVs. The Biden administration’s stance on AVs was modulated by its close ties to organized labor. Fearing competition from robots, unions have emerged as the primary opponents of advanced automation technologies in the transportation sector. The official Teamsters union position, for instance, is a national ban on driverless commercial vehicles. Given this political environment, inaction on federal AV policy may have been the best attainable outcome from the self-described “most pro-union, pro-worker President in history.”

The most significant AV policy action by the Biden administration was the National Highway Traffic Safety Administration’s (NHTSA) Standing General Order (SGO) on mandatory crash reporting, which was issued in 2021 and revised in 2023. Under the SGO, poorly defined reporting parameters coupled with aggressive compliance requirements led AV companies, out of an abundance of caution, to submit a lot of incident data that have little bearing on safety. In July 2024, NHTSA announced it would propose a rule to reform and codify the SGO’s incident reporting requirements by the end of the year, although this has since been delayed until at least May 2025.

Aside from the SGO, the most notable action on AVs taken by the Biden administration was the finalization of a rule that revised occupant protection safety standards to account for future vehicles that lack manual driving controls. The Biden NHTSA deserves praise for promulgating this rule in March 2022, although not too much credit because this rule was fully baked and ready for publication at the end of the Trump administration in Jan. 2021.

Much less praiseworthy was the Federal Motor Carrier Safety Administration’s last-minute decision on Dec. 27 to deny an exemption petition submitted by two AV truck developers, in which the companies proposed to use special cab-mounted hazard lights in lieu of placing warning triangles outside a disabled truck on the side of the road. The placement of warning devices around stopped commercial vehicles is a legacy federal requirement with which it is impossible to comply if there is no driver in the vehicle. The supposed basis for denial was insufficient information submitted in the exemption application, but it should not have taken 23 months to review a 15-page document for the claimed basic deficiencies.

But AV policy under the Biden administration didn’t end completely on a sour note. Just before Christmas, NHTSA announced it was releasing its long-promised proposed AV STEP voluntary national framework. The goal of AV STEP is to leverage existing authorities to give AV developers greater latitude to produce and deploy their vehicle technologies in exchange for submitting more information to regulators. However, the final decision on what—if anything—comes from the AV STEP proposal will be made by the incoming Trump administration.

The new Trump administration’s likely emphasis will be on the geopolitical strategic importance of advancing AV technology in the United States—which is to say, “winning the AV race with China.” This means enthusiasm for AV policymaking is likely to return to federal agencies and could bode well for the industry.

However, there is concern that China hawks may disrupt supply chains and limit international market access. The Commerce Department’s Bureau of Industry and Security in September proposed restrictions on transactions involving Chinese or Russian firms that affect AVs. The current Biden proposal preserves AV developer access to most global markets and limits damage to supply chains, but a final rule that is less cautious could do serious damage. The Trump administration should understand that needlessly aggressive trade restrictions on AV technologies could undermine their goal of strategic global AV dominance.

To advance continued U.S. leadership in AV innovation, the Trump administration should focus on modernizing NHTSA’s federal motor vehicle safety standards (FMVSS) so AVs can be incorporated into the national auto safety regulatory ecosystem. This would have the effect of preempting many state regulations and preventing an unworkable compliance patchwork. It would also obviate the need for Congress to act because the primary justification for congressional AV action over the past decade has been to increase the statutory cap on and duration of temporary FMVSS exemptions. But if NHTSA promulgates revisions to FMVSS and thereby allows AVs to self-certify to FMVSS just like other vehicles, there is no need for temporary exemptions.

The Trump administration has a golden opportunity to reinvigorate AV policy in the United States. To do so, it must stay focused on systematically identifying and addressing safety regulatory barriers and gaps. But the Trump administration may face challenges from within related to trade, national security, and even labor that could undermine its AV policy goals.

 » return to top

The Evolution of Bestpass

Unless you are part of the U.S. trucking industry, you’ve probably never heard of Bestpass. I’ve been a fan for many years, when I learned that it offers service to the over-the-road trucking industry, including weigh-station bypass services and management of their company toll accounts. Bestpass was launched in 2001 by people from the Trucking Association of New York. It came to my attention when I learned that, among its toll management services, it assisted subscribing truck fleets in taking advantages of refunds from certain state highway user charges for those using the New York State Thruway and the Massachusetts Turnpike. I’ve gotten to know Bestpass people at various transportation conferences such as those of the International Bridge, Tunnel & Turnpike Association.

Over the past decade, Bestpass has acquired several other companies that serve the trucking industry. In 2023 it acquired Fleetworthy, self-described as a fleet compliance, safety, and risk management solutions provider. Fleetworthy came with a platform called CPSuite, which as part of Bestpass provides what CEO Tom Fogarty told FleetOwner serves as “a single pane of glass for fleet executives to be able to view what’s working and what’s not in their overall operation.” As a result, Bestpass was rebranded as Fleetworthy.

But that was not all. Last year Fleetworthy acquired Drivewyze, a firm with a long track record in dealing with weigh station bypass (and possessing the largest share of that market). The two companies had already agreed on a partnership in 2023, but this progressed to a merger in 2024. Fogarty told FleetOwner that this merger simplifies many things that are not trucking companies’ mission. “Their core mission is safety and keeping the fleets operating on the roads, but being able to do that in a much simpler way is something they strive for.”

Recently Fleetworthy announced an agreement with data provider Geotab, to interface with that company’s telematics ecosystem. Bestpass customers can use the MyGeotab interface to match Geotab vehicle data and GPS locations with charges reported by Bestpass.

Long-term readers of this newsletter may guess why I’m especially interested in these developments. Sooner or later the United States will need to shift from per-gallon fuel taxes to per-mile charges. Keeping track of who owes what as this transition takes place will be complicated. It strikes me that for the long-haul trucking industry, service providers such as Fleetworthy will be well-positioned to play a key role in this future.

 » return to top

News Notes

Maryland Extends I-95 Express Toll Lanes
Last month the Maryland DOT opened a 6.5-mile extension of the ETLs on I-95, extending them to MD 152. Current plans call for further northward extensions to MD 24 by the end of 2027. The lanes charge fixed toll rates for peak, off-peak, and night, with rates also based on the number of axles (up to six or more). The lowest rates are for E-ZPass customers, with somewhat higher rates for Pay-by Plate and Video Toll customers.

New Zealand Planning $5.8 Billion Tollway P3
According to Infralogic (Dec. 6), New Zealand’s relatively new National Party-led coalition government “has rolled out the red carpet to private infrastructure investors.” The project currently entering procurement is the $5.8 billion Northland Expressway, to be built in three sections heading north from Auckland. Expressions of interest from potential public-private partnership (P3) consortiums are being sought, with a request for proposals (RFP) likely to be released in second-quarter 2025. Up to five consortia appear to be organizing to submit proposals once the RFP is issued.

Thailand Plans $1.4 Billion P3 Expressway
The Thai cabinet in December OK’d a plan to use a long-term P3 to finance, develop, and operate the $1.4 billion M9 Motorway linking Bangkok with Nonthaburi. The government plans a 30-year P3 concession for the project. The current schedule calls for developing procurement documents for a tender to be launched near the end of 2025, with the P3 deal reaching financial close in the second half of 2026.

Washington State Developing Port Tollway
To improve access to the Seattle/Tacoma seaports, Washington state DOT is under way on the SR 167 Completion Project. The plan calls for extending SR 167 westward to the Port of Tacoma by means of a four-lane tollway. The new tollway will facilitate access to the port by trucks (which currently congest local streets). It will also provide faster, uncongested trips for motorists, including those who use the express toll lanes on the existing SR 167. The new corridor appears to be a combination of truck tollway and express toll lanes for motorists. The second of four phases is now under construction, with last phase to be completed by 2030.

Supreme Court Considering Challenge to California Emissions Mandate
A legal challenge to California’s permission to exceed federal vehicle emission regulations was rejected by the D.C. Circuit Court, which ruled that the plaintiffs lacked standing. Last month the U.S. Supreme Court agreed to hear an appeal of that ruling. If the case is sent back to the circuit court, it will have to analyze whether, under existing law, California can use its long-standing permission to impose tougher regulations on tailpipe smog precursors to also regulate CO2 emissions (which have no impact on smog).

Committee Chair Not Supportive of Federal MBUF
Politico reported (Dec. 16) that Rep. Sam Graves (R-MO), incoming chair of the House Transportation & Infrastructure Committee, is not supportive of replacing federal motor fuels taxes with mileage-based user fees (MBUFs). He suggested that a better way to ensure that the growing population of electric vehicles (EVs) pay their way would be a tax of some kind on EV drivers. He told Politico that he plans to discuss this subject with House Budget Committee chair Jodey Arrington (R-TX). The federal Highway Trust Fund increasingly generates far less than the amount of federal highway and transit spending, and getting EV users to pay their share would help address this problem.

Schneider Battery Electric Trucks Top Six Million Miles
A December news release from trucking company Schneider announced that its fleet of 100 Freightliner eCascadia battery-electric vehicles (BEVs) has surpassed six million “zero emission miles.” The release added that this is a reduction of 20 million pounds of CO2 emissions—which sounds like a lot, but the usual unit of measurement is tons, which amounts to only 20,000 tons of CO2. Truck producer Daimler says the eCascadia has a range of 250 miles and can be recharged up to 80% of capacity in 90 minutes, which is a lot more time than refueling a diesel rig.

Louisiana Mississippi River Crossing Location: Decision This Year
Joe Donahue, Secretary of the Louisiana Department of Transportation and Development, announced that the final site for the planned $2 billion toll bridge across the Mississippi River will be announced in 2025. After several years of study and gathering public input, the alternatives have been narrowed down to three. The bridge is needed to address serious congestion on I-10 in the vicinity of Baton Rouge, the state capital. The plan calls for the new bridge to be south of Baton Rouge, with connections to I-10 via a new southern loop on both sides of the river. Current plans assume toll finance and a long-term P3 procurement model, similar to what is in use for the replacement of the I-10 bridge across Louisiana’s Calcasieu River.

San Francisco P3 Bus Yard Decision This Spring
Infralogic reported last month that the planned 30-year P3 project to modernize the San Francisco Municipal Transportation Agency’s (SFMTA) bus yard needs two important votes this spring: one by the SFMTA board and the other by the San Francisco Board of Examiners. SFMTA has a provisional agreement with a Plenary-led consortium for a 30-year availability-payment design-build-finance-operate-maintain P3 concession. Construction cost is estimated at $560 million and the annual availability payments are to be $42.2 million per year.

Arizona Plans Another Stretch of New I-11
Thanks to a $26 million federal grant, Arizona DOT plans to upgrade 4.5 miles of US 93 to prepare it for becoming part of the long-planned I-11 between Phoenix and Las Vegas. Over the last several years, ADOT has spent nearly $500 million on upgrades to US 93, which is planned to be the primary component of I-11 in Arizona. Nevada DOT has built about 45 miles of I-11 southeast of Las Vegas since 2018.

Pennsylvania Turnpike Revamps Tolling Policy
In addition to shifting the Pennsylvania Turnpike to all-electronic tolling as of this month, the agency has implemented two changes in its toll rates. All toll charges will now reflect consistent per-mile charges across the system; this is a step toward eventually charging per mile traveled on all American highways. Second, truck tolls will no longer be based on gross weight; instead they will be based on axle-weight, which more accurately reflects the extent of pavement damage from heavy vehicles. Kudos to the turnpike for these sensible changes.

Mixed Results on Bridge Condition
There’s good news and bad news in the Better Roads Bridge Inventory, compiled by Equipment World from FHWA data. Between 2020 and 2024, the number of bridges in “good” condition declined from 278,000 to 274,000, a decline of 1.3%. On the other hand, the number in only “fair” condition increased by 3.8%, from 293,000 to 305,000. The worst category—bridges in “poor” condition—were down 0.9% and account for only 6.72% of all bridges. So it would appear that the increase in fair-condition bridges came largely from more good bridges declining to fair condition. The states with the lowest percentage of poor-condition bridges are Nevada, Arizona, Texas, Delaware, and Georgia. On the other end of the scale, state with the highest extent of poor-condition bridges, in order, are Iowa (19.6% poor), West Virginia, South Dakota, Maine, and Rhode Island. States with the highest percentage of bridges in good condition are Georgia, Arizona, Ohio, Florida, and Nevada. And those with the lowest percentage in good condition are Utah, Rhode Island, Maine, Massachusetts, and West Virginia.

Tesla EV Plug Now Standardized
SAE International as of last month was finishing work on an open EV charger standard, J3400, based on the Tesla EV charging connector. Simultaneously, FHWA said it was finalizing a new standard for federally funded EV chargers, based on Tesla’s charging plug. Over the past year and a half, nearly every automaker and charger manufacturer adopted Tesla’s charging standard.

Trucking Industry Loses Rhode Island Truck Tolling Case
Last month a federal appeals court rejected the trucking industry’s case that Rhode Island’s trucks-only toll charges were unconstitutional. Although the court found that daily toll caps are unconstitutional, the overall tolling system, in which the toll revenue is dedicated to bridge improvements, was found to pass muster.

Highway Tolling Discussed Again in Michigan
An article by MLive (Dec. 11) reported on year-end discussions among Michigan legislators and transportation groups about how to pay for upgrading the state’s aging highways. Alternatives discussed included increasing state fuel taxes or vehicle registration fees, using toll revenue to pay for rebuilding/modernizing the state’s Interstate highways and freeways, or replacing fuel taxes with per-mile charges. A major tolling study by HNTB and CDM Smith in 2022 found that modest toll rates could finance the reconstruction and modernization of 545 route-miles of limited access highways, enabling $18.5 billion to be financed based on the toll revenue. (For details, see the lead article in the Feb. 2023 issue of this newsletter.)

MARTA Seeks to Enforce Bus-Only Lanes
Georgia’s MARTA transit agency will soon be opening its first dedicated-lanes bus rapid transit system, a 5-mile round trip route between downtown and Summerhill. MARTA has asked the state legislature for automated traffic cameras so that it can ticket drivers who move into the bus lanes. Since adding lanes to a highway is very costly, “bus-only” makes sense only where the bus person-throughput (per lane per hour) is more than what the lane would handle in mixed-flow traffic. A wiser plan would be for “bus toll lanes” that charge motorists variable pricing to keep the traffic moving smoothly to enable fast and reliable bus service. (See “Enhanced Transit and Managed Arterials: A Win-Win Combination,” Reason Foundation, Oct. 2016.)

Trump Misunderstands Panama Canal Tolls
President-elect Trump mis-spoke when he claimed that “the fees being charged by Panama are ridiculous [and] highly unfair.” In fact, as the Wall Street Journal pointed out in an editorial on Dec. 26, “Every vessel, regardless of its flag, pays the same rate according to tonnage and type. . . . About 75% of the total price is a toll [to pay for capital costs] and 25% is for services like tugboat or locomotive escorts.” The Panama Canal is a business and is far better run than highly subsidized U.S. inland waterways.

Riverside County’s Express Toll Lanes Credit Upgraded
Fitch Ratings last month announced that it has increased the rating on Riverside County Transportation Commission’s SR 91 express toll lanes from BBB+ to A. The upgrade reflects traffic and revenue levels exceeding Fitch’s base case. Until recently, the only express toll lanes with a rating of A or above was the world’s first ETL project, on SR 91 in neighboring Orange County. Most other express toll lane projects financed by their toll revenues have Fitch ratings of BBB or BBB-.

Denmark Shifts to Per-Kilometer Tolling
The Danish government, as of Jan. 1, 2025, shifted its heavy-vehicle tolling system from the Europe-wide Eurovignette (a multi-country electronic tolling system) to per-kilometer charges. The system charges vehicles weighing 12 tonnes or more, except for buses. Instead of a transponder, trucks will have to sign up with UTA Edenrod’s UTA One system. The Danish toll road system comprises 10,900 km of highways.

New Commentary Suggests NEPA Litigation Reform
In a recent Substack post, R. Richard Geddes and Joshua Rauh review the high cost and time of infrastructure projects getting through the current NEPA process, especially the litigation that often follows the release of the final Environmental Impact Statement (EIS). They offer a menu of changes that could streamline that system.

 » return to top

The post Surface Transportation News: Congestion is back, for trucks as well as cars appeared first on Reason Foundation.

]]>
The future of data center electricity use and microgrids https://reason.org/commentary/the-future-of-data-center-electricity-use-and-microgrids/ Thu, 05 Sep 2024 04:00:00 +0000 https://reason.org/?post_type=commentary&p=76100 Is the data center electricity future decentralized?

The post The future of data center electricity use and microgrids appeared first on Reason Foundation.

]]>
This commentary is the fifth in a series explaining data center electricity use and the nuances in regulating it. You can read early commentaries here, here, here, and here.

The recent earnings announcement from Nvidia brings my data center electricity use series full circle:

Its now-dominant data center segment increased revenue to $26.3 billion—more than 2½ times what that business generated a year earlier. Adjusted operating income for the quarter more than doubled year over year to $19.9 billion. Nvidia’s overall top and bottom lines beat Wall Street’s targets, as did the company’s forecast for the current period ending in October.

I have some closing thoughts (but no predictions!) about how this scenario will evolve into the future, based on my reading and the previous posts in this series.

Data centers are at the nexus of significant technological and economic shifts. Their growth is intertwined with the evolving landscape of electricity generation, distribution, and regulation. Future investments will have to navigate a complex landscape where electricity costs, carbon intensity, and regulatory pressures are all in flux. Technologies will be in flux too.

The biggest uncertainty facing the data center industry is not the price and availability of GPU chips for parallel computing. It’s power.

In this series I applied lots of economics, from how the elasticity of supply and demand change with time to the effects and feasibility of greenhouse gas reduction targets. But I think the most relevant economic insight comes from institutional economics and the make-or-buy decision. That insight leads us to microgrids.

BYOG: Microgrids

Applying the make-or-buy logic, one of the most significant questions for future data center investment is whether they will increasingly produce their own electricity through microgrids: bring your own generation. The energy-intensive operations of data centers pose significant challenges for both their commercial viability and their environmental impact. Microgrids offer a potential solution by providing a more resilient, efficient, and sustainable energy supply. The concept of microgrids—localized grids that can operate independently from the broader electricity grid—offers data centers a pathway to enhanced energy security, cost control, and sustainability.

microgrid is a localized energy system capable of operating independently or in conjunction with the broader power system. It typically integrates various distributed energy resources (DERs) such as solar panels, wind turbines, battery storage, small natural gas generation, and combined heat and power (CHP) systems. The microgrid can function autonomously in “island mode” or connect to the larger grid to provide or receive power as needed. This flexibility allows microgrids to be more resilient, to manage their energy costs, and to support the integration of low-carbon energy sources.

Microgrids are particularly valuable in critical infrastructure applications, such as hospitals, military bases, and, increasingly, data centers. For data centers, the ability to control their energy supply is not just a matter of cost but also a strategic imperative, given their need for continuous, reliable power and their growing role in efforts to reduce greenhouse gas emissions.

The economic rationale for microgrids is compelling. By producing their own electricity, data centers can insulate themselves from grid instability, rising energy costs, and regulatory risks. Microgrids powered by low-carbon energy sources also align with the decarbonization goals that many data center operators have set. Even though it seems an eternity ago in data center evolution, in 2020 Schneider Electric was arguing that microgrids would increase uptime while managing both costs and carbon.

Building Data Centers as Microgrids

Microgrids, especially microgrids that include storage, can serve the dual commercial and climate objectives in several ways:

  1. Enhanced Resilience: Data centers require an uninterrupted power supply. Microgrids can provide a higher level of energy resilience by allowing data centers to operate independently of the main grid during outages or disruptions. By incorporating backup generators, battery storage, and renewable energy sources, a microgrid can ensure that data centers remain operational even in the face of grid instability or natural disasters. Hurricane Beryl in Texas in July provided a recent example, in which the HEB grocery store chain was able to keep power to its stores thanks to the Enchanted Rock microgrid systems that they deploy widely. Enchanted Rock uses natural gas power generators to provide power when the distribution grid is down, and this multi-modal fuel approach creates a more resilient power system for their customers, including data centers.
  2. Energy Cost Optimization: Energy costs are a significant operational expense for data centers. By operating as a microgrid, data centers can manage their energy use and reduce costs through demand response strategies, energy arbitrage, and by selling excess power in either wholesale or local retail energy markets (as those emerge in the future) during peak demand periods. This capability allows data centers to manage their energy expenditures and potentially generate additional revenue. Selling into external markets requires more extensive and time-consuming interconnection procedures, and with the bottleneck that interconnection currently presents, that’s likely to be less of a revenue factor than demand response and arbitrage.
  3. Decarbonization: Many data center operators have committed to ambitious climate goals, including achieving carbon neutrality or running entirely on renewable energy. Microgrids enable data centers to incorporate on-site renewable energy generation, such as solar or wind power, coupled with battery storage to offset their carbon footprint. Natural gas microgrid systems also provide resilience while contributing to decarbonization, and will play an important role in the resource portfolio. If you balk at the thought of natural gas as decarbonizing, think about the opportunity cost of a choice of backup fuel: diesel generators or a natural gas microgrid?
  4. Grid Support and Flexibility: When connected to the main grid, microgrids can provide ancillary services, such as frequency regulation and voltage support, which help maintain grid stability. Data centers can thus play an active role in supporting the broader energy system, contributing to the grid’s flexibility and reliability.

The decision to adopt microgrids will depend on several factors, including the cost of generation technologies, the regulatory environment, and the pace of innovation in energy storage solutions.

Regulatory and Utility Barriers to Data Center Microgrids

While the potential benefits of microgrids for data centers are clear, existing utility regulations and the incumbent position of monopoly utilities present significant barriers to their widespread adoption.

In many regions, regulations governing electricity generation, distribution, and sales are designed around a conventional vertically-integrated utility model. These regulations often limit the ability of non-utility entities, such as data centers, to generate and distribute electricity. For example, some jurisdictions may have rules that restrict the sale of excess power generated by microgrids to third parties or the utility itself, reducing the economic viability of microgrid investments.

There’s also a political economy of microgrids, and decentralization in general. Monopoly utilities, which own and operate the transmission and distribution wires networks, may see microgrids as a threat to their business model. As a result, they may lobby state legislators and/or regulators for regulations that hinder the development of microgrids or impose high fees on microgrid operators for grid access. Utilities also typically have exclusive rights to serve their geographic service territory, making it difficult for data centers to establish microgrids that operate independently of the utility unless they don’t cover a lot of ground and don’t cross any public rights of way.

As I alluded to above, grid interconnection can also be a time-consuming and costly challenge. Connecting a microgrid to the broader utility grid involves navigating a complex set of technical and regulatory hurdles. Utilities often impose stringent interconnection standards that can be costly and time-consuming to meet, and the process for obtaining the necessary approvals to operate a microgrid in parallel with the utility grid can be opaque and cumbersome, discouraging data centers from pursuing microgrid projects.

At Powering Spaceship Earth, Josh Smith has an excellent analysis of, among other things, the “connect and manage” low-permission-threshold approach taken to interconnection in ERCOT, the Electric Reliability Council of Texas. It’s one of the several dimensions in which the Texas model performs better than that used elsewhere.

Finally, existing regulated rate structures often do not favor microgrid deployment. Demand charges and fixed costs embedded in electricity tariffs can diminish the financial benefits of generating power on-site. Utilities may also not offer incentives for microgrids to provide grid services, such as demand response or load shifting, further limiting the economic attractiveness of microgrid investments and the ability of the data center to serve as a grid resource and be compensated for it.

Navigating the Path

Despite these challenges, the growth of data center microgrids is likely to continue as technology advances and the pressure grows to meet demand growth and decarbonization goals simultaneously. To overcome regulatory and utility barriers, data center operators can pursue policy strategies, business model innovation, and continued technological change.

Engaging with regulators and policymakers to promote regulatory reforms that support microgrid development is crucial. This engagement includes advocating for changes to interconnection standards, rate structures, and rules governing electricity sales and distribution, making sure that regulators and policymakers are aware of the frictions and the missed opportunities that the status quo represents.

Exploring new business models, such as energy-as-a-service (EaaS), where a third party owns and operates the microgrid on behalf of the data center, can also help mitigate some of the regulatory and financial risks associated with microgrid development. This idea is itself another application of institutional and organizational economics, using a contractual relationship to achieve a mutually beneficial risk allocation. EaaS saves the data center owner from having to develop specialized energy operations expertise and keeps the generation assets off of their balance sheet, which can help with their financial risk profile.

As microgrid technologies continue to evolve, costs are likely to decrease, and the integration of advanced energy management systems will become more seamless. New data centers will face lower costs of building in liquid cooling and new data center architectures that can increase energy efficiency. These new energy systems and data center designs are conducive to microgrid architecture.

The existing regulatory landscape and the dominant position of monopoly utilities pose significant challenges to the widespread adoption of microgrids. Overcoming these barriers will require concerted efforts from data center operators, policymakers, and utilities to create an environment that fosters innovation and supports the transition to decentralized and decarbonized power systems.

AI and Decarbonization: Accelerating the Transition

Overhyped or not, artificial intelligence (AI) is already revolutionizing data center operations, offering opportunities to optimize energy use, reduce costs, and accelerate decarbonization. A lot of people, including Nvidia’s Jenson Huang, argue that AI will influence energy in many ways, not just by increasing the demand for electricity.

AI can enhance the energy efficiency of data centers by predicting energy demand, optimizing cooling systems, and integrating renewable energy sources more effectively. These capabilities will become increasingly valuable as data centers strive to meet growing demand while adhering to stricter environmental standards.

But the relationship between AI and decarbonization is complex. On the one hand, AI can significantly reduce the carbon footprint of data centers by improving operational efficiency. On the other hand, the computational intensity of AI workloads can increase electricity demand, potentially offsetting some of these gains. The timing of decarbonization efforts will therefore be influenced by how quickly AI can be harnessed to manage and reduce energy consumption.

AI-driven automation could also facilitate more dynamic interactions between data centers and the grid, enabling real-time adjustments to energy use in response to grid conditions and enabling the integration of more resilient DER.

The future of data center electricity use will be shaped by the concatenation of technological innovation, regulatory developments, and market dynamics. Investments in energy efficiency, renewable energy, and microgrids will be critical as data centers strive to balance growth with decarbonization. AI will play an important role in this transition, offering both opportunities and challenges for decarbonization. Microgrids are likely to be an important part of that story.

A version of this commentary was first published at Substack in the newsletter Knowledge Problem.

The post The future of data center electricity use and microgrids appeared first on Reason Foundation.

]]>
How AI and data center electricity use impact emission-reduction targets https://reason.org/commentary/are-there-unrealistic-emission-targets-for-data-center-electricity-use/ Wed, 04 Sep 2024 18:31:55 +0000 https://reason.org/?post_type=commentary&p=76085 Is AI a double-edged sword for hyperscalers?

The post How AI and data center electricity use impact emission-reduction targets appeared first on Reason Foundation.

]]>
This commentary is the fourth in a series explaining data center electricity use and the nuances in regulating it. You can read early commentaries here, here, and here.

Large-scale, dynamic social and economic change is often more difficult, incremental, and slower than anticipated. Consider James Watt and Matthew Boulton in Birmingham in 1776, having invented and refined the double-acting steam engine. Watt patented the invention that year, a breakthrough that would ultimately become a hallmark of the British Industrial Revolution and propel its global spread. Yet it wasn’t until the 1840s that their innovation truly transformed industry.

In 1776, the energy efficiency of Boulton & Watt’s steam engine was a mere 4%, converting only 100 BTUs (British thermal units) of coal into 4 BTUs of useful work, primarily in pumping water out of tin mines in Cornwall. The rest was waste heat.

Despite this low efficiency, the engine produced enough work relative to the horses and humans it displaced to attract willing buyers. These buyers, in turn, helped Boulton and Watt refine their design, reduce fuel waste, and enhance performance. Competing against water wheels—another technology that was improving, but facing diminishing returns and location limitations—the steam engine took nearly 50 years of incremental improvements to surpass its nearest competitors.

Boulton and Watt also suffered from input limitations, usually iron and coal quality and consistency of supply. Supply chain challenges made them figure out how to do more with less.

This economic history parable—one of my favorite stories—offers insights into the challenges facing data center owners and hyperscalers today: managing inputs, efficiency, and waste. One challenge is balancing their electricity demand (input) with their greenhouse gas emissions (waste), for which they have set corporate targets. Another parallel is the inelasticity of input supplies; increasing supply takes time, making change gradual and iterative until it eventually becomes exponential.

Data centers are incredibly energy-intensive, with their global electricity consumption expected to rise significantly in the coming years. In 2022, data centers consumed approximately 200 terawatt-hours (TWh), and this figure is projected to increase to around 260 TWh by 2026. This growth is driven by the exponential increase in digital services, artificial intelligence (AI) workloads, and cloud computing. In the U.S., data centers accounted for about 2.5% of total electricity consumption in 2022, a share that could triple by 2030 (S&P Global March 2024).

In the first three posts in this series, I developed those economic themes and suggested ways hyperscalers could manage this balance by building their own onsite generation.

The rising electricity demand from data centers is putting substantial pressure on power grids, especially in regions where the energy infrastructure is not equipped to handle such rapid increases. For instance, areas like Northern Virginia, a major hub for data centers, are experiencing challenges in power availability, which is driving developers to explore secondary and tertiary markets with better power access.

This demand surge has increased reliance on fossil fuel plants, including those previously slated for retirement, thereby complicating efforts to decarbonize the energy grid. Utilities and hyperscalers are both caught between the need to meet the growing energy demands of data centers and the push to reduce carbon emissions. This has led to complex tradeoffs, such as delaying the retirement of coal plants or increasing the use of natural gas, which poses challenges to achieving long-term sustainability goals while it helps them in the short run by displacing coal. Some of those planned retirements are likely to be delayed.

While tech companies are investing in renewable energy and efficiency improvements, the rapid expansion of data centers has still led to a notable increase in greenhouse gas (GHG) emissions (The Wall Street Journal, July 3, 2024).

Globally, the emissions from data centers have been relatively modest compared to the surge in digital workloads, thanks to improvements in energy efficiency. The cumulative effect is still significant, contributing to the strain on clean energy initiatives. For example, in Ireland, data center electricity use has tripled since 2015, representing a substantial portion of national energy demand (International Energy Agency).

In July 2024, Google reported that its greenhouse gas emissions were 48% higher than in 2019 and attributed the increase to AI and data center activity (S&P Global July 2024). Google has set ambitious targets to achieve net-zero emissions across all of its operations and value chain by 2030. To reach this goal, the company is focused on 24/7 carbon-free energy (including hourly matching of demand and supply, a topic for another analysis), renewable energy procurement, and other forms of supply chain decarbonization.

One example is their recent work with Fervo Energy, an innovative developer of enhanced geothermal using production techniques developed in oil and gas hydraulic fracturing and horizontal drilling, and their negotiations with the Nevada utility to develop a specific Clean Transition Tariff there to apply to the Fervo contract (see also this Trellis article on the more general tech interest in nuclear and geothermal).

All of this technological, commercial and strategic innovation takes a lot of time in complex systems like building steam engines in 1776 or redesigning energy-data systems today. Google has come under criticism for these higher emissions. I disagree with, for example, Katie Collins at CNET when she says if “… Google fails to honor its environmental commitments, it will be making a clear statement about how seriously it values profit versus the planet…If we are to take Google seriously, that number should be going down, not up.”

The hackneyed “profit versus planet” framing is naive and shows a shallow understanding of Google’s strategy and how it perceives its market. But she and other critics are correct in pointing out that there’s a tradeoff here that has to be managed.

The growth of data centers is a double-edged sword for clean energy initiatives. Data centers and the growth of AI have the potential to create the kind of orders-of-magnitude improvement in productivity and living standards that the steam engine ultimately created after its five-decade maturation (and yes, I’m glossing over the potential costs of AI, I acknowledge that but will leave that Pandora’s Box closed for now). Those productivity improvements include things like enabling researchers at PNNL to discover 18 new battery materials in two weeks, a research process that would otherwise take years. AI’s productivity boost as a research tool will be vital for improving energy efficiency, reducing costs, and enabling us to get more from less to reduce waste, again hearkening back to Boulton & Watt.

On the other hand, their substantial energy demands and associated emissions pose significant challenges to global decarbonization efforts. Addressing these issues should involve enhancing energy efficiency. The Economist’s Technology Quarterly in Jan. 2024 catalogued the energy efficiency improvements in data centers:

Despite all this the internet’s use of electricity has been notably efficient. According to the IEA, between 2015 and 2022 the number of internet users increased by 78%, global internet traffic by 600% and data-centre workloads by 340%. But energy consumed by those data centres rose by only 20-70%.

Such improved efficiency comes partly from improved thriftiness in computation. For decades the energy required to do the same amount of computation has fallen by half every two-and-a-half years, a trend known as Koomey’s law. And efficiencies have come from data centres as they have grown in size, with increasingly greater shares of their energy use going to computation.

Taking advantage of economies of scale in data center size helps efficiency, although such scale has diminishing returns. Recent improvements in low-carbon concrete reduce the carbon intensity of the building itself (and it’s a really cool innovation!).

One significant way to reduce greenhouse gas emissions and improve energy efficiency is through the adoption of liquid cooling systems in data centers. Unlike traditional air cooling, liquid cooling employs fluids—typically water or a dielectric liquid—to absorb and dissipate heat directly from high-performance components like central processing units (CPU) and graphics processing units (GPU), which generate substantial heat during operation.

Several types of liquid cooling systems are available, including direct-to-chip cooling, where coolant circulates through cold plates attached directly to processors, and immersion cooling, where servers are fully or partially submerged in a dielectric fluid that circulates to remove heat. These methods surpass traditional air cooling in efficiency because liquids have a higher heat capacity and thermal conductivity, allowing for more effective heat transfer and thermal management.

For instance, Nautilus Data Technologies uses a water cooling system where cold water circulates to extract heat from servers. The warm water generated can then be repurposed for other onsite applications. This system is water-efficient and boasts a Power Usage Effectiveness (PUE) of 1.15 or lower, demonstrating high efficiency compared to conventional data center cooling designs.

Liquid cooling supports higher power densities, an increasingly crucial factor as data centers evolve to manage more computationally intensive tasks like AI and big data analytics. Enhanced thermal management through liquid cooling enables closer hardware packing, potentially reducing the data center’s physical footprint. These systems can also diminish or even eliminate the need for extensive air conditioning and ductwork, leading to lower operational costs and a reduced environmental impact, particularly in terms of water and energy consumption.

Liquid cooling does have challenges and costs. The initial capital investment for implementing liquid cooling systems is generally higher than for traditional air cooling, due to the specialized equipment and infrastructure required. Maintenance is more complex as well, necessitating specific expertise to manage the liquid systems and prevent leaks or other issues that could jeopardize data center operations. Retrofitting existing data centers with liquid cooling can be especially expensive and technically challenging, which may limit its adoption to new constructions or specific high-performance computing environments—though there are numerous such new projects underway.

Much like Boulton and Watt in their time, modern data center companies pursue commercial objectives with strategies to implement them but face tradeoffs and complexities that often delay their achievement. The challenges of managing inputs and waste are neither new nor unique. The importance of efficiency and innovation in addressing these challenges and achieving objectives remains vital, even if it takes longer than initially anticipated.

A version of this commentary was first published at Substack in the newsletter Knowledge Problem.

The post How AI and data center electricity use impact emission-reduction targets appeared first on Reason Foundation.

]]>
Should data centers make or buy the electricity needed to meet AI demands? https://reason.org/commentary/should-data-centers-make-or-buy-the-electricity-needed-to-meet-ai-demands/ Wed, 14 Aug 2024 04:01:00 +0000 https://reason.org/?post_type=commentary&p=75604 Faced with rising grid costs and potential instability, data centers are exploring alternatives to traditional grid-supplied power.

The post Should data centers make or buy the electricity needed to meet AI demands? appeared first on Reason Foundation.

]]>
This commentary is the third in a series explaining data center electricity use and the nuances in regulating it. You can read early commentaries here and here.

The exponential growth of data centers, driven by the burgeoning demand for cloud services, artificial intelligence computations, and big data analytics, has significantly increased electricity consumption. In the first two posts of this series, I discussed the increasing data center electricity use, its implications for the electric grid, and how those implications will differ over time due to both demand and supply elasticity.

As data centers proliferate, their energy demands rise, prompting concerns about the grid’s ability to handle this surge. Some fear that the electric grid may not be prepared to adapt, potentially leading to instability and higher electricity prices. Those fears may turn out to be overstated. Transaction cost economics can help us understand why.

The fear of increased data center demand causing havoc in electric grid systems is not unfounded. Grid operators have to balance supply and demand in real-time, a task complicated by the intermittent nature of renewable energy sources and the variable load patterns of data centers (although that variability holds the seeds of flexibility). The Electric Reliability Council of Texas (ERCOT) is concernedvery concerned, given population growth and data center expansion. The Midcontinent Independent System Operator (MISO), the grid operator that stretches from Minnesota to Louisiana and covers 14 states (plus Manitoba), anticipates a shortfall of generation capacity in light of demand growth.

If data center electricity demand indeed strains the grid, we should expect to see rising grid power prices and longer grid interconnection queues, making grid-supplied power a more costly alternative for data centers.

Alternatives to grid-supplied power

Faced with rising grid costs and potential instability, data centers are exploring alternatives to traditional grid-supplied power. One option is long-term co-location contracts with nuclear power plants: These contracts could offer data centers a more stable and potentially faster supply of electricity. However, these nuclear plants were constructed under a regulatory regime that did not anticipate lucrative large customers “barging the queue.” The Wall Street Journal reports:

The owners of roughly a third of U.S. nuclear-power plants are in talks with tech companies to provide electricity to new data centers needed to meet the demands of an artificial-intelligence boom…

The discussions have the potential to remove stable power generation from the grid while reliability concerns are rising across much of the U.S. and new kinds of electricity users—including AI, manufacturing and transportation—are significantly increasing the demand for electricity in pockets of the country. 

Nuclear-powered data centers would match the grid’s highest-reliability workhorse with a wealthy customer that wants 24-7 carbon-free power, likely speeding the addition of data centers needed in the global AI race.

But instead of adding new green energy to meet their soaring power needs, tech companies would be effectively diverting existing electricity resources. That could raise prices for other customers and hold back emission-cutting goals. …

The relatively new arrangements mean data centers can be built years faster because little to no new grid infrastructure is needed. Data centers could also avoid transmission and distribution charges that make up a large share of utility bills.

On-site generation is a second option. Data centers can build and operate their own power generators, which offers greater control over energy supply but requires substantial investment in specialized expertise and infrastructure, which data center companies might lack. This option involves data center vertical integration upstream into one of their crucial inputs.

The make or buy decision in transaction cost economics

The “make or buy” decision framework from transaction cost economics helps us understand the choices data centers face. This framework helps firms decide whether to produce goods and services internally or outsource them to external suppliers through market transactions. Key factors influencing this decision include production costs, market competition, uncertainty, and transaction complexity.

Ronald Coase’s seminal work, “The Nature of the Firm” (1937), introduced the concept of transaction costs—expenses incurred in making an economic exchange. These costs include finding relevant prices, negotiating contracts, and enforcing agreements.

Coase argued that firms exist to minimize these transaction costs, performing certain transactions more efficiently than the market. Firms will expand until the cost of organizing an additional transaction internally equals the cost of executing the same transaction through the market. For a more thorough discussion of Coase’s work, my book The Essential Ronald Coase in the Fraser Institute’s Essential Scholars series is a good place to start.

Building on Coase’s work, Oliver Williamson (“The Economics of Organization: The Transaction Cost Approach,” 1981) developed the framework of “efficient boundaries,” where firms seek to minimize the sum of production and transaction costs. According to Williamson, firms decide which activities to perform internally and which to outsource based on the complexity and uncertainty of transactions. High complexity and uncertainty often lead firms to internalize production to manage risks more effectively.

Empirical studies have tested these theories across various industries. For example, Gordon Walker and David Weber (1984) analyzed a U.S. automobile company’s make-or-buy decisions, finding that relative production costs were the strongest predictor. They also noted that product complexity and uncertainty influenced the decision to internalize production.

Application to data centers’ electricity decisions

For data centers, the make-or-buy decision hinges on the costs associated with grid-supplied power versus self-generation. If grid power prices rise due to increased demand and interconnection delays, data centers may find building their generation more cost-effective. However, self-generation involves significant upfront capital investment and operational complexities, so there are tradeoffs.

The complexity and uncertainty of energy supply further complicate this decision. Grid-supplied power offers simplicity but comes with price volatility and potential reliability issues.

On-site self-generation provides more control but requires expertise in power systems, regulatory compliance, and maintenance. That expertise entails a significant expansion of the firm to implement that vertical integration. It’s a big capital and managerial change.

Transaction costs also play a crucial role. Engaging in long-term contracts with nuclear power plants or other energy providers involves negotiation and enforcement costs. These costs can be substantial, particularly for controversial energy sources like nuclear power. On the other hand, building and maintaining on-site generation involves internal transaction costs related to coordination and management. Again, tradeoffs.

Data centers also have to consider risk management. Grid dependence exposes them to market fluctuations and regulatory changes. Self-generation and vertical integration, in contrast, provide more predictability but require managing operational risks and potential technological obsolescence.

Strategically, data centers may prefer on-site generation to enhance resilience and sustainability, two of decentralization’s biggest benefits. As corporate sustainability goals become more prominent, on-site low-carbon energy sources can align with environmental objectives and provide long-term energy security.

Technological change will shift the margin between buy and make and make it easier and more cost-effective to vertically integrate. In this case, that new technology is likely to be small modular nuclear reactors (SMRs). As SMRs move from research to commercialization, each data center may be built with its own built-in SMR.

Others, like Helion (backed by Bill Gates), are working on elusive innovations in nuclear fusion.

It depends

Ultimately, the electricity make-or-buy decision for data centers is context-dependent. They have to weigh the costs, complexities, uncertainties, and strategic objectives. Right now, rising grid power prices, interconnection delays, and possible limits on co-location with nuclear plants may push them towards self-generation, but the significant investment and expertise required cannot be overlooked.

The insights from transaction cost economics and the make-or-buy literature offer a useful framework for understanding these decisions. In the dynamic landscape of energy and technology, the optimal solution may vary over time and across different regions.

As always in economics, the answer to the make or buy question is “it depends.”

A version of this commentary was first published at Substack in the newsletter Knowledge Problem.

The post Should data centers make or buy the electricity needed to meet AI demands? appeared first on Reason Foundation.

]]>
Data center electricity needs highlight the electricity industry’s lack of effective market signals and price mechanisms https://reason.org/commentary/data-center-electricity-needs-highlight-the-electricity-industrys-lack-of-effective-market-signals-and-price-mechanisms/ Tue, 13 Aug 2024 04:01:00 +0000 https://reason.org/?post_type=commentary&p=75597 When data centers increase activity, electricity suppliers may struggle to meet demand without purchasing power from other regions or using less efficient peaking power plants.

The post Data center electricity needs highlight the electricity industry’s lack of effective market signals and price mechanisms appeared first on Reason Foundation.

]]>
This commentary is the second in a series explaining data center electricity use and the nuances in regulating it. Here are the first and third pieces in the series.

Growing data center energy use continues to make headlines. In my first post on data center electricity use, I focused on the technologies that make AI possible and on broad trends in data center investment and electricity demand forecasts out to 2028.

Electricity demand forecasts

More demand forecasts in the U.S. and globally suggest growing electricity demand. The International Energy Agency forecasts an international average annual growth rate of 3.4% from 2024-2026, 85% of which will come from China and outside of the set of advanced economies. The Electric Power Research Institute (EPRI) published a white paper this past spring motivated by the sudden growth in AI-driven computation:

AI models are typically much more energy-intensive than the data retrieval, streaming, and communications applications that drove data center growth over the past two decades. At 2.9 watt-hours per ChatGPT request, artificial intelligence (AI) queries are estimated to require 10 times the electricity of traditional Google queries, which use about 0.3 watt-hours each; and emerging, computation-intensive capabilities such as image, audio, and video generation have no precedent. (EPRI 2024, p. 2)

Their analysis of U.S. demand included low, moderate, high, and higher growth scenarios through 2030, with electricity demand growth rate forecasts ranging from 3.7% to 15%. These scenarios suggest that data center share of electricity demand will be in the range of 4.6-9.1% of 2030 consumption, compared to 4% today (A May 2024 Axios article provides a helpful summary of the EPRI analysis).

Source: EPRI (2024)

A recent Energy Information Administration analysis shows that commercial sector electricity demand has grown fastest in states with rapid computing facility growth (including Virginia and Texas to no surprise, but surprisingly also including North Dakota).

Electricity demand also grew substantially in Texas, where relatively low costs for electricity and land have attracted a high concentration of data centers and cryptocurrency mining operations. North Dakota stands out with the fastest relative growth at 37% (up 2.6 BkWh) between 2019 and 2023, attributed to the establishment of large computing facilities in the state. In addition, western states such as Arizona and Utah have shown robust growth in commercial electricity demand, further contributing to the overall increase in the top 10 states. (EIA Today In Energy 6-28-2024).

Marshall’s Model of Time, illustrated by fishing

Some basic economic analysis will help us understand what’s happening, why, and what to expect. I’m going to get some help from pioneering economist Alfred Marshall for this analysis.

First, let’s gather the data we want to understand:

  • Technological change is leading to an increase in data center computing, which is creating an increased level of demand and a faster growth rate of demand for electricity.
  • Building a data center requires high capital expenditure and takes 18-24 months.
  • Building new electricity infrastructure (generation, poles, wires, transformers) requires high capital expenditure, and the current regulatory and business structure takes up to 10 or even 20 years.
  • New resource interconnection with a regional transmission grid can take on average 3.5 years in ERCOT in Texas and up to 6 or more years in other regions. The difference is the institutions, or rules, that ERCOT uses called “connect and manage.”

Now let’s bring Marshall in. According to Wikipedia, “Alfred Marshall (26 July 1842 – 13 July 1924) was an English economist and was one of the most influential economists of his time. His book Principles of Economics (1890) was the dominant economic textbook in England for many years. It brought the ideas of supply and demandmarginal utility, and costs of production into a coherent whole. He is known as one of the founders of neoclassical economics.”

If you want a quick overview of Marshall’s importance, I recorded a short video on him for my History of Economic Thought class a while back that you may find useful. And if you want to explore his seminal work Principles of Economics, it’s all available for you at Econlib.

One of the most important contributions Marshall made to economic theory was his conceptualization of time. Time is not a chronological phenomenon per se, but is rather divided into categories depending on how people can or cannot change their consumption, production, investment, innovation, and institutional decisions. His model provides a foundational framework for understanding how supply and demand respond to changes over different time horizons. He used the fishing industry to illustrate these time scales, showing how producers’ abilities to respond to changes in demand evolve over different periods.

Marshall introduces three distinct time scales: the immediate run, the short run, and the long run.

  • Immediate Run: In the immediate run, supply is fixed and cannot respond to changes in demand. This period is so short that no adjustments in production can be made. Prices may fluctuate significantly due to demand changes, but the quantity supplied remains constant. Marshall’s use of a day’s catch in the fishing industry illustrates this concept. Fishers can sell only the fish they have already caught, and no additional boats or nets can be deployed on such short notice.
  • Short Run: In the short run, supply is somewhat elastic but still constrained by existing capacity and resources. Producers can adjust their output to a limited extent by using current assets more intensively, such as working longer hours or employing more labor. But they can’t yet make significant capital investments or changes to their production methods. This response might involve fishers using their boats more frequently or employing more crew members, but they still cannot build new boats or expand their fleet immediately.
  • Long Run: In the long run, supply becomes highly elastic as producers can adjust their production capacity fully. This period allows for significant investments in new capital, technology, and infrastructure. In the fishing industry, responses could include building more boats, acquiring better equipment, and improving techniques, thereby increasing the overall catch. Over the long run, the market can reach a new equilibrium where supply meets the altered demand at a more stable price.

In Marshall’s abstract model, this adjustment process is smooth and frictionless, although he was a relentless empiricist, so he recognized inevitable frictions. Let’s apply this general model to the current situation with data center electricity use and analyze the facts laid out above.

Short-run dynamics: Inelastic supply and immediate responses

In the short run, the supply of electricity is relatively inelastic. This means that the quantity of electricity that can be generated and supplied to the market cannot be adjusted quickly in response to changes in demand. This inelasticity is due to the significant time and capital investment required to build new power plants or upgrade existing infrastructure. So when data centers increase activity or new ones come online, electricity suppliers may struggle to meet this demand without resorting to costly measures such as purchasing power from other regions or using less efficient peaking power plants. This struggle also has policy implications, as we will see in future posts, due to the closely regulated nature of the electricity industry and its century-old business model. Note, in particular, the large timing mismatch between data center capacity expansion and electricity infrastructure capacity expansion.

Marshall’s discussion of time in economics provides a useful analogy. Just as fishermen cannot instantly increase their catch in response to rising fish prices due to the time required to build more boats and nets, electricity providers cannot immediately ramp up production in response to increased demand from data centers. In both cases, the immediate run is characterized by fixed supply and price fluctuations driven by demand changes.

Also, the environmental impact of relying on peaking power plants, which often use fossil fuels, can be significant, contributing to higher carbon emissions and undermining efforts to transition to cleaner energy sources. Those impacts are a topic for a future post.

Long-run adjustments: Investment and capacity expansion

The long-run situation differs considerably. Over time, the supply of electricity becomes more elastic as new generation capacity is added and existing infrastructure is upgraded. This long-run elasticity is driven by investment in new power plants, renewable energy projects, and advancements in energy storage technology. Data center operators and electricity providers can plan for and respond to anticipated increases in demand by investing in capacity expansion.

Marshall’s framework helps us understand this transition. In the long run, fishers can build more boats and improve their fishing techniques, thereby increasing their catch and stabilizing prices. Similarly, in the electricity market, long-run adjustments involve substantial capital investment and technological innovation, leading to a more elastic supply curve and a more stable equilibrium between supply and demand.

The investment in new generation capacity to meet the growing energy demands of data centers is already evident. For example, many data center operators are entering into power purchase agreements (PPAs) with renewable energy providers to secure a stable and sustainable energy supply. These agreements not only ensure a steady flow of electricity but also promote the development of wind, solar, and other projects like storage and geothermal. Advancements in battery storage technology enable data centers to store excess energy generated during periods of low demand and use it during peak periods, smoothing out fluctuations in electricity consumption.

Unlike Marshall’s stylized fishing example, these long-run adjustments in electricity take a long time, longer than they could, because of many frictions that slow down or prevent such flexible adjustments. Permitting bureaucracy and delays add years to infrastructure projects, with dubious or nonexistent benefits to those delays. The same holds for grid operator interconnection, long delays with unclear commensurate benefits. Permitting and interconnection are brakes on long-run supply adjustments that could otherwise be more fluid. Supply chain problems are also an adjustment challenge, such as the bottleneck that transformer supply presents for infrastructure expansion projects. That bottleneck itself provides an example illustrating Marshall’s point:

Additionally, the surge in demand for electrical equipment may only last for a few years, which makes suppliers hesitant to invest in greater supply capabilities that could result in over-supply conditions when demand eases. This phenomenon mirrors past experiences in markets like semiconductors, where short-term market booms led to increased capacity, only to face challenges when the market dynamics shifted. (Wood Mackenzie 2024)

The role of technological innovation

Technological innovation will play a crucial role in shaping the long-run dynamics of data center energy use and electricity demand. Innovations in data center design, cooling technologies, and energy efficiency measures can reduce the amount of electricity consumed per unit of computing power significantly.

One important example is how the shift from traditional air cooling to liquid cooling systems has improved energy efficiency by reducing the need for air conditioning (also the topic of a future post). Algorithms to optimize server workloads and minimize idle times can also enhance energy efficiency.

Other production process changes, such as the transition to more energy-efficient hardware like advanced processors and memory modules designed to consume less power, are helping data centers reduce their overall energy footprint. These technological advancements will lower operating costs for data center operators and mitigate the impact on the electricity grid, making it easier to accommodate the growing demand for AI and other digital services. But even these within-firm capital investments take time.

The role of markets and price signals

One of the significant challenges in the electricity industry is the lack of effective market signals and price mechanisms to indicate relative scarcity and manage demand. In a more flexible and responsive market, prices would rise when electricity is scarce and fall when it is abundant, signaling to both producers and consumers to adjust their behavior accordingly. Implementing such price signals could lead to greater flexibility and efficiency in the electricity market.

For example, real-time pricing or time-of-use pricing could be introduced to reflect the actual cost of electricity production and supply at different times of the day. Under real-time pricing, electricity prices would fluctuate based on current supply and demand conditions, encouraging consumers to shift their usage to off-peak times when prices are lower. Data centers that can schedule non-urgent computational tasks could take advantage of lower prices during periods of low demand, reducing their overall energy costs and alleviating pressure on the grid during peak times. Given their substantial and often flexible energy consumption, data centers are also good candidates for bulk-scale demand response services offered by companies like Voltus and CPower. By temporarily scaling down operations or shifting workloads to off-peak periods, data centers can help balance supply and demand, stabilize prices, and reduce the need for expensive and emissions-heavy peaking power plants.

However, the regulatory and market institutions have to enable such markets and price signals to reduce frictions that maintain the timing mismatch between demand growth and increasing supply. They do not. While some demand response integration exists in wholesale power markets, it’s limited and heavily constrained. That constraint, alongside the frictions of permitting and interconnection that act as brakes keeping supply more inelastic than it could otherwise be, shows that we still have a lot to learn from Marshall’s model of time.

Electrons are like fish because catching them requires building stuff. Building stuff is costly and takes time, much more so in the case of electrons than of fish, due to regulatory impediments that are frictions that amplify the timing mismatches in the adjustment of both demand and supply.

A version of this commentary was first published at Substack in the newsletter Knowledge Problem.

The post Data center electricity needs highlight the electricity industry’s lack of effective market signals and price mechanisms appeared first on Reason Foundation.

]]>
Data center electricity use: Framing the problem https://reason.org/commentary/data-center-electricity-use-framing-the-problem/ Mon, 12 Aug 2024 04:01:00 +0000 https://reason.org/?post_type=commentary&p=75581 In 2019, data centers accounted for approximately 2% of total electricity consumption in the United States.

The post Data center electricity use: Framing the problem appeared first on Reason Foundation.

]]>
This commentary is the first in a series explaining data center electricity use and the nuances in regulating it. Here are the second and third pieces in the series.

The silicon chip platform company Nvidia recently became the most valuable public company in the US, surpassing Microsoft:

Nvidia became the U.S.’s most valuable listed company Tuesday thanks to the demand for its artificial-intelligence chips, leading a tech boom that brings back memories from around the start of this century.

Nvidia’s chips have been the workhorses of the AI boom, essential tools in the creation of sophisticated AI systems that have captured the public’s imagination with their ability to produce cogent text, images and audio with minimal prompting.

The buzzy hype around Nvidia and AI generally is feeling very dot-com-bubbly, for sure, but there’s likely to be some truth in CEO Jenson Huang’s claim: “We are fundamentally changing how computing works and what computers can do,” Mr. Huang said in a conference call with analysts in May. “The next industrial revolution has begun.”

An earlier discussion of that May earnings call in Yahoo Finance is also informative. Huang is right that computing in the blossoming AI era will be qualitatively different from the past five decades. and those changes are showing up in forecasts of increased investment in data centers.

Past computing relied on CPUs (central processing units). These processors perform calculations based on instructions, and have advanced to do so at rapid rates; innovations symbolized by Moore’s Law have meant that more of these processors can fit in smaller chips, so more and faster computing capability per square inch of space inside the case. However, the chips that Nvidia pioneered in the 1990s had different uses and different architecture. These GPUs (graphical processing units) were originally designed for graphics in video games, an industry that they transformed. What made them capable of delivering high-quality dynamic graphics was their parallel architecture, their ability to perform many independent calculations at the same time, and the same speed to turn 3D images into better and better 2D representations that could move and change dynamically.

It turns out that this massively parallel chip architecture is actually a general-purpose technology and not just a graphics processing technology, as Huang and his colleagues saw through the 2000s, building a coding platform and a developer ecosystem around their GPUs. Nvidia has evolved from a GPU company to a machine learning company to an AI company. These capabilities are what drive AI training and, once the model is trained, the ongoing inference that the model performs to deliver answers to users.

If you are at all interested in this rich history, in the history of technology, and/or the history of Nvidia as a company, I cannot recommend highly enough the Acquired podcast three-part series on Nvidia’s history (Part 1) (Part 2) (Part 3), as well as their interview with Jenson Huang in 2023, after the November 2022 AI takeoff.

All of this computing requires energy. Chip manufacturers, Nvidia included, have generally designed chips to minimize energy use per calculation, but they still require energy and still generate waste heat. The way CPUs work, and the density of their use in the space meant that data centers could work, physically and economically, while air-cooled. Air cooling meant increased electricity demand from air conditioning as computation and data centers grew. But AI training uses GPUs with their greater capabilities and commensurately greater energy requirements, both for the greater computation load and for cooling (more on that topic in a future post).

In a 2020 Science article, researcher Eric Masanet and co-authors analyzed the growth in data center energy use and improvements in data center energy efficiency in the 2010s, before the AI era:

Since 2010, however, the data center landscape has changed dramatically (see the first figure). By 2018, global data center workloads and compute instances had increased more than sixfold, whereas data center internet protocol (IP) traffic had increased by more than 10-fold (1). Data center storage capacity has also grown rapidly, increasing by an estimated factor of 25 over the same time period. …

But since 2010, electricity use per computation of a typical volume server—the workhorse of the data center—has dropped by a factor of four, largely owing to processor efficiency improvements and reductions in idle power. At the same time, the watts per terabyte of installed storage has dropped by an estimated factor of nine owing to storage-drive density and efficiency gains.

As researcher Alex deVries noted in his 2023 paper in Joule, AI electricity requirements could soon be as large as a country and the energy requirements of the GPU chips are substantial. This is why the topic du jour in electricity is data center electricity demand and its implications for infrastructure systems that are very slow to change.

For most of the past 60 years, the electricity demand has grown at a pretty steady rate. The average annual growth rate of electricity consumption in the US over the past 60 years has been approximately 2.2%, with a significant increase in electricity use from about 700 billion kWh in 1960 to over 4 trillion kWh in recent years (EIA Energy). Within that broad trend, the growth rate has varied over different periods, with more rapid increases in earlier decades and slower growth in recent years due to improved energy efficiency, changes in the economy, and shifts toward renewable energy sources.

The recent demand growth narrative in electricity has been the decrease in 2008 due to the financial crisis; after a rebound demand again grew at a low rate. Between 2008 and 2019, electricity consumption growth was relatively flat, with an overall average annual growth rate of about 0.3% (EIA Energy).

Source: ACEEE 2016

The economics and the business and regulatory models for the electricity industry are predicated on slow, steady growth, so stagnant demand from 2008-2019 caused utilities and their shareholders and bankers considerable concern. Then the pandemic happened. During the early stages of the pandemic, particularly in the second quarter of 2020, there was a notable decline in overall electricity consumption, due primarily to reduced economic activity and widespread lockdowns leading to a decrease in industrial and commercial electricity use and a shift to residential consumption (Cicala 2023). The Federal Energy Regulatory Commission reported a drop in electricity demand by approximately 3.5% during this period compared to the same timeframe in 2019.

Fast forward to 2023. In November 2022, OpenAI published their ChatGPT large language model (LLM) AI, and in the following weeks and months, other companies (Anthropic’s Claude, Google Gemini, among others) published their models. ChatGPT reached one million users within just five days of its launch, making it the fastest-growing consumer application in history. By February 2023, it had surpassed 100 million monthly active users. The growth in Nvidia’s stock valuation is a good reflection of the potential magnitude of these AI breakthroughs (and it’s reflected in Microsoft’s and Apple’s too):

Source: Wall Street Journal 2024

As we’ve been hearing for the past year, AI computation requires considerably more processing and electricity than previous types of computing. We already produce and process enormous amounts of data, and chip architects and data center designers have been attuned to computation’s energy use for decades.

In 2019, data centers accounted for approximately 2% of total electricity consumption in the United States. Data centers are energy-intensive, consuming 10 to 50 times more energy per square foot than typical commercial office buildings. In the early 2000s, data centers saw rapid growth in electricity use, with annual increases of around 24% from 2005 to 2010. This growth rate then slowed to about 4% annually from 2010 to 2014, and it continued at this rate up to 2020 (Lawrence Berkeley National Lab).

That pattern is changing, and changing quickly. 2024 is the year when utility forecasts and investment plans are dominated by data center growth (S&P Global Market Intelligence). Previous 5-year and 10-year demand forecasts are now obsolete and utilities and grid operators are revising their forecasts, challenged by the extent to which the AI data center is a qualitatively different cause of electricity demand, one with substantial productivity repercussions through the economy. Forecasting is always difficult, especially about the future (thank you, Yogi Berra), but it’s even more difficult when the changes are deep technological ones that make your historical data even less of a guide for future expectations.

A recent Grid Strategies report (Grid Strategies 2023) analyzed the implications of data center electricity demand, and the numbers are staggering for an industry whose business model is predicated on slow and steady demand growth.

These forecasts for the summer of 2028 (in gigawatts, that’s one million kilowatts) of peak demand are striking in how they changed from 2022 to 2023:

Source: Grid Strategies 2023

This updated forecast from PJM, the grid operator for the mid-Atlantic through Illinois and the largest grid operator in the US, indicates how the large and sudden growth in data center electricity demand is forcing them to revise their forecasts:

Source: PJM, Energy Transition in PJM, 2023

Such uncertainty is a substantial challenge, technologically, economically, politically, and culturally within the industry.

In this post, I wanted to frame the problem facing the electricity industry, its regulators, and the data center industry. Over the next couple of weeks, I will tease apart some pieces of this problem and bring in some economic insights that I hope will contribute to our understanding of the problem and some implications we can draw for better policy.

A version of this commentary was first published at Substack in the newsletter Knowledge Problem.

The post Data center electricity use: Framing the problem appeared first on Reason Foundation.

]]>
The 2021 Texas Power Crisis: What Happened and What Can Be Done to Avoid Another One? https://reason.org/policy-brief/the-2021-texas-power-crisis/ Thu, 22 Apr 2021 04:02:00 +0000 https://reason.org/?post_type=policy-brief&p=42044 No single cause was responsible and no simple fix will prepare the state to survive the next extreme cold weather event.

The post The 2021 Texas Power Crisis: What Happened and What Can Be Done to Avoid Another One? appeared first on Reason Foundation.

]]>
Introduction

The electric power system in Texas failed to meet customer needs during the extreme cold that descended upon the state in mid-February, 2021. The failures generated a lot of finger-pointing: too much wind power, not enough reliable natural gas, too little regulation, failed long-run planning, and too few connections to neighboring grids, among other targets. Most early complaints were wrong.

Extreme cold overwhelmed winter preparations in Texas: this is the main story. High power bills and other financial repercussions also have created challenges. The electric power system failures were severe, but any diagnosis of the failure or proposed remedy focusing solely on the Electric Reliability Council of Texas (ERCOT) will miss the mark. Electric power was not the only industry to see failures, and power systems did not only fail in the ERCOT. Natural gas wells and pipelines began freezing up. Municipal water systems broke down in several southern states. Roads were closed due to snow and ice. Ranchers and farmers saw severe losses from the cold.

This report focuses on ERCOT and the electric power system because the power outages were the proximate cause of many hardships suffered during the failures. No single cause was responsible and no simple fix will prepare the state to survive the next extreme cold weather event. Many details will only emerge with time, but this paper aims to provide a clear analysis of what is now known, along with a bit of background on how the system works, to help the public and policymakers understand what happened and what should be done next.

Table of Contents

Part 1 Introduction

Part 2 What Happened?

2.1 How Cold Was It?
2.2 Natural Gas and Electric Power Entanglements
2.3 Not the First Time

Part 3 How Did ERCOT Perform During the Emergency?

3.1 The Financial Fallout Continues

Part 4 What Can Be Done?

4.1 Winterization Requirements
4.2 Resource Adequacy Assessments
4.3 Does ERCOT Need a Capacity Market?
4.4 Interconnecting With Neighboring Grids
4.5 Microgrids, Battery Storage, and Other New and Improving Technologies
4.6 Analyzing the Financial Challenges

Part 5 Recommendations

Part 6 Conclusion: Looking Forward

Read the full Policy Brief here:

Download this Resource

Texas Power Failures: What Happened & What Can Be Done

By Michael Giberson

Thank you for downloading!

Please provide your work email address to access this report:
This field is hidden when viewing the form

The post The 2021 Texas Power Crisis: What Happened and What Can Be Done to Avoid Another One? appeared first on Reason Foundation.

]]>
How To Prevent Another Texas Power Failure https://reason.org/commentary/how-to-prevent-another-texas-power-failure/ Thu, 22 Apr 2021 04:01:43 +0000 https://reason.org/?post_type=commentary&p=42161 Winterization, financial reforms and new technologies could help improve Texas' power system improve reliability.

The post How To Prevent Another Texas Power Failure appeared first on Reason Foundation.

]]>
This is an excerpt from the policy brief—The 2021 Texas Power Crisis: What Happened and What Can Be Done to Avoid Another One?

While the Texas power grid is back in business, the financial fallout is likely to continue for months, maybe years. Resolving financial problems as soon as reasonable will reduce uncertainty and likely help facilitate the investment needed to improve integrated energy systems in Texas. Much analysis has already identified specific problems in the Texas energy system contributing to the outages, but investigations should be continued. As the Texas Legislature was already in session at the time of the outages, hearings already have been held and bills already have been introduced in response.

The Federal Energy Regulatory Commission/North American Electric Reliability Corporation (FERC/NERC) 2011 Report will be one place to look for recommendations, updated to reflect the more extreme cold conditions experienced in 2021. FERC and NERC are collaborating on analysis and recommendations addressing Texas’ 2021 experience. Presumably, the degree of compliance with recommendations from the 2011 Report will be among the topics investigated.

Likely better to let investigations continue before imposing significant reforms. The high stakes of failure demand a well-informed and well-considered response.

What changes are called for to help the system improve reliability?

Recommendations

Winterization

More-stringent winterization requirements seem politically unavoidable, though again the degree to which winterization is needed depends on the critical assessment of the cold. The severity of failures in 2021 may lead the incautious to say any cost of winterization is justifiable, but that is not true. It is the potential severity of failures in the future that demand that resources be devoted to where they will be most effective. Benefit-cost analysis is the standard approach for answering the question, “Where should resources be devoted to secure the best overall protection?”

Winterization standards should allow power plant operators significant flexibility to adapt plants to colder weather. It may be reasonable to prioritize implementation for Texas power plants that failed in February 2021 or February 2011, and possibly appropriate to excuse plants that performed well through both events from any new rules. It may be reasonable to set standards differently in the northern and southern parts of the state. Whatever winterization requirements ought to apply to panhandle wind turbines, they are likely more stringent than those applied to coastal wind turbines. Rules will likely be tailored to generating technologies, with some rules targeting wind energy, others targeting natural gas generation, and so on. Care should be taken to ensure requirements do not unreasonably burden any one type of generation or region of the state.

Lack of fuel supply is a concern. The loss of natural gas generation came both from plant outages and from a lack of natural gas supply. Winterization standards should not neglect the natural gas production and distribution system. Natural gas plants can be adapted to allow the plants to run on fuel oil when gas is not available, and regulators should consider whether some minimum amount of dual-fuel capability is desirable. In addition, gas pipelines should take the opportunity to have their facilities listed as critical services during rolling outages in order to avoid unintentional cuts to otherwise available gas supplies. While gas generation contributed the largest share of outages, coal-fueled plants and nuclear plants also deserve attention.

In assessing Texas’ winterization requirements, the public and policymakers should be aware that owners of power plants have strong financial incentives to avoid failures and will take steps to improve their plants with or without added regulations. Each additional MWh of power a generator could supply during the grid emergency could have earned $9,000, an amount almost 300 times higher than typical market prices. Any power plant already contracted to supply power, but unable to do so because of the cold, was likely paying that $9,000 MWh price to replace the power they could not provide. The prospects of earning that revenue or avoiding that cost provide a strong market signal. The good news, then, is that regulations can be focused on systemic challenges beyond investments that will already happen.

A related issue arises with calls for “bailing out” companies hard hit financially by the failures. If bailouts provide cover directly or indirectly for losses suffered by generators, it will reduce generator willingness to spend their own money to prevent failures. If bailouts cover losses incurred by retail electric suppliers, then it undermines incentives for retail providers to engage in long-term firm contracts that can encourage investment in new power plants. Bailouts for residential customers struck by $1,000 power bills raise more complex issues, but having seen the risks residential consumers will likely be much more cautious about supply offers that expose retail consumers directly to wholesale prices.

As part of ERCOT’s winterization response, it should fully reassess its resource adequacy analysis and the manner in which that analysis figures into its operational decisions.29 Scheduling of maintenance outages and reliability commitment policies for winter weather should be among operational practices updated. The PUC of Texas failed to produce annual reports on electric power winter readiness, as required in a law passed after the February 2011 rolling outages. Had it done so, potential failures may have been foreseen and avoided. As should go without saying, regulators should comply with the law.

Capacity Market

Installing a capacity market would achieve little without a better resource adequacy assessment, but how the resource adequacy assessment should be improved depends upon how and why the assessment was wrong. While the errors of the assessment are clear in hindsight, the relevant question concerns how it can be improved using information available as much as three to six months before the season arrives. Improvements in resource adequacy assessments are critical.

However, a better resource adequacy assessment combined with reasonable winterization of electric power and natural gas systems in Texas are likely adequate to the task. Fundamental changes to the Electric Reliability Council of Texas (ERCOT) market design could impose additional costs without predictable benefits.

Transmission Links

More substantial connections to neighboring grids would have reduced the depth and duration of the crisis. Proposals have been made, but appear to be mired in regulatory processes. The Public Utility Commission of Texas had directed ERCOT to prioritize rule developments needed for the Southern Cross proposal, but rules will mean little if the project cannot obtain regulatory permission from other states involved. FERC does not currently have authority to mandate transmission siting, but does bear significant responsibility for interstate transmission and wholesale power transactions crossing state borders. FERC’s authority over power flows in interstate commerce suggests it examine ways in which it can promote interstate transmission more effectively.

Many state legislatures, including in Texas, have granted existing transmission owners a right of first refusal (ROFR) over the construction of new transmission projects in their states. Supporters of ROFR provisions point to the benefits of working with experienced transmission owners. Critics of ROFR provisions say the provisions unnecessarily add costs and tend to discourage transmission expansion. If transmission expansion is part of the state’s response to the February energy emergency, the legislature may want to reconsider its ROFR law.

ERCOT and the PUC should ensure rules can accommodate Southern Cross and then are generalized for any subsequent link. The PUC and FERC should adopt standardized procedures for such links to add predictability to regulations. FERC should guard against the use of state regulatory processes to impede interstate commerce in power.

New Technologies

The ERCOT market design has demonstrated an ability to accommodate new and improving technologies from wind and solar to batteries to distributed energy resources. Retail market rules have allowed REPs to offer the most diverse selection of retail supply contracts available, including market-based net metering proposals and offers providing home energy management capabilities. Risks associated with retail offers passing through wholesale costs have demonstrated such contracts are not wise for most consumers, but they have not undermined the value of allowing experimentation by retailers. Rather, competition in the market should be protected to foster continued innovation as technology and communications improve and open up new ways of creating customer value.

These changes are not likely to provide more than modest improvements to winter reliability in the short run, but are nonetheless desirable and will continue. Resource adequacy assessments should reflect whatever reliability benefits new technologies offer.

Financial Reforms

Resolving financial problems surrounding the energy emergency will be a particular challenge. A quick resolution reduces uncertainty, which allows market participants to move forward more confidently. Few investors will be willing to put millions of dollars into a system in which billion-dollar obligations remain unresolved. But resolving problems quickly can raise the cost or force the liquidation of market participants that may otherwise have been capable of reestablishing their financial position.

Legislators and regulators also have to be concerned about imposing unnecessary costs on outside investors and financial market participants. The presence of purely financial market participants helps the market run more smoothly by making it easier for physical market participants to enter into both short-term and long-term contracts. Costs that do not reflect the costs associated with market participation will unnecessarily raise the cost of capital for market participants, slowing investment and ultimately resulting in somewhat higher prices for consumers.

Retail Competition

Some critics of retail competition in electric power took the opportunity of the Texas power outages to again state their case. One such article stated the point in its headline, “The real problem in Texas: Deregulation.” Reporters at The Wall Street Journal claimed that residential consumers in Texas had paid billions of dollars too much because of retail competition, although their calculations are inadequate to justify their conclusion. Adjusted for inflation, retail power rates in the competitive retail parts of Texas are lower than the rates charged in those areas when they were last regulated by the state PUC, which makes the overcharge claim hard to accept. The best economic analysis of Texas retail power prices, a peer-reviewed academic study published in the journal Energy Economics, found that retail competition brought cost savings to end consumers. Also, it is not the case that savings have come by cutting corners on reliability. Industry veterans Devin Hartman and Beth Garza report competitive markets have a superior reliability record overall.

Looking Forward

In responding to the power system failures, the identification of the root causes of failures will be critical. Many critics and analysts were quick to offer their long-favored prescriptions—limit renewables, add a capacity market, return to vertical integration—but the very rapidity of the prescriptions ensured they were not based on a deep understanding of what happened.

The days following the emergency have allowed a tentative picture of circumstances to be assembled, but more investigation remains. The weather was colder for longer across a larger portion of the state than ever recorded before. The widespread damages caused by a lack of access to electricity, including loss of life as Texans struggled to cope with the extreme cold, were disastrous, but examining ERCOT’s response shows that ERCOT did its job during the emergency.

The major failings happened before the bad weather hit. It may not prove to be cost-effective to fully weatherize every system component against the possible extremes of cold and heat experienced in Texas. Yet the severity of the failures, the lives lost to the cold, and the significant costs imposed on the state demand a careful look at the range of possible alternatives.

We should not overlook the point that the ERCOT power system has performed well under a wide variety of weather conditions. The regulations established promoting competition in ERCOT’s wholesale and retail market have served the state well. While these regulations must change in response to the failures of February 2021, this fundamental commitment to competition should be maintained.

Full Policy Brief—The 2021 Texas Power Crisis: What Happened and What Can Be Done to Avoid Another One?

The post How To Prevent Another Texas Power Failure appeared first on Reason Foundation.

]]>
The Texas Power Fiasco Shows Need to Find a Balance With Wind Power and Other Renewables https://reason.org/commentary/the-texas-power-fiasco-shows-need-to-find-a-balance-with-wind-power-and-other-renewables/ Mon, 05 Apr 2021 04:00:45 +0000 https://reason.org/?post_type=commentary&p=41538 If you want a system that is heavy in intermittent power generation, you need to have adequate backup power standing by to kick in when the wind isn’t blowing.

The post The Texas Power Fiasco Shows Need to Find a Balance With Wind Power and Other Renewables appeared first on Reason Foundation.

]]>
Round like a circle in a spiral, like a wheel within a wheel
Neither ending nor beginning, on an ever-spinning reel…

The story of wind power has never been written so well as in the song “Windmills of Your Mind, by Michel LeGrand, with English lyrics by Alan and Marilyn Bergman. The song, written about a great movie, The Thomas Crown Affair, is about the world of high-stakes theft and subterfuge, where all is spin and nothing is as it seems. And the blame vortex surrounding the blackouts during the recent deadly winter storm in Texas is a stunning example of this kind of rotary obfuscation. 

The battle lines on the “what we learned from Texas” debate were drawn practically before the rotors had stopped spinning, either on the wind turbines, or in the gas, coal, and nuclear power plants that Texas has historically relied on to keep the lights burning. (Let’s leave solar out of this, at 2 percent of power production, it’s not relevant.) 

If you wanted to scapegoat wind power, for whatever reason, as Texas Gov. Greg Abbott did, it was supposedly as obvious as the sheaths of ice on the turbines, that wind was the culprit. Responsible for producing over 23 percent of Texas electricity generation at some points of the year, a lot of the wind power didn’t work so well in the deep freeze. So, the narrative wrote itself for those looking to blame wind. They suggested wind power is unreliable because the wind is unreliable, the weather is unreliable, and so is the climate that drives the weather. Adding insult to injury, you can’t store the power to use when the wind isn’t blowing, even with Elon Musk in Texas. So, to those people, it was a wind-power failure, QED. 

On the other hand, if you love wind power and oppose fossil fuels or conventional power plants, the narrative from your perspective was equally clear. The “obsolete” conventional power generation sector (fossil fuels and nukes) was to blame for a whole bunch of reasons, including a lack of proper maintenance, a failure to weatherize equipment in ways that the state had previously been warned about, an isolationist power grid separated from the rest of the country, and so on. In the face of a predictable storm, Texas’ coal and natural gas power plants weren’t ready to do the job they’re supposed to.

Both narratives are in some aspects true.

Yes, many wind turbines weren’t producing power during the storm, but it’s also true that far more coal and natural gas-powered power plants were down and unable to meet the demand of Texas customers. The Austin-American Statesman reported the Electric Reliability Council of Texas (ERCOT) breakdown of the problem across the types of facilities on Feb. 17:

ERCOT said all types of facilities, not just the ones that produce renewable energy, were affected by the statewide outages.

As of Wednesday, 46,000 megawatts of generation were offline, with 185 generating plants tripped. ERCOT officials said 28,000 megawatts came from coal, gas and nuclear plants, and 18,000 megawatts were from solar and wind.

Energy demand reached a record high Sunday and didn’t taper off as electricity usage typically does during overnight hours. The issue became critical when several of the grid’s energy generation units began to go offline in rapid progression, affecting more than half of the grid’s winter generating capacity, according to ERCOT Senior Director of System Operations Dan Woodfin.

These failing sources largely included nuclear plants, coal plants and thermal energy generators. Frozen wind turbines were a factor, too, but Woodfin said wind shutdowns accounted for less than 13% of the outages.The problem in Texas was not simply about particular sources of power, nor the politics of power, the problem was a failure to properly blend the old with the new and balancing a system capable of meeting demand under those cold conditions…

An ERCOT report on generating capacity listed the top sources of power in the state:

Natural gas (51%)
Wind (24.8%)
Coal (13.4%)
Nuclear (4.9%)
Solar (3.8%)
Hydro, biomass-fired units (1.9%)

If you want a system that is heavy in intermittent power generation, like wind, you need to have adequate backup power standing by to kick in when the wind isn’t blowing. That’s obvious. What’s not obvious is that the problem is not when the wind fails to produce power, but when it actually succeeds at producing power because while that’s happening, the backup systems are operating at partial capacity, and they’re losing the money needed to maintain their ability to pick up the slack when the wind dies down. Add to that a general political climate that heavily disfavors properly maintaining, renovating, and upgrading conventional power generation or infrastructure, and you have some of the key contributing factors to the Great Texas Winter Blackout of 2021.

This challenge, balancing the economics of wind power and its effect on conventional forms of power generation is where the overall concept of replacing conventional power with renewable power is failing and will likely continue to fail. That’s not because there’s anything wrong with wind power, or conventional power. It’s a problem of cold, hard economics and governance. The various political actors steering power systems are often not interested or not capable of looking for a way to balance the speed of renewable deployment with a way to meet the economic needs of the established power generation systems that have kept the lights on when the wind doesn’t blow and the sun doesn’t shine. Striking that balance is the key to moving forward toward decarbonizing and expanding the electrification of power systems over time.

Spinning has always been part and parcel of using and generating power: wheels, axles, pulleys, crankshafts, generators, and so on. Similarly, spinning has always been part and parcel (cynically, perhaps the greatest part) of politics. And it’s those politics that led to the power fiasco in Texas. But there is potentially a ray of light. Hopefully, in the forensic aftermath of the Texas fiasco, more knowledge will be gained about how to strike the balance when integrating more renewables, both those we know and those we have yet to discover, into an existing system with massive economic momentum built up over decades of operation.

Politicians, activists, advocates, and opponents of wind, coal, gas, nuclear, hydropower, and even solar power, all need to stop spinning around this fundamental truth: Whether one likes it or not, intermittent forms of energy generation are just as subject to the laws of economics as every other form of energy generation. We have to stop pretending it’s all one way or the other, renewable or conventional, and strive to find the proper balance that keeps the system working and from failing its customers in deadly ways as it did in Texas. And that correct balance involves technology, economics, and, unfortunately, political rationality, which often seems more fickle than the wind.

The post The Texas Power Fiasco Shows Need to Find a Balance With Wind Power and Other Renewables appeared first on Reason Foundation.

]]>
Testimony: Florida Considers Electric Vehicle Fees to Replace Gas Tax Revenue https://reason.org/testimony/testimony-florida-considers-electric-vehicle-fees-to-replace-gas-tax-revenue/ Wed, 10 Mar 2021 14:00:39 +0000 https://reason.org/?post_type=testimony&p=40965 26 states have already implemented minor electric and hybrid vehicle fees to pay for infrastructure maintenance.

The post Testimony: Florida Considers Electric Vehicle Fees to Replace Gas Tax Revenue appeared first on Reason Foundation.

]]>
Senate Bill 140 would create annual flat fees for electric vehicles and plug-in hybrid electric vehicles in the state of Florida. It is paired with Senate Bill 138 which would direct the Florida Department of Transportation to create an Electric Vehicle Infrastructure Grant Program to distribute grants to various entities that apply, and have matching funds, in order to install electric vehicle charging infrastructure throughout the state, and would provide a one-time $5 million appropriation to implement the grant program.

There are approximately 66,700 electric (EV) and plug-in hybrid vehicles currently driven in Florida. Using this number, we can estimate that the new fees established by this legislation would equal roughly $9 million or more in revenue per year.

Fees such as that proposed in SB 140 are common across the country—26 states have imposed them already, and the fee levels proposed in SB 140 are about in the middle range compared to other states. States have been motivated to implement such fees mainly due to projections of lost transportation user fee revenue in the form of fuel taxes, which electric and hybrid vehicles do not pay or pay very little of relative to their use of infrastructure.

The Florida Department of Transportation in its EV Infrastructure Master Plan estimates fuel tax revenue losses by 2040 of between 8.4 percent to 30.0 percent, depending on how rapid the growth in adoption of EVs is. Needless to say, there will be no reduction in the need for roads and road maintenance as the mix of vehicles increasingly shifts to electric and the state’s population, economy, and vehicle-miles traveled continue to grow.

It is only fair that owners of electric and plug-in hybrid electric vehicles also pay for the building and maintenance of the roads they use. A new annual fee for these vehicles is an efficient way to do so. And the fee proposed in SB 140 will not discourage the adoption of electric and hybrid vehicles as their $1,650 average savings on gasoline per year is far more than the $135 or $150 annual fee.

Finally, the grant program in SB 138 would use the first five years of revenue from these fees to provide charging infrastructure for electric and plug-in hybrid electric vehicles, providing these drivers a direct user benefit for their user fee. In subsequent years, the fees would help pay for road maintenance in the state. Moreover, the proposed grant program uses a public-private partnership approach where private parties who need charging infrastructure for their workers or visitors share the costs of installing it with users via the state program.

These policies will help Florida achieve the growth in electric and hybrid electric vehicles that so many want to see for environmental reasons by improving electric charging infrastructure while simultaneously creating a system for those vehicles to pay their fair share for the roads they use in the years to come.

The post Testimony: Florida Considers Electric Vehicle Fees to Replace Gas Tax Revenue appeared first on Reason Foundation.

]]>
Nevada Ballot Initiative Analysis: Question 6 (2020) https://reason.org/voters-guide/nevada-ballot-initiative-analysis-question-6-2020/ Mon, 28 Sep 2020 18:06:35 +0000 https://reason.org/?post_type=voters-guide&p=36896 Question 6 would double the state’s Renewable Portfolio Standard (RPS) from the current 25 percent to be 50 percent by 2030.

The post Nevada Ballot Initiative Analysis: Question 6 (2020) appeared first on Reason Foundation.

]]>
Nevada Question 6: Nevada Renewable Energy Standards Initiative

Summary:

Renewable Portfolio Standards require that regulated electric utilities collect a minimum percentage of electricity from designated renewable sources, such as wind turbines or solar panels. Nevada’s Question 6 would double the state’s Renewable Portfolio Standard (RPS), from the current 25 percent to 50 percent by 2030.

Nevada voters approved this question in 2018 in the first of two required votes to place it into the state constitution.

Fiscal Impact:

The true fiscal impact is unknown as of this writing. Critics suggest costs could be in the hundreds of millions when considering potential higher costs in energy generation, distribution, and higher consumer rates. Much of the additional cost would likely be borne directly by electric consumers, but state and local governments could face higher costs as well.

Proponents Arguments For:

Proponents argue that increasing Nevada’s Renewable Portfolio Standards is the best way to make the necessary move to more clean and renewable energy. Supporters say that moving to 50 percent renewable energy will add thousands of jobs and billions in investments to the state’s economy in the renewable energy sector. Question 6 would make Nevada a self-sufficient leader in renewable clean energy, taking advantage of its natural supply of solar and geothermal resources, and would dramatically reduce carbon and sulfur pollution in our air. Supports say now is the time to make this change because renewable energy is already more reliable than fossil fuels and it keeps getting more affordable. Nevada currently gets 80 percent of its electricity from out-of-state natural gas and this over-reliance on fossil fuels leaves Nevadans vulnerable to price spikes and supply disruptions that Question 6 would end, its proponents argue.

Opponents Arguments Against:                                 

Opponents to Question 6 argue that a higher RPS will impose higher electric rates and lead to less-reliable power for households and businesses. Nevada already has one of the most diverse renewable energy portfolios in the nation, including 19 geothermal projects, 14 solar projects and roughly one dozen wind, biomass, hydro and waste heat renewable energy projects, with additional renewable energy products in the planning process. Forcing further shift to renewables before the market is ready will hurt consumers, opponents say. Furthermore prices of renewables are still high and the would cost jobs in manufacturing, agriculture, and tourism. Question 6 would impose the same type of RPS in Nevada that is partially blamed for California losing a large percentage of its industrial base since 2000, opponents say. California’s residential utility rates are 57 percent higher than Nevada’s and industrial rates are more than double, according to statistics from the U.S. Energy Information Administration. Putting Question 6 into Nevada’s state constitution means that if it turns out to be impractical to implement or too costly to the state’s economy, the only way to fix it would be with another constitutional amendment.

Discussion:

While Renewable Portfolio Standards may expedite the adoption of renewable energy, it does appear that this transition comes at a higher cost to customers. A recent comprehensive review of RPS by the University of Chicago shows a significant increase in costs alongside more mild gains in renewable energy use. This is in part because renewable sources like solar and wind are generated geographically further from the city centers they serve, resulting in higher infrastructure and transmission costs, but also due to costs imposed when energy companies are mandated to abandon old technologies and systems. Indeed, Question 6 could mean Nevada utilities have to abandon existing power plants before their useful life is completed. Furthermore, ratepayers will still be required to pay for those unused power plants.

There is also some controversy over what types of technologies would get designated as renewable. Hydroelectric dams, for instance, provide very renewable energy but do not qualify as an approved renewable source in Nevada. Generally, regulated utilities can generate power internally or purchase it wholesale from an independent power producer utilizing an approved technology.  Regulated utilities have an incentive to own and operate their own power plants since their allowed rate of return is based on a percentage of equity. NV Energy has been criticized in the past for supporting plans to prematurely shutter its power plants and replace them with solar plants in part to increase the utility’s equity and rate of return for shareholders while causing rates to increase for businesses and households.

It is also possible an RPS could be optimally effective at lower rates than 50 percent. There is some evidence that these requirements have helped to lower the costs of renewable energy generation by giving manufacturers of solar panels and wind turbines a guaranteed market into which their products will be sold.

California has recently experienced rolling blackouts, in part, because of its abandonment of traditional power sources for renewable sources. Although solar panels tend to produce electricity reliably during peak hours, wind turbines have highly variable outputs that tend to peak during off-peak hours when it is less useful. By contrast, nuclear and coal-fired power plants can supply a steady level of baseload power and natural gas turbines can be quickly scaled up on-demand, as needed. Grid managers must balance the supply and demand for electricity at any given moment, which can be more challenging with renewable sources since the production of wind turbines especially can vary greatly over short time periods. If reducing carbon dioxide emissions is the underlying goal of Renewable Portfolio Standards, there are many ways of achieving the goal without specifying in the state constitution what utilities must do.

Voters’ Guide to Nevada’s Other 2020 Ballot Questions

Voters’ Guides to 2020 Ballot Initiatives In Other States

The post Nevada Ballot Initiative Analysis: Question 6 (2020) appeared first on Reason Foundation.

]]>
Hacking the United Kingdom’s Electricity Grid https://reason.org/commentary/hacking-the-united-kingdoms-electricity-grid/ Thu, 07 May 2020 19:00:28 +0000 https://reason.org/?post_type=commentary&p=34048 Across Britain, lone hackers—a term that has come to mean high-tech —are creating ways to harness energy price and usage data that has only recently become available.

The post Hacking the United Kingdom’s Electricity Grid appeared first on Reason Foundation.

]]>
Kim Bauters, Ph.D., is an expert in artificial intelligence. If you want to know the best gelato recipe to use the ingredients you have on hand, he has developed a smartphone app that will help you. If you want to know the best recipe to make soap, he’s put AI to work for you with another app called Soap Genie.

If, however, you want to save electricity and reduce your carbon footprint, Bauters explains, that’s much easier “because there isn’t an artificial intelligence component.” So, he created an app that helps homeowners calculate the best times of day to use their appliances.

Bauters isn’t the only one helping people go green and save money. Karolis Petruskevicius, the founder of Homely Energy, used his parents as guinea pigs to see if the energy-saving technology he was developing would work. And Mick Wall taught himself several programming languages as he developed a web page showing the lowest-cost times to use household appliances.

Across the U.K., lone hackers, a term that has come to mean high-tech tinkerer, are creating ways to harness energy price and usage data that has only recently become available. Their creations tell people how to reduce CO2 emissions and save money by reducing their electricity use. Remarkably, by reducing electricity demand at specific times, they are also helping balance the electrical grid.

Two innovations made this possible. First, an electric and gas company called Octopus Energy created a variable-rate pricing plan called Octopus Agile that uses prices to reward customers for using electricity when supply is abundant and increases prices when demand is high. Second, Octopus made real-time price data available to anyone with enthusiasm and a little programming skill. Octopus encouraged everyone who would listen to create new tools for their customers.

The results are just beginning to appear, but these new innovations are attracting the attention of energy experts and managers in the United Kingdom. Thanks to the democratization of information, individual hackers like Bauters, Petruskevicius, and Wall can create, all by themselves, what used to require a team of experts with a big bank account. Octopus is also adding customers, thanks in part to these innovators who make trying this unique approach to electricity pricing less daunting and more rewarding. 

Helping homeowners respond to fluctuations in price—encouraging them to use renewable energy when it is available—will become even more important as the National Grid Electricity System Operator moves toward the goal of operating Great Britain’s grid with zero-carbon by 2025. Meeting that goal while delivering power when people need it will be very difficult without the help of homeowners. That difficulty will become more pronounced as the U.K. adds increased demand for electricity from a growing number of electric vehicles. 

In the midst of these large and challenging trends, we are increasingly likely to find individual hackers nibbling away at the problem with tools that can be as simple as sharing information to gadgets driven by artificial intelligence and complex algorithms. Without them, the transition to a carbon-free grid would be more expensive and difficult.

A new approach to electricity pricing 

Nobody knows that better than Phil Steele of Octopus Energy, whose official title is “Future Technologies Evangelist.” His job is to proselytize about the need to create the technology and tools customers will need in the changing electricity landscape.

“It is all about trying to get consumers on to renewable energy,” he explains. The best way to do that is to help customers use electricity during the times of day when it is generated by renewables. Currently, that is a challenge.

On a typical day, the amount of electricity that people want increases in the morning as they wake up and head to work. It declines slightly through the middle of the day and then peaks during the late afternoon and evening as people return home, turn on the TV, and cook their dinner. This pattern, while fairly predictable, can double energy demand—or cut it in half—in just a few hours. That large swing requires grid managers to find more supply by turning on and off “dispatchable” sources of electricity, often natural gas or coal. The peaking plants that generate this electricity are not only less environmentally friendly, but they are also the most expensive. Shifting demand away from peak hours would not only cut CO2 emissions, but it would also reduce the need for the most expensive sources of electricity.

Few people, however, realize how expensive it is to turn on their tumble dryer or dishwasher during those peak hours because they have been protected from big fluctuations in the prices electric companies must pay to meet that demand. When customers did not have the capability of following real-time price fluctuations, a stable rate structure made sense. Most electricity rates protected customers, charging based on simple price schemes that often had only two price tiers—one for the daytime, and one for overnight. Most electric companies in the U.K. still use a similar system.

Recognizing the new opportunities created by information technology, Octopus created their Agile rate which varies every half-hour during the day. “We launched Agile Octopus to encourage households to shift their consumption from the daily peak energy demand period of 4 p.m. to 7 p.m. when we’re paying very high wholesale prices,” said Steele. If the electric company can buy less energy when wholesale prices are high, they can pass the savings on to the customer. And the savings can be significant.

For example, at 3:30 p.m., customers could be paying less than three pence for a kilowatt-hour (kWH) of electricity. A half-hour later that price might jump to 17p per kWh. When 7:30 p.m. rolls around, those prices plunge down to eight or nine pence. Price changes during the rest of the day are less dramatic, but they can go from 1p up to 4p in just 30 minutes. 

Perhaps most remarkably, prices can even become negative, when Octopus actually pays customers to use more electricity. This typically happens in the middle of the night, when demand is low and there is an excess of wind power. Last December, when Storm Atiyah blew in, the amount of wind energy on the grid, enough to power about 15 million homes in the U.K., doubled in 24 hours. 

That electricity had to go somewhere. Rather than turning other generating stations off, which is expensive, Octopus decided to pay people to use the short-term surplus. Between 1:30 a.m. and 6 a.m. on Dec. 8, 2019, Octopus paid customers about 1 pence per kWh. As Octopus staff wrote on their blog, “overnight storms triggered the energy equivalent of an ‘everything must go’ sale.” If you owned an electric car, you hit the jackpot. Imagine what would happen if filling stations paid to fill your tank with petrol.

Grid managers noticed what Octopus was doing. The director of operations for the GB grid tweeted “a big thank you” for helping create demand to balance the excess supply.

Paying people to use electricity, however, doesn’t make a difference if nobody knows about it. To help customers, Octopus publishes its rates online, as well as making them available to programmers using an Application Programming Interface, or API. Put simply, the job of an API is to answer questions asked by another app. Programmers send a query to the API— a more technical version of “what will tomorrow’s electricity prices be in London” —and the API returns the data. 

Programmers can ask for the prices customers will pay during the next 24 hours. They can check the energy usage for individual meters. Octopus encouraged programmers to find new ways to use all this data. In 2018, they hosted a “Hack Day,” attracting more than 100 computer programmers and tinkerers. “We wanted to enable other tech innovators in the energy sector access to this incredible data,” Octopus staff explained, “so we …cracked open our Agile Octopus API for public development.”

Individuals have taken up that challenge, and the results are beginning to appear.

Tools to empower energy customers

The most basic way to help consumers adjust to the Agile pricing is to provide timely and useable information. When Mick Wall started playing with the Octopus API, he only wanted the information for himself. His home already had solar panels, so he appreciated the value of using electricity when it was available. As an Octopus customer, he wanted to know more about pricing but the detail he wanted was not available. “You could download small amounts of Agile tariff data for a single U.K. region from the Octopus website,” he said, “but not my region.” That’s when he started his “journey of discovery.”

Teaching himself several programming interfaces, he created a program that downloaded the data into a spreadsheet. Since he already managed the web page for his running club, he created energy-stats.uk and shared what he found. Soon, people began asking him to generate graphs and over his Christmas break, he added the ability for visitors to download the data. The simplicity of the site, providing electricity prices for regions of the U.K., is what makes it useful. When I asked him what innovations were next, Wall said he liked it the way it was. “I fear adding more stuff will take away from what the site is good at, just delivering the Octopus pricing data each day,” he said.

He’s probably right. He notes that the basic information on his site changed how he uses electricity and says many people have been in touch to thank him, saying it has helped them save money too. “The Octopus Agile tariff is a game-changer,” he says. “I defy anyone not to change their habits when they know they can save lots of money.”

That is also what inspired Kim Bauters to put his knowledge of computer science to work. He knew the price data was available, but, “People in my family found it was hard to track, so I decided to create an app.” Initially, he created an app called Octopus Watch for the Apple watch but has now added versions for the iPhone and Android. Unlike his other apps that use AI to create soap and gelato recipes, the programming for Octopus Watch was straightforward – take the data provided by the API and make it digestible. “All of these incentives don’t necessarily translate into a user doing something,” he said. The amount of data could be overwhelming.

“It is like the Google problem of having too much data,” he said, referencing the trend toward “big data,” where individuals and businesses are inundated by the information. Octopus Watch takes all that data and translates it into simple guidelines about when to turn on energy-intensive appliances like clothes dryers and dishwashers. Before the app, “we didn’t use timers on any appliances,” said Bauters, “but now we routinely use the app to find the best slots.” It is paying off.

The combination of the Octopus Agile prices and his app is yielding a “fairly consistent 35 percent savings,” he told me. For owners of electric cars, the savings “tend to be much more toward 60 to 70 percent.”

To take the next step in energy savings, however, people will probably need a little help. You may not need artificial intelligence to find the best time to dry your clothes, but it is handy when trying to keep your home comfortable amidst daily price fluctuations. That is the problem Karolis Petruskevicius and his colleague Ignas Bolsakovas wanted to solve.

A Ph.D. candidate in power networks, Karolis has been focusing on modeling electricity prices in the U.K. with a growing number of residential heat pumps and an increase in power production from intermittent renewable sources such as solar and wind. Working with his advisor, he developed a model and he wanted to test it. His parents had a heat pump and he decided to make them his test subjects.

Ground-source heat pumps are expensive to install but can be extremely efficient. Taking heat from the pipes in the ground, heat pumps return about four units of heat for every one unit of electricity. “You are getting three units of energy for free from renewable sources,” Karolis explained. “It is going to be key to decarbonizing.” The problem is to decide when to use electricity to run the pump. 

The algorithm he developed turned on the heat pump before prices jumped during peak demand, relying on the home’s insulation to keep temperatures at a comfortable level until prices fell again. “When price is low you slightly overheat the home. What it would do is overheat the house slightly before 4 p.m. and then let temperatures drop until 7 p.m.” With the tolerance of his parents, he found the system worked. 

Monitoring the heat pump is more than most people can do on their own. Partnering with Ignas, a computer scientist, he launched Homely Energy, and created a smart thermostat for heat pumps that uses Octopus Agile price data. Their app also works with heat pumps that are already internet-connected, turning them on and off based on the same price algorithm he used at his parents’ home. “We think that we can save up to 30% compared to fixed tariffs,” he says. 

If his projections are right, the market for Homely’s thermostat and app could grow dramatically. Petruskevicius cites estimates from the National Grid that half of the population of the U.K. could have a heat pump by the year 2050. He believes that as people see the economic and environmental benefits of heat pumps, more people will give them a try. 

Another person who is convinced is Jan Rosenow, the European Program Director at the Regulatory Assistance Project, an independent energy think tank. “The main reason I chose Agile is that we installed a heat pump,” he told me. “The interest was to see if we could get benefit out of load shifting.” 

Where does one of Europe’s leading experts on energy demand turn for information about how to manage his personal energy use? “I rely on third-party apps,” he answered. Prior to switching to Octopus Agile, Rosenow was paying about 14 p per kWh on average. Now, thanks to the price information, he says he pays about 7 or 8 p per kWh.

Rosenow is optimistic about the ability of market prices and consumer information to help adjust to the changing electricity market in the U.K. and Europe will see in the future. He explains that the main benefit of using prices and information to change the way people use electricity is to “shift that additional load outside peak hours and absorb renewables or solar during the day.” By charging an electric car during the day, rather than when people get home, it not only avoids the hours of peak demand, it uses solar energy that is available during the middle of the day. Users get two benefits at once – cheaper energy and smaller environmental impact.

Open data for green energy

The U.K. is just beginning to see these innovations make a difference, and there is still a long way to go. As the number of innovators grows, they realize they are part of a community. Mick Wall is building a list of devices and apps that work with the Octopus API. These range from intelligent plugs like Ecopush, to chargers for electric cars. More innovations are most certainly coming.

The catalyst for those creating these tools that take advantage of Octopus’s variable rates was to make the data available. “Without the APIs we wouldn’t be able to do what we are doing,” says Bolsakovas of Homely Energy. That data is also helping improve the performance of future innovations. “We also need more accurate data to make our algorithms work properly,” and instead of just using data from Petruskevicius parents’ home, they can now use the enormous amount of data provided by the API and their hardware. “The more open data we have the better.”

This seems like small stuff, but the potential is huge. A study by Octopus found that customers with Agile pricing and an electric car, shifted their energy use from the more traditional pattern – peaking around 7 p.m. – to peaking in the early morning hours around 2 a.m. This shift reduced their total electricity use during peak hours by nearly 50 percent. 

Open data also means an increasing diversity of solutions to suit the needs of customers. Not everyone has an electric car or heat pump. For many people, the primary electrical draw may be a clothes dryer. Whatever the case may be, from a simple web page like energy-stats.uk to a more sophisticated controller using data-driven algorithms like the Homely heat pump controller, customers can already find something that suits their needs. As more electric companies introduce similar pricing systems, the number of customers who can benefit from these tools will grow, and so will the incentives to create more apps.

Octopus seems to agree. After Storm Atiyah moved through, the company managers wrote on their blog, “For years, the mainstream thinking was that renewable or sustainable products were expensive or niche. Over the weekend, a combination of smart meters, smart energy tariffs and a couple of windy nights helped us make the U.K.’s greenest energy the cheapest too.”

Petruskevicius wants to be part of undermining that mainstream mindset. “I think it is the future,” and services like Homely Energy’s will be more necessary “when we have a lot more loads like batteries, EVs and heat pumps.”

These early energy hackers did not intend to overturn the mindset and change the way the U.K. – and, perhaps, others in Europe and North America – think about and use electricity. Curiosity and self-interest were their primary motives. As the changes in electricity generation and markets accelerate, however, hackers like Wall, Bauters, and Petruskevicius will be pioneers in that important transformation.

The post Hacking the United Kingdom’s Electricity Grid appeared first on Reason Foundation.

]]>
What The World Needs Now Is A Free Market for Oil https://reason.org/commentary/what-the-world-needs-now-is-a-free-market-for-oil/ Wed, 15 Apr 2020 04:03:32 +0000 https://reason.org/?post_type=commentary&p=33717 "Better to live and die on the free market where a company can decide its own fate."

The post What The World Needs Now Is A Free Market for Oil appeared first on Reason Foundation.

]]>
The age of managed oil markets is dead. If that wasn’t apparent before now, Mexico’s decision to unfurl the black flag when asked to contribute to reducing global oil production by 10 percent and risk scuttling the shaky détente between Saudi Arabia and Russia was the final nail in the coffin.

First Russia, then Mexico. 

Russian President Vladimir Putin sparked the current price war with Saudi Arabia and other members of the Organization of Petroleum Exporting Countries (OPEC) more than a month ago with a nyet to further limits on Russian output. In a snit over Russia’s snub, Saudi Crown Prince Mohammed bin Salman (MBS) opened the taps and increased global production by 20 percent almost overnight. 

Oil prices plummeted in response, decimating national budgets and compounding the economic pain caused by the global response to the coronavirus pandemic.  

With the world’s biggest economies against the ropes, Putin and MBS agreed to put down their drill-bits and play nice, which is when Mexico’s President Andres Manuel Lopez Obrador decided to cause trouble. 

To a market managed by this lot, good riddance. 

Even though the Saudi-led OPEC and Russia eventually struck a bargain late Sunday to reduce production by 9.7 million barrels a day, Mexico’s ability to turn global oil markets into a telenovela—with an equally unbelievable resolution—begs the question, why does the world put so much power in the hands of a cartel? 

Every oil-producing country involved in the unprecedented negotiations attempted to get a leg up on the competition. Mexico was even able to extract a promise from President Trump to cover most of its portion of the cuts so that it could carry on with plans to rebuild its oil and refining sectors. 

It is time to accept that collective international action will not overcome the national self-interest of individual politicians and that attempts to intervene in the market are doomed to deliver bad results. The answer to countering this type of national self-interest is simple—let the market work. 

The market is far more efficient at solving the problem of supply and demand than governments—the ongoing international circus of tweets and virtual meetings influencing markets is all the evidence needed to end the debate about that. 

The problem is that markets can be brutal and they don’t play favorites. 

Left alone, the market will reward the lowest-cost producer and punish producers with higher breakeven costs. Those forces could lead to more failed companies and job losses in the U.S. oil sector, and in other places with robust private energy sectors (Canada, looking at you), but those jobs are being destroyed now by the decisions of a tiny group of authoritarian leaders. 

Do we really want to continue to leave our economic and national security vulnerable to the whims of Putin or Saudi Crown Prince Mohammed bin Salman or Mexico? 

The traditional argument in favor of active management is that wild fluctuations in the price of oil have global implications because of its universal importance to defense and heavy industry. In that case, we either need to find better management or diversify our energy supply. 

On a global free market, national oil companies that benefit from generous state support could have an unfair advantage over private companies—if they are well managed. The oil world is littered with examples to the contrary. If Mexico wants to keep pumping out 1.6 million barrels of oil and building refineries instead of buying lower-cost gasoline from its northern neighbor, best of luck to it. 

The United States is one of the few countries without a national oil company, so a free market for oil will require adequate protections against states that subsidize domestic production and then export it. But every market requires clear rules and strict enforcement mechanisms to ensure a level playing field. 

Supply is not the immediate challenge, though. Removing 10 million or 15 million barrels a day from global markets will at best establish a floor below which prices won’t fall further. That price, though, may still be too low for a great number of U.S. oil companies to continue to produce, especially small and mid-sized independents that have traditionally been at the forefront of new discoveries. 

A global agreement on production levels will result in a price that works for Russia and Saudi Arabia—and in the economic strangulation of those U.S. companies. Better to live and die on the free market where a company can decide its own fate.  

Restore demand and the market will right itself. Prices will eventually rise as excess oil is drained off, making room for new competitors to enter the market. Real competition will encourage innovation and efficiency, hallmarks of U.S. producers. 

In the meantime, global leaders should stop squabbling over who gets the biggest slice of the oil market and focus on developing a vaccine to the COVID-19 virus so we can lift the global quarantine and get back to work. 

Restart economic activity and demand for oil, along with demand for everything else, will likely follow. Massive infrastructure projects require lots of labor and fuel, so governments could prioritize planned maintenance and expansion projects as well. Modernizing transportation, energy, and communication infrastructure would help work off the build-up of crude in storage and the growing spare tire many of us have acquired after a month of watching Netflix’s “Tiger King” from our couches. And when the bill comes due, hopefully, we’ll have far fewer potholes and crumbling bridges to contend with.

The post What The World Needs Now Is A Free Market for Oil appeared first on Reason Foundation.

]]>
PG&E’s Settlement Won’t Fix Its Problems and Consumers Deserve Choices https://reason.org/commentary/pges-settlement-wont-fix-its-problems-and-consumers-deserve-other-options/ Fri, 20 Dec 2019 05:01:44 +0000 https://reason.org/?post_type=commentary&p=30543 Routine power outages and rolling blackouts appear to be Californians’ new normal for at least another decade.

The post PG&E’s Settlement Won’t Fix Its Problems and Consumers Deserve Choices appeared first on Reason Foundation.

]]>
Pacific Gas & Electric is finalizing a $13.5 billion settlement that would go to individuals impacted by deadly fires in the state in recent years, including the Camp Fire in Paradise that killed 85 people. PG&E’s recent decisions to proactively create blackouts and shut off power to various parts of the state were an effort to avoid a repeat of its deadly mistakes. PG&E’s transmission infrastructure is so outdated that many fear it could cause more wildfires.  It turns out the median age of a PG&E transmission tower is 68 years old and some are over 100 years old.  The towers have a useful life of about 65 years.

As PG&E looks to emerge from Chapter 11 bankruptcy, there’s a lot of debate about what to do now. In response to the power shutdowns that had many people comparing California’s electricity infrastructure to that of a third-world country, Gov. Gavin Newsom may have threatened a state takeover of the utility, saying, “All options are on the table.”

Similarly, two dozen mayors, including those in San Jose, Sacramento, and Oakland, are recommending converting PG&E into a publicly-owned co-op. The plan would potentially put taxpayers on the hook for damages related to future fires caused by PG&E’s equipment.

PG&E CEO Bill Johnson recently admitted that it could take another decade for the utility to improve its grid enough that it no longer has to conduct power shutoffs to avoid causing wildfires. “I think this is probably a 10-year timeline to get to a point where it’s really ratcheted down significantly,” Johnson said of the blackouts.

Thus, routine power outages and rolling blackouts appear to be Californians’ new normal for at least another decade. Rather than accept this failing status quo, and instead of increasing taxpayers’ risk or government’s role in PG&E, California should give serious consideration to an entirely different model—competitive electricity markets.

The state’s regulated monopoly approach has failed, resulting in tragic deaths, massive property damage, not to mention some of the highest energy prices in the nation, and unreliable service.

Texas offers an interesting counterpoint.  In 1999, Texas launched an effort to bring competition and consumer choice to its electricity market.  Since that time, rates in its competitive market have declined by 31 percent and Texans now have access to some of the cheapest electricity in the country.  For comparison, according to the most recent data from the US Energy Information Administration, the price per kilowatt-hour across all sectors is $0.09 in Texas and $0.19 in California.

Additionally, as competition increased in its electricity market, Texas also became a leader in renewable energy. “Texas continues to dominate the nation’s wind energy production, adding far more generating capacity than any other state last year and having more installed wind power capacity than all but five countries in the world,” the Houston Chronicle reported last year.

California’s politicians and customers may still have bad memories from the disastrous results of the state’s last flirtation with energy deregulation in the early 2000s.  But that complex effort wasn’t deregulation— it only attempted to deregulate prices on the wholesale market while holding them fixed on the retail market.  It also prohibited retail providers from signing long-term purchase agreements with power producers to safeguard against price volatility. In short, the poorly designed system paved the way for market manipulation and the rolling blackouts and high prices that consumers faced at the time.

Since then, however, a dozen states have successfully implemented consumer choice into their electricity markets.  As a whole, they’ve fared well, cutting prices and expanding options for consumers.  Instead of accepting a decade of blackouts while PG&E struggles to modernize and improve its infrastructure, this is an opportune time for California to reconsider the direction of its electricity markets.  Californians deserve better than what they’re getting, and competition would deliver choices and alternatives to consumers.

This column originally appeared in the Orange County Register.

The post PG&E’s Settlement Won’t Fix Its Problems and Consumers Deserve Choices appeared first on Reason Foundation.

]]>
The Truth About Electric Choice in Nevada https://reason.org/commentary/the-truth-about-electric-choice-in-nevada/ Mon, 01 Oct 2018 12:00:09 +0000 https://reason.org/?post_type=commentary&p=24730 Nearly every argument being offered against electric choice is little more than a straw man.

The post The Truth About Electric Choice in Nevada appeared first on Reason Foundation.

]]>

With a campaign season upon us, politicians and other usual suspects are again in full swing, promoting over-the-top rhetoric about how the world will end if we don’t vote a certain way. One area of unusual focus this year has been Question 3, a proposed amendment to the Nevada Constitution that would require the Legislature to create “an open, competitive retail electric energy market.”

It’s rare that citizens get a chance to vote on their right to choose among electricity providers. In the states that have established competitive retail markets for electricity, it has never been accomplished at the ballot. As a constitutional amendment, the measure must pass the ballot twice consecutively. It already garnered 72 percent of the vote in 2016, so, if a simple majority of Nevada voters approve Question 3 this year, it will become law.

This reality has led NV Energy, the state’s current monopoly electricity provider, to pour nearly $12 million into ads intended to scare Nevadans about the implications of choice and competition.

The most oft-cited example is a half-hearted and botched attempt at implementing a retail electricity market by California nearly 20 years ago. But that’s a poor example because California’s regulation included so much price-fixing and needless mandates that it’s hard to characterize the effort as a real attempt at providing consumers with electric choice to begin with.

Essentially, California prohibited any movements in the price of retail electricity but deregulated prices on the wholesale market. It simultaneously required utilities to purchase power from independent power plant but prohibited them from signing long-term contracts. The entire framework was poorly constructed and destined to fail from the outset.

The good news for Nevadans is California’s botched attempt is an extreme outlier in the world of electric choice. At least a dozen states have created competitive retail electricity markets, according to the U.S. Department of Energy, and no others have experienced the crisis that California lawmakers created for themselves with their poor design.

In fact, ending the monopoly structure for electric utilities has been a huge success in most states. Texas created electric choice in 2002. Last year, the Texas Public Utilities Commission reported that retail electric rates have declined by as much as 63 percent since 2001. Texans now face electric rates as low as 4.5 cents per kilowatt-hour, compared to a national average of 13.45 cents. Meanwhile, Nevada’s monopoly structure has led to an overall price increase of 23 percent over the same timeframe and a price of 11.7 cents per kilowatt-hour today for residential customers, according to data from the Department of Energy.

Further, contrary to claims from opponents who say choice is incompatible with so-called “renewable” power sources like wind and solar, Texas’ power generation mix has grown to include more and more renewable sources in the years since its choice program was established.

Nearly every argument being offered against electric choice is little more than a straw man. Of the total $11.9 million raised in opposition to the measure, NV Energy has contributed almost all of it, using money earned from ratepayers. The existing monopoly framework is good for NV Energy, because its profits are broadly determined as a percentage of its costs. As a result, NV Energy knows it can be inefficient and less responsive to its customers than businesses that must compete to attract and retain customers. After all, being tired of dealing with a costly, subpar electricity provider is why Nevadans have forced this issue onto the ballot to begin with.

This column first appeared in the Las Vegas Review-Journal.

The post The Truth About Electric Choice in Nevada appeared first on Reason Foundation.

]]>
Permitting is Making Residential Solar Expensive and Reforms can Change That https://reason.org/commentary/permitting-is-making-residential-solar-expensive-and-reforms-can-change-that/ Sun, 10 Jun 2018 10:25:51 +0000 https://reason.org/?post_type=commentary&p=25057 This column originally ran in The Orange County Register.

The post Permitting is Making Residential Solar Expensive and Reforms can Change That appeared first on Reason Foundation.

]]>
California recently became the first state to require that homebuilders install solar panels on almost all new houses. In California, most homes built after Jan. 1, 2020, will be obligated to have solar energy panels and systems. The mandate is bad policy for multiple reasons. It will drive up the state’s already astronomical housing prices by an estimated $10,500 per home, according to the California Energy Commission’s own estimates. It is also likely to add layers of government bureaucracy in an area where the state’s typically overzealous regulatory streak has been surprisingly restrained — solar energy.

The cost of installing residential solar electricity generation in the United States has declined significantly over the past few years, due mostly to innovations that have reduced the cost of photovoltaic panels. But, nationwide, American consumers are still being charged far more for solar panels than the average consumer in other countries.

Antiquated regulations, primarily at the state and local levels, are costing American consumers about $70 billion per year. Permitting and other regulations vary significantly from state to state, but onerous permitting processes, strict building codes, and various tariffs cause U.S. consumers to pay, on average, nearly double what consumers abroad pay for installing similar solar systems. That amounts to nearly $10,000 more in costs for a 5-kilowatt residential solar system. The contrast with Australia, where there is no permitting process at all for solar, is stark. Solar installation costs in the U.S. are $3.25 per watt, compared to just $1.34 per watt in Australia.

California, not typically known for being a leader in government reform and efficiency, has actually been leading the way in streamlining its solar permitting process. The state has taken a number of steps to standardize its permitting process and has modernized outdated laws and regulations that were impeding Californians’ ability to adopt solar.

Assembly Bill 2188, passed in 2014, requires all local and city governments in California to follow the permitting processes included in the “Solar Permitting Guidebook,” which lays out standards and best practices for governments to reduce barriers to entry and cut costs. And the state has seen remarkable progress, with many local governments updating their books, reducing installation and permitting fees, and/or waiving them entirely. Each step taken locally to modernize outdated and onerous regulations is one step closer to ensuring the solar market is competitive and attractive to the ordinary consumer.California’s improvements have played a role in the price of solar in the state plummeting 55 percent over the last five years. And California is expected to lead the nation in solar production in the next five years, with a projection of 13,402 megawatts of solar according to the Solar Energy Industry Association.

These successes should be spurring California to continue streamlining its solar process so homebuilders, businesses, and consumers voluntarily select it. Instead, the state is implementing a mandate that may not scale well or generate positive returns since California is already experiencing periods where it has more solar power than the grid needs. Meanwhile, there is one very obvious negative. Piling on another $10,500, or more, in the costs of a home only makes the region’s housing more unaffordable for first-time homebuyers and middle-class families.
In March, the median home price across Southern California’s six counties was $519,000, according to CoreLogic. In Orange County, the median price hit a record $725,000 and in Los Angeles County the median price was up to $585,000.

Instead of the solar mandate, California would be much better served to continue focusing on reducing the cost of permitting and other regulations that have helped drive down the price of solar. Doing that can help solar installations become cheaper and more attractive to a wider swathe of consumers without pushing sky-high housing prices further out of reach for most Southern Californians.

This column originally ran in The Orange County Register.

The post Permitting is Making Residential Solar Expensive and Reforms can Change That appeared first on Reason Foundation.

]]>
Voluntary Energy Standards: ISO 50001 and the Superior Energy Standard https://reason.org/policy-brief/voluntary-energy-standards-iso-50001-and-the-superior-energy-standard/ Wed, 04 Apr 2018 04:01:40 +0000 https://reason.org/?post_type=policy-brief&p=23205 Many larger firms seeking to reduce costs are advancing national energy efficiency
significantly through “Energy Management Systems."

The post Voluntary Energy Standards: ISO 50001 and the Superior Energy Standard appeared first on Reason Foundation.

]]>
Firms continuously seek to improve their energy efficiency in order to reduce their costs and thereby remain competitive. In addition, some firms may seek to improve energy efficiency to signal to consumers their commitment to environmental protection. Some larger firms have implemented energy management systems (EMSs) in order to achieve one or both of these objectives. Over the course of the past decade, EMSs have been developed by the International Organization for Standardization (ISO) and U.S. Department of Energy. The aims of these EMSs are (1) to provide incentives for innovation and (2) to provide information to ease decision-making among end-consumers.

Although these EMS programs are voluntary, the U.S. Department of Energy (DoE) has been involved in their development and diffusion. This brief considers the extent to which the standards are likely to achieve their aims and the role of DoE in advancing these standards.

Parts 1 and 2 describe the ISO 50001 EMS and the DoE’s involvement in its proliferation, which resulted in the creation of the Superior Energy Standard (SEP) standard.

Part 3 discusses the intended benefits of EMSs and whether DoE’s involvement is conducive to achieving those benefits.

Finally, this report offers recommendations for restructuring the programs sponsored by the DoE.

Full Brief: Voluntary Energy Standards: ISO 50001 and the Superior Energy Standard

The post Voluntary Energy Standards: ISO 50001 and the Superior Energy Standard appeared first on Reason Foundation.

]]>
The Effect Of Corporate Average Fuel Economy Standards On Consumers https://reason.org/policy-brief/the-effect-of-corporate-average-fuel-economy-standards-on-consumers/ Sun, 01 Apr 2018 04:00:52 +0000 https://reason.org/?post_type=policy-brief&p=23175 Fuel economy and greenhouse gas emissions standards for vehicles are a very inefficient way to address issues related to fuel consumption and emissions.

The post The Effect Of Corporate Average Fuel Economy Standards On Consumers appeared first on Reason Foundation.

]]>
Corporate Average Fuel Economy (CAFE) standards require manufacturers to meet minimum fuel economy requirements for their fleets of vehicles sold in the U.S. As a result, manufacturers adjust certain vehicle attributes in order to comply with these standards. Among the many vehicle attributes that a manufacturer may adjust are weight, power, and drivetrain. Such adjustments have consequences for the cost and performance of vehicles, which affects consumers.

In their assessment of the likely effects of CAFE standards, the National Highway Traffic Safety Administration (NHTSA) and the Environmental Protection Agency (EPA) claim that the new standards introduced since 2011 generate substantial benefits for consumers. Underlying that claim is an assumption that consumers fail adequately to take into consideration the economic benefits of more fuel-efficient vehicles when making purchasing decisions. However, a slew of recent studies questions the assumptions made by NHTSA and EPA. This brief assesses the effects of CAFE standards on consumers.

Proponents of CAFE standards claim that they benefit consumers by reducing the total costs of purchasing and using vehicles. The evidence contradicts this claim. Consumers generally purchase vehicles with characteristics that meet their needs, including their expectation of the total cost of future gas purchases. CAFE standards distort manufacturers’ incentives, forcing them to produce new vehicles with lower gas consumption than would be preferred by consumers. As a result, the range of vehicle options available to consumers is limited and many consumers are effectively forced to purchase vehicles that are less able to meet their preferences.

Among the most adversely affected consumers are those, predominantly in rural areas, who seek to purchase used pickups. The distortions created by CAFE standards artificially raise the cost of these vehicles by more than the average savings from reduced gas usage, increasing the total cost of ownership. Given the steep rise in the price of used pickup trucks that resulted from CAFE standards for the 2012–2016 period and current increases occurring as the 2017–2021 standards are implemented, it is likely that prices would rise at an even faster rate if the agencies were to implement standards along the lines of those proposed as “augural” for 2022–2025.

In addition, as noted in a previous paper, fuel economy and greenhouse gas emissions standards for vehicles are a very inefficient way to address issues related to fuel consumption and emissions. Ideally, the federal government would scrap the federal CAFE and greenhouse gas emissions standards. However, this option is not currently on the table.

Ideally, the federal government would scrap the federal CAFE and greenhouse gas emissions standards. However, this option is not currently on the table. Nonetheless, the agencies implementing the standards do have the option of setting future greenhouse gas emissions and CAFE standards at the same level currently set for the model year 2021. That would certainly be preferable to the alternative of raising the standards further. In addition, to the extent that other extant EPA and NHTSA regulations serve as barriers to the introduction of vehicles that better suit consumer preferences, it behooves the agencies to seek ways to remove these barriers. One example noted herein are the essentially arbitrary and unnecessary differences between U.S. and international standards for a variety of vehicle parts. Harmonization of these standards would likely result in the production of vehicles that better serve consumers at a lower price. In addition, to the extent that the threat of anti-trust action impedes collaboration between manufacturers in the development of new technologies, a simple process for the granting of anti-trust waivers could facilitate more rapid innovation, not only of more-efficient vehicles but also in many other aspects of automotive technology.

Full Brief: The Effect Of Corporate Average Fuel Economy Standards On Consumers

Related Research:

CAFE and ZEV Standards: Environmental Effects and Alternatives

Climate Change, Catastrophe, Regulation and The Social Cost of Carbon

The post The Effect Of Corporate Average Fuel Economy Standards On Consumers appeared first on Reason Foundation.

]]>