Reason Foundation https://reason.org/ Tue, 09 Dec 2025 17:56:55 +0000 en-US hourly 1 https://reason.org/wp-content/uploads/2017/11/cropped-favicon-32x32.png Reason Foundation https://reason.org/ 32 32 Aviation Policy News: Air traffic controller staffing and resignation claims https://reason.org/aviation-policy-news/air-traffic-controller-staffing-and-resignation-claims/ Tue, 09 Dec 2025 14:44:34 +0000 https://reason.org/?post_type=aviation-policy-news&p=87264 Plus: How air traffic control reforms are described, the costs of modernization, and more.

The post Aviation Policy News: Air traffic controller staffing and resignation claims appeared first on Reason Foundation.

]]>
In this issue:

Controllers Discrepancy

During the government shutdown, we read article after article about the loss of air traffic controllers. Not only were some controllers calling out sick, but some reportedly were taking part-time jobs to make ends meet, and others took the shutdown as an opportunity to retire.

Yet once the shutdown ended, the media were full of news stories, based on updates from the Federal Aviation Administration, that controllers were returning, as USA Today reported on Nov. 16, “FAA Ends Shutdown-Era Flight Limits as Controller Staffing Rebounds.” Aviation Daily on Nov. 14 headlined that, “FAA Freezes Flight Cuts as Controller Callouts Decline Rapidly.” Within a week or so after the government shutdown ended, airline flights were reported as being essentially back to normal, just in time for Thanksgiving weekend.

There is something wrong with this rosy picture. To begin with, recall that controller staffing pre-shutdown was far below FAA norms, with six-day workweeks and 10- to 12-hour shifts for controllers at some key facilities. If all controllers who were on the roster the week before the shutdown returned to their jobs within the week after it ended, many air traffic control (ATC) facilities would still be seriously understaffed and controllers would still be overworked. Instead of allowing a return to all flight activities as they were pre-shutdown, the FAA could have considered what it would take in terms of targeted flight reductions to reduce the number of six-day controller work weeks and 10-hour shifts.

Adding to my concern are statements by Transportation Secretary Sean Duffy during the shutdown. In Politico on Nov. 9, Duffy said the following: “I used to have four controllers a day retire before the shutdown. I’m now up to 15 to 20 a day are retiring, so it’s going to be harder for me to come back after the shutdown and have more controllers controlling the airspace. So this is going to live on in air travel well beyond the time frame that this government opens up.” (italics added)

Let’s do a bit of arithmetic here. The federal government shutdown lasted 43 days. The net increase in retirements, per Duffy, was 11 to 16 controllers per day. At the low end, 11 retirements per day times 43 days equals 473 retirements. On the high end, 16 retirements per day times 43 days equals 688 controllers retired during the shutdown. The average of those two numbers for Duffy’s retirement claims is 580 fewer air traffic controllers today than before the shutdown.

So how could air traffic controller staffing possibly be back to pre-shutdown levels? Secretary Duffy owes us an explanation. Perhaps Congress should ask him. If the system is actually staffed with 580 fewer controllers than it had before the shutdown, it’s hard to see how they could be safely handling pre-shutdown levels of air traffic.

» return to top

Post-Mortem on Advanced Air Mobility  

This year has turned out to be the time when reality imposed its judgment on the plethora of advanced air mobility (AAM) start-up companies. In a lengthy article in Aviation Week (Oct. 27-Nov. 9), Ben Goldstein summarized the many losers and the handful of survivors.

This trend was already underway in 2024, which saw the demise of Lilium, Rolls-Royce’s Electrical unit, Universal Hydrogen, and Volocopter. Then came 2025’s deluge of bankruptcies. They include the City Airbus project, Germany’s APUS Zero Emission, Spain’s Crisalion Mobility, Cuberg/Northvolt, Eviation, Guardian Agriculture, Overair, Supernal, and Textron eAviation.

Left standing are the U.S. big three: Archer Aviation, Beta Technology, and Joby Aviation. All three have decent funding, a path toward FAA certification, and some potential as ongoing businesses, whether only as producers of aircraft or also as operators. The same issue of Aviation Week had another article on the growing number of AAM companies in China, none of which appear to be planning to seek FAA certification. Both Joby and Archer aim to launch actual electric vertical take-off and landing (eVTOL) air service in the United Arab Emirates as early as next year, with or without FAA certification.

Why have we seen so many failed start-ups? It’s not because eVTOL (the primary aim of these start-ups) is impossible, because we see the survivors’ aircraft flying. One serious problem is the business model. Because battery-powered vertical flight requires a very large amount of power, and batteries are very heavy, an eVTOL’s payload and range are both very limited. Instead of the mass-market fantasy of “flying cars” and go-anywhere air taxis, this is looking more and more like a high-end luxury service for niche markets. In addition, most of the failed start-up companies probably had no idea of both the long time and high cost of obtaining FAA certification.

This is why we are seeing non-eVTOL AAM concepts being developed and tested. One alternative is hybrid propulsion, which can significantly increase payload and/or range. Another is including actual wings on some of these aircraft for cruise flight. And once wings are taken seriously, we have seen Electra.aero demonstrate its blown-wing innovation that enables its EL2 to take off and land in less than 150 feet—and it’s a hybrid-electric. Its larger EL-9 can handle a 1,000-lb. payload and still land and take off in less than 300 ft. The U.S. Air Force is seriously interested in the EL-9. This kind of aircraft is a hybrid STOL.

Giving up vertical flight and battery-only power are two keys to more viable advanced air mobility. These lessons are being learned the hard way, but that is what competitive technology development requires. Imagine if a central planner like NASA had defined eVTOL as the “one best way” for advanced aerial mobility?

» return to top

Update on ATC “Privatization” 

My ongoing effort to shift the terminology for de-politicizing U.S. air traffic control by ceasing to call it “privatization” (as opposed to a self-funded public utility) has thus far not caught on here in the United States. As I noted in this newsletter, I switched my terminology several years ago to “public utility” because that is what these air traffic control entities are in all serious proposals today.

Since last month’s newsletter, I’ve continued to do interviews, most notably with Scott Simon on Nov. 15 for NPR’s Weekend Edition, which mentioned privatization in the online version’s headline. The Washington Post editorial board endorsed the idea of corporatization and cited my work in an editorial on Nov. 23, but the headline used the word privatize. Similarly, my Reason colleague Marc Scribner and Cornell Prof. Rick Geddes were interviewed by Dan Levin for Straight Arrow News in a piece that noted the plan would be “more akin to a public utility” but was headlined “The Quietly Powerful Group Keeping US Air Traffic Control Privatization Grounded.”

I was pleased to see an earlier op-ed in The New York Times by Binyamin Appelbaum, which explained the FAA’s limitations and cited “stand-alone corporations in Australia, Canada, and Germany” without resorting to using privatization. Former U.S. Department of Transportation official Diana Furchtgott-Roth had a good piece in The National Interest headlined “How to Modernize America’s Air Traffic Control,” but the unfortunate subhead was “Privatizing air traffic control could help prevent flight delays over the holiday season.” Oh, well…

As I explained last month, opponents of last decade’s House bill to create a U.S. version of nonprofit air traffic control corporation Nav Canada repeatedly attacked this idea as being “for profit” and “dominated by major airlines,” neither of which was true. And the strongest opponents then (and now)—private aviation groups AOPA and NBAA—continue to attack air traffic control “privatization” as if that were the case. In fact, nearly all 95 countries that receive ATC services today from user-funded, de-politicized air navigation service providers (ANSPs) are neither private nor for-profit. That is why, two years ago, I began using the term air traffic control public utility, since that is what the vast majority of depoliticized ATC systems are.

So, to this newsletter’s readership group, I repeat my request from last month’s issue: If you support depoliticizing the low-tech, underfunded Air Traffic Organization, please don’t refer to this as “privatization.” That helps only the opponents of this much-needed reform miscast what is actually being proposed.

» return to top

Europe’s Conundrum on Air Travel

Much of the discussion of air travel in Europe seems to be driven by environmental groups such as Transport & Environment (T&E), which calls for cancelling airport expansion plans, increasing taxes on airline passengers, and other measures. Most airport expansion projects are still going forward, but in parallel with that, the European Commission announced on Nov. 5 a plan to spend $400 billion of taxpayers’ money to greatly expand the current 12,128 km high-speed rail network between now and 2040. The stated goal is to shift travelers from short-haul flights to rail travel.

There is no sign in aircraft sale projections from Airbus and Boeing that air travel will not grow or shrink. T&E warns that if all French airport expansion plans were carried out, 38 million more people would travel through French airports by 2050, compared to a hypothetical no-build scenario. In its Nov. 14 article on this subject, Aviation Daily notes that the current Groupe ADP airport expansion plan would mean an increase by 2050 from 82 million to 105 million annual passengers.

A few European airports are attempting to limit increases in air travel. Schiphol Airport in the Netherlands is still battling airlines over its attempt to reduce the number of annual flights, ostensibly due to noise exposure. In Germany, Vienna Airport recently decided not to proceed with adding a third runway. But those anti-growth efforts are swamped by planned expansions. In non-EU member, the United Kingdom, long-sought runway additions have this year been approved for both Heathrow (LHR) and Gatwick (LGW). The ongoing expansion of Germany’s Frankfurt Airport (Terminal 3) will add capacity for between 19 and 24 million annual passengers. And there’s also the ADP expansion plan noted above.

Groups like T&E and the European Commission (EC) seem to be ignoring what is going on in the rest of planet Earth’s airspace. Air travel in India is growing by leaps and bounds, and many airport expansion projects are underway in that country. In the Middle East—especially Dubai and the United Arab Emirates—air travel is booming. Andrew Charlton, in his December newsletter, reported on the Dubai Airshow in mid-November. He noted that a “small order” for new aircraft in this region is 100, with an option for 50 more.

The International Air Transport Association (IATA) reports that Europe accounts for 26.7% of global air travel. So if the EC were to succeed in restricting air travel in its domain, three-fourths of the world would continue expanding air travel: the United States because of its affluence and the developing world (China, India, the Middle East), as their economies continue to grow.

I have written previously that the de facto premise of most climate activists and their followers in government is that every sector of every economy must reduce its share of greenhouse gases (GHGs), regardless of either how costly that is to carry out or the benefits of the activity that generates those gases. If I were an environmental policy central planner, my policy would be to figure out the cost per ton of GHG reduction in every sector of the economy—and focus first on all the low-hanging fruit. My guess is that the cost/ton in aviation would be on the high end, and the economic benefits of air travel would also be high. That would suggest looking for relatively lower-cost air travel measures rather than very costly measures, such as spending $400 billion to expand European high-speed rail.

For background reading on approaching climate change policy rationally, I once again recommend Steven Koonin’s important book, Unsettled: What Climate Science Tells Us, What It Doesn’t, and Why It Matters, BenBella Books, 2021. Koonin was Undersecretary for Science at the U.S. Department of Energy during the Obama administration. Earlier in his career, he was a professor of theoretical physics at Cal Tech. He has held numerous governance positions at national laboratories.

» return to top

Scott Kirby Is Wrong About Newark Slot Controls
By Gary Leff

United Airlines CEO Scott Kirby recently laid out the major benefits that fixing air traffic control would mean for improving air service in the United States, reducing delays and cancellations. I don’t think the FAA as a service provider can ever actually do it – you need the regulator to be different from the actual air traffic organization. The FAA regulating itself has meant zero accountability for decades.

However, Kirby goes on to argue that there still need to be limits at Newark airport, where United has a hub. His point about “simple math” doesn’t imply the solution that he thinks it does.

“Newark, for what it’s worth, always should have been capped. I mean it was the only airport left in the world that was a large airport that was over-scheduled that doesn’t have slots. It’s the only one and it used to have slots and the reality is at Newark the FAA says in the best of times with full staffing on perfect weather day they can handle 77 operations per hour and they were letting it be scheduled at 86 operations per hour for hours in a row. And that is simple math.”

We should improve throughput at Newark, because there’s a lot of demand for air travel out of Newark. We can’t do that because of air traffic control and because we don’t build things in the United States well anymore.

However, if Newark is overscheduled, the answer isn’t to hand the exclusive right to operate most of the flights to United, blocking competitors and future new entrants, as a free gift (subsidy) from taxpayers. That’s what slots are – the right to take off and land at an airport, generally given for free, despite huge economic value. Slot controls allow incumbents exclusivity and block anyone else from competing with them. That airlines have succeeded in regulatory capture to make this standard practice doesn’t make it any less bad policy.

Here’s the better approach: congestion pricing.

Slots are a blunt rationing instrument (and a subsidy to the incumbent airlines). Since they’re “use it or lose it,” we get unnecessary flying on small planes hardly anyone wants to fly, just to squat on flight times. Prices encourage airlines to allocate flights to the right aircraft and the right routes that match passenger demand.

Think of a runway like a heavily used road approaching its capacity. As use approaches 100% of capacity, planes have to queue. Each additional flight imposes delay costs on everyone else, but the airline only internalizes its own delay cost. So airlines are incentivized to overschedule.

Slots try to deal with this by capping the number of flights in a period. Congestion pricing says: “You can operate whenever you like, but you must pay the actual total cost of the delay you impose on others.”

Slots are a crude cap: “X movements per hour.” They’re allocated via grandfather rights and use it or lose it. They’re adjusted infrequently and administratively. Once you have the slot, the flight becomes “free” regardless of the delay it causes.

Charging per flight that approximates the marginal delay cost to others works better. When the system is uncongested, the price is low or zero. As demand approaches or exceeds capacity, prices rise sharply. Airlines operate a flight in that time slice if they are willing to pay – if the value of the flight to passengers and the airline is greater than the congestion charge.

That way, you get the flights that generate the highest value relative to the delay they cause. You also get natural spreading of flights to shoulders or off-peak times, reducing congestion and lowering their costs. Pricing can encourage the use of larger aircraft (“up-gauging”) to spread the cost out across more passengers.

A slot freezes peak delay – a “50 slots per hour” rule means you get 50 flights per hour, regardless of delay and irrespective of whether those are 50 regional jets or widebodies. There’s no incentive to move any of those flights 20 minutes to spread out peak loads.

Slots are also bad at handling weather events and air traffic control problems. Those might reduce an airport’s capacity from (say) 60 to 35 flights per hour. That’s when we get ground delay programs and ad-hoc rationing. Congestion pricing can do the work for you and prioritize the most valuable flights. Instead of stressing the system, airlines contribute towards paying for a better one.

Ultimately, the same price applies to everyone – incumbent airline or new entrant in the market. “Airlines would hate this!” Yes, of course incumbents would. They’re getting a valuable property right for free, and instead they’d be charged (though it could be done as revenue-neutral).

You’ll likely hear that “congestion charges” will just cement incumbent dominance, which is silly, because that’s what the current slot system does. The claim, though, is that incumbents have deep pockets to pay peak charges, while others get pushed out, worsening competition.

  • Under slots, incumbents own peak access for free (or were often cheaply acquired in the past). They can sit on grandfathered rights indefinitely. New entrants are often shut out completely.
  • If a new entrant sees high value in a particular peak flight, they can buy in. Under a fixed slot regime, there may literally be no access at any price.
  • If policymakers still want to support entry (they will, usually for their own constituents rather than the public good), they can offer rebates for new carriers on specific routes and use competition policy to scrutinize predatory practices rather than locking in those practices with slots.

There will also be a class argument that peak times will become “rich people’s time slots,” with lower-income travelers getting pushed into inconvenient off-peak times or other airports. That’s often what happens now, getting pushed to Spirit and Frontier for lower fares at other airports. And lower-income travelers would face fewer delays! In any case, especially if congestion pricing encourages up-gauging, we’ll likely see more major carriers with excess capacity to discount – at peak times. But if you want redistribution, then do it explicitly, not via hidden cross-subsidies embedded in slot allocation.

A fair concern is that low-value flights that few passengers value – often on smaller regional jets to low-volume airports – will lose peak-time service. That’s because these flights are less valuable! But if we’re really going to design policy around these flights, don’t do it in a way that also inefficiently allocates flights, causing delays for the entire air system. Make the subsidy cost of these flights explicit rather than burying it.

A system that sets prices by day and time seasonally, by 15 or 30 minute increments, and is published in advance is easy for airlines to plan for. Then, major weather or air traffic control outages can have surge pricing with a capped multiplier (e.g., 2x). This is easy for airlines to deal with – they manage variable fuel pricing and demand risk constantly. And this will lower costs from ground delays.

Newark shouldn’t get slot controls. We should abolish them at New York’s JFK and LaGuardia and Washington National as well. They’re a rationing mechanism that locks in incumbents and treats all flights in the same time window as equivalent, regardless of the systematic delays they create. And they provide no real incentive to move a flight time or up-gauge.

And slots turn scarcity value into privately-owned assets of the airline, rather than revenue streams to improve system capacity. Congestion pricing does the opposite! Anyone can access takeoffs and landings if the value of their flight is high enough to warrant paying peak prices.

Editor’s Note: This article is a slightly condensed version of Gary Leff’s “View from the Wing” column published Nov. 21, 2025, and is used with the author’s permission.

» return to top

Is the Moon Race Heating Up?

While NASA continues to plan to launch its first SLS/Orion human lunar launch as early as February, some observers (including the editor of this newsletter) are very concerned about risking the lives of four astronauts on a spaceship that has flown only once (in 2022), and whose Orion capsule’s heat shield was partly destroyed during re-entry. Instead of fixing the heat shield, NASA is counting on an untested, gentler re-entry path to bring the astronauts back to Earth.

The only reason I can think of for this risky decision is the multi-billion-dollar cost of each SLS/Orion launch. By contrast, because SpaceX and Blue Origin space launchers cost a small fraction of that, they sensibly carry out repeated uncrewed test launches to be sure that when it’s time to launch people, every system and subsystem has had ongoing improvements to increase its operability and level of safety.

I’m encouraged to see both Blue Origin and SpaceX talking with NASA about alternative ways to get people and cargo to the Moon and back. Eric Berger reported in Ars Technica (Nov. 13) that NASA’s acting administrator, Sean Duffy, asked both companies for more nimble plans for their respective lunar landings.

SpaceX disclosed that it has “shared and are formally assessing a simplified mission architecture and concept of operations that we believe will result in a faster return to the Moon while simultaneously improving crew safety.” Could that mean not using the flawed Orion capsules? Berger did not suggest this, but he thinks it might mean working with others beyond those directly involved with Artemis III. He went on to suggest two ideas that might be put forth by SpaceX: expendable Starships and using the company’s proven Dragon (presumably instead of Orion). For the former, instead of depending on propellant transfer in orbit (from one Starship to another), the idea would be to use expendable tankers, which would reduce their launch weight and might reduce the number of tanker missions by up to 50%.

Using SpaceX Dragons instead of Orion would increase safety and reduce cost, though Dragons would need a new heat shield for re-entry to Earth from lunar missions. Berger lays out a mission relying on a combination of Starships and Dragons, which is too complicated to summarize here, but none of its steps involves an SLS or an Orion. This would be a major change from using NASA’s minimally tested vehicles. It would also appear to eliminate having to use the costly (and behind-schedule) Lunar Gateway.

On Dec. 3, the Wall Street Journal reported on new proposals from Blue Origin. It has already been planned, for next year, to send a Blue Moon Mark 1 cargo mission to the lunar surface. This could be followed by a larger version of the cargo rocket to transport astronauts to the Moon in 2028 for a shorter stay than planned for Artemis III. The modified rocket would use storable propellants, which are intended to eliminate in-space fuel transfers. No details are available on that propulsion system.

NASA, per the WSJ report, “will evaluate proposals for a simpler astronaut landing on the Moon from Blue Origin and SpaceX, as well as any other proposals it might receive.” And assuming that Jared Isaacman gets confirmed promptly as NASA administrator, that assessment will be in good hands.

» return to top

What About that “$20 Billion More” for ATC Modernization?

Aviation media reports late last month focused on DOT Secretary Duffy’s call for Congress to provide the “additional” $20 billion for air traffic control modernization that a broad aviation coalition has called for, all of them deeming the $12.5 billion in borrowed money that Congress provided earlier this year as merely a down payment.

Until now, when capital investments in the ATC system were called for, Congress allocated funds from the Airport and Airway Trust Fund, whose dollars come from aviation user fees, primarily the airline passenger ticket tax. Aviation (or at least airlines) has long relied on user-funded infrastructure, for both airports and ATC. Highways are likewise supported largely by user fees, both fuel taxes and tolls.

The balance in the Aviation Trust Fund is expected to be around $20 billion by late 2025, but a large fraction of that will be drawn upon for the FAA’s 2026 operating and facilities and equipment budget needs. So what is the responsible answer to the “additional $20 billion” for ATC modernization?

Increase the aviation user fees. At a time when the federal budget is running a $2 trillion annual deficit, there is no justification for aviation to add to that total, which directly increases the national debt to unsustainable levels.

» return to top

News Notes

Port Authority Plans P3 for Newark Terminal B
Infralogic’s Eugene Gilligan reported (Nov. 13) that the Port Authority of New York and New Jersey’s $45 billion capital plan includes using a long-term public-private partnership for its new Terminal B. Gilligan’s article noted that infrastructure investment firms have held discussions with Port Authority officials regarding the use of a P3 procurement model for replacing the aging existing terminal. The agency in recent years has used such P3s for new terminals at both Kennedy (JFK) and LaGuardia (LGA) airports. The capital plan also includes a new AirTrain Newark and an “EWR Vision Plan” for revitalizing the airport.

SpaceX Starship Cleared for Cape Canaveral Launches
Politico Space reported (Dec. 5) that the Space Force has cleared SpaceX to launch its huge Starship launch system from its launchpad SLC-37. SpaceX hopes to launch up to 76 Starship flights per year from that site within the next few years. Other launch companies expressed concerns about interference with their own launch plans, but the Space Force accepted SpaceX’s plans to identify any new “blast danger areas” that need to be cleared near SLC-37.

First Digital Tower in the Middle East
Hamad International Airport (HIA) in Qatar has received certification for the first virtual/digital control tower in the Middle East. The Virtual Tower was developed by Searidge Technologies, with partners ADB Safegate and NATS, the UK air navigation service provider (ANSP). The vTWR provides 360-degree views of the entire airport, which was not possible from its conventional tower. The new system has two controller workstations in the existing conventional tower and two in HIA’s Backup and Approach Training Center.

Blue Origin Lands New Glenn Booster
On its second launch, Blue Origin’s New Glenn booster lofted two NASA payloads toward Mars and recovered the reusable booster for the first time. This was the first successful New Glenn booster recovery. Blue Origin plans to use it for many future launches, similar to SpaceX’s growing track record with Falcon 9 and Falcon Heavy launch vehicles.

Nav Canada Breaks Ground for Its First Digital Tower Center
Kingston, Ontario, is the site where Nav Canada has begun construction of an interim digital tower facility. The Kingston Digital Facility (KDF) is intended to lay the groundwork for a future digital tower center that is intended to serve up to 20 airports. Upon completion in 2026, the KDF will initially provide tower services for Kingston and one other airport, as the first digital tower facility in Canada. It is also the first stage in Nav Canada’s Digital Aerodrome Air Traffic Services (DAATS) initiative. Nav Canada’s technology partner on this endeavor is Kongsberg.

DOT Seeks Information on Dulles Airport Revamp
Responding to a White House request, the US DOT on Dec. 3 issued a Request for Information on plans to “revitalize” Washington Dulles Airport (IAD). The RFI includes the idea of public-private partnerships (P3s), like those that have been used to replace aging terminals at LaGuardia (LGA) and Kennedy (JFK) airports. IAD is an airport I avoid whenever possible, in part because of its slow, dangerous people movers called “mobile lounges,” which have no seats and are generally wall-to-wall with standing passengers and luggage. The airport really could use a serious rethink, and it could be a good fit for design-build-finance-operate-maintain (DBFOM) P3s.

JSX Plans Passenger Service from Santa Monica
Public charter carrier JSX has announced daily flights between Santa Monica (SMO) and Las Vegas (LAS) to begin before the end of December. This will be JSX’s first route to use turboprop aircraft (ATR 42-600s). JSX holds options to acquire as many as 25 additional ATR turboprops, if its early routes are successful. Even if it is successful, the SMO-LAS route will not be long-term, since SMO is due to shut down entirely at the end of 2028.

Fengate Plans to Sell its ConRAC
One of the pioneer developer/operators of consolidated rental car centers (Con RACs) is planning to sell its pioneer project at Los Angeles International Airport (LAX). Infralogic reported (Nov. 26) that Fengate is in negotiations with BBGI Global Infrastructure to sell the LAX ConRAC and a public school P3 in Prince George’s County, MD. BBGI, which owns 56 P3s in the United States, Canada, and Europe, was recently taken private by British Columbia Investment Management Corporation.

NASA Bans People from Next Boeing Starliner Flight
Ars Technica reported that NASA has approved renewed missions to the International Space Station for Boeing’s ill-fated Starliner capsule, but this first set of new missions will be for cargo only. Assuming this cargo-only mission is a success, Starliner will be approved to fly three passenger missions to the Space Station before the ISS is de-orbited, as planned. The original 2014 NASA contract called for Starliner to operate six crewed flights to ISS.

U.K. Takes Steps to Bolster GPS Position, Navigation, and Timing (PNT)
In response to increasing levels of GPS/GNSS spoofing and jamming, the U.K. government has committed £155 million for three projects. First, £71 million will be invested in a new enhanced LORAN program, a system with higher power and a far different spectrum than used by GNSS. Second will be £68 million to continue the development of a National Timing Center aimed at providing nationwide timing that does not rely on GNSS. Another £13 million will fund a new UK GNSS interference monitoring program.

Eurocontrol Calls for Increased Use of Text Messaging
The 42-government agency Eurocontrol has called for air navigation service providers (ANSPs) and airlines to make significantly more use of controller-pilot-data-link-communications (CPDLC), Aviation Daily reported (Nov. 14). Greater use of text messaging would relieve congestion on voice radio communications channels. Eurocontrol’s Paul Bosman reported at a recent conference that in European airspace, flights average only two data link messages per flight, adding, “This technology has been available for 20 years; can’t we do better?”

FAA Seeks Input on Replacing ERAM and STARS
On Nov. 20, the FAA released a Request for Information (RFI) about creating a common automation platform that would replace separate systems that manage en-route flights (ERAM) and terminal-area flights (STARS). The Common Automation Platform (CAP) sounds good in principle, and FAA is wise to seek a single, state-of-the-art platform to replace the two older systems, developed during different time frames. FAA noted that it is open to several potential approaches to “re-architecting” existing platforms. Responses are due Dec. 19, which does not provide much time for serious brainstorming.

Lockheed Martin Plans Commercial Orion
Aviation Week (Oct. 13-26) reported that the prime contractor for NASA’s Orion moon capsule is planning a commercial version. Lockheed acknowledged NASA’s contract for the Artemis moon missions, but with that program likely to be cut short after only a few launches, it is looking for possible commercial customers. I am happy to refer them to last month’s article on Orion’s potential shortcomings, beginning with its hardly-proven re-entry heat shield. If even half the problems cited by ex-NASA scientist Casey Handmer (in last month’s issue) are valid, my advice is caveat emptor.

Blue Origin Partners with Luxembourg Space Agency on Lunar Prospecting
Project Oasis was recently announced by the space launch company in conjunction with Luxembourg’s space agency. The plan is to remotely surveil lunar water ice to identify Helium 3, rare earth elements, and other resources that might support lunar production of materials and fuel that would not have to be transported from Earth. To the extent that promising lunar resources are identified, the project’s second phase, called Blue Alchemist, will experiment (here on Earth) with turning such raw material into useful materials that could later be produced on the Moon.

Airport P3 Activity in Brazil
In October, airport operator Motiva announced that its Brazilian airport concessions were for sale, with a value estimated at $1.8-2.2 billion. Twelve airport groups initially expressed interest, including the world’s second-largest (AENA Airports) and fifth-largest (Vinci Airports). In early November, AENA announced that it was working on a $986 million bond issue for its Brazilian airport P3 concessions. It also announced that its partially-owned Mexican airport company GAP would merge with its strategic partner AMP. It looks as if more airport deals will be forthcoming soon in Brazil.

Stockholm Arlanda Airport OKs Curved Landing Approaches
Aircraft equipped and certified for required navigation performance (RNP-Authorization Required) may be allowed to make continuous descent approaches on curved arrival paths at Arlanda. Swedavia expects that this will lead to more landings per hour and lower aircraft emissions. RNP-AR has been approved by Nav Canada for two airports in that country, Calgary and Toronto. I am not aware of any US airports approved by the FAA for this kind of landing

Newark Controllers Have Two More Years in Philadelphia, Per FAA
When the FAA shifted control of arrivals and departures at Newark from the troubled New York TRACON (N90) to the Philadelphia TRACON, 14 controllers moved to the Philly TRACON. The time period was indefinite, but on Nov. 17, the FAA announced that those controllers would remain at Philly TRACON for two more years.

» return to top

Quotable Quotes

“The economics of urban air taxis are difficult. An aircraft costing millions must fly many hours per day at high load factors to cover capital and operating costs.  Battery energy density limits range and payload. Downwash, noise, and turbulence make rooftop or street-level operations problematic. Wind and weather limits reduce availability. Certification requires thousands of flight hours and proven safety redundancies. Air traffic management for autonomous or semi-autonomous craft is not ready. Public acceptance of low-flying craft over dense cities remains uncertain. . . .  Supernal’s folding is symbolic. The era of hype is ending. The sector is moving into an attrition phase where many firms will fail, a few will survive, and the market will settle into niches. The original promise of eVTOLs as a mass urban transport solution is receding. The story now is about how a vision of the future met the hard reality of physics, economics, and regulation, and how an industry will be reshaped in the aftermath.”
—Michael Barnard, “From Kitty Hawk to Supernal: The Shrinking Future of eVTOLs,” Clean Technica, Sept. 11, 2025

“I enjoyed your piece in the Wall Street Journal [on ATC reform]. As someone on the front lines, I can tell you that things are certainly not getting better. The most frustrating part of my day is battling all the chatter on the radio. Many times we can’t get a word in edgewise. Meanwhile CPDLC (controller-pilot-data link- communications) just sits unused. It’s very rare for controllers to use it for anything other than frequency changes. Many of its numerous functions are not even activated, including free text messages. One concern I have is that the feds are going to spend billions on a new elaborate ground-based system, when a better and less-expensive aircraft AI system may be just around the corner. While I seriously doubt that the ground-based network will be eliminated any time soon, I could see significant reductions in the need for hardware, particularly on the en-route part of the system.”
—Greg Ross, email to Robert Poole, May 10, 2025, used by permission. Mr. Ross is a 737 captain for a major U.S. airline.

» return to top

The post Aviation Policy News: Air traffic controller staffing and resignation claims appeared first on Reason Foundation.

]]>
New study details how legal psychedelic services can treat depression, anxiety https://reason.org/commentary/new-study-details-how-legal-psychedelic-services-can-treat-depression-anxiety/ Tue, 09 Dec 2025 11:30:00 +0000 https://reason.org/?post_type=commentary&p=87245 A new study has found notable improvements in mental health among participants who underwent legal, supervised sessions with psychedelics in Oregon.

The post New study details how legal psychedelic services can treat depression, anxiety appeared first on Reason Foundation.

]]>
A new study has found notable improvements in mental health among participants who underwent legal, supervised sessions with psychedelics in Oregon, the first state to legalize such services for adults. Published by Osmind, a mental health research and electronic health record company, the study analyzed treatment outcomes from individuals seeking relief from depression and anxiety under Measure 109 in Oregon. This 2020 voter-approved initiative decriminalized psilocybin, the psychoactive compound in psychedelic mushrooms, for therapeutic use by adults over 21 in state-licensed centers. While clinical trials have hinted at psilocybin’s potential at scale, this report offers early evidence from a commercial setting.

The Osmind study relies on voluntary self-reports, making it “naturalistic” research that captures how these services perform outside the strict protocols of randomized trials (measuring outcomes through self-reported surveys is standard practice in real-world scientific research). The study tracked 88 participants and used standardized tools to measure changes: the PHQ-8 questionnaire for depression (a scale from 0 to 24, where higher scores indicate worse symptoms), the GAD-7 for anxiety, and the WHO-5 for overall well-being. Assessments occurred before the session, one day after, and a month later. No dosages were specified, but sessions followed state guidelines for supervised administration.

Results showed meaningful gains across the board. Depression scores on the PHQ-8 fell by an average of 4.6 points, shifting participants from moderate to mild severity, a change that meets the threshold for clinical significance. Anxiety dropped by 4.8 points (on the GAD-7 scale), and well-being rose by 10.7 points (on the WHO-5 index). No serious adverse events occurred during sessions, though 3 percent reported lingering issues, like heightened anxiety or family strain, a month later. These preliminary improvements suggest that psilocybin could offer rapid relief in a legal therapeutic setting, aligning with the compound’s reputation for fostering emotional resilience.

Direct comparisons to other psilocybin studies or clinical trials are tricky, as many rely on different scales, populations, and measures. Some studies report quantified outcomes (“effect size”) in the proportion of participants who had meaningful changes, while others report changes in a particular scale. As an example, in one randomized study, about two-thirds of participants continued to experience relief from major depressive disorder (MDD) remission five years after receiving treatment. That study only included participants diagnosed with major depression and measured outcomes with a different metric (the GRID-HAMD scale) than the Oregon study.

Nonetheless, Osmind’s review of real-world data reveals significant results on depression and anxiety, consistent with more medicalized clinical trials. Oregon’s approach to psychedelic treatment is a novel experiment, not just because it uses psychedelics, but because it created an entirely new mental health services framework. The state had to design training criteria for schools so that non-medical professionals could learn to administer a drug that is currently undergoing drug trials. By law, these “facilitators” did not need prior mental or medical training.

This new study shows promise for both the impact of psychedelics as a mental health treatment and for lowering the cost of licensed mental health services. Psychedelic therapy can be very expensive (over $15,000) when using a medical model, where two licensed therapists see a single patient for three extended sessions (based on countries where it is federally legal). In Oregon, professionals do not need to attend medical school and can administer group sessions, reducing the total cost per patient.

The Drug Enforcement Administration (DEA) has requested that Health and Human Services (HHS) review whether psilocybin should continue to be banned as a Schedule I drug (the DEA request was publicly confirmed by Kathryn Tucker, JD, who is involved with the case; it was also confirmed privately by legal counsel to Reason staff). A Schedule I designation reflects the government’s opinion that the substance has no medical value and is highly susceptible to abuse. Businesses that traffic in Schedule I substances, including Oregon psilocybin clinics, are considered federal criminal enterprises, are generally unable to access financial services, and are prohibited from claiming deductions on their federal income taxes using the “ordinary and necessary” standard that applies to other businesses. These federal penalties significantly increase the cost and risk faced by these businesses, and these additional financial burdens must be passed on to customers.

Data collected by Reason Foundation shows that states with legal psychedelic services do not display increased rates of criminal activity or hospitalizations. Taken together with this latest study, data from Oregon makes a strong case that psilocybin holds clear medical value and does not endanger public health, calling into question whether it should be considered a Schedule I drug.

The post New study details how legal psychedelic services can treat depression, anxiety appeared first on Reason Foundation.

]]>
Mandating inefficiency: Minimum lot size regulation and housing https://reason.org/commentary/mandating-inefficiency-minimum-lot-size-regulation-and-housing/ Mon, 08 Dec 2025 11:30:00 +0000 https://reason.org/?post_type=commentary&p=87211 Excessive land use restrictions are a primary contributor to the ongoing housing crisis, and minimum lot size regulations are among the most pervasive.

The post Mandating inefficiency: Minimum lot size regulation and housing appeared first on Reason Foundation.

]]>
Introduction

Excessive land use restrictions are a primary contributor to the ongoing housing crisis, and minimum lot size (MLS) regulations are among the most pervasive. MLS requirements dictate the smallest amount of land on which a home can be built. These rules are often coupled with minimum setback and square-footage regulations, creating a template for what homes must look like to obtain a permit. This bundle of laws makes smaller homes either unprofitable for developers or illegal. The homes produced by these design standards are out of reach for many Americans, underscoring a need for flexible housing options in the low-density forms most appealing to buyers.

Rolling back these regulations where they are excessive can open the door for smaller, denser, and less expensive units. Empirical evidence from many municipalities finds that allowing more homes per acre can lead to an influx of new units at the lower end of the market. Understanding the motivations behind MLS regulations, their current state, and what positive policy change would look like can pave the way for reform in this area.

History, rationale, and current state

Minimum lot size requirements are among the oldest and most pervasive elements of zoning, shaping land use patterns across nearly every city in the U.S. Throughout the 20th century, MLS regulation expanded, both in where it is imposed and the size of lots mandated. According to a report from the American Planning Association (APA), in the mid-20th century, it was not uncommon for localities to have lot requirements of over 20,000 square feet, with some areas requiring as much as five acres per unit. Despite growing populations, many of these laws have remained stagnant or become more severe. MLS laws serve three functions: generating tax revenue, providing water and sewer distribution, and maintaining aesthetics.

The inception of many MLS laws can be traced back to local tax policy. The Mercatus Center finds that MLS laws expanded during the baby boom, in part, to exclude smaller homes that could not generate enough property tax revenue relative to the number of children who would live in them. Specifically, “by setting a floor for land costs, [MLS] was intended to slow, if not exclude entirely, the entry of such families…”

Additionally, according to the APA, historical justifications for MLS regulations include the ability for local governments to plan their distribution of utilities, regulate congestion, and maintain air quality and health standards. For example, lots must be large enough to accommodate adequate water and sewage service, especially in areas not connected to a central system that relies on disposal methods such as septic tanks. Typically, zoning codes account for different levels of connection by having lower requirements where connection to central systems is possible, and higher requirements where it is not. Laws to ensure a minimum standard of sanitation are understandable. However, the vast majority of American homes are connected to central water and sewage, so sanitary concerns are a somewhat dubious justification in most areas today. Instead, today, MLS requirements are commonly used to promote spacious residential environments and to exclude denser housing options.

Figure 1 depicts the median lot size in each state. One influence is the environment and nature preservation. Nevada, which has the lowest median lot size, is largely uninhabitable, with development concentrated around cities and necessitating more efficient use of space. However, in less environmentally constrained areas, local land-use regulations play a significant role in determining lot sizes. Vermont combines extensive land conservation with some of the most stringent MLS requirements in the country, designed to preserve its rural character. The Northeastern Vermont Developments Association, for example, mandates an acre lot for every family as the densest option.

Source: Reason Foundation, using data from Visual Capitalist

Often, MLS requirements vary significantly on a granular level. Let’s consider just one state. Florida has a wide range of minimum lot size requirements, determined either at the county or city level. Figure 2 shows the minimum lot sizes in a single city, Plantation, with just over 100,000 residents.

The patchwork of 18 different districts with 10 different minimum lot sizes in Plantation is not the result of meaningful differences in infrastructure capacity, but rather, a legacy of aesthetic preferences. In fact, Plantation was founded in the 1950s with the vision of large lots, fruit gardens, and a unique feel to every home in mind. Initial advertisements for the “anti-development” bolstered that there would be “a full acre with every home,” to prevent crowding. The zoning code accompanied these preferences to ensure this outcome. While a large home far from neighbors is the dream for many, codifying this preference into law has downstream consequences that cannot be ignored. Today, the median home sale price in Plantation is $515,000, well over Florida’s median of $404,400. Prices in Plantation have climbed so high that the city has signed on to a countywide gap-finance effort, despite a history of being resistant to affordable housing measures. Crystalized policy has not allowed Plantation to adapt to its modern challenges, highlighting a need for reform in this area. While their stories may be less clear, most other localities across the nation look just like this. Each county has its own version of these regulations, leading to varying MLS standards that can change from city to city and even street to street. As these areas look to tackle their affordable housing challenges, it would be helpful to reassess existing rules, such as minimum lot sizes, rather than trying another complicated and expensive approach.

Source: Reason Foundation, using data from Plantation, Florida’s Use Regulations

MLS regulations and the cost of development   

The cost of housing can be divided into several categories, and the relative proportions depend on many factors. Location, current market conditions, and home construction processes all determine the division of cost categories. While construction is typically cited as making up the majority of the cost of new housing, evidence from Redfin suggests that, depending on location, the cost of land acquisition can be substantial. Tracking the cost of land as a share of home values across 40 U.S. metros, Redfin finds that the cost of land can be up to 60% of the home’s value. Of the top 20 metros with the highest land-cost-to-home-value ratio, nine were in California, with the rest in other notorious high-cost areas, including Boston, New York City, and Seattle. Requiring developers to purchase more land than they need influences the types of projects they can take on and the cost of housing down the road.

Research finds that MLS regulations raise housing prices in two ways: directly and indirectly. Accounting for 78% of the cost increase, the direct effect refers to mandating larger homes, which are naturally more expensive. The indirect impact captures the amenities that larger lot sizes create, for example, less congestion and higher local tax revenue. A 2011 study of homes in the Boston area corroborates this finding, estimating that areas with restrictive MLS regulations have home prices 20% higher than towns that do not. Further, regions that restrict their MLS are likely to experience rapid home price appreciation after the restriction takes effect. The study finds that towns experienced home price increases of up to 40% 10 years after an increase in MLS listings, controlling for other factors. Importantly, this relationship goes both ways.

Houston, Texas, has among the most liberal land use rules in the country, and this trend extends to minimum lot sizes. In 2013, Houston expanded a previous policy allowing lots as small as 1,400 square feet across most of the city. This reduction is credited as one of the reasons home prices in Houston, even in its urban core, have remained lower than in comparable metros across the country. A flexible policy has made Houston resilient in a time of strain. Minimum lot-size rules raise housing costs across the board, and their impact is burdensome on entry-level homes.

Small single-family homes, often referred to as starter homes, have been hit especially hard by large lot size regulations, contributing to the phenomenon of the “missing middle.” This phrase refers to the decline of middle-density development affordable to middle-income earners. By mandating expensive land purchases, small housing becomes unprofitable. Just by allowing more homes per acre, massive supply additions can be made specifically for middle-income buyers. Estimates by the American Enterprise Institute (AEI) find that single family homes were built at an average rate of 5.5 units per acre (~8000 sq ft) from 2000-2024. Even a modest increase in density of eight units per acre (~5400 square feet) could have added 4.8 million additional units in this time frame. These findings are not just theoretical; they are supported by profit incentives that developers respond to. Data from the U.S. Census Bureau show that between 2009 and 2024, the percentage of single-family homes built on 7,000-square-foot lots or smaller rose from 25% to 39%, indicating a desire among developers to supply these density levels—if they are allowed to do so.

While large homes and neighborhood amenities are desirable for many, they should not be the only option. When land is a primary expense, requiring developers to purchase more of it than they need is a substantial barrier to new supply—especially for lower cost options. In combination with other regulatory reforms, many high-cost areas would benefit from allowing smaller land parcels for housing development.

Policy recommendations

Fully addressing current housing challenges requires not only expanding total housing supply but also enabling the construction of the types of homes that reflect the preferences of buyers, particularly single-family housing in the form of starter homes. Reforming minimum lot sizes will lower prices for many housing types, and it is an especially critical step toward opening the single-family market.

Key points

As policymakers work to liberalize land use in their communities through MLS reform, here are several key points to consider:

#1 Reduce and standardize minimum lot sizes

Where central water and sewer connectivity is possible, states and communities should reduce and standardize their MLS to modest entry-level sizes. While different communities will have varying levels of willingness, any change in regulations could increase the supply of affordable units. By reducing and standardizing, states can create a predictable development atmosphere and enable more efficient use of space where desired. Standardization also aids city planners in accounting for utilities, one of the primary motivations behind MLS regulations from a planning perspective. Importantly, reducing lot size minimums does not mean that every neighborhood will be dense—just that developers can offer more options to prospective residents.

#2 Couple MLS and dimension requirement reform

While clear MLS reform is the top priority in this area, these reductions must be supported by accompanying dimension reforms. For example, setback rules specify how far a structure must be from the edge of the property line. Setbacks are often justified on grounds of fire safety or privacy, but are frequently used in practice to maintain a suburban or rural feel for communities. If safety were the true motivation, it raises the question of why urban areas with far smaller setback requirements are not viewed as comparably at risk. Much like MLS, these laws mandate inefficient use of space and result in higher prices for residents. Reducing minimum lot sizes should be paired with adjustments to setback requirements to ensure land is used efficiently.

Further, square-footage requirements for structures restrict small housing options, like tiny homes (defined as having 400 square feet or less of living space). While some areas have changed their square-footage minimums to accommodate these housing options, many maintain larger minimums. For small lots to make sense, they must be paired with small homes. Together, MLS and dimension reforms enable compact development.

#3 Allow lot-splits by right

Lot splitting is dividing one existing lot into two or more lots. Allowing lot splits is critical for infill development and complements MLS reform by creating options for existing neighborhoods. For example, lot splits are a great way to integrate smaller housing options, like tiny homes and accessory dwelling units. Because adjusting minimum lot sizes only affects future development on raw land, it may fail to capture areas that already have homes but include property owners interested in more density. Allowing this practice by right is a valuable tool to ensure MLS reform reaches its full potential.

Recent lot size policy reform: Case studies

As housing becomes an increasingly relevant policy action item, several states and localities have reformed their MLS regulations. Below are three recent examples of putting the policy recommendations above into practice.

Maine, House Paper 1224 (2025)

In April 2025, Maine passed House Paper 1224, banning any municipality from setting an MLS requirement greater than 5,000 feet per dwelling unit in areas connected to central water and sewer systems. Importantly, this bill includes a provision that setback and other dimension requirements cannot be stricter for multifamily housing than for single-family housing, though no specific maximum is given. In addition to reforming lot size requirements, this bill loosens density rules and allows affordable housing projects an additional story above the existing local height limit. HP 1224 ranks among the most comprehensive, clear, and far-reaching statewide minimum lot size reforms to date.

Texas, Senate Bill 15 (2025)

Texas’ 2025 Senate Bill 15 sets a minimum lot size of 3,000 square feet for single-family lots in new subdivisions larger than five acres. These provisions apply only to municipalities with more than 150,000 residents at least partially in counties with over 300,000 residents. In qualifying municipalities, SB 15 also provides maximum setback limits. As of 2025, municipalities within 19 out of Texas’ 254 counties meet the county population requirement to be subject to SB 15. While this change may seem incremental, the passing of this bill represents a massive win for Texas. The provisions in this bill were contentious and required revision from an initial 1,400-square-foot minimum proposed due to pushback from House members. While not as sweeping as initially desired, SB 15 opens the door for further reform and establishes a significant win for efficient use of land in the populated areas where Texans need it most.

The City of Pittsburgh, Pennsylvania, Legislation 1579 (2025)

In Pittsburgh, Pennsylvania, a recent MLS reform reduced requirements across all existing subdistricts, effective May 7, 2025. Table 1 includes the change in minimum lot size per dwelling unit for each subdistrict.

Table 1: Pittsburgh Minimum Lot Size Reform by Subdistrict

Density SubdistrictMLS before 5/7/2025 (sq. ft)MLS effective 5/7/205 (sq. ft)
Very-Low Density8,0006,000
Low Density5,0003,000
Moderate Density3,2002,400
High Density1,8001,200
Very-High Density1,200No Minimum Lot Size

Source: The City of Pittsburgh

While Pittsburgh’s current median home price is $235,000, which is well below the national average, prices in the city are rising. Through this legislation, lawmakers are taking active steps to ensure residents and developers are not faced with arbitrary hurdles.

Conclusion

Reducing MLS requirements does not mean mandating density or erasing existing neighborhood character. Instead, it provides flexibility, allowing communities to grow with their needs and respond to housing challenges as they arise. Buyers need these smaller homes, and developers are willing to answer. Reducing these regulations could substantially increase supply in the coming years through medium-density development. Homeowners can still enjoy traditional large single-family options, but others gain access to homes that better meet their needs, desires, and budgets. Given current home prices, mandating inefficiency through outdated lot size regulations is no longer a viable option.

The post Mandating inefficiency: Minimum lot size regulation and housing appeared first on Reason Foundation.

]]>
California’s state and local pension plans have over $265 billion in debt https://reason.org/commentary/californias-state-and-local-pension-plans-have-over-265-billion-in-debt/ Fri, 05 Dec 2025 11:00:00 +0000 https://reason.org/?post_type=commentary&p=87040 California’s public pension plans are taking on more risk than other pension systems while generating relatively poor investment return results.

The post California’s state and local pension plans have over $265 billion in debt appeared first on Reason Foundation.

]]>
California’s public pension plans are taking on more risk than other pension systems while generating relatively poor investment return results, a new Reason Foundation report finds. The California Public Employees’ Retirement System, CalPERS, and California State Teachers’ Retirement System, CalSTRS, are the nation’s two largest government-run pension funds, overseeing $558 billion and $382 billion in assets, respectively.

As it stands, California’s state and local governments have the most public pension debt in the country, with total unfunded pension liabilities of more than $265 billion, according to a new report from the Reason Foundation. That’s over $6,000 in pension debt for every state resident. CalPERS has $166 billion in debt, and CalSTRS has $39 billion in unfunded liabilities.

Since pension benefits promised to government workers are constitutionally protected, taxpayers are on the hook for that debt. In the years ahead, paying this pension debt will consume an ever-larger portion of state and local budgets.

So, to pay for retirement promises already made to government workers while also hoping to keep costs down, public pension systems are chasing new investment return strategies and targets. Worryingly, California’s over-reliance on high-risk, high-return strategies could result in overwhelming losses, a burden that taxpayers would ultimately bear.

Historically, pension plans have relied on investments like stocks and bonds, but many plans are moving away from this strategy and dedicating more assets to higher-risk investment strategies, such as real estate, hedge funds, private equity, and commodities, for which it can be challenging to obtain accurate market value information, and reporting periods lag behind those of traditional investments.

The Reason Foundation finds that in 2001, only 11% of California’s pension assets were allocated to alternative investments. However, by 2024, this share had increased to 37%, which is the 18th-highest in the nation.

CalPERS has more than doubled its shares in private equity over the last four years (from 6.3% of its total assets in 2020 to 17% in 2024), and it plans to expand further, so that private assets (private equity and private debt) make up 40% of its portfolio.

As it attempts to make up for the failure to set aside enough money to pay for promised retirement benefits, CalPERS is moving away from safer, predictable investment options in the hopes of better returns from riskier options that charge high fees, come with less transparency, and more risk and volatility that could leave taxpayers holding the bag.

With debt and costs rising, the pressure to take public pensions in this direction is strong because investment outcomes greatly impact overall funding progress and contribution requirements. The pressure is also increasing because California’s pensions have generated investment returns that fall below those of other pension systems nationwide.

Over the past 20 years, CalPERS achieved an average return of 6.8%, and CalSTRS achieved 7.6%, both of which are far below the S&P 500 average of 10.4% for the period.

Even over the last five years, during which CalPERS and CalSTRS have adopted higher-risk strategies in the hope of achieving better investment returns, California still ranked 36th out of 50 states in average investment returns for all public pension plans. California’s average investment return over the past five years was 7.51%, while Nevada ranked first with 9.67% average returns, and Washington state was second with 9.66% average returns for its pension systems during that time.

It is in taxpayers’ best interests for CalPERS, CalSTRS, and other public pension plans to achieve high investment returns, but investment strategies should include a thorough evaluation of the downside risks. Private equity charges high fees that primarily benefit fund managers, not retirees or taxpayers. They have opaque accounting practices and market valuations. They offer the potential for high investment returns, but that comes with high risk that they could fail to deliver.

Underestimating the risks associated with alternative investments could lead to even more costs. Taxpayers at the state and local level would see more money siphoned away from infrastructure, education, and public safety to make up for investment losses and to pay public pension debt. California’s pension systems should be more cautious about taking risks with taxpayers’ money and workers’ retirement benefits.

A version of this column first appeared at The Orange County Register.

The post California’s state and local pension plans have over $265 billion in debt appeared first on Reason Foundation.

]]>
Interdisciplinary harm reduction: A practical guide https://reason.org/commentary/interdisciplinary-harm-reduction-a-practical-guide/ Thu, 04 Dec 2025 11:30:00 +0000 https://reason.org/?post_type=commentary&p=87218 The goal is to identify where policies may be incongruent, such as through gaps in care, conflicting mandates, or fragmented accountability, and to design coordinated responses that reduce those harms without creating new ones.

The post Interdisciplinary harm reduction: A practical guide appeared first on Reason Foundation.

]]>
Public policy often approaches complex problems as if they can be neatly separated into specific categories, like public health, education, housing, transportation, or justice. Each agency develops solutions within its own silo, narrowly focused on its own specific outcomes of interest. 

While this specialization can increase efficiency, it also leads to significant institutional blind spots. In reality, people do not live within administrative divisions. The conditions that shape a person’s life—where they live, learn, work, and seek care—are deeply intertwined. As a result, a policy that may achieve desired outcomes in one department can unintentionally create harmful consequences in another, ultimately undermining broader goals of improving health and well-being.

For example, a city might fund a highly structured addiction treatment program that integrates counseling, medication, and case management. Yet without stable housing or employment opportunities, even the most effective interventions can falter once patients leave care. A state might pass legislation to improve public safety by increasing penalties for public drug use or expanding police authority to clear encampments. But without concurrent mental health and housing coordination, enforcement can produce the opposite of its intended outcome. Cities that increase enforcement without increasing services often see more frequent crisis calls, higher incarceration rates, and repeated emergency department visits, because individuals are cycled through short-term punitive responses instead of being stabilized through treatment, housing, or crisis-care coordination. These policy mismatches are a direct result of siloed policymaking, which is built to solve isolated problems rather than address the overlapping complexities of human behavior and institutional systems.

An interdisciplinary harm reduction approach identifies where policies intersect, overlap, or conflict, showing how siloed decisions can generate unintended harms elsewhere. It asks policymakers to view every issue as part of a larger ecosystem—what public health professionals call a “continuum of care.” The goal is to identify where policies may be incongruent, such as through gaps in care, conflicting mandates, or fragmented accountability, and to design coordinated responses that reduce those harms without creating new ones elsewhere. Though harm reduction is often associated with drug policy, its logic is conceptually applicable across disciplines. It is a pragmatic framework for thinking about risk mitigation that recognizes that human beings are not automatons and that each makes discreet decisions based on their own circumstances, background, and perceptions. A harm reduction approach doesn’t attempt to craft policy for a conceptualized version of humanity, but caters to the needs of real human beings by prioritizing practicality, coordination, and evidence over ideology. 

The value of an interdisciplinary approach can be better understood through economist Friedrich Hayek’s work on imperfect knowledge. Hayek argued that no single entity—whether a government agency, a business owner, or an expert committee—possesses all the information needed to make perfect decisions. Knowledge is distributed across countless individuals and institutions and is constantly in flux. This means that sound policymaking cannot rely on centralized control but must instead employ mechanisms that facilitate information sharing, test ideas in real-world conditions, and adapt based on feedback. While harm reduction does not originate from Hayek’s theories, an interdisciplinary harm reduction framework reflects this same insight. It brings together actors from different systems to identify shared goals, map where policies overlap, conflict, or create gaps, and build solutions that are both pragmatic and self-correcting.

In some arenas, these ideas are already being put into practice. For example, when police officers are trained in harm reduction principles, such as recognizing overdose symptoms, using naloxone, and collaborating with health providers, enforcement becomes more effective and safer for both patients and officers. When cities apply behavioral insights to design roads that naturally cue drivers to reduce speed—like using roundabouts instead of traditional intersections, as Golden, Colo., did—speeds and crash severity decline without relying on police presence. In healthcare, supervised consumption sites in Calgary, Alberta, Canada, have managed overdoses on-site, preventing deaths while reducing ambulance calls by 700 each year and saving more than $2.3 million annually in emergency costs. These examples spanning different sectors share the same underlying logic: measure concrete outcomes, coordinate across systems, and reduce avoidable harm.

This same logic can be successfully applied to housing, urban planning, education reform, governance, and beyond. By aligning their goals, data, and evaluation methods, agencies can prevent duplication, save public resources, and craft policy approaches that reinforce, rather than undermine, one another. 

Reason Foundation’s Interdisciplinary Harm Reduction Framework is built on that logic. Drawing on established models—including the National Harm Reduction Coalition’s core principles, continuum-of-care approaches used in public health, and Continuous Quality Improvement methods—it defines harm reduction as a pragmatic and evidence-informed approach to reducing avoidable harms across multiple areas of public policy, including health, housing, education, technology, finance, governance, and public safety. The framework provides policymakers with a guide to identify preventable harms, design proportionate responses, and evaluate their effectiveness in reducing risk for individuals and communities. Ultimately, it moves harm reduction policy design from theory to practice, creating a shared, interdisciplinary language for effective and measurable reform.

How to use this framework

This guide provides a clear explanation of the Interdisciplinary Harm Reduction Framework and its application across different areas of public policy. We begin by outlining the framework’s core principles and defining each one in the context of real-world decision-making. We then walk through the process of operationalizing these principles, offering a step-by-step guide for identifying harm, designing proportionate interventions, aligning incentives, and measuring outcomes. Each section is designed to be accessible for readers, whether or not they have a background in harm reduction or public policy. The ultimate goal is to translate this framework into a practical decision-making tool applicable to any policy area, from health and housing to education, governance, and technology.

Core principles

1. Outcome-Informed Decision-Making: An effective harm reduction approach must be grounded in reliable data, empirical research, and rigorous evaluation. This means prioritizing interventions with a demonstrable record of success in real-world conditions, using measurable indicators of harm reduction to track progress, and maintaining a willingness to adapt as new evidence emerges. Simultaneously, policies must proactively anticipate and minimize unintended consequences, such as fueling illicit markets, displacing harms to other populations or settings, or creating perverse incentives. This requires both pre-implementation analysis and ongoing monitoring to identify and correct harmful trends early. The emphasis should be on facts over ideology, ensuring that policy choices remain tethered to outcomes rather than political whim.

2. Risk Minimization Without Blanket Restrictions: This principle advocates for policies aimed at reducing the severity and likelihood of preventable harm without resorting to one-size-fits-all or authoritarian policy interventions. Overly broad restrictions affect entire populations, often imposing costs on the majority because a relatively small minority engages in higher-risk behaviors or encounters higher-risk conditions. A harm reduction approach focuses instead on identifying higher-risk individuals and areas to tailor interventions to have the greatest positive impact without unnecessarily limiting the freedoms of the general public.

3. Individual Autonomy and Voluntary Action: This principle prioritizes empowering people to make voluntary, informed choices about their own lives, so long as those choices do not cause direct and demonstrable harm to another person. Rather than relying on coercive mandates, the focus is on removing barriers to support and safeguarding personal agency. This allows individuals to voluntarily adopt safer behaviors when they are ready. This approach also recognizes that individual decisions can have ripple effects for families, communities, and broader society, and that these effects must also be addressed to strengthen both personal and collective outcomes. Lasting change is most effective when it is chosen willingly, not compelled. This principle acknowledges that responsibility for outcomes ultimately lies with individuals.

4. Targeted, Context-Specific Solutions: One-size-fits-all approaches are rarely effective and impose high costs, burdens, and harms on the general public. Harm reduction requires a nuanced understanding of specific communities, environments, and markets to tailor strategies that meet their unique needs. Whether applied to health, housing, finance, or technology, interventions should be proportional to the scale of the problem, appropriate for the target population, and feasible for sustained implementation.

5. Cross-Disciplinary Application: Harm reduction needn’t be confined to public health and drug policy. It offers a versatile framework applicable to housing stability, educational access, financial resilience, technology safety, governance reform, and public safety initiatives, among other issues. Viewing harm reduction through multiple policy lenses ensures more comprehensive solutions, prevents siloed thinking, and helps identify overlapping areas where small, well-designed policy changes can yield compounding benefits.

6. Practicality and Real-World Application: Proposed solutions must be operationally feasible, cost-effective, and workable in the real world. This requires an objective assessment of cost-effectiveness to ensure that both public and private resources are directed toward policies that deliver the greatest reduction in harm per dollar spent. Rather than pursuing unattainable ideals, this principle prioritizes tangible, incremental improvements that can be implemented within existing legal, economic, and cultural contexts. The goal is meaningful, sustainable progress over large-scale, disruptive changes that carry a high risk of both failure and unintended consequences.

7. Incentive Alignment: Sustainable harm reduction requires aligning the interests of individuals, communities, and institutions. Policies should be structured so that all stakeholders share a vested interest in achieving positive outcomes. This can be done through market-based incentives, regulatory flexibility, or public–private collaboration. Equally important is ensuring that policies do not create additional harms, allowing harm reduction efforts to gain long-term support based on shared value rather than enforcement or compliance mandates.

Step-by-step operational playbook

A successful operational playbook translates the Interdisciplinary Harm Reduction Framework into a six-step process that moves from problem identification to coordinated solution implementation. It begins with defining the policy problem and desired outcome, clarifying the harm being addressed, what measurable improvement looks like, and who is responsible for leading the effort. The next step involves mapping the systems and actors involved to visualize how different agencies, organizations, and individuals interact across health, justice, and community sectors. This step also includes establishing a steering committee composed of representatives from each partner agency and at least one community member with direct experience with the specific issue being addressed (e.g., substance use, homelessness, or navigating the justice system) to guide coordination and monitor progress.

Once these overlapping dynamics are mapped, the process turns to identifying points of risk, friction, or missed opportunity—areas where harm accumulates, or coordination fails—and recording them in a simple risk register to ensure accountability. After these risks are identified, teams apply the framework’s principles to decision-making, using the seven harm reduction principles as a lens to test whether proposed actions are practical, proportionate, and evidence-based. The fifth step focuses on designing coordinated interventions and evaluation plans that align funding, roles, and outcomes across systems while creating shared metrics to track progress transparently. Finally, the process concludes with implementation, learning, and adaptation, during which the steering committee meets regularly to review data, adjust strategies based on results, and share updates publicly to promote accountability and continuous improvement. 

Step 1. Define the policy problem and the desired outcome

Begin by clearly describing the specific problem and what measurable improvement would look like. Define the harm you are trying to reduce and how success can be measured. Before moving forward, assign a preliminary lead agency and identify all necessary stakeholders that should be involved in defining the problem. Early clarity about ownership of the issue prevents confusion later.

Questions to consider:

  • What harm or challenge are you trying to reduce?
  • Who is most affected, and in what environments or circumstances?
  • What would improvement look like in both the short- and long-term?
  • How will you measure success?

Step 2. Map the systems and actors involved

List and visualize all systems, organizations, and individuals that influence this issue. Include public agencies, community groups, non-governmental organizations, private entities, and informal supports, such as families or peer networks. Mapping reveals how decisions in one ambit of life can affect outcomes in another. As you map, identify who has authority, who provides data, and who will make final decisions. Assign a sponsor with budgetary or legal authority, an accountable lead for daily coordination, a data steward for evaluation, and at least one community representative to ensure real-world experiences inform every stage of the process.

Questions to consider:

  • Which systems or organizations currently influence this issue?
  • Where do people most often fall through the cracks?
  • Who are the main decision-makers, funders, or gatekeepers?
  • Where do responsibilities overlap or duplicate?

Step 3. Identify points of risk, friction, or missed opportunity

With the systems mapped, identify where harm accumulates or where efforts are misaligned. These are the points where coordination fails, incentives conflict, or barriers prevent access to support. Political or community pressures can also limit coordination, especially when proposed changes are controversial or misunderstood, and these should be identified as part of the same risk landscape. Recognizing these intersections early allows attention and resources to be focused where they can make the greatest impact.

Once identified, document these friction points in a simple tracking table or “risk register” that summarizes potential risks. For each, include its likelihood, impact, early warning signs, mitigation strategy, and responsible party. Review this document regularly in coordination meetings to ensure potential harms are identified early and addressed proportionately.

Questions to consider:

  • Do any current or proposed laws, statutes, or ordinances create barriers to implementing coordinated policies?
  • Where does harm most often occur within or between systems?
  • Are there communication gaps or conflicting priorities among agencies?
  • Do any current policies create or worsen unintended harms?
  • Which groups or communities are most likely to be overlooked?
  • What new risks could arise from this intervention?
  • How will we monitor for unintended effects or privacy issues?
  • Who is responsible for updating the risk register?

Step 4. Apply the framework’s principles to each decision area

Once the risks are identified, use the seven harm reduction principles to guide decision-making on how to address them. This framework is not meant for exclusive use by government officials. It is better understood as a shared checklist that independent actors can use when they convene to weigh tradeoffs, compare options, and discard approaches that do not work in practice. When public agencies participate, their role is primarily to bring partners together, share existing data, and remove unnecessary regulatory or administrative barriers so that those closest to the problem are free to test and refine solutions.

Apply each principle to the systems and decisions you have mapped to help ensure that responses are realistic, coordinated, and effective. The principles act as a filter to check whether proposed solutions reflect outcome-based, context-specific, and collaborative thinking grounded in local knowledge rather than top-down assumptions.

Every principle should be reviewed through the lens of those directly affected and those implementing support on the ground. Invite both service recipients and frontline practitioners to comment on how each principle applies in practice. When discussing context-specific design, confirm that diverse populations and geographic realities are represented.

Questions to consider:

  • Are desired outcomes clear, measurable, and evidence-based?
  • Is the proposed intervention proportional to the level of harm?
  • Does it respect individual choice and autonomy?
  • Is the approach tailored to local needs and contexts?
  • Are agencies and partners collaborating toward a shared goal?
  • Can it be implemented with available capacity and resources?
  • Are incentives aligned to reinforce positive outcomes rather than process?
  • Have affected communities been asked how proposed changes may impact them?
  • What accommodations are needed for language, disability, or access?
  • How will feedback be tracked and reported back?

Step 5. Design coordinated interventions and evaluation plans

With the principles applied, move from mapping to planning. Develop coordinated interventions across systems, assign clear roles, and clarify how each participating organization chooses to contribute. In an interdisciplinary harm reduction landscape, partners include public agencies, private providers, philanthropic funders, and community organizations. Each of these actors controls its own mission, budget, and internal accountability structures. Public officials may revise the way public programs are funded, contracted, or evaluated, but they do not direct or supervise the internal operations of independent institutions.

Within that constraint, “aligning funding” means using the tools that each actor legitimately controls to support the shared goals identified in earlier steps. Public agencies can decide how to structure their own grants, contracts, or reimbursement rules so that public dollars reward reductions in avoidable harm rather than simple service volume. Philanthropic organizations can voluntarily support parts of the effort that align with their missions. Service providers and community groups can decide how to allocate their own staff time and resources to participate in the coordinated response. No single institution sets funding levels for the others. Coordination emerges because different actors see value in the shared objectives and choose to orient some of their resources toward them.

Accountability is created similarly. Each partner remains accountable first to its own constituents, boards, donors, or voters. To make collaboration workable, partners can record their voluntary commitments in simple memoranda of understanding, contracts, or grant agreements that specify who is responsible for which activities and what indicators will be used to judge success. Where public funds are involved, outcome measures and reporting expectations should be defined clearly and published in advance, so that participation is both informed and voluntary. For purely private or philanthropic efforts, this framework still offers a template that organizations can adopt internally to clarify expectations and track results.

Once roles and commitments are clear, establish a shared evaluation plan that integrates information from these efforts and tracks progress across relevant sectors, not just within a single agency. The goal is to create a transparent picture of whether the overall approach is reducing harm, while respecting the independence of each participating institution.

Establish a feedback loop where results, risks, and community feedback are reviewed together at defined intervals. This integrated review structure replaces fragmented reporting and ensures that decisions remain transparent and data-driven.

Questions to consider:

  • Who will lead and coordinate implementation across systems?
  • How will roles and responsibilities be shared?
  • What data or evaluation tools will be used to track progress?
  • How will feedback and learning be used to improve the program over time?
  • What process is in place for identifying and correcting unintended harms?

Step 6. Implement, learn, and adapt

Implementation should include a standing review meeting—monthly during pilots—to compare data to benchmarks, discuss new risks, and document lessons learned. Decisions about scaling up, sustaining, modifying, or stopping an initiative should be based on those reviews, not on intuition or politics. Publish concise progress reports regularly so partners and the public can follow the evidence and stay invested.

Questions to consider:

  • Are we meeting regularly enough to detect problems early and adjust accordingly?
  • What evidence or benchmarks will guide decisions about scaling, modifying, or discontinuing the intervention?
  • How will we document lessons learned so they meaningfully inform future decisions?
  • Are any political, organizational, or resource pressures influencing implementation decisions?
  • How will we ensure transparency so partners and the public can track progress?
  • Do we have a clear process for deciding when and how to adapt the approach if circumstances change?

Hypothetical example: applying the framework to post-release overdose prevention

This section demonstrates how the Interdisciplinary Harm Reduction Framework can be applied to a real-world issue: preventing overdose deaths among people recently released from prison.

Step 1. Define the policy problem and desired outcome

In this example, we begin with a clear definition of the harm to be addressed, which is the sharp rise in overdose deaths that occurs in the first two weeks among those released from prison, a period when overall mortality can be up to 10 times higher than in the general population and overdose deaths up to 15.5 times higher. 

In one Colorado cohort of 905 people released from state prison, nearly 78 percent of people had a chronic medical or psychological condition, yet only about 10 percent had even a single outpatient visit within 30 days of release, and only 31 percent used any health service at the main safety-net system within 180 days. Upon release, individuals frequently face delays in reinstating Medicaid coverage, securing stable housing, or reconnecting with treatment providers secondary to loss of access to medication, housing, or support networks they once had, thereby disrupting the continuity of care. 

These administrative and logistical barriers create dangerous interruptions in care precisely when overdose risk is highest. Using the framework, policymakers first define the problem as avoidable harm linked to gaps in post-release coordination. The desired outcome might be to reduce fatal and non-fatal overdoses within 90 days of release and increase access to and voluntary use of medication for opioid use disorder (MOUD).

Applying the principle of outcome-informed decision-making, the team might identify measurable targets as: (1) a 15 percent reduction in 90-day overdoses; (2) a 20 percent increase in MOUD initiation within 14 days of release; and (3) a decrease in emergency department visits or emergency calls related to overdose. These outcomes are clear, evidence-based, and trackable across systems.

Step 2. Map the systems and actors involved

Mapping this issue involves correctional health, probation, public health, community clinics, pharmacies, emergency medical services, and peer recovery organizations. It demonstrates that, while each system plays a role, none are responsible for the transition from custody to care, revealing a high-risk gap in care upon prisoner release.

To operationalize the principle of cross-disciplinary collaboration, the example establishes a shared governance model for addressing the target problem. The sponsor (county public health) holds decision-making authority and funding. The accountable lead (correctional health) manages daily coordination. The data steward and evaluator ensure data integrity and oversight. The team also establishes a steering committee composed of representatives from each lead agency, the data steward, and a community advisor. The committee oversees progress, reviews data, and ensures that decisions remain transparent and evidence-based throughout the project. This clear structure transforms the mapping exercise into a functional plan for coordination.

This shared governance structure reflects real-world models that have already reduced deaths after release from prison. For example, Rhode Island’s statewide corrections-based MOUD program is sponsored by a cross-agency overdose task force, with the Department of Corrections as the operational lead and community treatment providers and public health officials jointly responsible for data and evaluation. In that program, everyone entering custody is screened for opioid use disorder, offered all forms of medication treatment while incarcerated, and connected to community clinics and Medicaid coverage before release. Evaluations found that this coordinated approach was associated with a roughly 61 percent reduction in overdose deaths among people recently released from incarceration and a 12 percent decline in overdose fatalities statewide, illustrating how clearly defined roles, shared accountability, and continuous data review can translate into measurable reductions in avoidable harm.

Step 3. Identify points of risk, friction, and missed opportunity

Once the systems are mapped, policymakers can then identify key friction points where harm accumulates. In our example, the team identifies significant harm associated with evening releases that occur after treatment clinics and community providers have closed, leaving individuals without immediate access to medication or follow-up care; inconsistent naloxone access; inadequate data exchange between correctional facilities, community health providers, and social service agencies; and stigma encountered during the initial stages of treatment engagement in the community. 

Each of these issues is logged in the risk register with ratings for likelihood and impact, early indicators, and assigned mitigation responsibilities. For example, risks tied to evening releases may be reduced through partnerships with mobile response teams, while data-related risks are mitigated by implementing role-based access to shared records to protect privacy and improve continuity of care. This keeps risk management transparent, targeted, and proportionate to actual harm.

Step 4. Apply the framework’s principles to decision areas

This step illustrates how the framework’s principles inform design choices:

  • Outcome-informed decision-making anchors each intervention to a specific measure.
  • Risk-minimization keeps the focus on key transition moments without adding barriers.
  • Individual autonomy ensures the program remains voluntary and participant-driven.
  • Targeted, context-specific solutions allow scheduling and staffing to adapt to local needs.
  • Cross-disciplinary collaboration connects correctional, clinical, and community systems.
  • Practicality and real-world application keep interventions feasible with existing resources.
  • Incentive alignment ties payments to performance measures, including successful post-release care coordination, treatment initiation, and retention in recovery services.

In the worked example, these principles directly shape the policy response: naloxone is offered at release, next-day MOUD appointments are reserved, peer recovery coaches facilitate linkage, and data dashboards track both health and justice outcomes.

Embedding input from people who have personally navigated the reentry process is also built into this step. The framework emphasizes participation from those most affected. In this example, individuals who have recently been released from custody review program materials, test the discharge workflow, and highlight gaps such as transportation and stigma.  Their feedback is formally documented and integrated into revisions, making engagement an accountability tool, rather than a symbolic exercise.

Step 5. Design coordinated interventions and evaluation plans

Here, the framework moves from planning to execution, including a pilot study for the proposed interventions. The mapped systems and agreed principles guide the design of an integrated pilot:

  • Screening and identification: At the time of incarceration, individuals are screened for opioid use risk during the correctional health intake process and monitored throughout custody and release.
  • Harm reduction at transition: Naloxone is provided at release, with a brief training before discharge.
  • Linkage to treatment: Peer recovery coaches meet people at release or within 24 hours to connect them with clinics.
  • Continuity of care: To prevent treatment interruption, pharmacies issue short-term bridge prescriptions, which are temporary supplies of medications like buprenorphine, to cover the period between release and a confirmed clinic appointment.
  • Monitoring and evaluation: Public health and correctional partners share de-identified data through a secure dashboard.

Evaluation follows the framework’s rule of evidence before expansion. The pilot uses a stepped-wedge design, which means the program is rolled out in phases—starting with one jail and gradually expanding to the others. This allows researchers to compare outcomes before and after implementation at each site and see whether improvements, such as fewer overdoses and stronger treatment connections, are linked to the program rather than other changes over time.

Step 6. Implement, learn, and adapt

The final stage in the framework emphasizes learning as an ongoing function. In the worked example, the steering committee meets monthly to review performance data, risk indicators, and community feedback. New challenges, such as transportation gaps or clinic delays, trigger minor course corrections. Decisions to expand, sustain, or stop the intervention depend entirely on whether the predefined data-driven outcomes are met, ensuring that changes are based on evidence rather than assumptions. Transparent reporting ensures that progress, setbacks, and adaptations are documented and shared with partners and the public.

Outcome of the example

If the coordinated pilot is implemented effectively, the county might see promising indicators within the first year—more people accessing treatment, fewer overdose-related emergency responses, and improved coordination across systems.

However, if these outcomes do not materialize, the framework still provides a structure for identifying where breakdowns occurred, what barriers—political, operational, or resource-related—interfered, and how the approach should be adapted or scaled back. The purpose of the example is to illustrate how the framework guides both improvement and course correction.

Final note for policymakers and advocates

This framework is both a mindset and a method. It encourages policymakers to move beyond assumptions toward evidence, collaboration, and continuous learning. By clearly defining harms, designing proportionate responses, measuring outcomes, and adjusting based on results, public systems can reduce avoidable suffering and wasted public resources while preserving choice, privacy, and dignity.

The goal is progress that is practical, measurable, and humane. When public responses expressly recognize that knowledge is dispersed across individuals and institutions, approaches can be tested through evidence and refined through feedback, officials are able to not only reduce harm but also strengthen trust and accountability across every system they touch.

The post Interdisciplinary harm reduction: A practical guide appeared first on Reason Foundation.

]]>
Why teacher salaries are stagnant https://reason.org/commentary/why-teacher-salaries-are-stagnant/ Thu, 04 Dec 2025 11:00:00 +0000 https://reason.org/?post_type=commentary&p=87044 That teachers’ wages have stagnated over two decades of growth in public school funding highlights deep structural problems in K–12 finance.

The post Why teacher salaries are stagnant appeared first on Reason Foundation.

]]>
A large body of research shows that effective teachers are the most important school-related factor in determining student success, making teacher compensation a key policy lever. “We need to pay all teachers more—and effective teachers even more,” said Heather Peske of the National Council on Teacher Quality in a recent SCHOOLED debate on teacher pay.

Peske has a point. The nationwide average teacher salary fell by over 6 percent between 2002 and 2022, going from $75,152 to $70,548 in 2023 dollars, according to new Reason Foundation research. In total, inflation-adjusted teacher salaries fell in 40 of 50 states, as shown in Table 1.

It’s also a fact that teacher salaries are tied to educational attainment and years of experience, meaning that high-performing teachers—and those in shortage areas like math, science, and special education—aren’t paid more for their results or expertise.

To put the right incentives in place, bold teacher pay reforms are needed. But to maximize the long-term impact of these policies, it’s important to address the root causes of stagnant salaries. Examining data both before and after the COVID-19 pandemic reveals structural problems in K–12 finance that keep dollars out of teacher paychecks.

Table 1: Inflation-adjusted average teacher salary growth (2002 to 2022)

1

Teacher salaries before COVID-19

Nationwide, inflation-adjusted teacher pay was flat in the nearly two decades leading up to the pandemic, with the average salary falling by 0.6 percent between 2002 and 2020. While a handful of states saw big swings—salaries rose by 22 percent in Washington while falling 19 percent in Indiana—most states saw moderate changes, ranging from -5 percent to +5 percent over that period.

Remarkably, teachers’ salaries weren’t increasing at a time of unprecedented growth in public school funding, which rose by 25 percent per student as all but one state boosted K–12 spending from 2002 to 2020. Figure 1 shows the growth in K–12 revenue and teacher salaries during this period.

With education funding at record levels, why wasn’t more money going to teacher paychecks?

Figure 1: U.S. revenue per student growth vs. average teacher salary growth (2002-2020, inflation-adjusted)

2

A surge in non-teaching staff

One reason teacher pay didn’t grow with K–12 spending is that public schools spent heavily on hiring non-teaching staff. From 2002 to 2020, as student enrollment grew by 6.6 percent, non-teaching staff expanded by 20 percent. For every five new students, public schools added about one non-teacher. In comparison, the number of classroom teachers rose by 6.6 percent—mirroring enrollment growth, but well below growth in all other staff.

Figure 2 shows the growth of public-school staff by position type. The largest growth category was in student support, which includes social workers, psychologists, speech-language pathologists, and other positions. A nearly identical number of instructional aides—paraprofessionals who assist teachers and often work with students with disabilities—were also added to public school payrolls. Taken together, the data suggest that special education and a greater emphasis on wraparound services played large roles in the growth of non-teaching staff.

Figure 2: Public school staffing growth by position type (2002–2020)

3

Rising teacher pension debt

Another expense that diverted funding from teacher salaries was employee benefits, a Census Bureau category that includes pensions, Social Security, health insurance, and other costs. Between 2002 and 2020, benefit spending per student rose by 79 percent in inflation-adjusted dollars. While salary and wage spending increased by $573 per student, benefit spending increased by $1,745 per student. Research indicates that rising teacher pension costs were the primary factor behind this trend. States have failed to set aside enough money to pay for the pension benefits promised to teachers, resulting in unfunded liabilities that accumulate over time.

Pension debt can add up to big bucks: In 2024, the Teacher Retirement System of Texas had an estimated $62.6 billion in unfunded liabilities, while the California State Teachers’ Retirement System had $85.5 billion in debt. These costs are usually paid for by increasing school districts’ and teachers’ contribution rates to the pension plans. As a result, K–12 funding that might go to salaries has increasingly been directed toward pension costs, even as many states have reduced teachers’ retirement benefits.

Before the pandemic, the story was relatively straightforward: Teacher salaries stagnated despite significant increases in public school funding, primarily because funds increasingly went to hire non-teachers and cover unfunded pension liabilities. After the pandemic began, however, a different story emerged.

Teacher salaries after COVID-19

Teacher pay now plunged, falling 5.6 percent in fiscal year 2023 dollars, from $74,698 in 2020 to $70,548 in 2022. But this wasn’t because more dollars were going to new hires or benefit spending—the number of non-teachers dropped by 2 percent (before again rising sharply in 2023), and real expenditures on employee benefits inched up by $39 per student. While teacher turnover during the pandemic might have played a part, inflation during those years took a big bite out of teacher paychecks.  

After the onset of COVID-19, price growth reached levels not seen since the 1980s. “We currently face macroeconomic challenges, including unacceptable levels of inflation,” said Janet Yellen, who was Treasury secretary at the time.

The price level during the 2022 school year was nearly 10 percent higher than just two years earlier. Large pay bumps were needed to keep pace with inflation, far more than the usual 3 percent or 4 percent that districts typically dole out. Yet this didn’t happen.   

While school districts weren’t exactly strapped for cash—education funding rose by 7.4 percent between 2020 and 2022, reaching a new record of $20,097 per student in 2022—most of this $1,386 per-student increase came from federal Covid-19 relief funds. And because these dollars were temporary, many districts were hesitant to allocate them to long-term commitments such as teacher salary increases. Instead, they opted to spend the federal funding on things like building renovations, tutoring, and one-time bonuses.

Groups such as the American Federation of Teachers and the National Education Association lobbied hard for the stimulus funding that ultimately squeezed teachers’ purchasing power. The consensus among economists is that Covid-19 fiscal stimulus—including the $2.2 trillion Coronavirus Aid, Relief, and Economic Security (CARES) Act signed by President Donald Trump and President Joe Biden’s $1.9 trillion American Rescue Plan, which sent $122 billion to public schools—helped lift inflation to historic levels. Public school lobbyists won the battle for pandemic relief funding, but that money didn’t increase teachers’ take-home pay even as inflation cut their purchasing power. 

Conclusion

Public school staffing decisions and rising pension debt led to teacher salary stagnation in the years leading up to COVID-19. While teacher take-home pay failed to keep up with inflation during the pandemic, the rise in price levels was atypical—and due in part to the stimulus spending that teachers’ unions lobbied for.

For policymakers looking to boost teachers’ salaries today, states like Texas and Arkansas offer bold ideas for targeting dollars on effective teachers and those teaching in shortage areas. But to maximize the long-term impact of such reforms, they’ll also need to pay down pension debt, examine special-education costs, and encourage school districts to prioritize teacher pay over other expenses. That teachers’ wages had stagnated over two decades of unprecedented growth in public school funding highlights deep structural problems in K–12 finance that shouldn’t be ignored.

A version of this column first appeared at The Thomas B. Fordham Institute.

The post Why teacher salaries are stagnant appeared first on Reason Foundation.

]]>
San Diego’s government needs more competition, not more taxes https://reason.org/commentary/san-diegos-government-needs-more-competition-not-more-taxes/ Wed, 03 Dec 2025 11:00:00 +0000 https://reason.org/?post_type=commentary&p=87063 San Diego’s rising pension costs and mounting long-term debt are creating significant budget pressures that have city officials turning to tax and fee increases.

The post San Diego’s government needs more competition, not more taxes appeared first on Reason Foundation.

]]>
San Diego’s rising pension costs and mounting long-term debt are creating significant budget pressures that have city officials turning to tax and fee increases, such as the recently imposed trash fee on many San Diego property owners.

San Diego’s $2.5 billion unfunded pension liability accounts for about 40% of the city’s total $6.8 billion debt. In the 2024 fiscal year, the city paid about $500 million in pension contributions, nearly 60% of which goes toward paying down pension debt. With unfunded pension liabilities and the city facing $540 million in forecasted budget deficits in the coming years, it has imposed a wave of unpopular taxes and fees on solid waste, parking, hotel stays, and more. Today, it is more than justifiable for taxpayers to question whether local leaders have given sufficient consideration to spending reductions and service streamlining relative to raising taxes and fees.

San Diegans felt similarly in 2006 when they passed Proposition C, authorizing “managed competitions” in which private companies would compete against city workers to reduce costs and increase innovation in delivering city services.

Managed competition can be transformational, generating cost savings of 5% to 20%. This technique has been successfully implemented in numerous states and cities, including Phoenix, Charlotte, and Indianapolis.

Former Indianapolis Mayor Stephen Goldsmith told Governing that the robust managed competition program implemented during his terms in the 1990s created over $400 million of value for city taxpayers. In Florida, former Gov. Jeb Bush’s administration conducted more than 100 managed competition initiatives that saved taxpayers more than $550 million.

But managed competition requires leadership committed to driving ongoing improvements, results and value. Politics and special interests can make managed competition challenging to sustain over time, as happened in San Diego.

City leaders didn’t fully embrace the managed competition effort, dragging out its implementation for years. Then they tended to give public employees special treatment relative to private firms regarding their cost estimates to deliver services, projections of future service demand, and the ability to penalize underperformance. It wasn’t a level playing field for private firms, and the city wasn’t pushing for efficiencies from government agencies. Rather than build the capabilities to improve these things, San Diego stopped trying.

From a taxpayer perspective, it is time to give competition another chance.

The rising costs of residential solid waste are a prime example. The private sector already picks up 70% of San Diego’s solid waste from businesses and apartments, with the city’s solid waste operation collecting the remaining 30% from single-family neighborhoods and multiplexes. The city’s costs have gotten so high that it is imposing a new solid waste tax on homeowners. San Diego did not see fit to test the market with the private firms that perform the same job in surrounding cities at significantly lower costs.

Phoenix has been applying managed competition in residential solid waste collection since 1979, dividing the city into zones and competing trash service in each zone every six years. In 2011, Phoenix’s Public Works Department told Government Technology that competition had generated $38 million in cumulative savings to that point.

San Diego’s leaders owe it to taxpayers to test the market and ensure that city workers are performing their jobs at maximum efficiency and at the lowest possible cost. Frustrated San Diegans rightfully wonder why the city didn’t implement this approach instead of raising taxpayers’ costs with the trash fee while continuing to do business with the same city employees. Worse, city officials are executing this implicit city job protection program at a time when every worker hired is adding significant costs and financial risks to San Diego’s already underfunded pension system.

Two decades ago, San Diego’s financial mismanagement earned it the moniker “Enron-by-the-Sea” and prompted taxpayers to demand procurement and pension reforms to save money. But as things improved over time, the city abandoned competition. And after state courts blocked a pension overhaul approved in a landslide in 2012, elected leaders ignored residents’ wishes and made no effort to craft a similar reform.

Instead of asking taxpayers to pay more taxes and fees to cover the city’s spending and debt, San Diego should give managed competition a fair chance to see if government agencies can improve efficiency or if the private sector can deliver better services at lower costs.

A version of this column first appeared at The San Diego Union-Tribune.

The post San Diego’s government needs more competition, not more taxes appeared first on Reason Foundation.

]]>
The ROAD to Housing Act carries promise but risks bureaucratic expansion https://reason.org/commentary/the-road-to-housing-act-carries-promise-but-risks-bureaucratic-expansion/ Wed, 03 Dec 2025 10:00:00 +0000 https://reason.org/?post_type=commentary&p=87149 While this approach may seem like a balanced first step, it raises important questions about how far federal agencies should go in shaping local decisions.

The post The ROAD to Housing Act carries promise but risks bureaucratic expansion appeared first on Reason Foundation.

]]>
Continuing concerns over high home prices have prompted Congress to consider federal solutions. The “Renewing Opportunity in the American Dream to Housing Act of 2025,” ROAD to Housing Act, is a broad bipartisan housing bill proposing several responses to the persistent housing shortage. The Senate passed it after it was incorporated into the National Defense Authorization Act in October, and it is now awaiting approval by the House. The bill’s bipartisan support highlights the urgency of the housing crisis; however, many analysts caution that expanding the federal role in land use carries risks that deserve scrutiny.

One reason the act has gained support is that it avoids preempting local zoning authority outright. Instead of overriding local control, the act focuses on research, guidance, and incentives for localities that choose to reform their zoning and regulatory frameworks. While this approach may seem like a balanced first step, it raises important questions about how far federal agencies should go in shaping local decisions. Incentives and guidance can easily evolve into indirect pressure, administrative burdens, or expectations that narrow the flexibility of states and localities. Even if all the bill’s provisions are implemented, it will not, on its own, significantly reduce price pressures. States and local governments still must reform their restrictive systems.

The ROAD to Housing Act utilizes a range of policy tools, grouped into four general categories: mandated reports, financial incentives for regulatory reform, adjustments to housing finance programs, and updates to existing federal supply-side initiatives. This bill takes the unusual step of focusing on expanding housing supply before turning to subsidy-heavy approaches, which marks a shift from many earlier federal housing proposals. Even with this emphasis on supply, the breadth of the bill makes it difficult to evaluate as a cohesive policy approach, and combining many unrelated programs into a single package increases the risk of mission creep, a problem common across federal housing initiatives.

New reports

A major component of the ROAD to Housing Act is its mandate for a series of reports from the Government Accountability Office (GAO) and the Department for Housing and Urban Development (HUD). Many of the reforms highlighted in these guidelines are supported by evidence, including reducing minimum lot sizes, parking reform, allowing accessory dwelling units (ADUs), and streamlining both zoning and building codes. Collectively, these reports would be mandated by the Housing Supply Frameworks Act. The Housing Supply Frameworks Act is the portion of the broader bill that directs GAO and HUD to develop these reports and model guidelines, essentially serving as the research and planning section within the ROAD to Housing Act. However, federally curated guidance often becomes an informal standard that localities feel pressured to follow, even when local conditions differ. Analysts at institutions focused on federalism have frequently warned that benchmarking and advisory frameworks can grow into de facto expectations that add new bureaucratic oversight without meaningfully accelerating supply.

However, requiring research and monitoring by HUD and GAO into these reforms is not equivalent to enacting them. Local governments must implement these changes to enable supply adjustment, and that is where they are likely to encounter resistance. Knowing these barriers, this act goes one step further to nudge local governments toward enacting these proposed reforms.

Federal financial incentives for reform

Beyond requiring research, the ROAD to Housing Act establishes several incentives to local governments that expand their housing supply. Most notably, it establishes a $200 million “Innovation Fund,” which will be awarded annually by HUD to local governments that demonstrate measurable supply expansion from 2027 to 2031. Grants will range from $250,000 to $10 million and be awarded to no fewer than 25 recipients annually.

This could encourage cities to take on politically difficult zoning reforms. However, federal grants can also cause jurisdictions to prioritize actions that maximize eligibility rather than reforms that address the most significant structural barriers. Jurisdictions may make symbolic or superficial changes to qualify for funding while avoiding deeper reforms that could truly expand housing options. There is also the possibility that some jurisdictions will benefit from market-driven supply increases unrelated to any policy change, while others with genuine constraints receive little or no support.

In addition, the ROAD to Housing Act establishes several other grant programs to expand home supply through rehabilitation. Notably, the Whole-Home Repairs Act and the Revitalizing Empty Structures Into Desirable Environments (RESIDE) Act give grants and forgivable loans to low-income homeowners and small landlords looking to repair old or dilapidated structures, through differing avenues and terms. Further, under the Accelerating Home Building Act grants are provided to local governments to develop pre-approved designs. These grant programs are also to be administered through HUD. 

Rehabilitation programs help preserve aging housing and prevent the loss of existing units. Still, they do not meaningfully expand overall supply in markets where zoning and permitting rules limit the addition of new homes. The act also supports pre-approved building designs through the Accelerating Home Building Act. These efforts may help simplify parts of the construction process, but without broader zoning reform, pre-approved plans will not significantly expand supply. HUD’s growing portfolio of grant programs also raises concerns about administrative complexity.

Mortgage reform

The ROAD to Housing Act includes several demand-side tweaks to the existing housing finance landscape to aid accessibility. Included as part of the act are incentives to increase the role of small-dollar loan originators and the expansion of Title I loans to cover the construction of accessory dwelling units (ADUs) and the purchase or improvement of manufactured homes. Further, this act expands existing financial literacy programs.

While these may help certain borrowers, demand-side tools do not directly address the primary driver of high prices: inadequate supply in many communities. If supply does not increase, new lending programs can unintentionally raise prices by boosting purchasing power without increasing the number of available homes. Because the act also aims to encourage supply-side reform, the risk is smaller than in past demand-driven programs, but it still warrants caution.

Reforming existing housing programs

Finally, this act makes several positive adjustments to existing housing programs. For example, it lifts the cap on the Rental Assistance Demonstration (RAD) program, which allows local Public Housing Authorities (PHAs) to convert public housing into privately-managed Section 8 housing and is largely beneficial for tenants. Further, through the Build Now Act, it ties community block grants, one of the largest federal affordable housing and development grants, to broader housing supply, thereby again incentivizing land-use liberalization. Regarding private investments in affordable housing, it raises the cap on public welfare investments by banks, many of which directly support affordable housing initiatives.

This could encourage better land-use regulation, but it also imposes additional conditions on one of the largest federal development programs. The expansion of caps on public welfare investments for banks will likely increase private capital in affordable housing projects, though it also raises questions about the growing federal influence over private investment decisions.

Conclusion

Taken together, these provisions aim to connect federal programs more directly to local regulatory reform and affordable housing investment. The intent is to support voluntary action rather than mandate it. However, there is a real risk that expanding federal incentives, guidance, and grant programs will overshadow the need for comprehensive local reform. A meaningful improvement in housing affordability still depends on states and cities reducing exclusionary zoning, shortening permitting timelines, and updating outdated building codes. The ROAD to Housing Act identifies many contributors to high housing costs and encourages local governments to take action. The bill includes several positive elements, especially the emphasis on zoning reform and regulatory streamlining. At the same time, it carries risks of administrative expansion, program duplication, and indirect federal involvement in land use decisions. A balanced assessment should highlight both the promise and the pitfalls of the act. Federal guidance and financial incentives can only support affordability if they help remove barriers to housing expansion rather than add new layers of oversight. Genuine progress requires local and state governments to confront and reform the regulatory barriers that continue to limit housing supply

The post The ROAD to Housing Act carries promise but risks bureaucratic expansion appeared first on Reason Foundation.

]]>
Proposed Model Policy: “Veterans Mental Health Innovations Act”  https://reason.org/backgrounder/proposed-model-policy-veterans-mental-health-innovations-act/ Wed, 03 Dec 2025 00:05:00 +0000 https://reason.org/?post_type=backgrounder&p=87225 This model legislation is intended to authorize state ibogaine research and participation in a larger multistate effort to complete a supervised clinical drug trial.

The post Proposed Model Policy: “Veterans Mental Health Innovations Act”  appeared first on Reason Foundation.

]]>
Ibogaine is a psychoactive substance that a growing body of research shows can help treat opioid use disorder, traumatic brain injury, depression, and post-traumatic stress disorder by physically repairing damaged brain tissue. This model legislation is intended to authorize state ibogaine research and authorize participation in a larger multistate effort to complete a Food and Drug Administration (FDA) supervised clinical drug trial.

The trial would seek approval of ibogaine as a treatment for opioid use disorder, depression, post-traumatic stress disorder, and other behavioral health conditions, especially those suffered by military veterans. If the FDA approves ibogaine to treat a medical condition, the legislation would allow licensed physicians to prescribe ibogaine administration for a patient under supervision.  

Download this Resource

EXPLAINER: Veterans Mental Health Innovations Act

Reason Foundation

Thank you for downloading!

Please provide your work email address to access this report:
This field is hidden when viewing the form

The post Proposed Model Policy: “Veterans Mental Health Innovations Act”  appeared first on Reason Foundation.

]]>
Surface Transportation News: Key Bridge replacement costs soar https://reason.org/transportation-news/key-bridge-replacement-costs-soar/ Tue, 02 Dec 2025 19:28:13 +0000 https://reason.org/?post_type=transportation-news&p=87153 Plus: Fixing the Highway Trust Fund, Spain de-tolls motorways resulting in problems, and more.

The post Surface Transportation News: Key Bridge replacement costs soar appeared first on Reason Foundation.

]]>
In this issue:

More Troubles for the Key Bridge Replacement

Things are not looking good for a speedy replacement of the destroyed Francis Scott Key Bridge in Baltimore. To begin with, last month, the National Transportation Safety Board (NTSB) cited Maryland officials’ failure to conduct a critically important risk assessment (based on guidelines from the American Association of State Highway and Transportation Officials) on the adequacy of bridge protections from collisions with major ships.

NTSB correctly identifies the Maryland Transportation Authority (MDTA) as having been at least partly responsible for the bridge’s collapse. NTSB noted that countermeasures such as “dolphins” could have been implemented if MDTA had performed the AASHTO risk assessment. As I have reported previously in this newsletter, MDTA also ignored “repeated warnings” from the Baltimore Harbor Safety and Coordination Committee about the lack of meaningful protection of the bridge piers. I believe it can be argued this is what attorneys call “contributory negligence.”

The second bad news was that the estimated cost of the replacement bridge will be between $4.3 billion and $5.2 billion, much higher than the previous estimate of $1.7 to $1.9 billion. The reasons for this include the fact that the new bridge will have a longer span, will be much higher, and (of course) have pier protections. I think Maryland officials should be taken to task for this. First, they claimed that the bridge would be a simple “replacement” of the old bridge, and therefore no environmental impact study would be needed. But then they went ahead and developed specifications for a very different and obviously much more costly bridge.

Politico recently reported that Senate Environment & Public Works Committee Chair Shelley Moore Capito (R-WV) is outraged by this double-cross, given Congress’s over-hasty commitment to paying 100% of the replacement bridge’s cost. In relating her conversation about this with Gov. Wes Moore, she told Politico that, “I felt it was unfair for Maryland to ask for 100 percent on $1.7 billion, when now it’s $5.2.”

Moore said that, at this point, she would not be leading a charge to alter the federal commitment, which she said would need to clear the 60-vote filibuster threshold in the Senate.

My own view is that, due to its contributory negligence in not protecting the Key Bridge piers, in no way should all U.S. taxpayers be on the hook for the new bridge’s construction cost. Maryland should provide funds based on the following sources:

  • The amount of revenue bonds it could issue based on reinstating tolls on the new bridge;
  • Proceeds from its own bridge insurance policies; and,
  • Proceeds from the shipping industry’s insurance pools, which are capable of providing up to $3.1 billion per ship collision.

As Rep. John Garamendi (D-CA) told Bloomberg TV last year, “I don’t think this has to be federal taxpayer money. Let’s go first to the insurance side of it, and then we’ll see what’s left over.”

» return to top

Spain De-Tolls Motorways; Problems Ensue

In 2018, the national government of Spain began de-tolling the country’s long-distance motorways, in an apparently populist move to make all of its extensive highway system (the world’s third-largest) available for free. The consequences were not exactly what the government expected.

Until 2018, nearly all the major motorways were operated and managed via long-term public-private partnerships (P3s). The motorway companies charged tolls, which paid for improvements as well as operating and maintenance costs. They also paid corporate taxes to the national government.

How the government is de-tolling is by failing to renew these long-term P3s as they reach their final year. Once a P3 is terminated, the tolls are removed, and the obvious consequence is that far more cars and trucks move onto the “free” motorways. The initial de-tolling has led to nearly 40% more personal vehicles and 89% more trucks. Most of these increases were from nearby roadways, but in the freight sector, some of the increased truck traffic has been a shift from rail to truck.

Thus far, according to Julian Nunez, head of the Spanish Association of Construction and Infrastructure Concession Companies, the government is losing €409.8 million per year in tax revenue from the former tollway operators and spending an additional €89.7 million per year in motorway maintenance costs. And this is just the beginning. In 2029, three more long-term P3 agreements are set to expire, potentially de-tolling another 527 km of motorways.

Nunez points out that because there is no dedicated fuel tax to pay for highways in Spain, all the cost of building, upgrading, and maintaining de-tolled motorways comes from the national government’s general budget. By contrast, users of Spain’s railways pay €690 million in taxes per year, maritime transport pays €515 million per year, and airport users pay €2.24 billion per year. But users of the de-tolled motorways pay nothing.

The motorway association has proposed to the government a replacement tolling plan for the entire 13,000 km motorway system. Under the plan, light vehicles would pay €0.03 per km and heavy vehicles €0.14 per km. This plan has also been submitted to the European Commission. That plan includes over €18 billion in motorway investments. It also proposes new long-term P3 concessions with 25-year terms.

Nunez says the Spanish government appears to be awaiting support for the plan from the European Commission before making any decisions. But it’s pretty clear that the government did not think through the consequences of de-tolling the country’s motorways.

» return to top

Brightline West: Xpress West Reborn

Engineering News-Record reported in its Oct. 22 issue that the estimated cost of the planned Brightline West high-speed rail line between Las Vegas and Rancho Cucamonga, CA, has ballooned from $12 billion (the Dec. 2023 estimate) to $21.5 billion. To cover most of the shortfall, the company has applied for a $6 billion loan from the federal Railroad Rehabilitation and Improvement Financing (RRIF). That would be a very high-risk loan.

On Nov. 17, Infralogic, an infrastructure finance newsletter, reported that Brightline West is in deep financial trouble, with mandatory redemption of $2.5 billion in revenue bonds that were due by Nov. 30 and many other dire problems. (See “Brightline West Faces USD 2.5 Billion Bond Redemption Amid Financial Uncertainty—2Q Credit Report”)

We’ve seen this high-speed rail story before, and it did not have a happy ending. The previous attempt to provide a privately financed high-speed rail line between Las Vegas and (in this case) Victorville was called Xpress West. In Aug. 2012, Reason Foundation published “The Xpress West High-Speed Rail Line from Victorville to Las Vegas: A Taxpayer Risk Assessment,” authored by consultant Wendell Cox. Like Brightline West, it planned to use right-of-way in the median of I-15, the primary highway route between Southern California and Las Vegas (which would make future expansion of that highway far more expensive).

The report assessed a number of risks, but the most serious was a speculative consumer market. “There is no parallel for large numbers of drivers and airline passengers to travel well outside the urban areas in which they live to connect to a train to any destination, much less one so close to Southern California as Las Vegas.”

Hence, ridership and revenue would likely be a fraction of what Xpress West projected, making repayment of its federal loan difficult, if not impossible. The study also pointed out that there are six commercial airports throughout the LA metro area that are far more convenient for most Las Vegas-bound travelers than driving out to Victorville. And those air fares are very economical. Hence, the Express West traffic and revenue numbers were highly exaggerated.

The story did not have a happy ending for Xpress West. Like Brightline West, it had applied for a federal RRIF loan. In March 2013, Rep Paul Ryan, then chair of the House Budget Committee, and Sen. Jeff Sessions, ranking member of the Senate Budget Committee, sent a letter to Transportation Secretary Ray LaHood opposing the RRIF loan. They also asked the Government Accountability Office to evaluate the project. Those actions led to a U.S. Department of Transportation (DOT) letter on June 28, 2013, rejecting their RRIF loan request. And that was basically the end of Xpress West, though it lingered on for a number of years trying to find other funding.

Brightline West, with its much higher estimated cost and similarly dismal ridership potential, is likely not much longer for this world.

» return to top

Fixing the Highway Trust Fund
By Marc Scribner

In November, the Tax Foundation released a new report by Alex Muresianu and Jacob Macumber-Rosin, “How to Refuel the Highway Trust Fund.” Their brief focuses on the federal Highway Trust Fund’s (HTF’s) persistent structural deficit and examines four alternatives that could eliminate the revenue-outlay imbalance. While these are not the only options for addressing the HTF’s fiscal problems, they are under somewhat serious political consideration. Most importantly, the authors’ comparative analysis accurately highlights the advantages and disadvantages of each approach. However, some of their policy conclusions and recommendations will generate criticism even from those who directionally agree with them.

Muresianu and Macumber-Rosin examine four potential revenue fixes to the Highway Trust Fund, assuming continued baseline growth of HTF expenditures:

  • Option 1: Replacing all existing HTF taxes with mileage-based user fees (MBUFs);
  • Option 2: A combination of replacing existing truck taxes (including the diesel tax) with a truck MBUF, establishing a new flat registration fee on electric vehicles (EVs), and increasing the gas tax;
  • Option 3: Raising gas and diesel taxes and indexing the rates to inflation; and
  • Option 4: Replacing all existing HTF taxes with flat registration fees.

The Option 1 full MBUF approach uses a rate schedule dictated by a gross vehicle weight rating formula (see Appendix Table 1), which is in part based on Oregon’s existing weight-distance tax for heavy trucks. Adopting this rate schedule would roughly double the per-mile tax liability on commercial trucks, while gas-powered passenger cars would face a tax burden similar to what they pay today. This approach would be propulsion-neutral, so hybrid-electric vehicles and EVs would pay to support the system they currently use and future-proof it for any subsequent advances in vehicle propulsion technologies.

The Option 2 hybrid approach recognizes that scaling an MBUF regime for all vehicles may be administratively or politically challenging. So, the authors propose instead to impose MBUFs on heavy trucks only, add a $100 annual fee for EVs, and increase the gas tax by 2 cents, each of which would be indexed to inflation. Muresianu and Macumber-Rosin estimate that total HTF revenue would grow slightly slower under Option 2 than under Option 1, but it would still be sufficient to cover baseline HTF expenditures over the next decade.

Option 3 is the most “conventional” of the alternatives: simply raising fuel tax rates and indexing them to inflation. This has long been proposed in Congress, but raising a tax on nearly all Americans has rendered it a political dead-end. The Tax Foundation proposal would increase gas tax rates from 18.4 cents to 28 cents per gallon and diesel tax rates from 24.4 cents to 40 cents per gallon. Out of the four options, this approach scored the worst. While a large fuel tax increase would be sufficient to cover baseline expenditures for a few years, it would fail to eliminate the HTF’s structural deficit because rising fuel economy and electrification are expected to dramatically decrease per-mile fuel tax collections going forward.

Option 4 is the most dramatic departure from the status quo: abolishing any tax relationship from the intensity of system use (i.e., gallons of fuel consumed while driving, miles driven) and imposing flat annual registration fees. Tax Foundation’s registration fee rate schedule (Appendix Table 2) is based on the gross vehicle weight rating formula from Option 1. Under this approach, a 4,000-pound passenger car would pay $68.14 per year, a 6,000-pound full-size SUV or light-duty pickup would pay $118.84, and an 80,000-pound Class 8 semi-truck would pay $7,354.31.

Replacing existing HTF taxes with registration fees has been proposed by the American Highway Users Alliance (NAPA testimony, page 5), under which most passenger cars would pay $135 per year, large SUVs and pickups would pay $165, and the heaviest trucks would pay $4,600. There are clearly vast differences in who would bear the burden in these registration fee proposals, with the Tax Foundation concentrating tax liability on heavy trucks that cause most of the wear and tear on roads and the American Highway Users Alliance shifting the burden to smaller passenger vehicles.

This question of who wins and who loses in a registration fee scheme would likely become a major source of political controversy. The concept itself faced strong backlash earlier this year when House Transportation and Infrastructure Committee Chairman Sam Graves attempted to attach his own registration fee proposal to the Republican reconciliation bill, which was quickly rejected as a new “car tax.” Under the Tax Foundation’s Option 4, registration fees also appear to be a weak revenue-raiser, with HTF baseline outlays exceeding projected revenue by year eight of the 10-year budget window.

Muresianu and Macumber-Rosin conclude that the Option 1 full MBUF approach “is the most efficient and sustainable option for US highway funding amid rapidly changing markets and technologies. It best achieves the user-pays principle, aligning taxes paid with actual road use, vehicle weight, and infrastructure costs.”

However, they acknowledge that a national MBUF system for all vehicles would be difficult to establish and administer, with significant implementation and operating cost uncertainties. They suggest that the Option 2 hybrid approach—which would establish truck-only MBUFs and EV registration fees, as well as modestly increase the gas tax—would deliver most of the benefits of Option 1 with fewer policy challenges.

While it is certainly true that a national truck MBUF and EV registration fee is less complex to administer, the politics on the ground are less favorable to Option 2. The trucking industry has been clear that it will fiercely oppose any MBUF proposal that singles out trucks. As such, MBUF advocates for years have been stressing the importance of developing collection methods capable of scaling across the entire vehicle fleet. A truck-only MBUF could generate a political backlash that kills MBUFs for all. Tying this counterproductive strategy to another proven political lead balloon—federal gas tax increases, however modest—likely dooms not only Option 2 to failure, but potentially Option 1.

As unsatisfying as it may be, there are likely no politically viable Highway Trust Fund fixes that can sustain current baseline expenditures. Perhaps addressing excessive spending rather than insufficient revenue would be more fruitful. One option Muresianu and Macumber-Rosin did not consider is aligning HTF expenditures to expected tax receipts and then relaxing federal constraints on tolling and public-private partnerships. This would make states less dependent on federal-aid grants and expand the users-pay principle at the individual facility level. To be sure, fiscal restraint also faces strong immediate political headwinds, but it might prove to be the most realistic option as entitlement programs become insolvent and the national debt explodes as anticipated over the next decade, as we at Reason Foundation have suggested.

See Alex Muresianu and Jacob Macumber-Rosin’s full Tax Foundation analysis, “How to Refuel the Highway Trust Fund,” which is well worth reading.

» return to top

BBC Report Touts “Electrifying Rail”

In an odd news article, BBC technology reporter Chris Baraniuk wrote that passengers on a British train leaving Aldershot station may not notice a cluster of solar panels beside the tracks. But, he writes, they would be surprised to learn that “the train they are on is drawing power from it.”

Hurrah, the BBC seems to be reporting. And the headline writer penned the story’s headline as, “This is the big one—tech firms bet on electrifying rail.” Well, one of the great many things I learned by earning two engineering degrees from MIT is that solar power is very, very diluted. It might light up a few bulbs, but in no conceivable way could it power any train (apart from a model railroad).

But the story goes on to quote the co-founder of start-up company Riding Sunbeams, Leo Murray, who says, “On a sunny afternoon, if you are catching a train through Aldershot, a little bit of the energy for the train will come from those solar panels.” His company installed the solar panels beside the tracks in 2019. They produce 40 kilowatts on a sunny day. Murray adds, “If you are a railway, this is the cheapest energy you can buy.” Also, the most diluted.

So what is solar power actually used for?

It’s never made clear, but Murray is quoted as saying that his panels are the only solar array in the country that delivers power directly to the rail to move trains. Nowhere does the article explain how a tiny bit of electricity fed into the track can help power the train, which does not appear to be powered by electricity. Moreover, by paragraph 12, the story notes that solar panels produce direct current (DC) while overhead lines used to power trains use alternating current (AC). The piece goes on from there to discuss various electric-powered rail ideas in a number of European countries. But it never explains how the tiny bit of solar electricity connected to the track at Aldershot makes any difference at all.

» return to top

News Notes

Work Begins on $4.6 Billion Georgia 400 Express Toll Lanes
The Georgia Department of Transportation’s largest public-private partnership (P3) thus far got underway in October. The entirely privately funded project will add priced express lanes in both directions to sixteen miles of this north-south expressway in the Atlanta metro area. The toll-financed P3 project has a 50-year term. In addition to passenger vehicles, the express lanes will be used by a new MARTA Bus Rapid Transit line, and the P3 project will construct several BRT stations along the right of way. The P3 consortium, SR400 Peach Partners, is led by ACS Infrastructure, Acciona, and Meridiam.

Replacing the Cape Fear Bridge May Be a Toll-Financed P3
North Carolina DOT has concluded that the aging steel lift bridge across the Cape Fear River is functionally obsolete and needs to be replaced. With an estimated cost of $1.1 billion, and only one third of that available from a $242 million federal grant and an $85 million state grant, NCDOT and its Turnpike Authority are considering both tolling and a public-private partnership (P3) to finance and manage the replacement. David Roy of the Turnpike Authority pointed out at a recent public hearing that a state agency is not allowed to provide toll discounts to any kind of user, but that a P3 concession company would be able to do that –e.g., for local residents.

South Carolina May Consider I-77 Toll Lanes
In its $3.2 billion express toll lanes project, North Carolina DOT will be adding express toll lanes to I-77 between Charlotte and the South Carolina border. Unless South Carolina adds lanes on its side of the border, there will be a huge bottleneck as 10 NC lanes meet far fewer lanes on the South Carolina side. SCDOT director Justin Powell is aware of the problem. He told the Rock Hill Herald on Nov. 24 that he plans to discuss this with his North Carolina counterpart in the near future.

New Zealand Moving to Road User Charges
Last month, the New Zealand Parliament passed a bill to authorize nationwide road user charges. Local agencies are encouraged to partner with the NZ Transport Agency. The proposed charging is called “time-of-use charging,” with higher rates applying during the busiest hours for roadway use. The Auckland Council is expected to be the first local government to engage with the Transport Agency. Also, in late November, the Agency announced that tolls and road user charges will be indexed to inflation, as measured by the NZ Consumer Price Index.

EPA Changes Definition of Waters of the United States
For decades, the Environmental Protection Agency defined waters of the United States (WOTUS) very comprehensively, to include even ditches that were often dry. Litigation over many years challenged this policy as inconsistent with the legal definition of those waters as “navigable.” On Nov. 17, the EPA announced a revised definition, consistent with a 2023 Supreme Court decision, which has led to cheers from highway organizations.

I-10 Bridge Replacement to Begin in March
Louisiana DOTD has announced that construction of the $2.4 billion I-10 Calcasieu Bridge replacement will begin in March. The bridge is to be designed, built, financed, operated, and maintained by Calcasieu Bridge Partners, formed by Plenary Americas, Acciona Concesiones, and Sacyr Infrastructure USA under a 50-year toll-financed public-private partnership. That river has something of a pirate history, so a Louisiana pirate symbol of crossed pistols will be incorporated into the bridge’s four towers.

Express Toll Lanes Expanding in California’s Bay Area
18 miles of new express toll lanes are nearing completion on I-80 in Solano County, from Red Top Road in Fairfield to I-505 in Vacaville. Variable tolls will be charged between 5 AM and 8 PM, and a FasTrak tag will be required, as on other express toll lanes in the region. Vehicles with two occupants and a switchable FasTrak will pay half price, and those with three will go at no charge with the FasTrak set at 3.

Brightline Florida in Trouble
The privately financed “higher-speed” passenger rail line between Miami and Orlando is in financial trouble. Its tax-exempt revenue bonds are trading at below their nominal value, and non-insured bonds have recently traded in the low 80s. While ridership has increased over the last year, it is well below projections. Moreover, due to a number of collisions with motor vehicles and pedestrians, in recent years, the rail line is under attack in Miami media. Separately, Brightline and Florida East Coast Railroad are in litigation over Brightline’s planned commuter service, which FEC claims violates Brightline’s agreement on its use of FEC trackage.

Four Dallas Suburbs May Withdraw From Rail Transit System
The cities of Plano, Irving, Farmers Branch, and Highland Park have scheduled referenda for next March on whether they should withdraw from the regional rail transit system DART. The agency says its 93-mile system is the largest light rail system in the United States. Thirteen cities dedicate a share of their sales tax revenue to DART, but these four cities say their sales taxes to DART cover far more than what the agency spends on their DART service. In the Dallas/Fort Worth metro area, only 0.6% of commuters used transit in 2024, down from 1.2% in 2019 and 3.4% in 1980.

Metro Pacific to Sell Shares in Its Toll Roads
Metro Pacific Investments Corporation announced plans to sell 20-30% stakes in its Indonesian and Philippine toll roads via private placement, according to the Manila Standard. MPIC chief finance officer June Cheryl Cabal-Revilla said the sale will involve 20-30% of the Indonesian toll road and a similar stake in Metro Pacific Tollways Corp. MPTC CEO Gilbert Sta. Maria said the company is in talks with overseas and local investors for the private placement.

Japanese Maglev Project Costs Have Doubled
The cost of the main segment of the Chuo maglev line planned for Tokyo to Nagoya is now double the original estimate. That section—between Shinagawa and Nagoya—is now expected to cost $72 billion, compared to less than half that in 2014. The reasons for the large increase were cited as price surges, responses to challenging construction work, and enhanced specifications. This news is based on an article in Infralogic dated Oct. 30, 2025.

Nashville Loop Project in Trouble?
ENR reported late last month that the $240 million Music City Loop tunnel project has experienced a walkout by local contractor Shane Trucking & Excavating partway through boring the nine-mile tunnel. Boring Company CEO Steve Davis, on a Nov. 24 livestream, discussed worker safety innovations and said the project remained on schedule.

Vietnam Considering $1.4 Billion Expressway
Infralogic (Nov. 28) reported that the Vietnamese government is considering a public-private partnership for a 141 km expressway linking two other expressways in Lam Dong province. The national Ministry of Construction has asked the Lam Dong government to assess the pros and cons of a P3 for this project.

» return to top

The post Surface Transportation News: Key Bridge replacement costs soar appeared first on Reason Foundation.

]]>
Why the World Health Organization’s anti-nicotine policy could keep millions smoking https://reason.org/commentary/why-the-world-health-organizations-anti-nicotine-policy-could-keep-millions-smoking/ Tue, 02 Dec 2025 11:30:00 +0000 https://reason.org/?post_type=commentary&p=87127 If these recommendations are put in place, they could discourage millions of smokers from switching to safer alternatives.

The post Why the World Health Organization’s anti-nicotine policy could keep millions smoking appeared first on Reason Foundation.

]]>
The World Health Organization (WHO) is pushing for countries to regulate e-cigarettes, nicotine pouches, and heated tobacco just as strictly as traditional cigarettes, even suggesting outright bans. If these recommendations are put in place, they could discourage millions of smokers from switching to these safer alternatives, leading to more deaths and diseases from smoking instead of reducing them. 

Promoting a new position paper titled “WHO Position on Tobacco Control and Harm Reduction,” Director General Dr. Tedros Adhanom Ghebreyesus claims e-cigarettes aren’t promoting harm reduction, via transitioning smokers to a safer source of nicotine, but are instead encouraging a new wave of addiction among young people.  

Instead of switching to e-cigarettes or nicotine pouches, the WHO recommends smokers make use of quit helplines and nicotine replacement therapies. But both these methods have notoriously low success rates and are not readily available or affordable in low- and middle-income countries (LMIC) where the majority of smokers live. LMICs often lack the public health infrastructure of countries like the United Kingdom or New Zealand, which have independently and successfully embraced products like e-cigarettes for tobacco harm reduction. LMICs are often more reliant on bodies such as the WHO for health and regulatory advice, placing a great responsibility on these organizations to provide sound, evidence-based guidance.   

In 2019, the WHO congratulated India, where there are more than 250 million tobacco users and around one million tobacco-related deaths per year, for its ban on e-cigarettes. In 2024, the WHO honored Brazil’s National Health Surveillance Agency with an award for reaffirming a ban on e-cigarettes. E-cigarettes are also banned in Argentina, Thailand, Brazil, Vietnam, and Mexico, where more than 70 million tobacco users live. Cigarettes, which are by far the most dangerous way of consuming nicotine, remain legal in all these countries. 

The WHO paper doesn’t provide any evidence that e-cigarettes or nicotine pouches are, in fact, just as or more harmful than smoking. The safer profile of these products is not just some self-serving claim from the tobacco industry trying to sell these alternatives. That vaping is safer than smoking is acknowledged by some of the WHO’s largest funders, such as the United States, the United Kingdom, and Canada. These countries have different regulatory regimes for nicotine products, but all of their leading health agencies, the Food and Drug Administration, the Office for Health Improvement and Disparities, and Health Canada, agree that e-cigarettes are safer than cigarettes. The gold standard for evidence-based medicine, the Cochrane Review, consistently finds e-cigarettes to be more effective than nicotine replacement therapies for smoking cessation. 

The UK’s National Health Service (NHS) and Cancer Research UK consistently promote e-cigarettes to smokers, regularly debunking the myths that these products are just as or more dangerous than cigarettes. The NHS even offers some smokers free vape kits as part of its “swap to stop” initiative. 

These efforts are bearing fruit. Smoking rates in the UK have declined significantly since the rise of e-cigarettes. In November 2025, the number of vapers in the UK surpassed the number of smokers for the first time. The spread of e-cigarettes, nicotine pouches, and heated tobacco products has given tens of millions of smokers who want to quit— but have failed through other methods—an alternative. Sweden has the lowest smoking and lung cancer rate in Europe because those who wish to use nicotine typically choose snus, an oral nicotine product that doesn’t involve combustion or inhaling smoke. 

There is also a wide-ranging body of evidence demonstrating that the kinds of restrictions Tedros is calling for, whether in the form of higher taxes or bans on e-cigarette flavors consumers prefer, result in more smoking of traditional cigarettes. That’s not a prescription for better public health.

Despite the overwhelming evidence that vaping is dramatically safer than smoking, the WHO persists in its demands that if countries don’t ban e-cigarettes outright, they should be subject to the same taxes and regulations as cigarettes. It should be commonsensical that products presenting vastly different risks should be regulated differently. But the WHO’s advice to put vapes, nicotine pouches, and other nicotine alternatives on a level playing field with cigarettes, if implemented in more countries, will only prolong and sustain death and disease among smokers who want to quit but don’t have the right options that might help them succeed. 

The post Why the World Health Organization’s anti-nicotine policy could keep millions smoking appeared first on Reason Foundation.

]]>
What state policymakers should know about homeschoolers https://reason.org/commentary/what-state-policymakers-should-know-about-homeschoolers/ Tue, 02 Dec 2025 11:00:00 +0000 https://reason.org/?post_type=commentary&p=87068 For state policymakers, it is crucial to have an accurate understanding of modern homeschoolers when considering new laws or regulations.

The post What state policymakers should know about homeschoolers appeared first on Reason Foundation.

]]>
With homeschooling on the rise, calls for increased government oversight of home-based education are growing. In some states, the process has already begun: Lawmakers in Illinois, New Jersey, and Virginia introduced bills this year to create new regulations for homeschoolers. New Jersey’s Assembly Bill 5825, for example, would require homeschool parents to align their curricula with the state’s learning standards, maintain a portfolio of their child’s work, and undergo an external evaluation of their child’s progress. This may sound like a simple oversight to some, but such regulations can be a burden for families, interfere with their K–12 education goals, and put them in the crosshairs of hostile public officials.

For state policymakers, it is crucial to have an accurate understanding of modern homeschoolers when considering new laws or regulations. While misconceptions about homeschooling remain prevalent, a growing body of research and data is helping to set the record straight.

The following analysis compiles key information to address two fundamental questions: Who homeschools, and why do they choose to do so? It begins by examining trends in homeschool participation, then looks at the sector’s demographics and the reasons families choose home education, and ends with a brief discussion of policy considerations.

Homeschool participation and growth

Homeschooling has increased over the past few decades, but it saw an especially steep spike at the onset of the COVID-19 pandemic. Participation estimates vary by data source, but it is clear that growth has been anything but linear.

Using figures from the U.S. Census Bureau’s Household Pulse Survey, Alanna Bjorklund-Young and Angela R. Watson estimate that nearly 6 percent of students were homeschooled during the 2022–23 academic year. In comparison, the most recent estimate published by the National Center for Education Statistics’ National Household Education Survey (NHES) puts this figure at 3.4 percent in the same year.

This discrepancy is due to differences in how data are collected. The Pulse survey uses a broader definition of homeschooling than the NHES, whose estimates “can be viewed as a more restrictive definition of homeschooling, providing a more conservative estimate of this population,” write Bjorklund-Young and Watson.

However, the NHES also reports that a total of 5.2 percent of students are schooled at home—comprising homeschoolers and students enrolled full-time in virtual courses—a figure that is more in line with the Pulse survey.

Bjorklund-Young and Watson suggest looking at these data sources as a range of estimates. But to evaluate growth over time, the NHES figures should be used to ensure consistent comparisons. According to the NHES data, the proportion of homeschooling students increased from 1.7 percent of all students in 1999 to 3.4 percent in 2011–12, but dipped to 2.8 percent by 2018–19. Homeschooling then surged during the Covid-19 pandemic, with the latest NHES data showing the aforementioned 3.4 percent homeschooled in the 2022–23 school year. While the total number of homeschoolers in 2022–23—1.76 million—is up from 1.45 million in 2018–19, it is nearly identical to the 2012 figure of 1.77 million students.

Research also indicates that the homeschool population is dynamic, with many students switching school sectors at least once during their K–12 education. A survey conducted by Albert Cheng and Angela Watson found that only 17 percent of adults who were ever homeschooled did so for the entirety of their K–12 education, with 56 percent of respondents doing so for six years or fewer.

Figure 1: Homeschooling’s Pandemic Pull

Homeschool demographics

The National Center for Education Statistics’ NHES program and the Census Bureau’s Pulse survey also provide insight into homeschooling demographics. According to the NHES, a greater share of white students (5.1 percent) were homeschooled in 2022–23 than Black students (1.7 percent) and Hispanic students (1.8 percent), as shown in Figure 1. However, compared to pre-pandemic levels, the proportion of Black homeschoolers increased by half a percentage point—rising from 1.2 percent to 1.7 percent. White participation rose even more, while Hispanic participation fell slightly.

Figure 2: Many Homeschoolers Are Students of Color

Bjorklund-Young and Watson’s research provides additional context for these figures. Using the NHES data, they estimate that in 2022–23, about 29 percent of homeschoolers were students of color: 6 percent black, 14 percent Hispanic, 7 percent two or more races, and less than 2 percent Asian (see Figure 2). Their estimate rises to 40 percent using Pulse data, indicating that students of color are underrepresented among homeschoolers, regardless of the data source. (About 51 percent of all school-aged students fall into this category.) While students of color have seen only modest growth as a share of all homeschoolers over time—about four percentage points between 1998–99 and 2022–23—Bjorklund-Young and Watson nonetheless conclude that “the stereotypical narratives around homeschooling as a predominantly white population must be updated to represent the modern group of homeschoolers.”

Figure 3: Most Homeschool Parents Are Democrats or Independents

Additionally, a nationally representative survey by Angela Watson and Matthew Lee included comparisons of homeschooling parents with their non-homeschooling peers on characteristics such as political affiliation, religiosity, and schooling sectors. It found that, while Republicans are overrepresented among homeschoolers—44 percent of homeschool parents identified as Republican compared to 36 percent of the general population—the majority of homeschool parents are either Democrats or independents (see Figure 3). The study also found that fewer than half of homeschoolers attend religious services weekly (including 31 percent who never attend services), and about 35 percent of homeschool households have at least one child enrolled in a public school.

Figure 4: Who Is Homeschooling?

The NHES data also provide valuable insight into other key demographic variables, as shown in Figure 4. (Although the National Center for Education Statistics has not yet published a complete 2022–23 dataset, I use its 2018–19 data to supplement the latest figures.)

Not surprisingly, a greater proportion of students living in areas classified as rural or towns are homeschooled than those living in areas classified as cities or suburbs. But while city and suburban students are homeschooled at lower rates, the NHES’s 2019 data indicate they still comprise about 64 percent of all homeschoolers.

Students living in lower-income households homeschool at lower rates than those living in higher-income households. According to the NHES’s 2019 data, nearly half of homeschoolers—about 49 percent—live in households earning more than $75,000.

Finally, in terms of education level, parents who have attended at least some college choose to homeschool at higher rates than parents with only a high school education. The NHES’s 2019 data show that a slight majority of homeschooler parents (52 percent) hold a bachelor’s, graduate, or professional degree, with less than one-quarter having only completed high school or less.

Figure 5: School Environment, Academics Big Factors in Choosing Homeschooling

Reasons for homeschooling

The National Center for Education Statistics’ 2023 NHES report provides insight into why parents choose to homeschool. The survey allowed respondents to select multiple factors that were important in their decision, along with one “most important” factor. These findings, which largely mirror the final NHES survey before the Covid-19 pandemic, are summarized in Figure 5.

Notably, parents’ concerns about school environment stood out above the rest, with 83 percent of respondents identifying it as an important reason to homeschool and 28 percent ranking it as the most important one. Another factor receiving high scores in both categories was dissatisfaction with academic instruction at other schools, which was selected as an important reason by 72 percent of respondents and the most important reason by 17 percent. Taken together, nearly half of the respondents cited school environment or academics as the most important reason to homeschool their kids, underscoring the weight parents put on these factors.

Also notable is that moral instruction was selected by 75 percent of respondents as an important reason to homeschool, while the desire to provide religious instruction was chosen by 53 percent.

Takeaways and policy implications  

Available research and data on homeschooling provide several key takeaways for policymakers.

First, homeschoolers are a sizeable and growing share of U.S. students. Estimates vary, but we can confidently say that homeschoolers comprise between about 3.4 and 6 percent of all U.S. students, with a conservative estimate of 1.76 million total students as of 2022–23. “The U.S. homeschool population is of similar magnitude to the private and charter sectors,” conclude Watson and Lee.

As its popularity grows, homeschooling will face increasing scrutiny and regulatory pressure. It’s important for state policymakers to have an accurate view of who homeschoolers are, how they homeschool, why they choose this model, and what the research says about abuse, neglect, outcomes, and other vital issues. They should also carefully consider whether proposed homeschool regulations—ranging from notification requirements to curricular reviews—will achieve their intended purposes, and what the negative unintended consequences of additional oversight might be.

Next, data show that homeschoolers are a diverse population, putting to rest the stereotype that home education is exclusively the domain of white Christian conservatives. While white families homeschool at higher rates than Black and Hispanic families, between 29 and 40 percent of homeschoolers are students of color.

Furthermore, less than half of homeschool parents say they are Republicans, which should alert policymakers to the fact that homeschooling draws support across the political spectrum. Regardless of partisan affiliation, parents want what is best for their children, and for many, this means a home-based education.

Additionally, and perhaps surprisingly, many homeschooling families are connected to the public education system in some form. It’s rare for children to be homeschooled for their entire K–12 education, and more than one-third of homeschool households have at least one child enrolled in a public school. This underscores that, even within the same family, children have vastly different needs that can only be satisfied by a diverse supply of education providers. If maximizing each child’s unique potential is a primary goal for K–12 education, then it only makes sense to give families robust options through policy mechanisms such as education savings accounts, charter schools, and public school open enrollment. Policymakers should be sector-agnostic when it comes to cultivating K–12 education systems.

Finally, these findings offer valuable insights into where public schools are falling short and provide guidance on how they can improve. The fact that 45 percent of homeschool parents cite either concerns about school environments or dissatisfaction with academic instruction at other schools as the most important reason for homeschooling should raise flags for policymakers, especially at a time when lax discipline, chronic absenteeism, declining enrollment, low academic standards, and curricular controversies are making headlines from public school systems across the country.

The latest data and research clearly demonstrate that homeschoolers are diverse in many ways and that past conceptions about them should be discarded. They also provide lawmakers with an accurate understanding of modern homeschooling, its role in the education system, and insights into what public schools can learn from the reasons parents choose to homeschool. This is a critical time in K–12 education. With public schools falling short for families in a variety of measures, homeschooling is increasingly becoming an attractive alternative. Overregulating the sector in response to that preference won’t solve any of those problems, but it may add to them.

A version of this column first appeared at Education Next.

The post What state policymakers should know about homeschoolers appeared first on Reason Foundation.

]]>
Restoring robust hearing practices will protect consumers from defective aviation consumer protection regulations https://reason.org/testimony/restoring-robust-hearing-practices-will-protect-consumers-from-defective-aviation-consumer-protection-regulations/ Mon, 01 Dec 2025 15:00:00 +0000 https://reason.org/?post_type=testimony&p=87141 The recent history of Section 41712 discretionary rulemaking suggests that regulatory analysis has not been sufficiently robust to avoid harm to consumers.

The post Restoring robust hearing practices will protect consumers from defective aviation consumer protection regulations appeared first on Reason Foundation.

]]>
A version of the following public comment letter was submitted to the Office of the Secretary of Transportation on December 1, 2025.

On behalf of Reason Foundation, I respectfully submit these comments in response to the Office of the Secretary’s (OST) notice of proposed rulemaking (NPRM) on Procedures in Regulating and Enforcing Unfair or Deceptive Practices.

By way of background, I am a senior transportation policy analyst at Reason Foundation and focus on federal transportation policy, including aviation consumer protection regulation. Reason Foundation is a national 501(c)(3) public policy research and education organization with expertise across a range of policy areas, including transportation.

Reason Foundation previously submitted comments to OST recommending the initiation of this rulemaking proceeding. We write in support of the proposed changes to Subparts F and G contained in the NPRM.

Protecting consumers from defective regulations

The statutory authority (49 U.S.C. § 41712) wielded by the U.S. Department of Transportation to police unfair or deceptive practices in the aviation industry long predates the Department itself. The authority was created as Section 411 of the Civil Aeronautics Act of 1938 and modeled on the “unfair or deceptive acts or practices” language included months before in the Federal Trade Commission Act of 1938, which covered most other commercial contexts. In 1958, Congress expanded Section 411 to cover not only air transportation itself but the sale of air transportation by ticket agents.

When Congress passed the Airline Deregulation Act in 1978, it eliminated most economic regulation in the aviation sector and wound down the Civil Aeronautics Board (“CAB”). When the CAB was terminated in 1985, Section 411 consumer protection authority was transferred to the Department of Transportation’s Office of the Secretary (“OST”). In 1994, Congress reorganized the Title 49 Transportation Code, and Section 411 was recodified as Section 41712.

While reorganizing the Transportation Code, Congress was also working to modernize authorities held by the Federal Trade Commission (“FTC”). The FTC Act Amendments of 1994, among other things, codified longstanding internal FTC policy in dealing with claims of unfair or deceptive acts or practices that had in part been synthesized for Congress in the FTC’s December 1980 “Policy Statement on Unfairness.” The FTC’s approach, as affirmed by Congress, requires that specific elements be met to prove unfairness allegations, one of which necessitates careful benefit/cost analysis.

Specifically, the FTC Act amendments added three standards of proof to the FTC’s broad statutory prohibition on unfair business practices (15 U.S.C. § 45(n)). For conduct to qualify as legally unfair, it must be (1) “likely to cause substantial injury to consumers,” (2) not “reasonably avoidable by consumers themselves,” and (3) “not outweighed by countervailing benefits to consumers or to competition.” It is worth noting that these reforms earned bipartisan support. Similar language was also included in the Dodd-Frank Act of 2010, covering the enforcement responsibilities of the Consumer Financial Protection Bureau (12 U.S.C. § 5531(c)).

While bipartisan recognition of the problem of ill-defined “unfairness” exists in virtually every other federal consumer protection context, Congress has so far not moved to reform the Department of Transportation’s similar Section 41712 aviation consumer protection authority. This failure to act has enabled regulators in recent years to engage in a variety of re-regulatory activities, including new restrictions on airfare advertising that prohibit government taxes and fees from being “displayed prominently” (14 C.F.R. § 399.84(a)), outlawing true nonrefundable ticketing (14 C.F.R. § 259.5(b)(4)), which puts upward price pressure on airfares due to the forced risk transfer from consumers to air carriers, and an inflexible tarmac delay rule (14 C.F.R. § 259.4) suspected of increasing flight cancellations—particularly at smaller and more-rural airports.

Each of the aforementioned aviation consumer protection regulations has been criticized as harming consumers, some with stronger evidence than others. But without the FTC-style standards of proof and evidentiary hearing procedures, the scales were tipped in favor of regulators. These are fact-intensive matters that require careful review of the evidence to ensure potential regulatory actions will not perversely harm consumers.

Despite congressional inaction on modernizing Section 41712, the December 2020 final rule did much to bring the Department’s aviation consumer protection authority into alignment with similar federal authorities. This rule added FTC-style standards of proof to Section 41712 enforcement and rulemaking procedures while also codifying internal agency practices for allowing alleged violators to present evidence defending themselves against possible enforcement or rulemaking activity derived from the aviation consumer protection authority.

While this would have improved airline and ticket agents’ defensive positions, it also would have required the Department of Transportation to clearly explain itself along the way and give consumers better insight into how decisions that affect them are made. In this way, the FTC-style standards of proof in unfairness claims are best understood as promoting regulatory quality and consistency in enforcement.

Following the transition between administrations, the Biden administration quickly moved to reverse these reforms. In February 2022, the Department published a rule modifying the hearing procedures for discretionary aviation consumer protection rulemakings in several ways that would reduce regulatory quality. In August 2022, OST published a guidance document further suggesting it will again take an expansive view of how its Section 41712 powers are defined and limited.

These policy changes reopened the door for future discretionary rulemaking guided more by political whims than careful empirical analysis. The recent history of Section 41712 discretionary rulemaking suggests that regulatory analysis has not been sufficiently robust to avoid harm to consumers. As such, we support the proposed restoration of the 2020 hearing procedures, as modified. While outside the scope of this proceeding, we also support the rescission of the August 2022 Guidance Regarding Interpretation of Unfair or Deceptive Practices, as the Department indicated it will pursue in the future.

Conclusion

Thank you for the opportunity to provide comments in response to this NPRM. We urge the Department to act swiftly to implement these needed reforms to protect consumers from defective regulations derived from the aviation consumer protection authority.

Download this Resource

Restoring robust hearing practices will protect consumers from defective aviation consumer protection regulations

Thank you for downloading!

Please provide your work email address to access this report:

The post Restoring robust hearing practices will protect consumers from defective aviation consumer protection regulations appeared first on Reason Foundation.

]]>
State and local governments are drowning in debt https://reason.org/commentary/state-and-local-governments-are-drowning-in-debt/ Mon, 01 Dec 2025 11:00:00 +0000 https://reason.org/?post_type=commentary&p=87059 To address this mountain of debt and restore fiscal stability, state and local governments must sustainably align spending with revenues.

The post State and local governments are drowning in debt appeared first on Reason Foundation.

]]>
The national debt recently surpassed $38 trillion, but America’s debt crisis isn’t limited to the federal government. Less well known is that, nationwide, state and local governments now hold more than $6.1 trillion of their debt.

States owe $2.7 trillion in debt, cities hold $1.4 trillion, school districts have $1.3 trillion, and counties owe $760 billion, according to a review by Reason Foundation of more than 20,000 financial statements filed by government entities for their 2023 fiscal years, the most recent period with complete data available.

In total, California’s state and local governments hold $1 trillion in debt, the highest in the nation. New York’s state and local debt is the second-most, at $800 billion, followed by Texas at $550 billion, Illinois at $410 billion, New Jersey at $310 billion, and Florida at $240 billion.

Additionally, Massachusetts, Pennsylvania, Ohio, Washington, Michigan, Georgia, Maryland, Connecticut, North Carolina, and Colorado each have more than $100 billion in state and local government debt.

On a per-capita basis, the state and local debt numbers are even more eye-opening, with states like Hawaii, Delaware, and Wyoming having surprisingly large debt loads per resident.

Nationally, state and local government debt amounts to about $18,400 per person. In New York, Connecticut, New Jersey, Illinois, and Hawaii, state and local debt exceeds $30,000 a person.

Following them are Massachusetts, California, Alaska, North Dakota, Delaware, Wyoming, and Maryland, all of which have state and local liabilities in excess of $20,000 per resident.

Over 40 percent of state and local government debt consists of unfunded pension and healthcare benefits promised to public workers. State and local pension debt amounts to $1.5 trillion, with an additional $1 trillion in healthcare benefits promised to retirees.

The bonds that governments issue to fund infrastructure projects, such as roads and bridges, to build and upgrade schools, and to pay for other programs, represent an additional 33 percent of all state and local debt.

These debts have three negative consequences for taxpayers. First, the annual interest costs and debt payments are starting to crowd out essential services. Many local governments are already being forced to divert funds from taxpayers’ priorities, such as education, policing, and transportation, to pay for promised public pension benefits that they haven’t set aside the necessary money for.

Second, as governments struggle to cover rising interest and pension payments, some politicians will seek to raise taxes and fees, placing a growing burden on taxpayers. The scale of tax increases needed to pay for these public pension debts could also hinder economic activity within communities, reducing revenues and further increasing debt woes.

Third, current levels of debt weaken long-term balance sheets, harming the future. Some cities and states haven’t borrowed or spent wisely, so they’ll be looking to borrow more money to modernize their infrastructure, schools, and technology in the years ahead. However, today’s debt burden will make borrowing more expensive and potentially raise the interest rates on new bond issuances, costing taxpayers even more.

To address this mountain of debt and restore fiscal stability, state and local governments must sustainably align spending with revenues. In years with a robust economy, governments should use budget surpluses to pay down debt rather than funding new or existing programs.

For mega-infrastructure projects, such as major highway and bridge repair, replacement, and expansion, public-private partnerships can be used, allowing the private sector to bear the initial construction costs and any overruns, rather than taxpayers.

Ultimately, the most significant drivers of state and local debt are pensions and retiree healthcare benefits, which must be reformed to ensure they are fully funded and prevent the accrual of debt.

State and local governments have far less ability to keep piling up debt the way the federal government does. The bill is coming due, and cities and states that pay down debt quickly and right-size government will be best positioned for the future.

A version of this column first appeared at The DC Journal.

The post State and local governments are drowning in debt appeared first on Reason Foundation.

]]>
Connecticut’s pensions shouldn’t make political investment in WNBA team https://reason.org/commentary/connecticut-pensions-not-piggy-bank-wnba/ Wed, 26 Nov 2025 11:00:00 +0000 https://reason.org/?post_type=commentary&p=87072 Saving the Connecticut Sun may be good politics, but it is a bad financial move that puts the state's taxpayers at risk.

The post Connecticut’s pensions shouldn’t make political investment in WNBA team appeared first on Reason Foundation.

]]>
Connecticut Gov. Ned Lamont has floated a plan to use state pension assets to purchase a stake in the Women’s National Basketball Association’s (WNBA) Connecticut Sun and keep the team from moving to Boston. Saving a local team may be good politics, but it is bad finance that would put taxpayers at risk.

Connecticut’s $60 billion Retirement Plans and Trust Funds exist to fund the retirement benefits promised to public workers, not to serve as a bailout vehicle for a professional sports franchise or to promote Connecticut’s economic development.

Gov. Lamont’s efforts to keep the Connecticut Sun in the state come as a Boston-based private equity group led by Celtics minority owner Steve Pagliuca reached a $325 million deal to buy the team and relocate it to Boston in 2027, pending league approval.

Connecticut’s state and local governments have recently made an impressive fiscal recovery after decades of budget neglect. The adoption of spending guardrails and mechanisms that enabled surplus pension contributions has stabilized finances, reduced bonded debt, improved pension funding, and led to credit rating upgrades. But recent fiscal improvements are no excuse for pursuing pension investments for political symbolism rather than financial merit.

The state’s fiscal job is unfinished: Connecticut still ranks second in the nation in terms of per-capita public employee debt (which includes unfunded pension and retiree healthcare liabilities). Furthermore, the state’s pension trust—which calculates contributions presuming a 7% annual return—has earned an average return of just 5.7% over the past 24 years (2001-2024), while the S&P 500 returned 10.6% over the same period, according to the Reason Foundation’s 2025 Annual Pension Solvency and Performance Report.

If the goal of this proposed investment in the Connecticut Sun were to maximize investment returns, it wouldn’t have been paraded in such a manner. A financially motivated investment is one where Connecticut’s pension funds would be comfortable selling at any time deemed advantageous; it is one where the state feels comfortable advocating for management decisions that benefit the team the most, which could very well mean moving out of Connecticut.

In fact, politically motivated investment harms not only the state’s pension plans but also the Connecticut Sun itself. The team would be better off with investors who are genuinely interested in its success, rather than ones whose primary goal is merely to retain the team in a particular geographical location, even if that is not conducive to the team’s ability to compete and win.

Sports teams can be great investments if managed correctly, but the upside comes with a delicate bundle of risks. Professional sports franchises are relatively illiquid, with valuations that fluctuate depending on revenue trends, media rights, and local market conditions. Future exit prospects are also limited, because—unlike an investment in a publicly traded company—only so many people/institutions are willing and able to buy a professional sports team.

If the Connecticut Sun turned out to be a bad investment for the state’s pension plan, public employees in the plan would likely not bear the harm: they are still guaranteed their retirement benefits regardless of investment outcomes or cost increases. The risk instead falls on taxpayers, who must cover any funding gap in the pension fund through higher property, income, and sales taxes.

Connecticut’s pension funds are fiduciary funds established to fund retirement obligations while avoiding unnecessary risk, not a piggy bank for pet political projects. Any sports investments, like all other pension fund investments, must be evaluated on a single criterion: maximizing risk-adjusted returns for beneficiaries and taxpayers.

A version of this column first appeared at The Connecticut Mirror.

The post Connecticut’s pensions shouldn’t make political investment in WNBA team appeared first on Reason Foundation.

]]>
Pension Reform News: Reason analysis shows debt drives the rise in pension costs https://reason.org/pension-newsletter/analysis-shows-debt-drives-the-rise-in-pension-costs/ Tue, 25 Nov 2025 17:05:48 +0000 https://reason.org/?post_type=pension-newsletter&p=87109 Plus: Ohio bill would advance shared pension responsibility, Florida has decades to go before fully funding benefits, and more.

The post Pension Reform News: Reason analysis shows debt drives the rise in pension costs appeared first on Reason Foundation.

]]>
In This Issue:

Articles, Research & Spotlights 

  • Analysis Shows Debt Drives the Rise in Pension Costs
  • Ohio Bill Would Advance Shared Pension Responsibility
  • California Pensions Rank High on Investment Risk, But Low on Returns
  • Florida Still Has Decades to Go Before Fully Funding Pension Benefits

News in Brief
Quotable Quotes on Pension Reform

Data Highlight
Reason Foundation in the News


Articles, Research & Spotlights

Most Pension Contributions Go Toward Paying Off Debt, Not Funding Benefits

Pension benefits promised to public workers have become increasingly expensive, squeezing state and local budgets nationwide. A new analysis from Mariana Trujillo uses Reason Foundation’s Annual Pension Solvency and Performance Report to dive into the growth of public pension costs over the last decade. Since 2014, annual pension costs have risen by 26% nationwide, with some states, like New Jersey and Alaska, seeing their pension costs rise more rapidly than others. With employee contributions remaining relatively stable, taxpayers have had to bear the bulk of this growing burden. Trujillo’s analysis finds that public pension debt, not new retirement benefits, is the primary driver behind these trends. In fact, more than half of employer pension contributions (55%) are now allocated to address the estimated $1.5 trillion aggregate state and local public pension funding shortfall.

Ohio House Bill 473 Could Balance Public Pension Plan Contributions

New legislation under consideration in Ohio aims to improve transparency and balance the burden of pension costs between employees and employers. House Bill 473 would restrict state and local government employers from paying all or a portion of an employee’s contribution obligation, a practice commonly known as a “pickup.” While governments use pickups to attract quality workers, this practice masks the true cost of a retirement benefit and distorts market signals that are important for informed policymaking. In comments submitted to the Ohio legislature, Reason Foundation’s Zachary Christensen explained the value of collaboration between employees and the taxpayer (represented by lawmakers) in a retirement plan and the importance of transparency in that partnership. 

California’s Pensions Are Relying on Riskier Investment Strategies

Facing more than $265 billion in unfunded pension liabilities and ever-increasing costs on local governments, California’s pension systems are turning toward high-risk investment strategies they hope will offer high rewards. As Reason Foundation’s Zachary Christensen explains in a recent op-ed, every resident in the Golden State is on the hook for about $6,000 in pension debt, so there is real pressure for the state’s pensions to catch up with above-average investment returns. The California Public Employees’ Retirement System (CalPERS) and the California State Teachers’ Retirement System (CalSTRS) aim to achieve higher returns by increasing their investments in alternatives, such as private equity and hedge funds. However, this strategy also carries significant downside risk, which will ultimately be borne by increased costs on taxpayers.  

Florida Must Stay the Course to Pay for Promised Pension Benefits

New estimates indicate that the Florida Retirement System (FRS) will need at least 17 more years before reaching full funding, but lawmakers are considering adding to these already underfunded pension benefits with proposals to bring back cost-of-living adjustments (COLA) for retirees. Zachary Christensen and Steve Vu from the Reason Foundation provide analysis applicable to this discussion, finding that even without granting a new COLA, a single year of bad returns (0%) could still undo years of progress in the system’s funding. A major recession could extend the full funding date beyond 30 years and would require significant increases in annual costs on taxpayers. With these remaining risks in mind, lawmakers need to avoid diverting from the state’s current path through more risky promises to public workers.

News in Brief

Market Volatility Poses a Bigger Threat to Pension Stability Than Long-Term Averages Suggest

A new pair of whitepapers from Sage Advisory and First Actuarial Consulting shows that many public pension boards assume the same return every year—typically around 7%—even though markets rarely behave that way. These fixed-return models make plans appear stable and fully funded, but they hide the real risks facing systems that pay out far more in benefits than they take in. When a plan has a large negative cash flow, early market losses matter much more than average returns. In these cases, trustees may be forced to sell assets during downturns, locking in losses and creating a long-term funding problem that the “smoothed” projections never reveal.

The second paper focuses specifically on this timing problem—known as sequence-of-returns risk—and explains why it is a structural issue for mature pension systems. When contributions are too small relative to benefit payments, the plan depends heavily on investment gains to maintain its funded status. But if significant losses occur early, the plan must liquidate assets at low prices to keep paying retirees. This shrinks the asset base, reduces future compounding, and can drive down the funded ratio even if markets recover later. Data from the largest plans illustrate this clearly: systems with the most negative cash flow experienced the most significant funding declines over time. The papers are available here and here.

Quotable Pension Quotes 

“Any time you give a benefit, and you don’t pay for it today, it’s like buying it on a credit card. You’re eventually going to have to pay the bill. And those decisions in the ‘90s have left us a large bill in 2026.”
–Mississippi State Sen. Daniel Sparks (R-District 5), quoted in “Mississippi’s PERS faces $26 billion debt,” WJTV, Nov. 6, 2025.

“In the late ‘90s and early 2000, there were some additional benefits placed into law without additional funding at the time. Also, in the two subsequent decades, we had a declining active to retiree ratio, meaning there were fewer active PERS covered members paying into the system and more retirees coming onto the system and retirees living longer.”
–Ray Higgins, executive director of Mississippi PERS, quoted in “Mississippi’s PERS faces $26 billion debt,” WJTV, Nov. 6, 2025.

“If the state fails Safe Harbor, then we would have to enroll everybody into Social Security. So that would more than double what we’re paying right now, […] Almost half our budget would have to go to pensions and Social Security. … So the cost of doing nothing is extreme.”
–Illinois state Rep. Stephanie Kifowit (D-Oswego), quoted in “Tier 2 pension reform bill moves forward, but Pritzker says there’s ‘a lot more work’ to do,” Capitol News Illinois, Oct. 30, 2025.

Data Highlight

Reason Foundation’s Mariana Trujillo explains why most state and local government pension contributions no longer fund current employee benefits. More than half of every dollar contributed to public pension plans now goes toward amortizing legacy pension debt—driven by decades of underfunding and overly optimistic return assumptions—rather than paying for benefits earned each year. Read the full analysis here.

Reason Foundation in the News

“Most plans are taking a lot more risk in their investment portfolio than they used to, and so there’s a lot more volatility than there ever was in pension plan returns.”
—Reason’s Ryan Frost quoted in “Illinois is tops in unfunded state and local pension liabilities per capita,” The Bond Buyer, Oct. 31, 2025.

“Over 40 percent of state and local government debt consists of unfunded pension and healthcare benefits promised to public workers. State and local pension debt amounts to $1.5 trillion, with an additional $1 trillion in healthcare benefits promised to retirees.”
—Reason’s Mariana Trujillo and Jordan Campbell writing in “State and Local Governments Are Drowning in Debt,” Inside Sources, Nov. 19, 2025.

“Yet few governments have set aside money to pay for their retirees’ future healthcare costs. The Reason Foundation reports that state and local governments faced $958 billion in retiree medical obligations in 2023, about $2,900 per American. The liabilities are largest in blue states like New York ($15,017 per capita), New Jersey ($10,599) and Connecticut ($6,657), which let workers retire early with generous health benefits.”
—Alyssia Finley writing, “The ObamaCare Blue-City Bailout,” in The Wall Street Journal, Nov. 7, 2025.

“The saying in the pension world is that most pension funds have been 20 years away from paying off their unfunded liabilities for the past 20 years.”
—Reason’s Mariana Trujillo quoted in “Unfunded pensions make up a large portion of California’s $1 trillion debt,” State Affairs, Oct. 31, 2025.

The post Pension Reform News: Reason analysis shows debt drives the rise in pension costs appeared first on Reason Foundation.

]]>
Model legislation would authorize groundbreaking research into ibogaine for mental health https://reason.org/backgrounder/model-legislation-would-authorize-groundbreaking-research-into-ibogaine-for-mental-health/ Tue, 25 Nov 2025 11:30:00 +0000 https://reason.org/?post_type=backgrounder&p=87010 Model legislation would authorize groundbreaking research into ibogaine for mental health

The post Model legislation would authorize groundbreaking research into ibogaine for mental health appeared first on Reason Foundation.

]]>
Growing research has demonstrated the promise of ibogaine in treating a wide range of intractable conditions, from post-traumatic stress disorder (PTSD) to traumatic brain injury (TBI). But because ibogaine is classified as a Schedule I drug through the federal Controlled Substances Act, it remains out of reach for both researchers and patients. Model legislation from Reason Foundation, titled the Veterans Mental Health Innovations Act (VMHI), will bypass this restriction by authorizing a multistate research collaboration to advance treatment and healing.

State-based research and clinical trials

  • After years of advocacy by veterans’ organizations and researchers, a bipartisan coalition of state legislators in Texas voted to fund ibogaine research programs (Texas Senate Bill 2308). In 2025, Texas launched a multimillion-dollar endeavor that will allow any state that enacts the VMHI to join the effort on ibogaine clinical trials.
  • The most effective way to ensure those in need benefit from ibogaine is to conduct clinical trials using ibogaine as an investigational new drug. Clinical trials are a costly and lengthy endeavor for any one entity, but through VMHI, multiple states will conduct their own local trials, advancing a single unified application to the Food and Drug Administration (FDA).
  • Under the VMHI, each participating state selects and funds a research grantee of their choice to conduct ibogaine clinical trials locally with in-state participants.

Multistate collaboration and shared success

  • A multistate consortium allows states with limited resources to take part in what could be nearly a billion-dollar endeavor. This public effort to conduct FDA-approved clinical trials will be in partnership with a private drug developer, which will assume financial risk and responsibility for advancing the treatment through the clinical trial process. 
  • Under VHMI, states retain the long-term benefits of the research they fund. Instead of handing over value to pharmaceutical companies, the bill keeps the research and development process rooted locally and ensures states are compensated if an application is successful.

Federal government and the role of the FDA

  • Ibogaine is deemed a Schedule I drug by the federal government. Engaging in FDA-approved research is the surest way to prove its medicinal and treatment value.
  • Once ibogaine is approved by the FDA to treat a medical condition, the VMHI would allow licensed physicians to prescribe ibogaine administration for a patient under supervision.
  • The VMHI leaves direct engagement with the FDA to the drug developer, eliminating the need for states to navigate the complex clinical trial application process.

The model legislation for the Veterans Mental Health Innovations Act is available below. The template is designed to be easily adapted by states, with the sections that need customization highlighted.

Download this Resource

Veterans Mental Health Innovations Act Model Legislation

Thank you for downloading!

Please provide your work email address to access this report:

The post Model legislation would authorize groundbreaking research into ibogaine for mental health appeared first on Reason Foundation.

]]>
Southern California school districts spend big, but student outcomes have barely budged https://reason.org/commentary/southern-california-school-districts-spend-big-but-student-outcomes-have-barely-budged/ Tue, 25 Nov 2025 11:00:00 +0000 https://reason.org/?post_type=commentary&p=87076 California's per student spending increased by nearly 79 percent between 2002 and 2023.

The post Southern California school districts spend big, but student outcomes have barely budged appeared first on Reason Foundation.

]]>
At least 32 school districts, including Los Angeles and Anaheim, have joined the California Teachers Association’s latest effort to extract more money from taxpayers. The “We Can’t Wait” campaign, endorsed by United Teachers Los Angeles, demands more funding for smaller class sizes, additional counselors and mental health professionals, and other spending.

“It’s no surprise that public schools are underfunded throughout the state,” claimed Anaheim Union High School District trustee Jessica Guerrero.

In reality, new Reason Foundation research shows that taxpayers are getting a poor return on their investment in California’s public schools, and the last thing those schools need is more money. Between 2002 and 2023, the state’s public school funding increased by nearly 79%, rising from $14,526 per student to $25,941 per student after adjusting for inflation.

The most recent data shows Los Angeles Unified spends $27,073 per average daily attendance, or per student, for shorthand. Santa Ana Unified spends $25,099 per student, San Bernardino Unified spends $24,881, Long Beach Unified spends $22,379, and Corona-Norco Unified spends $18,321.

California led the nation in K-12 spending growth over the past two decades, and you would expect commensurate gains in student outcomes. But results on the National Assessment of Educational Progress (NAEP) from 2003 to 2024 were disappointing, with declines in 8th-grade math and only modest gains in 4th-grade math and 8th-grade reading. The lone bright spot was 4th-grade reading, where the share of students scoring below basic on NAEP improved from 50.4% to 43.7%. That means that just over half of 4th graders are meeting the lowest reading threshold.

Despite record education funding, student outcomes have barely budged. While there is plenty of blame to go around, Reason Foundation’s data reveal structural problems with how K-12 dollars are spent. For starters, California’s public schools are a textbook case of mission creep. From 2002 to 2023, enrollment fell by 317,253 students while the number of non-teachers—including counselors, psychologists, social workers, administrators, and instructional aides—increased by 74,428.

There are fewer kids and more staff because public schools are increasingly focused on things like “whole child” development and content about everything from climate change to ethnic studies, which takes time away from core classes like math, English language arts, and science.

For example, California is spending $4 billion on community schools that provide both students and their families with healthcare, mental health support, legal clinics, and other services. These things aren’t bad, but it doesn’t make sense to turn public schools into social service hubs when nearly 46% of 8th-graders can’t do basic math and districts like Los Angeles Unified can’t cover their bills.

Teacher pension debt is also diverting resources from classrooms. In 2023, California’s public schools spent $4,900 per student on pension benefits, which include pension costs, health insurance, workers’ compensation, and other expenses. These costs have increased by a massive 134.9% since 2002, when schools spent $2,086 per student in real terms.

While benefit spending is up, teachers’ benefits aren’t any better. That’s because the main cause of this increase is unfunded pension liabilities. For years, lawmakers failed to set aside enough cash to cover pension promises made to teachers, and the bill has come due.

In 2024, the California State Teachers’ Retirement System reported $85 billion in debt, which is more than the state spends on K-12 education each year. Allowing this debt to accumulate means even fewer dollars will be spent on instruction in the years ahead, as money is shifted to pay for benefits promised to retirees and workers.

Finally, empty school buildings are eating up resources. Between 2020 and 2023, public enrollment dropped by 5.1%. Yet only seven of the state’s public schools closed in 2023-24—well below pre-COVID-19 school closure levels and fewer than rural states like South Dakota and Utah.

Public school closures are challenging for communities, but the alternative is worse. Underutilized schools are inefficient and costly, siphoning away dollars that could be used to boost student achievement with reading programs, tutoring, and increasing pay for high-performing teachers.

There are no easy fixes to California’s student achievement woes, and even more money won’t help. Policymakers must address structural issues with how education funding is spent, with a focus on academics, reducing pension debt, and closing underutilized schools.

A version of this column first appeared at The Orange County Register.

The post Southern California school districts spend big, but student outcomes have barely budged appeared first on Reason Foundation.

]]>