Just one night of poor sleep can boost Alzheimer’s proteins

How well, not how much, people sleep may affect Alzheimer’s disease risk.

Healthy adults built up Alzheimer’s-associated proteins in their cerebral spinal fluid when prevented from getting slow-wave sleep, the deepest stage of sleep, researchers report July 10 in Brain. Just one night of deep-sleep disruption was enough to increase the amount of amyloid-beta, a protein that clumps into brain cell‒killing plaques in people with Alzheimer’s. People in the study who slept poorly for a week also had more of a protein called tau in their spinal fluid than they did when well rested. Tau snarls itself into tangles inside brain cells of people with the disease.
These findings support a growing body of evidence that lack of Zs is linked to Alzheimer’s and other neurodegenerative diseases. Specifically, “this suggests that there’s something special about deep, slow-wave sleep,” says Kristine Yaffe, a neurologist and psychiatrist at the University of California, San Francisco who was not involved in the study.

People with Alzheimer’s are notoriously poor sleepers, but scientists aren’t sure if that is a cause or a consequence of the disease. Evidence from recent animal and human studies suggests the problem goes both ways, Yaffe says. Lack of sleep may make people more prone to brain disorders. And once a person has the disease, disruptions in the brain may make it hard to sleep. Still, it wasn’t clear why not getting enough shut-eye promotes Alzheimer’s disease.
Researchers led by neurologist David Holtzman of Washington University School of Medicine in St. Louis speculated that lower levels of brain cell activity during deep sleep would produce less A-beta, tau and other proteins than other stages of sleep or wakefulness. Holtzman, Washington University sleep medicine physician Yo-El Ju and colleagues recruited 17 volunteers, all healthy adults between ages 35 and 65, who had no sleep disorders. “These are good sleepers,” Ju says.
Volunteers wore activity monitors to track their sleep at home and visited the sleep lab at least twice. On one visit, the volunteers slept normally while wearing headphones. On the other visit, researchers played beeps through headphones whenever the volunteers were about to go into deep sleep. The sounds usually didn’t wake the people up but kept them from getting any slow-wave sleep. Volunteers slept just as much on the night when deep sleep was disrupted as they did on the night when no sound was played through the headphones.
Spinal taps showed that the more deep sleep people missed out on, the higher their levels of A-beta in the morning. Tau levels didn’t budge because of just one night of slow-wave sleep disruption, but people whose activity monitors indicated they had slept poorly the week before the test also had higher levels of that protein.

“This study in humans is really an elegant experimental demonstration” that bolsters Holtzman’s hypothesis that lack of rest for brain cells could be detrimental, says Adam Spira, a psychologist at Johns Hopkins Bloomberg School of Public Health. Without proper deep sleep, brain cells continue to churn out, producing more A-beta and tau than a well-rested brain.

Some research has suggested that toxic proteins get flushed out of the brain during sleep (SN: 11/16/13, p. 7). Messing with slow-wave sleep doesn’t seem to interfere with this wash cycle, Ju says. Levels of other proteins made by nerve cells didn’t vary with the lack of deep sleep, she says.

Elephant seals recognize rivals by the tempo of their calls

The tempo of a male elephant seal’s call broadcasts his identity to rival males, a new study finds.

Every male elephant seal has a distinct vocalization that sounds something like a sputtering lawnmower — pulses of sound in a pattern and at a pace that stays the same over time.

At a California state park where elephant seals breed each year, researchers played different variations of an alpha male’s threat call to subordinate males who knew him. The seals weren’t as responsive when the tempo of that call was modified substantially, suggesting they didn’t recognize it as a threat. Modifying the call’s timbre — the acoustic quality of the sound — had the same effect, researchers report August 7 in Current Biology. Unlike dolphins and songbirds, elephant seals don’t seem to vary pitch to communicate.
Those vocal name tags serve a purpose. During breeding season, male elephant seals spend three months on land without food or water, competing with rivals for social status and mating rights. Fights between two blubbery car-sized animals can be brutal.

“We’ve seen males lose their noses,” says Caroline Casey, a biologist at the University of California, Santa Cruz. For lower-ranking males, identifying an alpha male by his call and then backing off might prevent a beach brawl.

Borrowed genes give mums the blues

Mums are now a flower of a different color. Japanese researchers have added a hint of clear sky to the humble plant’s palette, genetically engineering the first-ever “true blue” chrysanthemum.

“Obtaining blue-colored flowers is the Holy Grail for plant breeders,” says Mark Bridgen, a plant breeder at Cornell University. The results are “very exciting.”

Compounds called delphinidin-based anthocyanin pigments are responsible for the natural blues in such flowers as pansies and larkspur. Mums lack those compounds. Instead, the flowers come in a variety of other colors, evoking fiery sunsets, new-fallen snow and all things chartreuse.
In previous attempts to engineer a blue hue in chrysanthemums — and roses and carnations — researchers inserted the gene for a key enzyme that controls production of these compounds, causing them to accumulate. But the resulting blooms skewed more violet-purple than blue.
True blue pigment remained elusive, scientists thought, because its origin was complex; multiple genes have been shown to be involved in its generation. But Naonobu Noda, of the National Agriculture and Food Research Organization in Tsukuba, Japan, and colleagues were surprised to find that inserting only two borrowed genes into chrysanthemums created blue flowers. One gene, from Canterbury bells, got the enzyme process started; the other, from butterfly peas, further tweaked the pigment molecules.

Together, the gene double-team transformed 19 of 32 mums, or 59 percent, of the Taihei variety from having pink or magenta blooms into blue beauties. Additional analyses revealed that the blue color arose because of molecular interactions between the tweaked pigment and certain colorless compounds naturally found in many plants, including chrysanthemums. The two-part method could possibly be used in the production of other blue flowers, the researchers report July 26 in Science Advances.

Gene editing creates virus-free piglets

Pigs are a step closer to becoming organ donors for people.

Researchers used molecular scissors known as CRISPR/Cas9 to snip embedded viruses out of pig DNA. Removing the viruses — called porcine endogenous retroviruses, or PERVs — creates piglets that can’t pass the viruses on to transplant recipients, geneticist Luhan Yang and colleagues report online August 10 in Science.

Yang, a cofounder of eGenesis in Cambridge, Mass., and colleagues had previously sliced 62 PERVs at a time from pig cells grown in the lab (SN: 11/14/15, p. 6). Many of the embedded viruses are already damaged and can’t make copies of themselves to pass on an infection. So in the new study, the researchers removed just 25 viruses that were still capable of infecting other cells.
The team had to overcome several technical hurdles to make PERV-less pig cells that still had the normal number of chromosomes. In a process similar to the one that created Dolly the Sheep (SN: 3/1/97, p. 132), researchers sucked the DNA-containing nuclei from the virus-cleaned cells and injected them into pig eggs. The technique, called somatic cell nuclear transfer, is better known as cloning. Embryos made from the cloned cells were transplanted to sows to develop into piglets.

The process is still not very efficient. Researchers placed 200 to 300 embryos in each of 17 sows. Only 37 piglets were born, and 15 are still living. The oldest are about 4 months old. Such virus-free swine could be a starting point for further genetic manipulations to make pig organs compatible with humans.

What happens in Earth’s atmosphere during an eclipse?

As the moon’s shadow races across North America on August 21, hundreds of radio enthusiasts will turn on their receivers — rain or shine. These observers aren’t after the sun. They’re interested in a shell of electrons hundreds of kilometers overhead, which is responsible for heavenly light shows, GPS navigation and the continued existence of all earthly beings.

This part of the atmosphere, called the ionosphere, absorbs extreme ultraviolet radiation from the sun, protecting life on the ground from its harmful effects. “The ionosphere is the reason life exists on this planet,” says physicist Joshua Semeter of Boston University.
It’s also the stage for brilliant displays like the aurora borealis, which appears when charged material in interplanetary space skims the atmosphere. And the ionosphere is important for the accuracy of GPS signals and radio communication.

This layer of the atmosphere forms when radiation from the sun strips electrons from, or ionizes, atoms and molecules in the atmosphere between about 75 and 1,000 kilometers above Earth’s surface. That leaves a zone full of free-floating negatively charged electrons and positively charged ions, which warps and wefts signals passing through it.
Without direct sunlight, though, the ionosphere stops ionizing. Electrons start to rejoin the atoms and molecules they abandoned, neutralizing the atmosphere’s charge. With fewer free electrons bouncing around, the ionosphere reflects radio waves differently, like a distorted mirror.
We know roughly how this happens, but not precisely. The eclipse will give researchers a chance to examine the charging and uncharging process in almost real time.

“The eclipse lets us look at the change from light to dark to light again very quickly,” says Jill Nelson of George Mason University in Fairfax, Va.

Joseph Huba and Douglas Drob of the U.S. Naval Research Laboratory in Washington, D.C., predicted some of what should happen to the ionosphere in the July 17 Geophysical Research Letters. At higher altitudes, the electrons’ temperature should decrease by 15 percent. Between 150 and 350 kilometers above Earth’s surface, the density of free-floating electrons should drop by a factor of two as they rejoin atoms, the researchers say. This drop in free-floating electrons should create a disturbance that travels along Earth’s magnetic field lines. That echo of the eclipse-induced ripple in the ionosphere may be detectable as far away as the tip of South America.

Previous experiments during eclipses have shown that the degree of ionization doesn’t simply die down and then ramp back up again, as you might expect. The amount of ionization you see seems to depend on how far you are from being directly in the moon’s shadow.

For a project called Eclipse Mob, Nelson and her colleagues will use volunteers around the United States to gather data on how the ionosphere responds when the sun is briefly blocked from the largest land area ever.
About 150 Eclipse Mob participants received a build-it-yourself kit for a small radio receiver that plugs into the headphone jack of a smartphone. Others made their own receivers after the project ran out of kits. On August 21, the volunteers will receive signals from radio transmitters and record the signal’s strength before, during and after the eclipse.
Nelson isn’t sure what to expect in the data, except that it will look different depending on where the receivers are. “We’ll be looking for patterns,” she says. “I don’t know what we’re going to see.”

Semeter and his colleagues will be looking for the eclipse’s effect on GPS signals. They would also like to measure the eclipse’s effects on the ionosphere using smartphones — eventually.

For this year’s solar eclipse, they will observe radio signals using an existing network of GPS receivers in Missouri, and intersperse it with small, cheap GPS receivers that are similar to the kind in most phones. The eclipse will create a big cool spot, setting off waves in the atmosphere that will propagate away from the moon’s shadow. Such waves leave an imprint on the ionosphere that affects GPS signals. The team hopes to combine high-quality data with messier data to lay the groundwork for future experiments to tap into the smartphone crowd.

“The ultimate vision of this project is to leverage all 2 billion smartphones around the planet,” Semeter says. Someday, everyone with a phone could be a node in a global telescope.

If it works, it could be a lifesaver. Similar atmospheric waves were seen radiating from the source of the 2011 earthquake off the coast of Japan (SN Online: 6/16/11). “The earthquake did the sort of thing the eclipse is going to do,” Semeter says. Understanding how these waves form and move could potentially help predict earthquakes in the future.

Does the corona look different when solar activity is high versus when it’s low?

Carbondale, Ill., is just a few kilometers north of the point where this year’s total solar eclipse will linger longest — the city will get two minutes and 38 seconds of total darkness when the moon blocks out the sun. And it’s the only city in the United States that will also be in the path of totality when the next total solar eclipse crosses North America, in 2024 (SN: 8/5/17, p. 32). The town is calling itself the Eclipse Crossroads of America.
“Having a solar eclipse that goes through the entire continent is rare enough,” says planetary scientist Padma Yanamandra-Fisher of the Space Science Institute’s branch in Rancho Cucamonga, Calif. “Having two in seven years is even more rare. And two going through the same city is rarer still.”

That makes Carbondale the perfect spot to investigate how the sun’s atmosphere, or corona, looks different when solar activity is high versus low.

Every 11 years or so, the sun cycles from periods of high magnetic field activity to low activity and back again. The frequency of easy-to-see features — like sunspots on the sun’s visible surface, solar flares and the larger eruptions of coronal mass ejections — cycles, too. But it has been harder to trace what happens to the corona’s streamers, the long wispy tendrils that give the corona its crownlike appearance and originate from the magnetic field.
The corona is normally invisible from Earth, because the bright solar disk washes it out. Even space telescopes that are trained on the sun can’t see the inner part of the corona — they have to block some of it out for their own safety (SN Online: 8/11/17). So solar eclipses are the only time researchers can get a detailed view of what the inner corona, where the streamers are rooted, is up to.
Right now, the sun is in a period of exceptionally low activity. Even at the most recent peak in 2014, the sun’s number of flares and sunspots was pathetically wimpy (SN: 11/2/13, p. 22). During the Aug. 21 solar eclipse, solar activity will still be on the decline. But seven years from now during the 2024 eclipse, it will be on the upswing again, nearing its next peak.

Yanamandra-Fisher will be in Carbondale for both events. This year, she’s teaming up with a crowdsourced eclipse project called the Citizen Continental-America Telescope Eclipse experiment. Citizen CATE will place 68 identical telescopes along the eclipse’s path from Oregon to South Carolina.

As part of a series of experiments, Yanamandra-Fisher and her colleagues will measure the number, distribution and extent of streamers in the corona. Observations of the corona during eclipses going back as far as 1867 suggest that streamers vary with solar activity. During low activity, they tend to be more squat and concentrated closer to the sun’s equator. During high activity, they can get more stringy and spread out.

Scientists suspect that’s because as the sun ramps up its activity, its strengthening magnetic field lets the streamers stretch farther out into space. The sun’s equatorial magnetic field also splits to straddle the equator rather than encircle it. That allows streamers to spread toward the poles and occupy new space.

Although physicists have been studying the corona’s changes for 150 years, that’s still only a dozen or so solar cycles’ worth of data. There is plenty of room for new observations to help decipher the corona’s mysteries. And Yanamandra-Fisher’s group might be the first to collect data from the same point on Earth.

“This is pure science that can be done only during an eclipse,” Yanamandra-Fisher says. “I want to see how the corona changes.”

Scientists create the most cubic form of ice crystals yet

Cube-shaped ice is rare, at least at the microscopic level of the ice crystal. Now researchers have coaxed typically hexagonal 3-D ice crystals to form the most cubic ice ever created in the lab.

Cubed ice crystals — which may exist naturally in cold, high-altitude clouds — could help improve scientists’ understanding of clouds and how they interact with Earth’s atmosphere and sunlight, two interactions that influence climate.

Engineer Barbara Wyslouzil of Ohio State University and colleagues made the cubed ice by shooting nitrogen and water vapor through nozzles at supersonic speeds. The gas mixture expanded and cooled, and then the vapor formed nanodroplets. Quickly cooling the droplets further kept them liquid at normally freezing temperatures. Then, at around –48° Celsius, the droplets froze in about one millionth of a second.

The low-temperature quick freeze allowed the cubic ice to form, the team reports in the July 20 Journal of Physical Chemistry Letters. The crystals weren’t perfect cubes but were about 80 percent cubic. That’s better than previous studies, which made ice that was 73 percent cubic.

Fiery re-creations show how Neandertals could have easily made tar

Neandertals took stick-to-itiveness to a new level. Using just scraps of wood and hot embers, our evolutionary cousins figured out how to make tar, a revolutionary adhesive that they used to make formidable spears, chopping tools and other implements by attaching sharp-edged stones to handles, a new study suggests.

Researchers already knew that tar-coated stones date to at least 200,000 years ago at Neandertal sites in Europe, well before the earliest known evidence of tar production by Homo sapiens, around 70,000 years ago in Africa. Now, archaeologist Paul Kozowyk of Leiden University in the Netherlands and colleagues have re-created the methods that these extinct members of the human genus could have used to produce tar.
Three straightforward techniques could have yielded enough adhesive for Neandertals’ purposes, Kozowyk’s team reports August 31 in Scientific Reports. Previous studies have found that tar lumps found at Neandertal sites derive from birch bark. Neandertal tar makers didn’t need ceramic containers such as kilns and didn’t have to heat the bark to precise temperatures, the scientists conclude.
These findings fuel another burning question about Neandertals: whether they had mastered the art of building and controlling a fire. Some researchers suspect that Neandertals had specialized knowledge of fire control and used it to make adhesives; others contend that Neandertals only exploited the remnants of wildfires. The new study suggests they could have invented low-tech ways to make tar with fires, but it’s not clear whether those fires were intentionally lit.

“This new paper demystifies the prehistoric development of birch-bark tar production, showing that it was not predicated on advanced cognitive or technical skills but on knowledge of familiar, readily available materials,” says archaeologist Daniel Adler of the University of Connecticut in Storrs, who did not participate in the study.
Kozowyk’s group tested each of three tar-making techniques between five and 11 times. The lowest-tech approach consisted of rolling up a piece of birch bark, tying it with wood fiber and covering it in a mound of ashes and embers from a wood fire. Tar formed between bark layers and was scraped off the unrolled surface. The experimenters collected up to about one gram of tar this way.

A second strategy involved igniting a roll of birch bark at one end and placing it in a small pit. In some cases, embers were placed on top of the bark. The researchers either scraped tar off bark layers or collected it as it dripped onto a rock, strip of bark or a piece of bark folded into a cup. The most tar gathered with this method, about 1.8 grams, was in a trial using a birch-bark cup placed beneath a bark roll with its lit side up and covered in embers.

Repeating either the ash-mound or pit-roll techniques once or twice would yield the relatively small quantity of tar found at one Neandertal site in Europe, the researchers say. Between six and 11 repetitions would produce a tar haul equal to that previously unearthed at another European site.

In a third technique, the scientists placed a birch-bark vessel for collecting tar into a small pit. They placed a layer of twigs across the top of the pit and placed pebbles on top, then added a large, loose bark roll covered in a dome-shaped coat of wet soil. A fire was then lit on the earthen structure. This method often failed to produce anything. But after some practice with the technique, one trial resulted in 15.7 grams of tar — enough to make a lump comparable in size to the largest chunks found at Neandertal sites.

An important key to making tar was reaching the right heat level. Temperatures inside bark rolls, vessels, fires and embers varied greatly, but at some point each procedure heated bark rolls to between around 200˚ and 400˚ Celsius, Kozowyk says. In that relatively broad temperature range, tar can be produced from birch bark, he contends.

If they exploited naturally occurring fires, Neandertal tar makers had limited time and probably relied on a simple technique such as ash mounds, Kozowyk proposes. If Neandertals knew how to start and maintain fires, they could have pursued more complex approaches.

Some researchers say that excavations point to sporadic use of fire by Neandertals, probably during warm, humid months when lightning strikes ignited wildfires. But other investigators contend that extinct Homo species, including Neandertals, built campfires (SN: 5/5/12, p. 18).

Whatever the case, Kozowyk says, “Neandertals could have invented tar with only basic knowledge of fire and birch bark.”

Why bats crash into windows

Walls can get the best of clumsy TV sitcom characters and bats alike.

New lab tests suggest that smooth, vertical surfaces fool some bats into thinking their flight path is clear, leading to collisions and near misses.

The furry fliers famously use sound to navigate — emitting calls and tracking the echoes to hunt for prey and locate obstacles. But some surfaces can mess with echolocation.

Stefan Greif of the Max Planck Institute for Ornithology in Seewiesen, Germany, and colleagues put bats to the test in a flight tunnel. Nineteen of 21 greater mouse-eared bats (Myotis myotis) crashed into a vertical metal plate at least once, the scientists report in the Sept. 8 Science. In some crashes, bats face-planted without even trying to avoid the plate.
Smooth surfaces act as acoustic mirrors, the team says: Up close, they reflect sound at an angle away from the bat, producing fuzzier, harder-to-read echoes than rough surfaces do. From farther away, smooth surfaces don’t produce any echoes at all.

Infrared camera footage of wild bat colonies showed that vertical plastic plates trick bats in more natural settings, too.

Crash reel
This video shows three experiments into how smooth surfaces affect bat flight. In one lab test, a vertical metal plate gave a bat the illusion of a clear flight path, causing it to crash into the barrier. In a second lab test, a horizontal metal plate created the illusion of water; the bat dips to surface to take a sip. Finally, near a natural bat colony, a bat collides with a vertically hung plastic plate, showing that smooth surfaces could impact bats in the wild, as well.

We’re probably undervaluing healthy lakes and rivers

For sale: Pristine lake. Price negotiable.

Most U.S. government attempts to quantify the costs and benefits of protecting the country’s bodies of water are likely undervaluing healthy lakes and rivers, researchers argue in a new study. That’s because some clean water benefits get left out of the analyses, sometimes because these benefits are difficult to pin numbers on. As a result, the apparent value of many environmental regulations is probably discounted.

The study, published online October 8 in the Proceedings of the National Academy of Sciences, surveyed 20 government reports analyzing the economic impacts of U.S. water pollution laws. Most of these laws have been enacted since 2000, when cost-benefit analyses became a requirement. Analysis of a measure for restricting river pollution, for example, might find that it increases costs for factories using that river for wastewater disposal, but boosts tourism revenues by drawing more kayakers and swimmers.
Only two studies out of 20 showed the economic benefits of these laws exceeding the costs. That’s uncommon among analyses of environmental regulations, says study coauthor David Keiser, an environmental economist at Iowa State University in Ames. Usually, the benefits exceed the costs.

So why does water pollution regulation seem, on paper at least, like such a losing proposition?

Keiser has an explanation: Summing up the monetary benefits of environmental policies is really hard. Many of these benefits are intangible and don’t have clear market values. So deciding which benefits to count, and how to count them, can make a big difference in the results.
Many analyses assume water will be filtered for drinking, Keiser says, so they don’t count the human health benefits of clean lakes and rivers (SN: 8/18/18, p. 14). That’s different from air pollution cost-benefit studies, which generally do include the health benefits of cleaner air by factoring in data tracking things like doctor’s visits or drug prescriptions. That could explain why Clean Air Act rules tend to get more favorable reviews, Keiser says — human health accounts for about 95 percent of the measured benefits of air quality regulations.

“You can avoid a lake with heavy, thick, toxic algal blooms,” Keiser says. “If you walk outside and have very polluted air, it’s harder to avoid.”

But even if people can avoid an algae-choked lake, they still pay a price for that pollution, says environmental scientist Thomas Bridgeman, director of the Lake Erie Center at the University of Toledo in Ohio.
Communities that pull drinking water from a lake filled with toxic blooms of algae or cyanobacteria spend more to make the water safe to drink. Bridgeman’s seen it firsthand: In 2014, Lake Erie’s cyanobacteria blooms from phosphorus runoff shut down Toledo’s water supply for two days and forced the city to spend $500 million on water treatment upgrades.

Most of the studies surveyed by Keiser and his team were missing other kinds of benefits, too. The reports usually left out the value of eliminating certain toxic and nonconventional pollutants — molecules such as bisphenol A, or BPA, and perfluorooctanoic acid, or PFOA (SN: 10/3/15, p. 12). In high quantities, these compounds, which are used to make some plastics and nonstick coatings, can cause harm to humans and wildlife. Many studies also didn’t include discussion of how the quality of surface waters can affect groundwater, which is a major source of drinking water for many people.

A lack of data on water quality may also limit studies, Keiser’s team suggests. While there’s a national database tracking daily local air pollution levels, the data from various water quality monitoring programs aren’t centralized. That makes gathering and evaluating trends in water quality harder.

Plus, there are the intangibles — the value of aquatic species that are essential to the food chain, for example.
“Some things are just inherently difficult to put a dollar [value] on,” says Robin Craig, an environmental law professor at the University of Utah in Salt Lake City. “What is it worth to have a healthy native ecosystem?… That’s where it can get very subjective very fast.”

That subjectivity can allow agencies to analyze policies in ways that suit their own political agendas, says Matthew Kotchen, an environmental economist at Yale University. An example: the wildly different assessments by the Obama and Trump administrations of the value gained from the 2015 Clean Water Rule, also known as the Waters of the United States rule.

The rule, passed under President Barack Obama, clarified the definition of waters protected under the 1972 Clean Water Act to include tributaries and wetlands connected to larger bodies of water. The Environmental Protection Agency estimated in 2015 that the rule would result in yearly economic benefits ranging from $300 million to $600 million, edging out the predicted annual costs of $200 million to $500 million. But in 2017, Trump’s EPA reanalyzed the rule and proposed rolling it back, saying that the agency had now calculated just $30 million to $70 million in annual benefits.

The difference in the conclusions came down to the consideration of wetlands: The 2015 analysis found that protecting wetlands, such as marshes and bogs that purify water, tallied up to $500 million in annual benefits. The Trump administration’s EPA, however, left wetlands out of the calculation entirely, says Kotchen, who analyzed the policy swing in Science in 2017.

Currently, the rule has gone into effect in 26 states, but is still tied up in legal challenges.

It’s an example of how methodology — and what counts as a benefit — can have a huge impact on the apparent value of environmental policies and laws.

The squishiness in analyzing environmental benefits underlies many of the Trump administration’s proposed rollbacks of Obama-era environmental legislation, not just ones about water pollution, Kotchen says. There are guidelines for how such cost-benefit analyses should be carried out, he says, but there’s still room for researchers or government agencies to choose what to include or exclude.

In June, the EPA, then under the leadership of Scott Pruitt, proposed revising the way the agency does cost-benefit analyses to no longer include so-called indirect benefits. For example, in evaluating policies to reduce carbon dioxide emissions, the agency would ignore the fact that those measures also reduce other harmful air pollutants. The move would, overall, make environmental policies look less beneficial.

These sharp contrasts in how presidential administrations approach environmental impact studies are not unprecedented, says Craig, the environmental law professor. “Pretty much every time we change presidents, the priorities for how to weigh those different elements change.”