Drugs for reflux disease in infants may come with unintended consequences

When my girls were newborns, I spent a lot of time damp. Fluids were everywhere, some worse than others. One of the main contributors was milk, which, in various stages of digestion, came back to haunt me in a sloppy trail down my back.

I was sometimes alarmed at the volume of fluid that came flying out of my tiny babies. And I remember asking our pediatrician if it was a problem. We were lucky in that the amount and frequency of the regurgitations didn’t seem to signal trouble.

But some babies spit up a lot more, and seem to be in distress while doing so. That’s led doctors to prescribe antacids to treat reflux disease in these infants. A U.S.-based survey found that from 2000 to 2003, infant use of a type of antacid called proton-pump inhibitors quadrupled.

Those numbers point to worried doctors and parents who want to help babies feel better. The problem, though, is that antacids come with side effects. Mucking with acid levels can affect the body beyond the stomach, and these unintended effects may be even more meddlesome in babies.

“What we found in adults and what we’re starting to see more in children is that [the drugs] are not as benign as we used to think,” says U.S. Air Force Captain Laura Malchodi, a pediatrician at Walter Reed National Military Medical Center in Bethesda, Md.

Infants who took proton-pump inhibitors, a class of drugs that includes Prilosec and Nexium, in their first six months of life broke more bones over the next several years than children who didn’t receive the drugs. That example comes from research Malchodi presented May 7 at the 2017 Pediatric Academic Societies Meeting in San Francisco.

Malchodi and her colleagues examined medical records of nearly 900,000 healthy children. Of those, about 7,000 were prescribed proton-pump inhibitors by the time they were 6 months old. About 67,000 were prescribed histamine H2-blocking drugs, such as Zantac or Pepcid, and about 11,000 babies were prescribed both types of drugs.
Children who had received proton-pump inhibitors, either alone or in combination with a histamine H2-blocker, had more fractures over the next five years than children who weren’t prescribed that type of drug. The researchers tried to rule out other differences between the groups of babies that might explain the higher number of fractures. When those differences were removed from the analysis, proton-pump inhibitor prescriptions were still linked to fractures.

The study can’t say whether proton-pump inhibitors definitely caused weaker bones. But that’s not an unreasonable hypothesis given what’s seen in adults, for whom the link between long-term use of proton-pump inhibitors and broken bones is stronger.

If proton-pump inhibitors do interfere with bones, it’s still a mystery exactly how. One idea was that the drugs hinder calcium absorption, leading to weaker bones. That idea has fallen out of favor, Malchodi says. Another proposal centers on cells called osteoclasts. To do their job, these cells rely on proton pumps to create acidic pockets around bones. But if osteoclasts aren’t working properly, “in the end, what you get is disorganized bone,” Malchodi says.

Reflux disease is not the same thing as reflux, which babies are nearly guaranteed to experience. For one thing, the amount of liquid they’re slurping down relative to their body weight is huge. And that liquid is held down by an esophageal sphincter that’s often underdeveloped in babies. (One technical term for reflux is “poor gastric compliance,” but I bet you’ve got more colorful descriptions.)

Antacids won’t stop babies from spitting up, says Malchodi. “We definitely counsel parents all the time that this is not going to stop the reflux,” she says. Instead, the drugs are thought to change the pH of the liquid coming back up in an attempt to make it less irritating.

Some babies may need that pharmaceutical help. But many may not. If babies are growing well and don’t seem to be in long-lasting distress, then it’s possible that they may need the “tincture of time” to outgrow the reflux. (Malchodi points out that so-called “happy spitters” are probably not smiling while they’re barfing, because obviously, throwing up is not fun. It’s just that these babies don’t seem to be bothered long after the spitting.)

She hopes that her research and other studies like it will prompt more careful discussions between parents and doctors before antacids are prescribed. And if they are deemed necessary, “have a stop point in mind,” she says.

Why you can hear and see meteors at the same time

For centuries, skywatchers have reported seeing and simultaneously hearing meteors whizzing overhead, which doesn’t make sense given that light travels roughly 800,000 times as fast as sound. Now scientists say they have a potential explanation for the paradox.

The sound waves aren’t coming from the meteor itself, atmospheric scientists Michael Kelley of Cornell University and Colin Price of Tel Aviv University propose April 16 in Geophysical Research Letters. As the leading edge of the falling space rock vaporizes, it becomes electrically charged. The charged head produces an electric field, which yields an electric current that blasts radio waves toward the ground. As a type of electromagnetic radiation, radio waves travel at the speed of light and can interact with metal objects near the ground, generating a whistling sound that people can hear.

Just 0.1 percent of the radio wave energy needs to be converted into sound for the noise to be audible as the meteor zips by, the researchers estimate. This same process could explain mysterious noises heard during the aurora borealis, or northern lights (SN: 8/9/14, p. 32). Like meteors, auroras have been known to emit radio wave bursts.

Jupiter’s precocious birth happened in the solar system’s first million years

Jupiter was an early bloomer. New measurements of meteorite ages suggest that the giant planet’s core must have formed within the solar system’s first million years. If so, Jupiter’s presence could help explain why the inner planets are so small — and possibly even be responsible for Earth’s existence.

Previously, astronomers’ best constraints on Jupiter’s age came from simulations of how solar systems form in general. Gas giants like Jupiter grow by accreting gas from spinning disks of gas and dust around a young star. Those disks typically don’t last more than 10 million years, so astronomers inferred that Jupiter formed by the time that disk dissipated.
“Now we can use actual data from the solar system to show Jupiter formed even earlier,” says Thomas Kruijer, who did the research while at the University of Münster in Germany. Kruijer, now at Lawrence Livermore National Laboratory in California, and his team report Jupiter’s new age in the Proceedings of the National Academy of Sciences the week of June 12.

To study one of the biggest objects in the solar system, Kruijer and colleagues turned to some of the smallest: meteorites. Most meteorites come from the asteroid belt currently located between Mars and Jupiter but probably were born elsewhere.

Luckily, meteorites carry a signature of their birthplaces. The gas and dust disk that the planets formed from had different neighborhoods. Each had its own “zip code,” areas enriched in certain isotopes, or different masses of the same elements. Careful measurements of a meteorite’s isotopes can point to its home.

Kruijer and colleagues selected 19 samples of rare iron meteorites from the Natural History Museum in London and the Field Museum in Chicago. These rocks represent the metal cores of the first asteroid-like bodies to congeal as the solar system was forming.

The team dissolved about a gram of each sample in a solution of nitric acid and hydrochloric acid. “It smells terrible,” Kruijer says.
Then the researchers separated out the elements tungsten — a good tracer of both a meteorite’s age and birthplace — and molybdenum, another tracer of a meteorite’s home.

By measuring the relative amounts of molybdenum-94, molybdenum-95, tungsten-182 and tungsten-183, Kruijer and his team identified two distinct groups of meteorites. One group formed closer to the sun than Jupiter is today; the other formed farther from the sun.

The tungsten isotopes also showed that both groups existed at the same time, between about 1 million and 4 million years after the start of the solar system about 4.57 billion years ago (SN Online: 8/23/10). That means something must have kept them separated.

The most likely candidate is Jupiter, Kruijer says. His team’s calculations suggest that Jupiter’s core had probably grown to about 20 times the mass of the Earth in the solar system’s first million years, making it the oldest planet. Its presence would have created a gravitational barrier that kept the two meteorite neighborhoods segregated. Jupiter would then have continued growing at a slower rate for the next few billion years.

“I have high confidence that their data is excellent,” says cosmochemist Meenakshi Wadhwa of Arizona State University in Tempe. The suggestion that Jupiter held the different meteorites apart is “a little more speculative, but I buy it,” she adds.

Jupiter’s early entrance could also explain why the inner solar system lacks any planets larger than Earth. Many extrasolar planetary systems have large close-in planets, from rocky super-Earths (about two to 10 times the mass of Earth) to gassy mini-Neptunes or hot Jupiters. Astronomers have puzzled over why our solar system looks so different.

An early Jupiter’s gravity could have kept most of the planet-forming disk away from the sun, meaning there was less raw material for the inner planets. This picture is consistent with other work suggesting a young Jupiter wandered through the inner solar system and swept it clean (SN: 4/2/16, p.7), Kruijer says.

“Without Jupiter, we could have had Neptune where Earth is,” Kruijer says. “And if that’s the case, there would probably be no Earth.”

Gecko-inspired robot grippers could grab hold of space junk

Get a grip. A new robotic gripping tool based on gecko feet can grab hold of floating objects in microgravity. The grippers could one day help robots move dangerous space junk to safer orbits or climb around the outside of space stations.

Most strategies for sticking don’t work in space. Chemical adhesives can’t withstand the wide range of temperatures, and suction doesn’t work in a vacuum.

Adhesives inspired by gecko feet — which use van der Waals forces to cling without feeling sticky (SN Online: 11/18/14) — could fit the bill, says Mark Cutkosky of Stanford University, whose team has been designing such stickers for more than a decade. Now his team has built robotic gripper “hands” that can grapple objects many times their size without pushing them away, the researchers report June 28 in Science Robotics.
The team first tested the grippers in the Robo-Dome, a giant air hockey table at NASA’s Jet Propulsion Laboratory in Pasadena, Calif., where two 370-kilogram robots gently pushed each other around using a small square of gecko gripper.

Then last summer, Aaron Parness and Christine Fuller, of the Jet Propulsion Lab, and Hao Jiang of Stanford took the full gripper hand, which includes several patches of gripping material in a specific arrangement, on a microgravity flight in NASA’s Weightless Wonder aircraft. The team used the hand to grab and release a cube, cylinder and beach ball, which represented satellites, spent rockets or fuel tanks, and pressure vessels.

Gripper hands could be used to repair or move dead satellites, or help miniature satellites called CubeSats stick to larger spacecraft like barnacles, Parness says.

Readers question hominid family tree

Hominid hubbub
In “Hominid roots may go back to Europe” (SN: 6/24/17, p. 9), Bruce Bower reported that the teeth of Graecopithecus, a chimp-sized primate that lived in southeastern Europe 7 million years ago, suggest it was a member of the human evolutionary family.

“Is it appropriate to use the terms ‘hominid’ and ‘ape’ as if the two are mutually exclusive categories?” asked online reader Tim Cliffe. “The distinction being made is between our clade in particular and all other apes. It seems to me that ‘hominids’ should be described as a subset of apes, not a separate category,” he wrote.
“Yes, hominids are apes,” Bower says. “The terminology gets pretty thick in evolutionary studies, so researchers (and journalists) use some shortcuts.”

Fossils of many ancient apes dating to between 25 million and 5 million years ago have been found, but the interest in this case is in a key transition to a particular kind of ape that walked upright and displayed various skeletal traits similar to traits unique to the human evolutionary family. “That’s why one source in the story, Bernard Wood, wonders whether Graecopithecus was an apelike hominid or a hominid-like ape,” Bower says. “But it’s important to remember that hominids diverged from other, ancestral apes. So did chimps.”

Science News defines “hominid” as a member of the human evolutionary family.

Laser, camera, action
The world’s fastest video camera films 5 trillion frames every second, Ashley Yeager reported in “A different kind of camera captures speedy actions” (SN: 6/24/17, p. 5). The camera works by flashing a laser at a subject and using a computer program to combine the still images into a video. Researchers tested the device by filming particles of light as the particles traveled a short distance.

Online reader JHoughton1 wondered if the researchers really filmed a light particle in their tests. “I thought light ‘sometimes behaves like a wave, sometimes like a particle,’ but that there isn’t really any particle that’s a particle in the usual sense. Is this really a picture of a ‘particle’ of light? A photon-as-ball-of-stuff?”

The camera captured the forward progression of a laser pulse, which is an ensemble of photons, Yeager says.

Photons themselves aren’t “balls of stuff” on quantum scales, says physics writer Emily Conover. All particles, including photons, are spread out in space, propagating like waves. “Only when scientists measure or observe a photon or any other particle do they find it in one place, like the ball of stuff that people typically imagine. I think in that sense, photons are about as tangible as any other quantum particle,” Conover says.

Bringing down the mucus house
Little-known sea animals called giant larvaceans can catch a lot of carbon in disposable mucus casings called “houses,” Susan Milius reported in “ ‘Mucus houses’ catch sea carbon fast” (SN: 6/10/17, p. 13).

Online reader Robert Stenton wondered what happens to mucus houses as they fall to the bottom of the ocean.

What happens to discarded houses isn’t yet clear, Milius says, though researchers have proposed that the houses might carry substantial portions of carbon to life on the sea bottom. And if bits of a house fall fast enough to reach great depths, the carbon could get trapped in water masses that move around the planet for centuries before surfacing. Bits drifting down slowly may be intercepted by microbes and other debris feeders and would not end up sequestered.

Correction
In “Human noises invade wilderness” (SN: 6/10/17, p. 14), Science News incorrectly reported that official wilderness areas in the United States do not allow livestock grazing. Grazing is permitted in protected wilderness areas at preprotection levels under the Wilderness Act of 1964, which created the National Preservation System.

Giant armored dinosaur may have cloaked itself in camouflage

Sometimes body armor just isn’t enough. A car-sized dinosaur covered in bony plates may have sported camo, too, researchers report online August 3 in Current Biology. That could mean the Cretaceous-period herbivore was a target for predators that relied on sight more than smell to find prey.

The dinosaur, dubbed Borealopelta markmitchelli, has already made headlines for being one of the best preserved armored dinosaurs ever unearthed. It was entombed on its back some 110 million years ago under layers of fine marine sediments that buried the animal very quickly — ideal preservation conditions, says study coauthor Caleb Brown, a paleontologist at the Royal Tyrrell Museum of Palaeontology in Drumheller, Canada. The fossil, found in Alberta in 2011, captured not only large amounts of skin and soft tissue but also the animal’s three-dimensional shape.
“Most of the other armored dinosaurs are described based on the skeleton. In this case, we can’t see the skeleton because all the skin is still there,” Brown says.

That skin contains clues to the dinosaur’s appearance, including its coloration. “We’re just beginning to realize how important color is, and we’re beginning to have the methods to detect color” in fossils, says Martin Sander, a paleontologist at Bonn University in Germany who wasn’t part of the study.

But despite ample tissue, the researchers didn’t find any melanosomes, cellular structures that often preserve evidence of pigment in fossilized remains. Instead, Brown and colleagues turned to less direct evidence: molecules that appear when pigments break down. The researchers found about a dozen types of those molecules, including substantial amounts of benzothiazole, a by-product of the reddish pigment pheomelanin. That might mean the dinosaur was reddish-brown.
The distribution of pigment by-products also gives clues about the dinosaur’s appearance. B. markmitchelli had a thin film of pigment-hinting organic molecules on its back, but that layer disappeared on the belly. That pattern is reminiscent of countershading, when an animal is darker on its back than its underside, Brown says. Countershading is a simple form of camouflage that helps animals blend in with the ground when seen from above or with the sky when seen from below.
This is not the first time countershading has been proposed for a dinosaur (SN: 11/26/16, p 24). But finding the camouflage on such a large herbivore is somewhat surprising, Brown says. Modern plant eaters that don similar camouflage tend to be smaller and at greater risk of becoming someone’s dinner. B. markmitchelli’s skin patterning suggests that at least some top Cretaceous predators might have relied more on eyesight than today’s top carnivores, which often favor smell when hunting, Brown says.

Some experts, however, want stronger evidence for the coloration claims. Molecules like benzothiazole can come from melanin, but they can also come from a number of other sources, such as oils, says Johan Lindgren, a paleontologist at Lund University in Sweden. “What this paper nicely highlights is how little we actually know about the preservation of soft tissues in animal remains. There’s definitely something there — the question is, what are those [molecules], and where do they come from?”

Sander does buy the evidence for the reddish tint, but it might not be the full story, he says. The dino could have displayed other colors that didn’t linger in the fossil record. But the countershading findings “point out the importance of vision” for dinosaurs, he says. Sharp-eyed predators might have made camouflage a perk for herbivores — even ones built like tanks.

A mutation may explain the sudden rise in birth defects from Zika

A single genetic mutation made the Zika virus far more dangerous by enhancing its ability to kill nerve cells in developing brains, a new study suggests.

The small change — which tweaks just one amino acid in a protein that helps Zika exit cells — may cause microcephaly, researchers report September 28 in Science. The mutation arose around May 2013, shortly before a Zika outbreak in French Polynesia, the researchers calculate.

Zika virus was discovered decades ago but wasn’t associated with microcephaly — a birth defect characterized by a small head and brain — until the 2015–2016 outbreak in Brazil. Women who had contracted the virus while pregnant started giving birth to babies with the condition at higher-than-usual rates (SN: 4/2/16, p. 26).
Researchers weren’t sure why microcephaly suddenly became a complication of Zika infections, says Pei-Yong Shi, a virologist at the University of Texas Medical Branch at Galveston. Maybe the virus did cause microcephaly before, scientists suggested, but at such low rates that no one noticed. Or people in South America might be more vulnerable to the virus. Perhaps their immune systems don’t know how to fight it, they have a genetic susceptibility or prior infections with dengue made Zika worse (SN: 4/29/17, p. 14). But Shi and colleagues in China thought the problem might be linked to changes in the virus itself.
The researchers compared a strain of Zika isolated from a patient in Cambodia in 2010 with three Zika strains collected from patients who contracted the virus in Venezuela, Samoa and Martinique during the epidemic of 2015–2016. The team found seven differences between the Cambodian virus and the three epidemic strains.

Researchers engineered seven versions of the Cambodian virus, each with one of the epidemic strains’ mutations, and injected the viruses into fetal mouse brains. Viruses with one of these mutations, dubbed S139N, killed brain cells in fetal mice and destroyed human brain cells grown in lab dishes more aggressively than the Cambodian strain from 2010 did, the researchers found.
“That’s pretty convincing evidence that it at least plays some role in what we’re seeing now,” says Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases.

The mutation changes an amino acid in a Zika protein called prM. That protein helps the virus mature within infected cells and get out of the cells to infect others. Shi and colleagues don’t yet know why tweaking the protein makes the virus kill brain cells more readily.

The alteration in that protein probably isn’t the entire reason epidemic strains cause microcephaly, Shi says. The Cambodian strain also led to the death of a few brain cells, but perhaps not enough to cause microcephaly. “We believe there are other changes in the virus that collectively enhance its virulence,” he says. In May in Nature, Shi and colleagues described a different mutation that allows the virus to infect mosquitoes more effectively.

Brain cells from different people vary in their susceptibility to Zika infections, says infectious disease researcher Scott Weaver, also at the University of Texas Medical Branch but not involved in the study. He says more work on human cells and in nonhuman primates is needed to confirm whether this mutation is really the culprit in microcephaly.

Step away from the cookie dough. E. coli outbreaks traced to raw flour

Eggs, long condemned for making raw cookie dough a forbidden pleasure, can stop taking all the blame. There’s another reason to resist the sweet uncooked temptation: flour.

The seemingly innocuous pantry staple can harbor strains of E. coli bacteria that make people sick. And, while not a particularly common source of foodborne illness, flour has been implicated in two E. coli outbreaks in the United States and Canada in the last two years.

Pinning down tainted flour as the source of the U.S. outbreak, which sickened 63 people between December 2015 and September 2016, was trickier than the average food poisoning investigation, researchers recount November 22 in the New England Journal of Medicine.
Usually, state health departments rely on standard questionnaires to find a common culprit for a cluster of reported illnesses, says Samuel Crowe, an epidemiologist at the Centers for Disease Control and Prevention in Atlanta, who led the study. But flour isn’t usually tracked on these surveys. So when the initial investigation yielded inconclusive results, public health researchers turned to in-depth personal interviews with 10 people who had fallen ill.

Crowe spent up to two hours asking each person detailed questions about what he or she had eaten around the time of getting sick. Asking people what they ate eight weeks ago can be challenging, Crowe says: Many people can’t even remember what they ate for breakfast that morning.

“I got a little lucky,” Crowe says. Two people remembered eating raw cookie dough before getting sick. They each sent Crowe pictures of the bag of flour they had used to make the batter. It turned out that both bags had been produced in the same plant. That was a “pretty unusual thing,” he says.
Follow-up questioning helped Crowe and his team pin down flour as the likely source. Eventually, U.S. Food and Drug Administration scientists analyzed the flour and isolated strains of E. coli bacteria that produce Shiga toxins, which make E. coli dangerous.

Disease-causing bacteria, including E. coli, usually thrive in moist environments, like bags of prewashed lettuce (SN: 12/24/16, p. 4). But the bacteria can also survive in a desiccated state for months and be re-activated with water, says Crowe. So as soon as dry flour mingles with eggs or oil, dormant bacteria can reawaken and start to replicate.

Cookie dough wasn’t the culprit in every case. A few children who got sick had been given raw tortilla dough to play with while waiting for a table at a restaurant. The cases all involved wheat flour from the same facility, leading to a recall of more than 250 flour-containing products.

There are ways to kill bacteria in flour before it reaches grocery store shelves, but they aren’t in use in the United States. Heat treatment, for example, will rid flour of E. coli and other pathogens. But the process also changes the structure of the flour, which affects the texture of baked goods, says Rick Holley, a food safety expert at the University of Manitoba in Canada who wasn’t part of the study. Irradiation, used to kill parasites and other pests in flour, might be a better option, Holley says. But it takes a higher dose of radiation to zap bacteria than it does to kill pests.

Or, of course, people could hold out for warm, freshly baked cookies.

Climate foiled Europeans’ early exploration of North America

Many people may be fuzzy on the details of North America’s colonial history between Columbus’ arrival in 1492 and the Pilgrims’ landing on Plymouth Rock in 1620. But Europeans were actively attempting to colonize North America from the early 16th century onward, even though few colonies survived.

As historian Sam White explains in A Cold Welcome, most early attempts were doomed by fatally incorrect assumptions about geography and climate, poor planning and bad timing.
White weaves together evidence of past climates and written historical records in a comprehensive narrative of these failures. One contributing factor: Explorers assumed climates at the same latitude were the same worldwide. But in fact, ocean currents play a huge role in moderating land temperatures, which means Western Europe is warmer and less variable in temperature from season to season than eastern North America at the same latitude.

On top of that, explorations occurred during a time of global cooling known as the Little Ice Age, which stretched from the 13th to early 20th centuries. The height of exploration may have occurred at the peak of cooling: Starting in the late 16th century, a series of volcanic eruptions likely chilled the Northern Hemisphere by as much as 1.8 degrees Celsius below the long-term average, White says.

This cooling gave Europeans an especially distorted impression of their new lands. For instance, not long after Spanish explorer Sebastián Vizcaíno landed in California’s Monterey Bay in December 1602, men’s water jugs froze overnight — an unlikely scenario today. Weather dissuaded Spain from further attempts at colonizing California for over a century.
Harsh weather also heightened conflict when underprepared Europeans met Native Americans, whose own resources were stretched thin by unexpectedly bad growing seasons.

A Cold Welcome is organized largely by colonial power, which means findings on climate are repeated in each chapter. But White’s synthesis of climate and history is novel, and readers will see echoes of today’s ignorance about the local consequences of climate change. “Human psychology may be both too quick to grasp at false patterns and yet too slow to let go of familiar expectations,” White writes.

Buy A Cold Welcome from Amazon.com. Science News is a participant in the Amazon Services LLC Associates Program. Please see our FAQ for more details.

Laser experiment hints at weird in-between ice

A proposed form of ice acts like a cross between a solid and a liquid. Now, a new study strengthens the case that the weird state of matter really exists.

Hints of the special phase, called superionic ice, appeared in water ice exposed to high pressures and temperatures, researchers report February 5 in Nature Physics. Although such unusual ice isn’t found naturally on Earth, it might lurk deep inside frozen worlds like Uranus and Neptune (SN Online: 3/5/12).
Normal ice is composed of water molecules, each made of an oxygen atom bonded to two hydrogen atoms. As water freezes, those molecules link up to form a solid. But superionic ice is made up of ions, which are atoms with a positive or negative electric charge. Within the material, hydrogen ions flow freely through a solid crystal of oxygen ions.

“That’s really strange behavior for water,” says study coauthor Marius Millot, a physicist at Lawrence Livermore National Laboratory in California. Although the superionic state was first predicted 30 years ago, “up until now we didn’t really know whether this was something that was real.”

At extremely high pressures, familiar substances like water can behave in unusual ways (SN: 1/14/12, p. 26). Working with a sample of ice that was crushed between two diamonds, Millot and colleagues used a laser to create a shock wave that plowed through the ice, boosting the pressure even more. At first, the density and temperature of the ice ramped up smoothly as the pressure increased. But at around 1.9 million times atmospheric pressure and 4,800 kelvins (about 4,500° Celsius), the scientists observed a jump in density and temperature. That jump, the researchers say, is evidence that superionic ice melted at that point. Although we normally think of ice as being cold, at high pressures, superionic ice can form even when heated. The melting occurred at just the conditions that theoretical calculations predict such ice would melt. The physicists didn’t measure the pressure at which the superionic phase first formed.

The electrical conductivity of the material provided another hint of superionic ice: The level of conductivity was consistent with expectations for that phase of matter. Whereas metals conduct electricity via the motion of electrons, in superionic ice, the flowing hydrogen ions transmit electricity.
The researchers “provide quite good evidence” of the new phase, says Alexander Goncharov, a physicist at the Carnegie Institution for Science in Washington, D.C., who was not involved with the study.

Others are more cautious about the significance of the work. “It’s definitely providing more insight into water at these conditions,” says physicist Marcus Knudson of Washington State University in Pullman. But, he says, “I don’t see strong evidence that there’s a melting transition in their data.”

So more work remains before this weird kind of ice is fully understood. For now, the superionic state of water seems likelier, but still on thin ice.