You wouldn’t expect wardrobe classics like leather jackets or denim jeans at an exhibit celebrating fashion at its most forward. But “#techstyle” at the Museum of Fine Arts in Boston features those sartorial mainstays and others, each with a technological twist.
A feast for the eyes, the diversity of pieces is matched by the diversity of artists and approaches. Yet a single theme unites: The fusion of technology and fashion will increasingly influence both. Visitors are introduced to this theme via a room featuring works by prominent designers already known for merging fashion and tech: A digitally printed silk dress by Alexander McQueen hangs next to a fiberglass “airplane dress” by Hussein Chalayan that has flaps that open and shut via remote control. The largest part of the exhibit focuses on how technology is changing design and construction strategies. In addition to clothes made with mainstream techniques like laser-cutting, several 3-D printed garments are on display. These include a kinematic dress made of more than 1,600 interlocking pieces that can be customized to a wearer’s body via a 3-D scan. The dress comes off the printer fully assembled. Other pieces are made with technologies still being developed, such as the laser-welded fabrics from sustainable textile researcher Kate Goldsworthy. The real standouts are in the “Performance” section, which displays attire that uses data from the immediate environment to generate some visible aspect of the garment. These interactive pieces “reveal something to the eye that you wouldn’t see normally, something that science often captures with graphs and charts,” says Pamela Parmal, a curator of the exhibit. For instance, the interactive dress “Incertitudes” is adorned with pins that flex in response to nearby voices, creating waves in the fabric; a dress embedded with thousands of tiny LEDs can display tweeted messages or other illuminated patterns. And there are two leather jackets that, at first glance, look like their innovation is merely a stylish cut. But the jackets are coated in reactive inks that shimmer with iridescent colors in response to the wind and heat generated by heat guns in the display case. (These creations were born after designer and trained chemist Lauren Bowker used the reactive compounds to reveal the aerodynamics of race cars in a wind tunnel in a project for Formula One.)
Visitors seeking in-depth explanations of the science behind the fashions will have to look elsewhere. But “#techstyle” still has something for everyone, whether fashionista or engineer. And while the fashions represented are all cutting edge, the show harks back to an era when clothes were custom-made. Technology might have brought us mass-produced cookie-cutter clothing, but it can also enable clothing tailored to the individual.
From within the dark confines of the skull, the brain builds its own version of reality. By weaving together expectations and information gleaned from the senses, the brain creates a story about the outside world. For most of us, the brain is a skilled storyteller, but to spin a sensible yarn, it has to fill in some details itself.
“The brain is a guessing machine, trying at each moment of time to guess what is out there,” says computational neuroscientist Peggy Seriès. Guesses just slightly off — like mistaking a smile for a smirk — rarely cause harm. But guessing gone seriously awry may play a part in mental illnesses such as schizophrenia, autism and even anxiety disorders, Seriès and other neuroscientists suspect. They say that a mathematical expression known as Bayes’ theorem — which quantifies how prior expectations can be combined with current evidence — may provide novel insights into pernicious mental problems that have so far defied explanation. Bayes’ theorem “offers a new vocabulary, new tools and a new way to look at things,” says Seriès, of the University of Edinburgh.
Experiments guided by Bayesian math reveal that the guessing process differs in people with some disorders. People with schizophrenia, for instance, can have trouble tying together their expectations with what their senses detect. And people with autism and high anxiety don’t flexibly update their expectations about the world, some lab experiments suggest. That missed step can muddy their decision-making abilities. Given the complexity of mental disorders such as schizophrenia and autism, it is no surprise that many theories of how the brain works have fallen short, says psychiatrist and neuroscientist Rick Adams of University College London. Current explanations for the disorders are often vague and untestable. Against that frustrating backdrop, Adams sees great promise in a strong mathematical theory, one that can be used to make predictions and actually test them.
“It’s really a step up from the old-style cognitive psychology approach, where you had flowcharts with boxes and labels on them with things like ‘attention’ or ‘reading,’ but nobody having any idea about what was going on in [any] box,” Adams says.
Applying math to mental disorders “is a very young field,” he adds, pointing to Computational Psychiatry, which plans to publish its first issue this summer. “You know a field is young when it gets its first journal.”
A mind for math Bayesian reasoning may be new to the mental illness scene, but the math itself has been around for centuries. First described by the Rev. Thomas Bayes in the 18th century, this computational approach truly embraces history: Evidence based on previous experience, known as a “prior,” is essential to arriving at a good answer, Bayes argued. He may have been surprised to see his math meticulously applied to people with mental illness, but the logic holds. To make a solid guess about what’s happening in the world, the brain must not rely just on current input from occasionally unreliable senses. The brain must also use its knowledge about what has happened before. Merging these two streams of information correctly is at the heart of perceiving the world as accurately as possible.
Bayes figured out a way to put numbers to this process. By combining probabilities that come from prior evidence and current observations, Bayes’ formula can be used to calculate an overall estimate of the likelihood that a given suspicion is true. A properly functioning brain seems to do this calculation intuitively, behaving in many cases like a skilled Bayesian statistician, some studies show (SN: 10/8/11, p. 18).
This reckoning requires the brain to give the right amount of weight to prior expectations and current information. Depending on the circumstances, those weights change. When the senses falter, for instance, the brain should lean more heavily on prior expectations. Say the mail carrier comes each day at 4 p.m. On a stormy afternoon when visual cues are bad, we rely less on sight and more on prior knowledge to guess that the late-afternoon noise on the front porch is probably the mail carrier delivering letters. In certain mental illnesses, this flexible balancing act may falter.
People with schizophrenia often suffer from hallucinations and delusions, debilitating symptoms that arise when lines between reality and imagination blur. That confusion can lead to hearing voices that aren’t there and believing things that can’t possibly be true. These departures from reality could arise from differences in how people integrate new evidence with previous beliefs. There’s evidence for such distorted calculations. People with schizophrenia don’t fall for certain visual illusions that trick most people, for instance. When shown a picture of the inside of a hollowed-out face mask, most people’s brains mistakenly convert the image to a face that pops outward off the page. People with schizophrenia, however, are more likely to see the face as it actually is — a concave mask. In that instance, people with schizophrenia give more weight to information that’s coming from their eyes than to their expectation that noses protrude from the rest of the face. To complicate matters, the opposite can be true, too, says neuropsychologist Chris Frith of the Wellcome Trust Centre for Neuroimaging at University College London. “In this case, their prior is too weak, but in other cases, their prior is too strong,” he says.
In a recent study, healthy people and those who recently began experiencing psychosis, a symptom of schizophrenia, were shown confusing shadowy black-and-white images. Participants then saw color versions of the images that were easier to interpret. When shown the black-and-white images again, people with early psychosis were better at identifying the images, suggesting that they used their prior knowledge — the color pictures — to truly “see” the images. For people without psychosis, the color images weren’t as much help. That difference suggests that the way people with schizophrenia balance past knowledge and present observations is distinct from the behavior of people without the disorder. Sometimes the balance tips too far — in either direction.
In a talk at the annual Computational and Systems Neuroscience meeting in February in Salt Lake City, Seriès described the results of a different visual test: A small group of people with schizophrenia had to describe which way a series of dots were moving on a screen. The dots moved in some directions more frequently than others — a statistical feature that let the scientists see how well people could learn to predict the dots’ directions. The 11 people with schizophrenia seemed just as good at learning which way the dots were likely to move as the 10 people without, Seriès said. In this situation, people with schizophrenia seemed able to learn priors just fine.
But when another trick was added, a split between the two groups emerged. Sometimes, the dots were almost impossible to see, and sometimes, there were no dots at all. People with schizophrenia were less likely to claim that they saw dots when the screen was blank. Perhaps they didn’t hallucinate dots because of the medication they were on, Seriès says. In fact, very early results from unmedicated people with schizophrenia suggest that they actually see dots that aren’t there more than healthy volunteers. Preliminary results so far on schizophrenia are sparse and occasionally conflicting, Seriès admits. “It’s the beginning,” she says. “We don’t understand much.”
The research is so early that no straightforward story exists yet. But that’s not unexpected. “If 100 years of schizophrenia research have taught us anything, it’s that there’s not going to be a nice, simple explanation,” Adams says. But using math to describe how people perceive the world may lead to new hunches about how that process goes wrong in mental illnesses, he argues.
“You can instill expectations in subjects in many different ways, and you can control what evidence they see,” Adams says. Bayesian theory “tells you what they should conclude from those prior beliefs and that evidence.” If their conclusions diverge from predictions, scientists can take the next step. Brain scans, for instance, may reveal how the wrong answers arise. With a clear description of these differences, he says, “we might be able to measure people’s cognition in a new way, and diagnose their disorders in a new way.”
Now vs. then The way the brain combines incoming sensory information with existing knowledge may also be different in autism, some researchers argue. In some cases, people with autism might put excess weight on what their senses take in about the world and rely less on their expectations. Old observations fit with this idea. In the 1960s, psychologists had discovered that children with autism were just as good at remembering nonsense sentences (“By is go tree stroke lets”) as meaningful ones (“The fish swims in the pond”). Children without autism struggled to remember the non sequiturs. But the children with autism weren’t thrown by the random string of words, suggesting that their expectations of sentence meaning weren’t as strong as their ability to home in on each word in the series.
Another study supports the notion that sensory information takes priority in people with autism. People with and without autism were asked to judge whether a sight and a sound happened at the same time. They saw a white ring on a screen, and a tone played before, after or at the same time. Adults without autism were influenced by previous trials in which the ring and tone were slightly off. But adults with autism were not swayed by earlier trials, researchers reported in February in Scientific Reports.
This literal perception might get in the way of speech perception, Marco Turi of the University of Pisa in Italy and colleagues suggest. Comprehending speech requires a listener to mentally stitch together sights and sounds that may not arrive at the eyes and ears at the same time. Losing that flexibility could make speech harder to understand.
A different study found that children with autism perceive moving dots more clearly than children without autism (SN Online: 5/5/15). The brains of people with autism seem to prioritize incoming sensory information over expectations about how things ought to work. Elizabeth Pellicano of University College London and David Burr of the University of Western Australia in Perth described the concept in 2012 in an opinion paper in Trends in Cognitive Sciences. Intensely attuned to information streaming in from the senses, people with autism experience the world as “too real,” Pellicano and Perth wrote.
New data, however, caution against a too-simple explanation. In an experiment presented in New York City in April at the annual meeting of the Cognitive Neuroscience Society, 20 adults with and without autism had to quickly hit a certain key on a keyboard when they saw its associated target on a screen. Their job was made easier because the targets came in a certain sequence. All of the participants improved as they learned which keys to expect. But when the sequence changed to a new one, people with autism faltered. This result suggests that they learned prior expectations just fine, but had trouble updating them as conditions changed, said cognitive neuroscientist Owen Parsons of the University of Cambridge. Distorted calculations — and the altered versions of the world they create — may also play a role in depression and anxiety, some researchers think. While suffering from depression, people may hold on to distorted priors — believing that good things are out of reach, for instance. And people with high anxiety can have trouble making good choices in a volatile environment, neuroscientist Sonia Bishop of the University of California, Berkeley and colleagues reported in 2015 in Nature Neuroscience.
In their experiment, people had to choose a shape, which sometimes came with a shock. People with low anxiety quickly learned to avoid the shock, even when the relationship between shape and shock changed. But people with high anxiety performed worse when those relationships changed, the researchers found. “High-anxious individuals didn’t seem able to adjust their learning to handle how volatile or how stable the environment was,” Bishop says. Scientists can’t yet say what causes this difficulty adjusting to a new environment in anxious people and in people with autism. It could be that once some rule is learned (a sequence of computer keys, or the link between a shape and a shock), these two groups struggle to update that prior with newer information.
This rigidity might actually contribute to anxiety in the first place, Bishop speculates. “When something unexpected happens that is bad, you wouldn’t know how to respond,” and that floundering “is likely to be a huge source of anxiety and stress.”
Recalculating “There’s been a lot of frustration with a failure to make progress” on psychiatric disorders, Bishop says. Fitting mathematical theories to the brain may be a way to move forward. Researchers “are very excited about computational psychiatry in general,” she says.
Computational psychiatrist Quentin Huys of the University of Zurich is one of those people. Math can help clarify mental illnesses in a way that existing approaches can’t, he says. In the March issue of Nature Neuroscience, Huys and colleagues argued that math can demystify psychiatric disorders, and that thinking of the brain as a Bayesian number cruncher might lead to a more rigorous understanding of mental illness. Huys says that a computational approach is essential. “We can’t get away without it.” If people with high anxiety perform differently on a perceptual test, then that test could be used to both diagnose people and monitor how well a treatment works, for instance.
Scientists hope that a deeper description of mental illnesses may lead to clearer ways to identify a disorder, chart how well treatments work and even improve therapies. Bishop raises the possibility of developing apps to help people with high anxiety evaluate situations — outsourcing the decision making for people who have trouble. Frith points out that cognitive behavioral therapy could help depressed people recalculate their experiences by putting less weight on negative experiences and perhaps breaking out of cycles of despondence.
Beyond these potential interventions, simply explaining to people how their brains are working might ease distress, Adams says. “If you can give people an explanation that makes sense of some of the experiences they’ve had, that can be a profoundly helpful thing,” he says. “It destigmatizes the experience.”
Our home planet is young at heart. According to new calculations, Earth’s center is more than two years younger than its surface.
In Einstein’s general theory of relativity, massive objects warp the fabric of spacetime, creating a gravitational pull and slowing time nearby. So a clock placed at Earth’s center will tick ever-so-slightly slower than a clock at its surface. Such time shifts are determined by the gravitational potential, a measure of the amount of work it would take to move an object from one place to another. Since climbing up from Earth’s center would be a struggle against gravity, clocks down deep would run slow relative to surface timepieces. Over the 4.5 billion years of Earth’s history, the gradual shaving off of fractions of a second adds up to a core that’s 2.5 years younger than the planet’s crust, researchers estimate in the May European Journal of Physics. Theoretical physicist Richard Feynman had suggested in the 1960s that the core was younger, but only by a few days. The new calculation neglects geological processes, which have a larger impact on the planet’s age. For example, Earth’s core probably formed earlier than its crust. Instead, says study author Ulrik Uggerhøj of Aarhus University in Denmark, the calculation serves as an illustration of gravity’s influence on time — very close to home.
The green hairstreak butterfly (Callophrys rubi) gets its blue-green hue from complex nanoscale structures on its wings. The structures, called gyroids, are repeating patterns of spiral-shaped curls. Light waves bouncing off the patterned surface (top inset above) interfere with one another, amplifying green colors while washing out other shades (SN: 6/7/08, p. 26).
Scientists led by Min Gu of the Royal Melbourne Institute of Technology in Australia have now painstakingly re-created the gyroid structure by sculpting the shapes out of a special resin that solidifies when hit with laser light. The technique, called optical two-beam lithography, uses a pair of lasers to set the material in just the right pattern. Afterward, the remaining resin can be washed away, leaving only the gyroid structure. The fabricated version repeats its pattern every 360 nanometers, or billionths of a meter.
The gyroid structures determine more than just color. They also divvy up light that is circularly polarized — its electric fields spiral either clockwise or counterclockwise. In the butterfly, this effect is weak because of irregularities in the structure. But the artificial version sorts the light according to polarization, reflecting one type much more than the other, the researchers report May 13 in Science Advances.
The ability to control circular polarization of light with structures like these could allow scientists to increase the bandwidth of optical communications, the researchers say. The two polarizations of light could each carry different information, which could then be separated and decoded down the line.
In Colorado’s Rocky Mountains, male and female valerian plants have responded differently to hotter, drier conditions, a new study shows. Rapidly changing ratios of the sexes could be a quick sign of climate change, the researchers say.
Valerian (Valeriana edulis) plants range from hot, scrubby lowlands to cold alpine slopes. In each patch of plants, some are male and some are female. The exact proportion of each sex varies with elevation. High on the mountain, females are much more common than males; they can make up 80 percent of some populations. Four decades ago, in patches of valerian growing in the middle of the plant’s elevation range, 33.4 percent of the plants were males. Those patches grew in the Rockies at elevations around 3,000 meters. Today, you would have to hike considerably higher to find the same proportion of male plants. Males, now 5.5 percent more common on average, are reaching higher elevations than in the past, researchers report in the July 1 Science.
“We think climate is acting almost like a filter on males and females,” says Will Petry of ETH Zurich, who led the study while at the University of California, Irvine. “The settings on this filter are controlling the sex ratio.” Those settings are sweeping up the mountainside like a rising tide at a rate of 175 meters per decade, Petry and colleagues found. Ecologists already knew that the ratio of male to female plants can vary with altitude or water availability, says ecologist Spencer Barrett of the University of Toronto, who was not involved in this study. But “the idea that a sex ratio is moving upslope — nobody’s ever done that before.”
Those moving sex ratios have kept pace with climate change since the late 1970s. Today, winter snows are melting earlier and summers are hotter, with less rain. As a result, the same amount of precipitation that would have fallen at one elevation in 1978 now falls at higher elevations instead; it has moved upslope by 133 meters per decade. Soil moisture has moved up the mountain, too, by 195 meters per decade.
The parallel shifts mean that changing sex ratios could be a marker of climate change, says population biologist Tom Miller of Rice University in Houston, a coauthor of the study. Today, movements of whole species — often up in latitude or altitude — are a hallmark of climate change. But proportions of males and females are changing “substantially faster than species are moving,” Miller says. They “might be a much more rapid fingerprint of climate change than where species are migrating to.” Petry’s team found that fingerprint while hiking around the Rocky Mountain Biological Laboratory in Crested Butte, Colo. As the scientists walked through the mountains in Chaffee and Gunnison counties, they counted flowering males and females at 31 sites in 2011, then compared their modern data with historical counts from nine of the same populations, made by coauthor Judy Soule from 1978 to 1980. When Petry saw that the percentage of males and females had changed, “we also started thinking about the consequences,” he says.
If one sex vastly outnumbers the other, populations could die out. “Imagine if it became an Amazonia situation,” says Kailen Mooney, whose lab at UC Irvine led the new study. A 100 percent female population wouldn’t be pollinated, and would disappear once the mature females died, he says.
If those female-only populations grew above a certain altitude and died out because males couldn’t reach them, then male plants would set the upper boundary for the whole species. Sex ratios “add nuance” to the way scientists think about climate-driven migration, Mooney says, because one sex could determine geographic limits for whole species.
NASA’s Juno spacecraft has sent back its first picture of Jupiter since arriving at the planet July 4 (SN: 7/23/16, p. 14). The image, taken July 10 when the spacecraft was 4.3 million kilometers from Jupiter, shows off the planet’s clouds, its Great Red Spot (a storm a bit wider than Earth) and three of its moons (Io, Europa and Ganymede).
Juno is on the outbound leg of its first of two 53.5-day orbits of the gas giant (Juno will then settle into 14-day orbits). During orbit insertion, all of Juno’s scientific instruments were turned off while the spacecraft made its first dive through the harsh radiation belts that encircle the planet. This first image indicates that Juno is in good health and ready to study the largest planet in the solar system.
The probe is the ninth to visit Jupiter and the second to stay in orbit (SN: 6/25/2016, p. 32). For the next 20 months, Juno will investigate what lurks beneath the opaque clouds that enshroud the planet (SN: 6/25/2016, p. 16). The spacecraft won’t take its first intimate pictures of Jupiter until August 27, when it flies within 5,000 kilometers of the cloud tops.
U.S. drivers love to hit the road. The problem is doing so safely.
In 2013, 32,894 people in the United States died in motor vehicle crashes. Although down since 2000, the overall death rate — 10.3 per 100,000 people — tops 19 other high-income countries, the U.S. Centers for Disease Control and Prevention reported July 8. Belgium is a distant second with 6.5 deaths per 100,000. Researchers reviewed World Health Organization and other data on vehicle crash deaths, seat belt use and alcohol-impaired driving in 2000 and 2013. Canada had the highest percentage of fatal crashes caused by drunk drivers: 33.6 percent. New Zealand and the United States tied for second at 31 percent. But Canada and 16 other countries outperformed the United States on seat belt use — even though, in 2013, 87 percent of people in the United States reported wearing safety belts while riding in the front seat.
Spain saw the biggest drop — 75 percent — in its crash death rate. That country improved nearly all aspects of road safety, including decreasing alcohol-impaired driving and increasing seat belt use, the researchers say.
Thirst drove one of the last populations of woolly mammoths to extinction.
A small group of holdouts on an isolated Alaskan island managed to last about 8,000 years longer than most of their mainland-dwelling brethren. But by about 5,600 years ago, the island’s lakes — the only source of freshwater — became too small to support the mammoths (Mammuthus primigenius), scientists report online the week of August 1 in the Proceedings of the National Academy of Sciences. “I don’t think I’ve ever seen something so conclusive about an extinction before,” says Love Dalén, an evolutionary geneticist at the Swedish Museum of Natural History in Stockholm who was not involved in the research. The study highlights “how sensitive small populations are and how easily they can become extinct.”
Surprisingly recent woolly mammoth bones had previously been discovered in a cave on St. Paul Island, which became isolated from the mainland roughly 14,000 years ago. Since there’s no evidence that prehistoric humans lived on St. Paul, the find provided a chance to study extinction in the absence of human influence, says Russell Graham, a paleontologist at Penn State who led the study.
The scientists extracted a core of sediment from a lake bed near the cave to see how environmental conditions had changed over the last 11,000 years. The team found remnants of ancient plants, animals and fungi in the sediment — including traces of mammoth DNA in some layers. By analyzing and dating the different sediment layers, the team could infer when and how the mammoths went extinct.
“We initially thought that vegetation change and habitat would be the major driving factor,” Graham says. Instead, his team found a wealth of evidence — including an increase in salt-tolerant algae and crustaceans 6,000 years ago — suggesting freshwater shortages as the culprit. A warmer climate after the last Ice Age ended contributed to the St. Paul mammoths’ downfall. Sea level rise shrank the mammoths’ island habitat and cut into their freshwater supplies by raising the water table and making the lake saltier over time, the team concluded. Warmer, drier conditions also caused water to evaporate more quickly from the lake surface.
The study highlights an often-overlooked vulnerability of island and coastal communities. Some islands in the South Pacific are currently experiencing similar freshwater shortages thanks to rising seas, Graham says – and Florida could be next in line. That’s particularly bad news for large island-dwelling and coastal mammals, which tend to need more water to survive than smaller species.
In a few highly specialized laboratories, scientists bombard matter with the world’s most powerful electrical pulses or zap it with sophisticated lasers. Other labs squeeze heavy-duty diamonds together hard enough to crack them.
All this is in pursuit of a priceless metal. It’s not gold, silver or platinum. The scientists’ quarry is hydrogen in its most elusive of forms.
Several rival teams are striving to transform hydrogen, ordinarily a gas, into a metal. It’s a high-stakes, high-passion pursuit that sparks dreams of a coveted new material that could unlock enormous technological advances in electronics. “Everybody knows very well about the rewards you could get by doing this, so jealousy and envy [are] kind of high,” says Eugene Gregoryanz, a physicist at the University of Edinburgh who’s been hunting metallic hydrogen for more than a decade.
Metallic hydrogen in its solid form, scientists propose, could be a superconductor: a material that allows electrons to flow through it effortlessly, with no loss of energy. All known superconductors function only at extremely low temperatures, a major drawback. Theorists suspect that superconducting metallic hydrogen might work at room temperature. A room-temperature superconductor is one of the most eagerly sought goals in physics; it would offer enormous energy savings and vast improvements in the transmission and storage of energy.
Metallic hydrogen’s significance extends beyond earthly pursuits. The material could also help scientists understand our own solar system. At high temperatures, compressed hydrogen becomes a metallic liquid — a form that is thought to lurk beneath the clouds of monstrous gas planets, like Jupiter and Saturn. Sorting out the properties of hydrogen at extreme heat and high pressure could resolve certain persistent puzzles about the gas giants. Researchers have reported brief glimpses of the liquid metal form of hydrogen in the lab — although questions linger about the true nature of the material.
While no lab has yet produced solid metallic hydrogen, the combined efforts of many scientists are rapidly closing in on a more complete understanding of the element itself — as well as better insight into the complex inner workings of solids. Hydrogen, the first element in the periodic table and the most common element in the universe, ought to be easy to understand: a single proton paired with a single electron. “What could be more simple than an assembly of electrons and protons?” asks theoretical physicist Neil Ashcroft of Cornell University. But at high pressures, the physics of hydrogen rapidly becomes complex.
At room temperature and atmospheric pressure, hydrogen is a gas. But like other materials, altered conditions can transform hydrogen into a solid or a liquid. With low enough temperatures or a sufficiently forceful squeeze, hydrogen shape-shifts into a solid. Add heat while squeezing, and it becomes a liquid.
If subjected to still more extreme conditions, hydrogen can — at least theoretically — undergo another transformation, into a metal. All metals have one thing in common: They conduct electricity, due to free-flowing electrons that can go where they please within the material. Squeeze anything hard enough and it will become a metal. “Pressure does a great job of dislodging the outer electrons,” Ashcroft says. This is what scientists are aiming to do with hydrogen: create a sloshing soup of roving electrons in either a liquid or a solid.
When hydrogen is compressed, many atoms begin to interact with one another, while paired in molecules of two hydrogen atoms each. The underlying physics becomes a thorny jumble. “It is amazing; the stuff takes up incredibly complex arrangements in the solid state,” says Ashcroft, the first scientist to propose, in 1968, that metallic hydrogen could be a high-temperature superconductor.
Hydrogen’s complexity fascinates scientists. “It’s not just the metallization question that’s of interest to me,” says Russell Hemley, a chemist at the Carnegie Institution for Science in Washington, D.C., and Lawrence Livermore National Laboratory in California. Studying the intricacies of hydrogen’s behavior can help scientists refine their understanding of the physics of materials.
In 1935, when physicists Eugene Wigner and Hillard Bell Huntington of Princeton University first predicted that compressed solid hydrogen would be metallic, they thought the transition to a metal might occur at a pressure 250,000 times that of Earth’s atmosphere. That may sound like a lot, but scientists have since squeezed hydrogen to pressures more than 10 times as high — and still no solid metal.
Scientists originally expected that the transition would be a simple flip to metallic behavior. Not so, says theoretical physicist David Ceperley of the University of Illinois at Urbana-Champaign. “Nature has a lot more possibilities.” Solid hydrogen exists in multiple forms, each with a different crystal structure. As the pressure climbs, the wily hydrogen molecules shift into ever-more-complex arrangements, or phases. (For physicists, the “phase” of matter goes deeper than the simple states of solid, liquid or gas.) The number of known solid phases of hydrogen has grown steadily as higher pressures are reached, with four phases now well established. The next phase scientists find could be a metal — they hope.
Story continues below timeline
Timeline: The race to make metallic hydrogen Rival research teams are rushing to transform solid or liquid hydrogen into a metal. With each experiment, the pressure rises. They have two very different aims: a room-temperature superconductor and a window into the gas giants. If solid metallic hydrogen turns out to be a room-temperature superconductor, it would have to be crushed to work, making it impractical for many applications. But if hydrogen could hold its metallic form after the pressure is released, as some researchers have suggested, “it would be revolutionary,” says physicist Isaac Silvera, who leads the metallic hydrogen hunt at Harvard University. Such a material could be used in electrical wires to reduce loss of energy and decrease the world’s power consumption. And it might lead to efficient, magnetically levitated trains and technological advances in nuclear fusion, supercomputing and more.
While one group of would-be metallurgists is searching for solid metal, other investigators seek the scientifically intriguing liquid hydrogen metal. Their techniques differ in timescale and size. To produce liquid metal, scientists violently slam hydrogen for fractions of a second at a time, using enormous machines at national laboratories. Scientists searching for a solid metal, on the other hand, use fist-sized devices to capture hydrogen between the tips of two tiny diamonds and slowly squeeze.
Diamonds have it rough Crushing an ethereal, normally gaseous substance between two diamonds sounds nearly impossible. Such tricky experiments make for a field where researchers are in regular disagreement over their latest results. “We’re still missing high-quality, reliable data,” says physicist Alexander Goncharov. “The issue is the experiments are too challenging.”
In his office at the Geophysical Laboratory of the Carnegie Institution, Goncharov opens a desk drawer and pulls out a device called a diamond anvil cell. The cylinder of metal is small enough for Goncharov to cradle in his palm. Bits of precisely machined steel and tough tungsten carbide are held together with four screws. Through portals in the top and bottom, bright sparkles shimmer: diamonds.
Inside the capsule, two diamonds taper to tiny points a few hundredths of a millimeter wide. They pinch the material within, squashing it at over a million times atmospheric pressure. The gap between the minuscule anvils can be as small as a few thousandths of a millimeter, about the size of a human red blood cell.
Once they’ve been pressurized, diamond anvil cells will hold the pressure almost indefinitely. The prepared cells can be carried around — inspected in the laboratory, transported to specialized facilities around the world — or simply stored in a desk drawer. Goncharov regularly travels with them. (Tip from the itinerant scientist: If questions arise at airport security, “never use the word ‘diamonds.’ ”) The diamond anvil can squeeze more than hydrogen — materials from iron to sodium to argon can be crushed in the diamond vise. To identify new, potentially metallic phases of solid hydrogen within the pressurized capsules, scientists shine laser light onto the material, measuring how molecules vibrate and rotate — a technique called Raman spectroscopy (SN: 8/2/08, p. 22). If a new phase is reached, molecules shift configurations, altering how they jiggle. Certain types of changes in how the atoms wobble are a sign that the new phase is metallic. If the material conducts electricity, that’s another dead giveaway. A final telltale sign: The normally translucent hydrogen should acquire a shiny, reflective surface.
Significant hurdles exist for diamond anvil cell experiments. Diamonds, which cost upwards of $600 a pop, can crack under such intense pressures. Hydrogen can escape from the capsule, or diffuse into the diamonds, weakening them. So scientists coat their diamonds with thin layers of protective material. The teams each have their own unique recipe, Goncharov says. “Of course, everyone believes that their recipe is the best.”
Three phases of solid hydrogen have been known since the late 1980s. With the discovery of a fourth phase in 2011, “the excitement was enormous,” Gregoryanz says. In Nature Materials, Mikhail Eremets and Ivan Troyan at the Max Planck Institute for Chemistry in Mainz, Germany, reported that a new phase appeared when they squashed room-temperature hydrogen to over 2 million times atmospheric pressure. Goncharov, Gregoryanz and colleagues created the new phase and deduced its structure in Physical Review Letters in 2012. In phase IV, as it’s known, hydrogen arranges itself into thin sheets — somewhat like the single-atom-thick sheets of carbon known as graphene, the scientists wrote.
Progress doesn’t come easy. With each new paper, scientists disagree about what the results mean. When Eremets and colleagues discovered the fourth phase, they thought they also had found metallic hydrogen (SN: 12/17/11, p. 9). But that assertion was swiftly criticized, and it didn’t stand up to scrutiny.
The field has been plagued by hasty claims. “If you look at the literature for the last 30 years,” Gregoryanz says, “I think every five years there is a claim that we finally metallized hydrogen.” But the claims haven’t been borne out, leaving scientists perpetually skeptical of new results.
In a recent flurry of papers, scientists have proposed new phases — some that might be metallic or metal-like precursors to a true metal — and they are waiting to see which claims stick. Competing factions have volleyed papers back and forth, alternately disagreeing and agreeing.
A paper from Gregoryanz’s group, published in Nature in January, provided evidence for a phase that was enticingly close to a metal (SN Online: 1/6/16), at more than 3 million times atmospheric pressure. But other scientists disputed the evidence. In their own experiments, Eremets and colleagues failed to confirm the new phase. In a paper posted online at arXiv.org just days after Gregoryanz’s paper was published, Eremets’ team unveiled hints of a “likely metallic” phase, which occurred at a different temperature and pressure than Gregoryanz’s new phase.
A few months later, Silvera’s group squeezed hydrogen hard enough to make it nearly opaque, though not reflective — not quite a metal. “We think we’re just below the pressure that you need to make metallic hydrogen,” Silvera says. His findings are consistent with Eremets’ new phase, but Silvera disputes Eremets’ speculations of metallicity. “Every time they see something change they call it metallic,” Silvera says. “But they don’t really have evidence of metallic hydrogen.”
All this back and forth may seem chaotic, but it’s also a sign of a swiftly progressing field, the researchers say. “I think it’s very healthy competition,” Gregoryanz says. “When you realize that somebody is getting ahead of you, you work hard.”
The current results are “not very well consistent, so everybody can criticize each other,” Eremets says. “For me it’s very clear. We should do more experiments. That’s it.”
There are signs of progress. In 2015, Eremets and colleagues discovered a record-breaking superconductor: hydrogen sulfide, a compound of hydrogen and sulfur. When tightly compressed into solid form, hydrogen sulfide superconducts at temperatures higher than ever seen before: 203 kelvins (−70° Celsius) (SN: 8/8/15, p. 12).
Adding sulfur to the mix stabilizes and strengthens the hydrogen structure but doesn’t contribute much to its superconducting properties. Hydrogen sulfide is so similar to pure hydrogen, Eremets says, that, “in some respects we already found superconductivity in metallic hydrogen.”
Brief glimmers Giant, gassy planets are chock-full of hydrogen, and the element’s behavior under pressure could explain some of these planets’ characteristics. A sea of flowing liquid hydrogen metal may be the source of Jupiter’s magnetic field (SN: 6/25/16, p. 16). Learning more about metallic hydrogen’s behavior deep inside such planets could also help resolve a long-standing puzzle regarding Saturn: The ringed behemoth is unexpectedly bright. The physics of hydrogen’s interactions with helium inside the planet could provide the explanation.
Using a radically different set of technologies, a second band of scientists is on the hunt for such liquid metallic hydrogen. These researchers have gone big, harnessing the capabilities of new, powerful machines designed for nuclear fusion experiments at government-funded national labs. These experiments show the most convincing evidence of metallic behavior so far — but in hydrogen’s liquid, not solid, form. These enormous machines blast hydrogen for brief instants, temporarily sending pressures and temperatures skyrocketing. Such experiments reach searing temperatures, thousands of kelvins. With that kind of heat, metallic hydrogen appears at lower, more accessible pressures. Creating such conditions requires sophisticated equipment. The Z machine, located at Sandia National Laboratories in Albuquerque, generates extremely intense bursts of electrical power and strong magnetic fields; for a tiny instant, the machine can deliver about 80 terawatts (one terawatt is about the total electrical power–generating capacity in the entire United States).
A group of scientists recently used the Z machine to launch a metal plate into a sample of deuterium — an isotope of hydrogen with one proton and one neutron in its nucleus — generating high pressures that compressed the material. The pummeled deuterium showed reflective glimmers of shiny metal, the scientists reported last year in Science. “It starts out transparent, it goes opaque, and then later we see this reflectivity increase,” says Marcus Knudson of Sandia and Washington State University in Pullman.
Another group is pursuing a different tactic, using some of the most advanced lasers in the world, at the National Ignition Facility at Lawrence Livermore. Scientists there zap hydrogen to produce high pressures and temperatures. Though the conclusions of this experiment are not yet published, says physicist Gilbert Collins of Lawrence Livermore, one of the leaders of the experiment, “we have some really beautiful results.”
The first experiment to show evidence of liquid metallic hydrogen was performed at Lawrence Livermore in the 1990s. A team of physicists including William Nellis, now at Harvard, used a sophisticated gunlike apparatus to shoot projectiles into hydrogen at blisteringly fast speeds. The resulting hydrogen briefly conducted electricity (SN: 4/20/96, p. 250).
These experiments face hurdles of their own — it’s a struggle to measure temperature in such systems, so scientists calculate it rather than measuring it directly. But many researchers are still convinced by these results. Metallic hydrogen “certainly has been produced by shock techniques,” Cornell’s Ashcroft says.
Some scientists still have questions. “It’s certainly difficult to tell if something is a metal or not at such high temperature,” Collins says. Although they need high temperatures to reach the metal liquid phase, some physicists define a metal based on its behavior at a temperature of absolute zero. Current experiments hit high pressures at temperatures as close to zero as possible to produce relatively cool liquid metallic hydrogen.
Scientists who conduct palm-sized diamond anvil cell experiments refuse to be left out of the liquid-metal action. They’ve begun to use laser pulses to heat and melt hydrogen crammed into the cells. The results have stirred up new disagreements among competing groups. In April’s Physical Review B, Silvera and coauthors reported forming liquid metallic hydrogen. But under similar conditions, Goncharov and others found only semiconducting hydrogen, not a metal. They reported their results in Physical Review Letters in June.
“There’s kind of a crisis now with these different experiments,” says Illinois theorist Ceperley. “And there’s a lot of activity trying to see who’s right.” For now, scientists will continue refining their techniques until they can reach agreement.
The major players have managed to reach consensus before. Four phases of solid hydrogen are now well established, and researchers agree on certain conditions under which solid hydrogen melts.
Solid metallic hydrogen, however, has perpetually seemed just out of reach, as theoretical predictions of the pressure required to produce it have gradually shifted upward. As the goalposts have moved, physicists have reached further, achieving ever-higher pressures. Current theoretical predictions put the metal tantalizingly close — perhaps only an additional half a million times atmospheric pressure away.
The quest continues, propelled by a handful of hydrogen-obsessed scientists.
“We all love hydrogen,” Collins says. “It has the essence of being simple, so that we think we can calculate something and understand it, while at the same time it has such a devious nature that it’s perhaps the least understandable material there is.”
Fractions of a second after food hits the mouth, a specialized group of energizing nerve cells in mice shuts down. After the eating stops, the nerve cells spring back into action, scientists report August 18 in Current Biology. This quick response to eating offers researchers new clues about how the brain drives appetite and may also provide insight into narcolepsy.
These nerve cells have intrigued scientists for years. They produce a molecule called orexin (also known as hypocretin), thought to have a role in appetite. But their bigger claim to fame came when scientists found that these cells were largely missing from the brains of people with narcolepsy. People with narcolepsy are more likely to be overweight than other people, and this new study may help explain why, says neuroscientist Jerome Siegel of UCLA. These cells may have more subtle roles in regulating food intake in people without narcolepsy, he adds.
Results from earlier studies hinted that orexin-producing nerve cells are appetite stimulators. But the new results suggest the opposite. These cells actually work to keep extra weight off. “Orexin cells are a natural obesity defense mechanism,” says study coauthor Denis Burdakov of the Francis Crick Institute in London. “If they are lost, animals and humans gain weight.”
Mice were allowed to eat normally while researchers eavesdropped on the behavior of their orexin nerve cells. Within milliseconds of eating, orexin nerve cells shut down and stopped sending signals. This cellular quieting was consistent across foods. Peanut butter, mouse chow, a strawberry milkshake and a calorie-free drink all prompted the same response. “Foods with different flavors and textures had a similar effect, implying that it is to do with the act of eating or drinking, rather than with what is being eaten,” Burdakov says. When the eating ended, the cells once again resumed their activity. When Burdakov and colleagues used a genetic technique to kill orexin nerve cells, mice ate more food than normal, behavior that led to weight gain, the team found. But a reduced-calorie diet slimmed these mice down. The results suggest that giving orexin to people who lack it may reduce obesity. But that might not be a good idea. An overactive orexin system has been tied to stress and anxiety, Burdakov says. Orexin’s link to stress raises a different possibility —that anxiety can be reduced by curbing orexin nerve cell activity. “And our study suggests that the act of eating can do just that,” Burdakov says. “This provides a candidate explanation for why people turn to eating at times of anxiety.”