The moon might have formed from the filling during Earth’s jelly doughnut phase.
Around 4.5 billion years ago, something hit Earth, and the moon appeared shortly after. A new simulation of how the moon formed suggests it took shape in the midst of a hot cloud of rotating rock and vapor, which (in theory) forms when big planetary objects smash into each other at high speeds and energies. Planetary scientists Simon Lock of Harvard University and Sarah Stewart of the University of California, Davis proposed this doughnut-shaped planetary blob in 2017 and dubbed it a synestia (SN: 8/5/17, p. 5). Radiation at the surface of this swirling cloud of vaporized, mixed-together planet matter sent rocky rain inward toward bigger debris. The gooey seed of the moon grew from fragments in this hot, high-pressure environment, with a bit of iron solidifying into the lunar core. Some elements, such as potassium and sodium, remained aloft in vapor, accounting for their scarcity in moon rocks today.
After a few hundred years, the synestia shrank and cooled. Eventually, a nearly full-grown moon emerged from the cloud and condensed. While Earth ended up with most of the synestia material, the moon spent enough time in the doughnut filling to gain similar ingredients, Lock, Stewart and colleagues write February 28 in Journal of Geophysical Research: Planets . The simulation shakes up the prevailing explanation for the moon’s birth: A Mars-sized protoplanet called Theia collided with Earth, and the moon formed from distinct rubble pieces. If that’s true, moon rocks should have very different chemical compositions than Earth’s. But they don’t.
Other recent studies have wrestled with why rocks from the moon and Earth are so alike (SN: 4/15/17, p. 18). Having a synestia in the mix shifts the focus from the nature of the collision to what happened in its aftermath, potentially resolving the conundrum.
On the hormonal roller coaster of life, the ups and downs of childbirth are the Tower of Power. For nine long months, a woman’s body and brain absorb a slow upwelling of hormones, notably progesterone and estrogen. The ovaries and placenta produce these two chemicals in a gradual but relentless rise to support the developing fetus.
With the birth of a baby, and the immediate expulsion of the placenta, hormone levels plummet. No other physiological change comes close to this kind of free fall in both speed and intensity. For most women, the brain and body make a smooth landing, but more than 1 in 10 women in the United States may have trouble coping with the sudden crash. Those new mothers are left feeling depressed, isolated or anxious at a time society expects them to be deliriously happy. This has always been so. Mental struggles following childbirth have been recognized for as long as doctors have documented the experience of pregnancy. Hippocrates described a woman’s restlessness and insomnia after giving birth. In the 19th century, some doctors declared that mothers were suffering from “insanity of pregnancy” or “insanity of lactation.” Women were sent to mental hospitals.
Modern medicine recognizes psychiatric suffering in new mothers as an illness like any other, but the condition, known as postpartum depression, still bears stigma. Both depression and anxiety are thought to be woefully underdiagnosed in new mothers, given that many women are afraid to admit that a new baby is anything less than a bundle of joy. It’s not the feeling they expected when they were expecting.
Treatment — when offered — most commonly involves some combination of antidepression medication, hormone therapy, counseling and exercise. Still, a significant number of mothers find these options wanting. Untreated, postpartum depression can last for years, interfering with a mother’s ability to connect with and care for her baby.
Although postpartum depression entered official medical literature in the 1950s, decades have passed with few new options and little research. Even as brain imaging has become a common tool for looking at the innermost workings of the mind, its use to study postpartum depression has been sparse. A 2017 review in Trends in Neurosciences found only 17 human brain imaging studies of postpartum depression completed through 2016. For comparison, more than four times as many have been conducted on a problem called “internet gaming disorder” — an unofficial diagnosis acknowledged only five years ago. Now, however, more researchers are turning their attention to this long-neglected women’s health issue, peering into the brains of women to search for the root causes of the depression. At the same time, animal studies exploring the biochemistry of the postpartum brain are uncovering changes in neural circuitry and areas in need of repair.
And for the first time, researchers are testing an experimental drug designed specifically for postpartum depression. Early results have surprised even the scientists.
Women’s health experts hope that these recent developments signal a new era of research to help new moms who are hurting.
“I get this question all the time: Isn’t it just depression during the postpartum period? My answer is no,” says neuroscientist Benedetta Leuner of Ohio State University. “It’s occurring in the context of dramatic hormonal changes, and that has to be impacting the brain in a unique way. It occurs when you have an infant to care for. There’s no other time in a woman’s life when the stakes are quite as high.”
Brain drain Even though progesterone and estrogen changes create hormonal whiplash, pregnancy wouldn’t be possible without them. Progesterone, largely coming from the ovaries, helps orchestrate a woman’s monthly menstrual cycle. The hormone’s primary job is to help thicken the lining of the uterus so it will warmly welcome a fertilized egg. In months when conception doesn’t happen, progesterone levels fall and the uterine lining disintegrates. If a woman becomes pregnant, the fertilized egg implants in the uterine wall and progesterone production is eventually taken over by the placenta, which acts like an extra endocrine organ.
Like progesterone, estrogen is a normal part of the menstrual cycle that kicks into overdrive after conception. In addition to its usual duties in the female body, estrogen helps encourage the growth of the uterus and fetal development, particularly the formation of the hormone-producing endocrine system.
These surges in estrogen and progesterone, along with other physiological changes, are meant to support the fetus. But the hormones, or chemicals made from them, cross into the mother’s brain, which must constantly adapt. When it doesn’t, signs of trouble can appear even before childbirth, although they are often missed. Despite the name “postpartum,” about half of women who become ill are silently distressed in the later months of pregnancy.
Decades ago, controversy churned over whether postpartum depression was a consequence of fluctuating hormones alone or something else, says neuroscientist Joseph Lonstein of Michigan State University in East Lansing. He studies the neurochemistry of maternal caregiving and postpartum anxiety. Lonstein says many early studies measured hormone levels in women’s blood and tried to determine whether natural fluctuations were associated with the risk of postpartum depression. Those studies found “no clear correlations with [women’s] hormones and their susceptibility to symptoms,” he says. “While the hormone changes are certainly thought to be involved, not all women are equally susceptible. The question then became, what is it about their brains that makes particular women more susceptible?” Seeking answers, researchers have examined rodent brains and placed women into brain scanners to measure the women’s responses to pictures or videos of babies smiling, babbling or crying. Though hormones likely underlie the condition, many investigations have led to the amygdalae. These two, almond-shaped clumps of nerve cells deep in the brain are sometimes referred to as the emotional thermostat for their role in the processing of emotions, particularly fear.
The amygdalae are entangled with many structures that help make mothers feel like mothering, says neuroscientist Alison Fleming of the University of Toronto Mississauga. The amygdalae connect to the striatum, which is involved in experiencing reward, and to the hippocampus, a key player in memory and the body’s stress response. And more: They are wired to the hypothalamus, the interface between the brain and the endocrine system (when you are afraid, the endocrine system produces adrenaline and other chemicals that get your heart racing and palms sweating). The amygdalae are also connected to the prefrontal cortex and insula, involved in decision making, motivation and other functions intertwined with maternal instinct.
Fleming and colleagues have recently moved from studies in postpartum rodents to human mothers. In one investigation, reported in 2012 in Social Neuroscience, women were asked to look at pictures of smiling infants while in a functional MRI, which images brain activity. In mothers who were not depressed, the researchers found a higher amygdala response, more positive feelings and lower stress when women saw their own babies compared with unfamiliar infants.
But an unexpected pattern emerged in mothers with postpartum depression, as the researchers reported in 2016 in Social Neuroscience. While both depressed and not-depressed mothers showed elevated amygdala activity when viewing their own babies, the depressed mothers also showed heightened responses to happy, unknown babies, suggesting reactions to the women’s own children were blunted and not unique. This finding may mean that depressed women had less inclination to emotionally attach to their babies.
Mothers with postpartum depression also showed weaker connectivity between the amygdalae and the insula. Mothers with weaker connectivity in this area had greater symptoms of depression and anxiety. Women with stronger connectivity were more responsive to their newborns.
While there’s still no way to definitely know that the amygdalae are responding to postpartum chemical changes, “it’s very likely,” Lonstein says, pointing out that the amygdalae are influenced by the body’s reaction to hormones in other emotional settings.
Maternal rewards While important, the amygdalae are just part of the puzzle that seems to underlie postpartum depression. Among others is the nucleus accumbens, famous for its role in the brain’s reward system and in addiction, largely driven by the yin and yang of the neurotransmitters dopamine and serotonin. In studies, mothers who watched films of their infants (as opposed to watching unknown infants) experienced increased production of feel-good dopamine. The women also had a strengthening of the connection between the nucleus accumbens, the amygdalae and other structures, researchers from Harvard Medical School and their collaborators reported in February 2017 in Proceedings of the National Academy of Sciences.
That’s not entirely surprising given that rodent mothers find interacting with their newborn pups as neurologically rewarding as addictive drugs, says Ohio State’s Leuner. Rodent mothers that are separated from their offspring “will press a bar 100 times an hour to get to a pup. They will step across electrified grids to get to their pups. They’ve even been shown in some studies to choose the pups over cocaine.” Mothers find their offspring “highly, highly rewarding,” she says.
When there are postpartum glitches in the brain’s reward system, women may find their babies less satisfying, which could increase the risk for impaired mothering. Writing in 2014 in the European Journal of Neuroscience, Leuner and colleagues reported that in rats with symptoms of postpartum depression (induced by stress during pregnancy, a major risk factor for postpartum depression in women), nerve cells in the nucleus accumbens atrophied and showed fewer protrusions called dendritic spines — suggesting weaker connections to surrounding nerve cells compared with healthy rats. This is in contrast to other forms of depression, which show an increase in dendritic spines. Unpublished follow-up experiments conducted by Leuner’s team also point to a role for oxytocin, a hormone that spikes with the birth of a baby as estrogen and progesterone fall. Sometimes called the “cuddle chemical,” oxytocin is known for its role in maternal bonding (SN Online: 4/16/15). Leuner hypothesizes that maternal depression is associated with deficits in oxytocin receptors that enable the hormone to have its effects as part of the brain’s reward system.
If correct, the idea may help explain why oxytocin treatment failed women in some studies of postpartum depression. The hormone may simply not have the same potency in some women whose brains are short on receptors the chemical can latch on to. The next step is to test whether reversing the oxytocin receptor deficits in rodents’ brains relieves symptoms.
Leuner and other scientists emphasize that the oxytocin story is complex. In 2017, in a study reported in Depression & Anxiety, women without a history of depression who received oxytocin — which is often given to promote contractions or stem bleeding after delivery — had a 32 percent higher likelihood of developing postpartum depression than women who did not receive the hormone. In more than 46,000 births, 5 percent of women who did not receive the hormone were diagnosed with depression, compared with 7 percent who did.
“This was the opposite of what we predicted,” says Kristina Deligiannidis, a neuroscientist and perinatal psychiatrist at the Feinstein Institute for Medical Research in Manhasset, N.Y. After all, oxytocin is supposed to enhance brain circuits involved in mothering. “We had a whole group of statisticians reanalyze the data because we didn’t believe it,” she says. While the explanation is unknown, one theory is that perhaps the women who needed synthetic oxytocin during labor weren’t making enough on their own — and that could be why they are more prone to depression after childbirth.
But postpartum depression can’t be pinned to any single substance or brain malfunction — it doesn’t reside in one tidy nest of brain cells, or any one chemical process gone haywire. Maternal behavior is based on complex neurological circuitry. “Multiple parts of the brain are involved in any single function,” Deligiannidis says. “Just to have this conversation, I’m activating several different parts of my brain.” When any kind of depression occurs, she says, multiple regions of the brain are suffering from a communication breakdown.
Looking further, Deligiannidis has also examined the role of certain steroids synthesized from progesterone and other hormones and known to affect maternal brain circuitry. In a 2016 study in Psychoneuroendocrinology involving 32 new mothers at risk for postpartum depression and 24 healthy mothers, Deligiannidis and colleagues reported that concentrations of some steroids that affect the brain, also called neurosteroids, were higher in women at risk for developing depression (because of their past history or symptoms), compared with women who were not. The higher levels suggest a system out of balance — the brain is making too much of one neurosteroid and not enough of another, called allopregnanolone, which is thought to protect against postpartum depression and is being tested as a treatment. Treating pregnancy withdrawal Beyond mom
CASEZY IDEA/SHUTTERSTOCK Postpartum depression doesn’t weigh down just mom. Research suggests it might have negative effects on her offspring that can last for years. Risks include:
Newborns Higher levels of cortisol and other stress hormones More time fussing and crying More “indeterminate sleep,” hovering between deep and active sleep Infants and children Increased risk of developmental problems Slower growth Lower cognitive function Elevated cortisol levels Adolescents Higher risk of depression Tufts University neuroscientist Jamie Maguire, based in Boston, got interested in neurosteroids during her postgraduate studies in the lab of Istvan Mody at UCLA. Maguire and Mody reported in 2008 in Neuron that during pregnancy, the hippocampus has fewer receptors for neurosteroids, presumably to protect the brain from the massive levels of progesterone and estrogen circulating at that time. When progesterone drops after birth, the receptors repopulate.
But in mice genetically engineered to lack those receptors, something else happened: The animals were less interested in tending to their offspring, failing to make nests for them.
“We started investigating. Why are these animals having these abnormal postpartum behaviors?” Maguire recalls. Was an inability to recover these receptors making some women susceptible? Interestingly, similar receptors are responsible for the mood-altering and addictive effects of some antianxiety drugs, suggesting that the sudden progesterone drop after childbirth could be leaving some women with a kind of withdrawal effect.
Further experiments demonstrated that giving the mice a progesterone-derived neurosteroid — producing levels close to what the mice had in pregnancy — alleviated the symptoms.
Today, Maguire is on the scientific advisory board of Boston area–based Sage Therapeutics, which is testing a formulation of allopregnanolone called brexanolone. Results of an early clinical trial published last July in The Lancet assessed whether brexanolone would alleviate postpartum symptoms in women with severe postpartum depression. The study involved 21 women randomly assigned to receive a 60-hour infusion of the drug or a placebo within six months after delivery.
At the end of treatment, the women who received the drug reported a 21-point reduction on a standard scale of depression symptoms, compared with about 9 points for the women on a placebo. “These women got better in about a day,” says Deligiannidis, who is on the study’s research team. “The results were astonishing.”
In November, Sage Therapeutics announced the results of two larger studies, although neither has been published. Combined, the trials involved 226 women with severe or moderate postpartum depression. Both groups showed similar improvements that lasted for the month the women were followed. The company has announced plans to request approval from the U.S. Food and Drug Administration to market brexanolone in the United States. This is an important first step, researchers say, toward better treatments.
“We are just touching on one small piece of a bigger puzzle,” says Jodi Pawluski, a neuroscientist at the Université de Rennes 1 in France who coauthored the 2017 review in Trends in Neurosciences. She was surprised at the dearth of research, given how common postpartum depression is. “This is not the end, it’s the beginning.”
Ask a classroom of children to draw a scientist, and you’ll see plenty of Crayola-colored lab coats, goggles and bubbling beakers. That image hasn’t changed much since the 1960s. But the person wearing the lab coat is shifting.
A new analysis finds that more female scientists have appeared in kids’ drawings in recent decades — going from nearly nonexistent in the 1960s to about a third in 2016.
“A lot has changed since the 1960s,” says David Miller, a Ph.D. candidate in psychology at Northwestern University who reports the findings with colleagues March 20 in Child Development. The first of many “draw-a-scientist” studies asked nearly 5,000 children to draw a scientist between 1966 and 1977. “Of those 5,000 drawings,” Miller says, “only 28 … depicted a female scientist.” That’s just 0.6 percent.
Today, “more women are becoming scientists, and there’s some evidence that female scientists are being represented more in the media,” he says. For instance, in a content analysis of the magazine Highlights for Children, 13 percent of people pictured in science feature stories of the 1960s were women or girls, compared with 44 percent in the 2000s. To look for changes in children’s perceptions over time, the researchers conducted a meta-analysis, combining data from 78 studies that included a total of more than 20,000 U.S. children in kindergarten through 12th grade.
On average, 28 percent of children drew female scientists in studies conducted from 1985 to 2016, the researchers found.
What hasn’t changed much: Kids pick up stereotypes by gender as they grow up. At age 6, girls in the more recent studies drew female scientists about 70 percent of the time. By age 16, 75 percent drew male scientists.
“This is a critical period in which kids are learning stereotypes,” Miller says. “It’s important that teachers and parents present diverse examples of both male and female scientists.”
Editors’ note: This story was corrected on March 21, 2018, to note that by age 16, girls drew only 25 percent of scientists as female.
Making day-to-day activities more vigorous for a few minutes — such as briefly stepping up the pace of a walk — could offer people who don’t exercise some of the health benefits that exercisers enjoy.
That’s according to a new study of roughly 25,000 adults who reported no exercise in their free time. Those who incorporated three one- to two-minute bursts of intense activity per day saw a nearly a 40 percent drop in the risk of death from any cause compared with those whose days didn’t include such activity. The risk of death from cancer also fell by nearly 40 percent, and the risk of death from cardiovascular disease dropped almost 50 percent, researchers report online December 8 in Nature Medicine.
In a comparison with around 62,000 people who exercised regularly, including runners, gym-goers and recreational cyclists, the mortality risk reduction was similar.
“This study adds to other literature showing that even short amounts of activity are beneficial,” says Lisa Cadmus-Bertram, a physical activity epidemiologist at the University of Wisconsin–Madison, who was not involved in the research. “So many people are daunted by feeling that they don’t have the time, money, motivation, transportation, etc. to go to a gym regularly or work out for long periods of time,” she says. “The message we can take is that it is absolutely worth doing what you can.”
Emmanuel Stamatakis, an epidemiologist at the University of Sydney, and his colleagues analyzed a subset of records from the UK Biobank, a biomedical database containing health information on half a million people in the United Kingdom. The study’s non-exercising participants — more than half of whom were women and were 62 years old on average — had worn movement-tracking devices for a week.
Over an average seven years of follow-up, for those whose days included three to four bursts of activity, the mortality rate was 4.2 deaths from any cause per 1,000 people for one year. For those with no activity bursts, it was 10.4 deaths per 1,000 people for one year.
The researchers were looking for bursts of vigorous activity that met a definition determined in a laboratory study, including reaching at least 77 percent of maximum heart rate and at least 64 percent of maximum oxygen consumption. In real life, the signs that someone has reached the needed intensity level are “an increase in heart rate and feeling out of breath” in the first 15 to 30 seconds of an activity, Stamatakis says.
Regular daily activities offer several opportunities for these vigorous bursts, he says. “The simplest one is maximizing walking pace for a minute or two during any regular walk.” Other options, he says, include carrying grocery bags to the car or taking the stairs. “The largest population health gains will be realized by finding ways to get the least physically active people to move a little more.”
Look closely at a snowflake, and you’ll observe a one-of-a-kind gossamer lattice, its growth influenced by ambient conditions like temperature and humidity. Turns out, this sort of intricate self-assemblage can also occur in metals, researchers report in the Dec. 9 Science.
In pools of molten gallium, physicist Nicola Gaston and colleagues grew zinc nanostructures with symmetrical, hexagonal crystal frameworks. Such metal snowflakes could be useful for catalyzing chemical reactions and constructing electronics, says Gaston, of the MacDiarmid Institute for Advanced Materials and Nanotechnology at the University of Auckland in New Zealand.
“Self-assembly is the way nature makes nanostructures,” she says. “We’re trying to learn to do the same things.” Figuring out how to craft tiny, complex metal shapes in fewer steps and with less energy could be a boon for manufacturers.
The researchers chose gallium as a growth medium, due to its relatively low melting point, ability to dissolve many other metals and the tendency for its atoms to loosely organize while in a liquid state.
After mixing zinc into the gallium, the team subjected the alloy to elevated temperatures and different pressures, and then let the mixture cool to room temperature. The loose ordering of gallium atoms appeared to coax the crystallizing zinc to bloom into symmetrical, hexagonal structures resembling natural snowflakes and other shapes, the team found. It’s somewhat like how a fruit tray imparts order on the fruits stacked within, Gaston says.
The future may be bright for research into applications of gallium and other low-temperature liquid metals. “Not to take that snowflake metaphor too far, but [this work] really hints at new branches for scientific discovery,” Gaston says.
Meet the metric system’s newest prefixes: ronna-, quetta-, ronto- and quecto-.
Adopted November 18 at the 27th General Conference on Weights and Measures in Versailles, France, ronna- and quetta- describe exceedingly large numbers while ronto- and quecto- describe the exceedingly small. This is the first time that the International System of Units, or SI, has expanded since 1991, when the prefixes zetta-, yotta-, zepto and yocto- were added (SN: 1/16/93).
Numerically, ronna- is 1027 (that’s a digit followed by 27 zeroes) and quetta- is 1030 (30 zeroes). Their tiny counterparts ronto- and quecto- also refer to 27 and 30 zeroes, but those come after a decimal point. Until now, yotta- and yocto- (24 zeros) capped off the metric system’s range.
Science News spoke with Richard Brown, head of metrology at the National Physical Laboratory in Teddington, England, about what the latest SI expansion means for science. The following conversation has been edited for clarity and brevity.
SN: Why do we need the new prefixes?
Brown: The quantity of data in the world is increasing exponentially. And we expect that to continue to increase and probably accelerate because of quantum computing, digitalization and things like that. At the same time, this quantity of data is starting to get close to the top range of the prefixes we currently use. People start to ask what comes next?
SN: Where do the prefix names come from?
Brown: About five years ago, I heard a BBC podcast about these new names for quantities of data. And the two that they mentioned were brontobyte and hellabyte. Brontobyte, I think comes from brontosaurus being a big dinosaur and hellabyte comes from “‘hell of a big number.”
The problem with those from a metrology point of view, or measurement point of view, is they start with letters B and H, which already are in use for other units and prefixes. So we can’t have those as names. [It was clear] that we had to do something official because people were starting to need these prefixes. R and Q are not used for anything else, really, in terms of units or SI prefixes. [The prefix names themselves are] very, very loosely based on the Greek and Latin names for nine and 10. SN: How will the prefixes be used?
Brown: The whole point of the International System of Units is it’s an accepted global system, which if you use, you will be understood.
When you use a prefix with a unit, it means that the number associated with the unit changes. And people like small numbers that they can understand. So you can express the mass of the Earth in terms of ronnagrams; it’s six ronnagrams. And equally the mass of Jupiter is two quettagrams. Some good examples of [small numbers] are that the mass of an electron is about one rontogram, and the mass of one bit of data as stored on a mobile phone is around one quectogram.
I think the use of a suitable prefix makes things more understandable. And I think we shouldn’t forget that even if there’s not always a direct scientific usage immediately, they will gain traction over time.
Tyrannosaurus rex isn’t just a king to paleontologists — the dinosaur increasingly reigns over the world of art auctions. A nearly complete skeleton known as Stan the T. rex smashed records in October 2020 when a bidding war drove its price to $31.8 million, the highest ever paid for any fossil. Before that, Sue the T. rex held the top spot; it went for $8.3 million in 1997.
That kind of publicity — and cachet — means that T. rex’s value is sky-high, and the dinosaur continues to have its teeth firmly sunk into the auction world in 2022. In December, Maximus, a T. rex skull, will be the centerpiece of a Sotheby’s auction in New York City. It’s expected to sell for about $15 million.
Another T. rex fossil named Shen was anticipated to sell for between $15 million and $25 million at a Christie’s auction in Hong Kong in late November. However, the auction house pulled it over concerns about the number of replica bones used in the fossil. “These are astronomical sums of money, really surprising sums of money,” says Donna Yates, a criminologist at Maastricht University in the Netherlands who studies high-value collectibles.
Stan’s final price “was completely unexpected,” Yates says. The fossil was originally appraised at about $6 million — still a very large sum, though nothing like the final tally, which was the result of a three-way bidding war.
But the staggering amounts of money T. rex fossils now fetch at auction can mean a big loss for science. At those prices, the public institutions that might try to claim these glimpses into the deep past are unable to compete with deep-pocketed private buyers, researchers say.
One reason for the sky-high prices may be that T. rex fossils are increasingly being treated more like rare works of art than bits of scientific evidence, Yates says. The bones might once have been bought and sold at dusty “cowboy fossil” dealerships. But nowadays these fossils are on display in shiny gallery spaces and are being appraised and marketed as rare objets d’art. That’s appealing to collectors, she adds: “If you’re a high-value buyer, you’re a person who wants the finest things.”
But fossils’ true value is the information they hold, says Thomas Carr, a paleontologist at Carthage College in Kenosha, Wis. “They are our only means of understanding the biology and evolution of extinct animals.”
Keeping fossils of T. rex and other dinosaurs and animals in public repositories, such as museums, ensures that scientists have consistent access to study the objects, including being able to replicate or reevaluate previous findings. But a fossil sold into private or commercial hands is subject to the whim of its owner — which means anything could happen to it at any time, Carr says. “It doesn’t matter if [a T. rex fossil] is bought by some oligarch in Russia who says scientists can come and study it,” he says. “You might as well take a sledgehammer to it and destroy it.”
A desire for one’s own T. rex There are only about 120 known specimens of T. rex in the world. At least half of them are owned privately and aren’t available to the public. That loss is “wreaking havoc on our dataset. If we don’t have a good sample size, we can’t claim to know anything about [T. rex],” Carr says.
For example, to be able to tell all the ways that T. rex males differed from females, researchers need between 70 and 100 good specimens for statistically significant analyses, an amount scientists don’t currently have.
Similarly, scientists know little about how T. rex grew, and studying fossils of youngsters could help (SN: 1/6/20). But only a handful of juvenile T. rex specimens are publicly available to researchers. That number would double if private specimens were included.
Museums and academic institutions typically don’t have the kind of money it takes to compete with private bidders in auctions or any such competitive sales. That’s why, in the month before Stan went up for auction in 2020, the Society for Vertebrate Paleontology, or SVP, wrote a letter to Christie’s asking the auction house to consider restricting bidding to public institutions. The hope was that this would give scientists a fighting chance to obtain the specimens.
But the request was ignored — and unfortunately may have only increased publicity for the sale, says Stuart Sumida, a paleontologist at California State University in San Bernardino and SVP’s current vice president. That’s why SVP didn’t issue a public statement this time ahead of the auctions for Shen and Maximus, Sumida says, though the organization continues to strongly condemn fossil sales — whether of large, dramatic specimens or less well-known creatures. “All fossils are data. Our position is that selling fossils is not scientific and it damages science.”
Sumida is particularly appalled at statements made by auction houses that suggest the skeletons “have already been studied,” an attempt to reassure researchers that the data contained in that fossil won’t be lost, regardless of who purchases it. That’s deeply misleading, he says, because of the need for reproducibility, as well as the always-improving development of new analysis techniques. “When they make public statements like that, they are undermining not only paleontology, but the scientific process as well.”
And the high prices earned by Stan and Sue are helping to drive the market skyward, not only for other T. rex fossils but also for less famous species. “It creates this ripple effect that is incredibly damaging to science in general,” Sumida says. Sotheby’s, for example, auctioned off a Gorgosaurus, a T. rex relative, in July for $6.1 million. In May, a Deinonychus antirrhopus — the inspiration for Jurassic Park’s velociraptor — was sold by Christie’s for $12.4 million.
Protecting T. rex from collectors Compounding the problem is the fact that the United States has no protections in place for fossils unearthed from the backyards or dusty fields of private landowners. The U.S. is home to just about every T. rex skeleton ever found. Stan, Sue and Maximus hail from the Black Hills of South Dakota. Shen was found in Montana.
As of 2009, U.S. law prohibits collecting scientifically valuable fossils, particularly fossils of vertebrate species like T. rex, from public lands without permits. But fossils found on private lands are still considered the landowner’s personal property. And landowners can grant digging access to whomever they wish. Before the discovery of Sue the T. rex (SN: 9/6/14), private owners often gave scientific institutions free access to hunt for fossils on their land, says Bridget Roddy, currently a researcher at the legal news company Bloomberg Law in Washington, D.C. But in the wake of Sue’s sale in 1997, researchers began to have to compete for digging access with commercial fossil hunters.
These hunters can afford to pay landowners large sums for the right to dig, or even a share of the profits from fossil sales. And many of these commercial dealers sell their finds at auction houses, where the fossils can earn far more than most museums are able to pay.
Lack of federal protections for paleontological resources found on private land — combined with the large available supply of fossils — is a situation unique to the United States, Roddy says. Fossil-rich countries such as China, Canada, Italy and France consider any such finds to be under government protection, part of a national legacy.
In the United States, seizing such materials from private landowners — under an eminent domain argument — would require the government to pay “just compensation” to the landowners. But using eminent domain to generally protect such fossils wouldn’t be financially sustainable for the government, Roddy says, not least because most fossils dug up aren’t of great scientific value anyway.
There may be other, more grassroots ways to at least better regulate fossil sales, she says. While still a law student at DePaul University in Chicago, Roddy outlined some of those ideas in an article published in Texas A&M Journal of Property Law in May.
One option, she suggests, is for states to create a selective sales tax attached to fossil purchases, specifically for buyers who intend to keep their purchases in private collections that are not readily available to the public. It’s “similar to if you want to buy a pack of cigarettes, which is meant to offset the harm that buying cigarettes does to society in general,” Roddy says. That strategy could be particularly effective in states with large auction houses, like New York.
Another possibility is to model any new, expanded fossil preservation laws on existing U.S. antiquities laws, intended to preserve cultural heritage. After all, Roddy says, fossils aren’t just bones, but they’re also part of the human story. “Fossils have influenced our folklore; they’re a unifier of humanity and culture rather than a separate thing.”
Though fossils from private lands aren’t protected, many states do impose restrictions on searches for archaeological and cultural artifacts, by requiring those looking for antiquities to restore excavated land or by fining the excavation of certain antiquities without state permission. Expanding those restrictions to fossil hunting, perhaps by requiring state approval through permits, could also give states the opportunity to purchase any significant finds before they’re lost to private buyers.
Preserving fossils for science and the public Such protections could be a huge boon to paleontologists, who may not even know what’s being lost. “The problem is, we’ll never know” all the fossils that are being sold, Sumida says. “They’re shutting scientists out of the conversation.”
And when it comes to dinosaurs, “so many of the species we know about are represented by a single fossil,” says Stephen Brusatte, a paleontologist at the University of Edinburgh. “If that fossil was never found, or disappeared into the vault of a collector, then we wouldn’t know about that dinosaur.”
Or, he says, sometimes a particularly complete or beautifully preserved dinosaur skeleton is found, and without it, “we wouldn’t be able to study what that dinosaur looked like, how it moved, what it ate, how it sensed its world, how it grew.”
The point isn’t to put restrictions on collecting fossils so much as making sure they remain in public view, Brusatte adds. “There’s nothing as magical as finding your own fossils, being the first person ever to see something that lived millions of years ago.” But, he says, unique and scientifically invaluable fossils such as dinosaur skeletons should be placed in museums “where they can be conserved and studied and inspire the public, rather than in the basements or yachts of the oligarch class.”
After its record-breaking sale, Stan vanished for a year and a half, its new owners a mystery. Then in March 2022, news surfaced that the fossil had been bought by the United Arab Emirates, which stated it intends to place Stan in a new natural history museum.
Sue, too, is on public view. The fossil is housed at Chicago’s Field Museum of Natural History, thanks to the pooled financial resources of the Walt Disney Corporation, the McDonald Corporation, the California State University System and others. That’s the kind of money it took to get the highest bid on a T. rex 25 years ago.
And those prices only seem to be going up. Researchers got lucky with Sue, and possibly Stan.
As for Shen, the fossil’s fate remains in limbo: It was pulled from auction not due to outcry from paleontologists, but over concerns about intellectual property rights. The fossil, at 54 percent complete, may have been supplemented with a polyurethane cast of bones from Stan, according to representatives of the Black Hills Institute of Geological Research in Hill City, S.D. That organization, which discovered Stan, retains a copyright over the skeleton.
In response to those concerns, Christie’s pulled the lot, and now says that it intends to loan the fossil to a museum. But this move doesn’t reassure paleontologists. “A lot of people are pleased that the sale didn’t go through,” Sumida says. “But it sort of just kicks the can down the road.… It doesn’t mean they’re not going to try and sell it in another form, somewhere down the road.”
Ultimately, scientists simply can’t count on every important fossil finding its way to the public, Carr says. “Those fossils belong in a museum; it’s right out of Indiana Jones,” he says. “It’s not like they’re made in a factory somewhere. Fossils are nonrenewable resources. Once Shen is gone, it’s gone.”
The infant universe transforms from a featureless landscape to an intricate web in a new supercomputer simulation of the cosmos’s formative years.
An animation from the simulation shows our universe changing from a smooth, cold gas cloud to the lumpy scattering of galaxies and stars that we see today. It’s the most complete, detailed and accurate reproduction of the universe’s evolution yet produced, researchers report in the November Monthly Notices of the Royal Astronomical Society.
This virtual glimpse into the cosmos’s past is the result of CoDaIII, the third iteration of the Cosmic Dawn Project, which traces the history of the universe, beginning with the “cosmic dark ages” about 10 million years after the Big Bang. At that point, hot gas produced at the very beginning of time, about 13.8 billion years ago, had cooled to a featureless cloud devoid of light, says astronomer Paul Shapiro of the University of Texas at Austin. Roughly 100 million years later, tiny ripples in the gas left over from the Big Bang caused the gases to clump together (SN: 2/19/15). This led to long, threadlike strands that formed a web of matter where galaxies and stars were born.
As radiation from the early galaxies illuminated the universe, it ripped electrons from atoms in the once-cold gas clouds during a period called the epoch of reionization, which continued until about 700 million years after the Big Bang (SN: 2/6/17).
CoDaIII is the first simulation to fully account for the complicated interaction between radiation and the flow of matter in the universe, Shapiro says. It spans the time from the cosmic dark ages and through the next several billion years as the distribution of matter in the modern universe formed.
The animation from the simulation, Shapiro says, graphically shows how the structure of the early universe is “imprinted on the galaxies today, which remember their youth, or their birth or their ancestors from the epoch of reionization.”
An ancient hominid dubbed Homo naledi may have lit controlled fires in the pitch-dark chambers of an underground cave system, new discoveries hint.
Researchers have found remnants of small fireplaces and sooty wall and ceiling smudges in passages and chambers throughout South Africa’s Rising Star cave complex, paleoanthropologist Lee Berger announced in a December 1 lecture hosted by the Carnegie Institution of Science in Washington, D.C.
“Signs of fire use are everywhere in this cave system,” said Berger, of the University of the Witwatersrand, Johannesburg.
H. naledi presumably lit the blazes in the caves since remains of no other hominids have turned up there, the team says. But the researchers have yet to date the age of the fire remains. And researchers outside Berger’s group have yet to evaluate the new finds.
H. naledi fossils date to between 335,000 and 236,000 years ago (SN: 5/9/17), around the time Homo sapiens originated (SN: 6/7/17). Many researchers suspect that regular use of fire by hominids for light, warmth and cooking began roughly 400,000 years ago (SN: 4/2/12).
Such behavior has not been attributed to H. naledi before, largely because of its small brain. But it’s now clear that a brain roughly one-third the size of human brains today still enabled H. naledi to achieve control of fire, Berger contends.
Last August, Berger climbed down a narrow shaft and examined two underground chambers where H. naledi fossils had been found. He noticed stalactites and thin rock sheets that had partly grown over older ceiling surfaces. Those surfaces displayed blackened, burned areas and were also dotted by what appeared to be soot particles, Berger said.
Meanwhile, expedition codirector and Wits paleoanthropologist Keneiloe Molopyane led excavations of a nearby cave chamber. There, the researchers uncovered two small fireplaces containing charred bits of wood, and burned bones of antelopes and other animals. Remains of a fireplace and nearby burned animal bones were then discovered in a more remote cave chamber where H. naledi fossils have been found.
Still, the main challenge for investigators will be to date the burned wood and bones and other fire remains from the Rising Star chambers and demonstrate that the fireplaces there come from the same sediment layers as H. naledi fossils, says paleoanthropologist W. Andrew Barr of George Washington University in Washington, D.C., who wasn’t involved in the work.
“That’s an absolutely critical first step before it will be possible to speculate about who may have made fires for what reason,” Barr says.
Josep Cornella doesn’t deal in absolutes. While chemists typically draw rigid lines between organic and inorganic chemistry, Cornella, a researcher at Max-Planck-Institut für Kohlenforschung in Mülheim an der Ruhr, Germany, believes in just the opposite.
“You have to be open to cross boundaries,” he says, “and learn from it.” The fringes are “where the rich new things are.”
Cornella is an organic chemist by industry standards; he synthesizes molecules that contain carbon. But he’s put together a team from a wide range of backgrounds: inorganic chemists, physical organic chemists, computational chemists. Together, the team brainstorms novel approaches to designing new catalysts, so that chemical reactions essential to pharmaceuticals and agriculture can be made more efficient and friendly for the environment. Along the way, Cornella has unlocked mysteries that stumped chemists for years.
“He has told us about catalysts … that we didn’t have before, and which were just pipe dreams,” says Hosea Nelson, a chemist at Caltech who has not worked with Cornella. Bold idea When Cornella heard a speaker at a 2014 conference say that bismuth was nontoxic, he was sure it was a mistake. Bismuth is a heavy metal that sits between toxic lead and polonium on the periodic table. But it is indeed relatively nontoxic — it’s even used in the over-the-counter nausea medicine Pepto-Bismol.
Still, bismuth remains poorly understood. That’s one reason it attracted him. “It was a rather forgotten element of the periodic table,” Cornella says. But, “it’s there for a reason.”
Cornella started wondering if an element like bismuth could be trained for use as a catalyst. For the last century, scientists have been using transition metals, like palladium and iron, as the main catalysts in industrial synthesis. “Could we actually train [bismuth] to do what these guys do so well?” he asked. It was a conceptual question that “was completely naïve, or maybe stupid.”
Far from stupid: His team successfully used bismuth as a catalyst to make a carbon-fluorine bond. And bismuth didn’t just mimic a transition metal’s role — it worked better. Only a small amount of bismuth was required, much less than the amount of transition metal needed to complete the same task.
“A lot of people, including myself and other [researchers] around the world, have spent a lot of time thinking about how to make bismuth reactions catalytic,” Nelson says. “He’s the guy who cracked that nut.”
Standout research While the bismuth research is “weird” and “exciting,” Cornella says, it remains a proof of concept. Bismuth, though cheap, is not as abundant as he had hoped, so it’s not a very sustainable option for industry.
But other Cornella team findings are already being used in the real world. In 2019, the group figured out how to make an alternative to Ni(COD)2, a finicky catalyst commonly used by chemists in the lab. If it’s not kept at freezing temperatures and protected from oxygen by a layer of inert gases, the nickel complex falls apart.
The alternative complex, developed by Lukas Nattmann, a Ph.D. student in Cornella’s lab at the time, stays stable in oxygen at room temperature. It’s a game changer: It saves energy and materials, and it’s universal. “You can basically take all those reactions that were developed for 60 years of Ni(COD)2 and basically replace all of them with our catalyst, and it works just fine,” Cornella says. Cornella’s lab is also developing new reagents, substances that transform one material into another. The researchers are looking to transform atoms in functional groups — specific groupings of atoms that behave in specific ways regardless of the molecules they are found in — into other atoms in a single step. Doing these reactions in one step could cut preparation time from two weeks to a day, which would be very useful in the pharmaceutical industry
Taking risks It’s the success that gets attention, but failure is “our daily basis,” Cornella says. “It’s a lot of failure.” As a student, when he couldn’t get a reaction to work, he’d set up a simple reaction called a TBS protection — the kind of reaction that’s impossible to get wrong — to remind himself that he wasn’t “completely useless.”
Today he runs a lab that champions taking risks. He encourages students to learn from one another about areas they know nothing about. For instance, a pure organic chemist could come into Cornella’s lab and leave with a good understanding of organometallic chemistry after spending long days working alongside a colleague who is an expert in that area.
To Cornella, this sharing of knowledge is crucial. “If you tackle a problem from just one unique perspective,” he says, “maybe you’re missing some stuff.”
While Cornella might not like absolutes, Phil Baran, who advised Cornella during his postdoctoral work at Scripps Research in San Diego, sees Cornella as fitting into one of two distinct categories: “There are chemists who do chemistry in order to eat, like it’s a job. And there are chemists who eat in order to do chemistry,” Baran says. Cornella fits into the latter group. “It’s his oxygen.”